User login
New Tech Could Record Deep-Brain Activity From Surface
Modern technology for recording deep-brain activity involves sharp metal electrodes that penetrate the tissue, causing damage that can compromise the signal and limiting how often they can be used.
A rapidly growing area in materials science and engineering aims to fix the problem by designing electrodes that are softer, smaller, and flexible — safer for use inside the delicate tissues of the brain. On January 17, researchers from the University of California, San Diego, reported the development of a thin, flexible electrode that can be inserted deep within the brain and communicate with sensors on the surface.
But what if you could record detailed deep-brain activity without piercing the brain?
A team of researchers (as it happens, also from UC San Diego) have developed a thin, flexible implant that “resides on the brain’s surface” and “can infer neural activity from deeper layers,” said Duygu Kuzum, PhD, a professor of electrical and computer engineering, who led the research.
By combining electrical and optical imaging methods, and artificial intelligence, the researchers used the device — a polymer strip packed with graphene electrodes — to predict deep calcium activity from surface signals, according to a proof-of-concept study published this month in Nature Nanotechnology.
“Almost everything we know about how neurons behave in living brains comes from data collected with either electrophysiology or two-photon imaging,” said neuroscientist Joshua H. Siegle, PhD, of the Allen Institute for Neural Dynamics in Seattle , who not involved in the study. “ Until now, these two methods have rarely been used simultaneously.”
The technology, which has been tested in mice, could help advance our knowledge of how the brain works and may lead to new minimally invasive treatments for neurologic disorders.
Multimodal Neurotech: The Power of 2-in-1
Electrical and optical methods for recording brain activity have been crucial in advancing neurophysiologic science, but each technique has its limits. Electrical recordings provide high “temporal resolution”; they reveal when activation is happening, but not really where. Optical imaging, on the other hand, offers high “spatial resolution,” showing which area of the brain is lighting up, but its measurements may not correspond with the activity’s timing.
Research over the past decade has explored how to combine and harness the strengths of both methods. One potential solution is to use electrodes made of transparent materials such as graphene, allowing a clear field of view for a microscope during imaging. Recently, University of Pennsylvania scientists used graphene electrodes to illuminate the neural dynamics of seizures.
But there are challenges. If graphene electrodes are very small — in this case, 20 µm in diameter — they become more resistant to the flow of electricity. Dr. Kuzum and colleagues addressed this by adding tiny platinum particles to improve electrical conductivity. Long graphene wires connect electrodes to the circuit board, but defects in graphene can interrupt the signal, so they made each wire with two layers; any defects in one wire could be hidden by the other.
By combining the two methods (microelectrode arrays and two-photon imaging), the researchers could see both when brain activity was happening and where, including in deeper layers. They discovered a correlation between electrical responses on the surface and cellular calcium activity deeper down. The team used these data to create a neural network (a type of artificial intelligence that learns to recognize patterns) that predicts deep calcium activity from surface-level readings.
The tech could help scientists study brain activity “in a way not possible with current single-function tools,” said Luyao Lu, PhD, professor of biomedical engineering at George Washington University in Washington, DC, who was not involved in the study. It could shed light on interactions between vascular and electrical activity, or explain how place cells (neurons in the hippocampus) are so efficient at creating spatial memory.
It could also pave the way for minimally invasive neural prosthetics or targeted treatments for neurologic disorders, the researchers say. Implanting the device would be a “straightforward process” similar to placing electrocorticography grids in patients with epilepsy, said Dr. Kuzum.
But first, the team plans to do more studies in animal models before testing the tech in clinical settings, Dr. Kuzum added.
A version of this article appeared on Medscape.com.
Modern technology for recording deep-brain activity involves sharp metal electrodes that penetrate the tissue, causing damage that can compromise the signal and limiting how often they can be used.
A rapidly growing area in materials science and engineering aims to fix the problem by designing electrodes that are softer, smaller, and flexible — safer for use inside the delicate tissues of the brain. On January 17, researchers from the University of California, San Diego, reported the development of a thin, flexible electrode that can be inserted deep within the brain and communicate with sensors on the surface.
But what if you could record detailed deep-brain activity without piercing the brain?
A team of researchers (as it happens, also from UC San Diego) have developed a thin, flexible implant that “resides on the brain’s surface” and “can infer neural activity from deeper layers,” said Duygu Kuzum, PhD, a professor of electrical and computer engineering, who led the research.
By combining electrical and optical imaging methods, and artificial intelligence, the researchers used the device — a polymer strip packed with graphene electrodes — to predict deep calcium activity from surface signals, according to a proof-of-concept study published this month in Nature Nanotechnology.
“Almost everything we know about how neurons behave in living brains comes from data collected with either electrophysiology or two-photon imaging,” said neuroscientist Joshua H. Siegle, PhD, of the Allen Institute for Neural Dynamics in Seattle , who not involved in the study. “ Until now, these two methods have rarely been used simultaneously.”
The technology, which has been tested in mice, could help advance our knowledge of how the brain works and may lead to new minimally invasive treatments for neurologic disorders.
Multimodal Neurotech: The Power of 2-in-1
Electrical and optical methods for recording brain activity have been crucial in advancing neurophysiologic science, but each technique has its limits. Electrical recordings provide high “temporal resolution”; they reveal when activation is happening, but not really where. Optical imaging, on the other hand, offers high “spatial resolution,” showing which area of the brain is lighting up, but its measurements may not correspond with the activity’s timing.
Research over the past decade has explored how to combine and harness the strengths of both methods. One potential solution is to use electrodes made of transparent materials such as graphene, allowing a clear field of view for a microscope during imaging. Recently, University of Pennsylvania scientists used graphene electrodes to illuminate the neural dynamics of seizures.
But there are challenges. If graphene electrodes are very small — in this case, 20 µm in diameter — they become more resistant to the flow of electricity. Dr. Kuzum and colleagues addressed this by adding tiny platinum particles to improve electrical conductivity. Long graphene wires connect electrodes to the circuit board, but defects in graphene can interrupt the signal, so they made each wire with two layers; any defects in one wire could be hidden by the other.
By combining the two methods (microelectrode arrays and two-photon imaging), the researchers could see both when brain activity was happening and where, including in deeper layers. They discovered a correlation between electrical responses on the surface and cellular calcium activity deeper down. The team used these data to create a neural network (a type of artificial intelligence that learns to recognize patterns) that predicts deep calcium activity from surface-level readings.
The tech could help scientists study brain activity “in a way not possible with current single-function tools,” said Luyao Lu, PhD, professor of biomedical engineering at George Washington University in Washington, DC, who was not involved in the study. It could shed light on interactions between vascular and electrical activity, or explain how place cells (neurons in the hippocampus) are so efficient at creating spatial memory.
It could also pave the way for minimally invasive neural prosthetics or targeted treatments for neurologic disorders, the researchers say. Implanting the device would be a “straightforward process” similar to placing electrocorticography grids in patients with epilepsy, said Dr. Kuzum.
But first, the team plans to do more studies in animal models before testing the tech in clinical settings, Dr. Kuzum added.
A version of this article appeared on Medscape.com.
Modern technology for recording deep-brain activity involves sharp metal electrodes that penetrate the tissue, causing damage that can compromise the signal and limiting how often they can be used.
A rapidly growing area in materials science and engineering aims to fix the problem by designing electrodes that are softer, smaller, and flexible — safer for use inside the delicate tissues of the brain. On January 17, researchers from the University of California, San Diego, reported the development of a thin, flexible electrode that can be inserted deep within the brain and communicate with sensors on the surface.
But what if you could record detailed deep-brain activity without piercing the brain?
A team of researchers (as it happens, also from UC San Diego) have developed a thin, flexible implant that “resides on the brain’s surface” and “can infer neural activity from deeper layers,” said Duygu Kuzum, PhD, a professor of electrical and computer engineering, who led the research.
By combining electrical and optical imaging methods, and artificial intelligence, the researchers used the device — a polymer strip packed with graphene electrodes — to predict deep calcium activity from surface signals, according to a proof-of-concept study published this month in Nature Nanotechnology.
“Almost everything we know about how neurons behave in living brains comes from data collected with either electrophysiology or two-photon imaging,” said neuroscientist Joshua H. Siegle, PhD, of the Allen Institute for Neural Dynamics in Seattle , who not involved in the study. “ Until now, these two methods have rarely been used simultaneously.”
The technology, which has been tested in mice, could help advance our knowledge of how the brain works and may lead to new minimally invasive treatments for neurologic disorders.
Multimodal Neurotech: The Power of 2-in-1
Electrical and optical methods for recording brain activity have been crucial in advancing neurophysiologic science, but each technique has its limits. Electrical recordings provide high “temporal resolution”; they reveal when activation is happening, but not really where. Optical imaging, on the other hand, offers high “spatial resolution,” showing which area of the brain is lighting up, but its measurements may not correspond with the activity’s timing.
Research over the past decade has explored how to combine and harness the strengths of both methods. One potential solution is to use electrodes made of transparent materials such as graphene, allowing a clear field of view for a microscope during imaging. Recently, University of Pennsylvania scientists used graphene electrodes to illuminate the neural dynamics of seizures.
But there are challenges. If graphene electrodes are very small — in this case, 20 µm in diameter — they become more resistant to the flow of electricity. Dr. Kuzum and colleagues addressed this by adding tiny platinum particles to improve electrical conductivity. Long graphene wires connect electrodes to the circuit board, but defects in graphene can interrupt the signal, so they made each wire with two layers; any defects in one wire could be hidden by the other.
By combining the two methods (microelectrode arrays and two-photon imaging), the researchers could see both when brain activity was happening and where, including in deeper layers. They discovered a correlation between electrical responses on the surface and cellular calcium activity deeper down. The team used these data to create a neural network (a type of artificial intelligence that learns to recognize patterns) that predicts deep calcium activity from surface-level readings.
The tech could help scientists study brain activity “in a way not possible with current single-function tools,” said Luyao Lu, PhD, professor of biomedical engineering at George Washington University in Washington, DC, who was not involved in the study. It could shed light on interactions between vascular and electrical activity, or explain how place cells (neurons in the hippocampus) are so efficient at creating spatial memory.
It could also pave the way for minimally invasive neural prosthetics or targeted treatments for neurologic disorders, the researchers say. Implanting the device would be a “straightforward process” similar to placing electrocorticography grids in patients with epilepsy, said Dr. Kuzum.
But first, the team plans to do more studies in animal models before testing the tech in clinical settings, Dr. Kuzum added.
A version of this article appeared on Medscape.com.
FROM NATURE NANOTECHNOLOGY
Colchicine May Benefit Patients With Diabetes and Recent MI
TOPLINE:
A daily low dose of colchicine significantly reduces ischemic cardiovascular events in patients with type 2 diabetes (T2D) and a recent myocardial infarction (MI).
METHODOLOGY:
- After an MI, patients with vs without T2D have a higher risk for another cardiovascular event.
- The Colchicine Cardiovascular Outcomes Trial (COLCOT), a randomized, double-blinded trial, found a lower risk for ischemic cardiovascular events with 0.5 mg colchicine taken daily vs placebo, initiated within 30 days of an MI.
- Researchers conducted a prespecified subgroup analysis of 959 adult patients with T2D (mean age, 62.4 years; 22.2% women) in COLCOT (462 patients in colchicine and 497 patients in placebo groups).
- The primary efficacy endpoint was a composite of cardiovascular death, resuscitated cardiac arrest, MI, stroke, or urgent hospitalization for angina requiring coronary revascularization within a median 23 months.
- The patients were taking a variety of appropriate medications, including aspirin and another antiplatelet agent and a statin (98%-99%) and metformin (75%-76%).
TAKEAWAY:
- The risk for the primary endpoint was reduced by 35% in patients with T2D who received colchicine than in those who received placebo (hazard ratio, 0.65; P = .03).
- The primary endpoint event rate per 100 patient-months was significantly lower in the colchicine group than in the placebo group (rate ratio, 0.53; P = .01).
- The frequencies of adverse events were similar in both the treatment and placebo groups (14.6% and 12.8%, respectively; P = .41), with gastrointestinal adverse events being the most common.
- In COLCOT, patients with T2D had a 1.86-fold higher risk for a primary endpoint cardiovascular event, but there was no significant difference in the primary endpoint between those with and without T2D on colchicine.
IN PRACTICE:
“Patients with both T2D and a recent MI derive a large benefit from inflammation-reducing therapy with colchicine,” the authors noted.
SOURCE:
This study, led by François Roubille, University Hospital of Montpellier, France, was published online on January 5, 2024, in Diabetes Care.
LIMITATIONS:
Patients were not stratified at inclusion for the presence of diabetes. Also, the study did not evaluate the role of glycated hemoglobin and low-density lipoprotein cholesterol, as well as the effects of different glucose-lowering medications or possible hypoglycemic episodes.
DISCLOSURES:
The COLCOT study was funded by the Government of Quebec, the Canadian Institutes of Health Research, and philanthropic foundations. Coauthors Jean-Claude Tardif and Wolfgang Koenig declared receiving research grants, honoraria, advisory board fees, and lecture fees from pharmaceutical companies, as well as having other ties with various sources.
A version of this article appeared on Medscape.com.
TOPLINE:
A daily low dose of colchicine significantly reduces ischemic cardiovascular events in patients with type 2 diabetes (T2D) and a recent myocardial infarction (MI).
METHODOLOGY:
- After an MI, patients with vs without T2D have a higher risk for another cardiovascular event.
- The Colchicine Cardiovascular Outcomes Trial (COLCOT), a randomized, double-blinded trial, found a lower risk for ischemic cardiovascular events with 0.5 mg colchicine taken daily vs placebo, initiated within 30 days of an MI.
- Researchers conducted a prespecified subgroup analysis of 959 adult patients with T2D (mean age, 62.4 years; 22.2% women) in COLCOT (462 patients in colchicine and 497 patients in placebo groups).
- The primary efficacy endpoint was a composite of cardiovascular death, resuscitated cardiac arrest, MI, stroke, or urgent hospitalization for angina requiring coronary revascularization within a median 23 months.
- The patients were taking a variety of appropriate medications, including aspirin and another antiplatelet agent and a statin (98%-99%) and metformin (75%-76%).
TAKEAWAY:
- The risk for the primary endpoint was reduced by 35% in patients with T2D who received colchicine than in those who received placebo (hazard ratio, 0.65; P = .03).
- The primary endpoint event rate per 100 patient-months was significantly lower in the colchicine group than in the placebo group (rate ratio, 0.53; P = .01).
- The frequencies of adverse events were similar in both the treatment and placebo groups (14.6% and 12.8%, respectively; P = .41), with gastrointestinal adverse events being the most common.
- In COLCOT, patients with T2D had a 1.86-fold higher risk for a primary endpoint cardiovascular event, but there was no significant difference in the primary endpoint between those with and without T2D on colchicine.
IN PRACTICE:
“Patients with both T2D and a recent MI derive a large benefit from inflammation-reducing therapy with colchicine,” the authors noted.
SOURCE:
This study, led by François Roubille, University Hospital of Montpellier, France, was published online on January 5, 2024, in Diabetes Care.
LIMITATIONS:
Patients were not stratified at inclusion for the presence of diabetes. Also, the study did not evaluate the role of glycated hemoglobin and low-density lipoprotein cholesterol, as well as the effects of different glucose-lowering medications or possible hypoglycemic episodes.
DISCLOSURES:
The COLCOT study was funded by the Government of Quebec, the Canadian Institutes of Health Research, and philanthropic foundations. Coauthors Jean-Claude Tardif and Wolfgang Koenig declared receiving research grants, honoraria, advisory board fees, and lecture fees from pharmaceutical companies, as well as having other ties with various sources.
A version of this article appeared on Medscape.com.
TOPLINE:
A daily low dose of colchicine significantly reduces ischemic cardiovascular events in patients with type 2 diabetes (T2D) and a recent myocardial infarction (MI).
METHODOLOGY:
- After an MI, patients with vs without T2D have a higher risk for another cardiovascular event.
- The Colchicine Cardiovascular Outcomes Trial (COLCOT), a randomized, double-blinded trial, found a lower risk for ischemic cardiovascular events with 0.5 mg colchicine taken daily vs placebo, initiated within 30 days of an MI.
- Researchers conducted a prespecified subgroup analysis of 959 adult patients with T2D (mean age, 62.4 years; 22.2% women) in COLCOT (462 patients in colchicine and 497 patients in placebo groups).
- The primary efficacy endpoint was a composite of cardiovascular death, resuscitated cardiac arrest, MI, stroke, or urgent hospitalization for angina requiring coronary revascularization within a median 23 months.
- The patients were taking a variety of appropriate medications, including aspirin and another antiplatelet agent and a statin (98%-99%) and metformin (75%-76%).
TAKEAWAY:
- The risk for the primary endpoint was reduced by 35% in patients with T2D who received colchicine than in those who received placebo (hazard ratio, 0.65; P = .03).
- The primary endpoint event rate per 100 patient-months was significantly lower in the colchicine group than in the placebo group (rate ratio, 0.53; P = .01).
- The frequencies of adverse events were similar in both the treatment and placebo groups (14.6% and 12.8%, respectively; P = .41), with gastrointestinal adverse events being the most common.
- In COLCOT, patients with T2D had a 1.86-fold higher risk for a primary endpoint cardiovascular event, but there was no significant difference in the primary endpoint between those with and without T2D on colchicine.
IN PRACTICE:
“Patients with both T2D and a recent MI derive a large benefit from inflammation-reducing therapy with colchicine,” the authors noted.
SOURCE:
This study, led by François Roubille, University Hospital of Montpellier, France, was published online on January 5, 2024, in Diabetes Care.
LIMITATIONS:
Patients were not stratified at inclusion for the presence of diabetes. Also, the study did not evaluate the role of glycated hemoglobin and low-density lipoprotein cholesterol, as well as the effects of different glucose-lowering medications or possible hypoglycemic episodes.
DISCLOSURES:
The COLCOT study was funded by the Government of Quebec, the Canadian Institutes of Health Research, and philanthropic foundations. Coauthors Jean-Claude Tardif and Wolfgang Koenig declared receiving research grants, honoraria, advisory board fees, and lecture fees from pharmaceutical companies, as well as having other ties with various sources.
A version of this article appeared on Medscape.com.
Adequate Transition of Epilepsy Care from Pediatric to Adult Is Often Lacking
study was published online in Epilepsy & Behavior.
, according to a recent survey. Many respondents received little to no information regarding the process, and many adults were still receiving care from family physicians or pediatric neurologists. TheRoom for Improvement
“We are not doing as good a job with planning for transition as we should,” said Elaine C. Wirrell, MD, who was not involved with the study. “It is not just a simple issue of sending your patient to an adult neurologist. Transition is a process that happens over time, so we need to do a better job getting our families ready for moving on to an adult provider.” Dr. Wirrell is director of pediatric epilepsy and professor of neurology at the Mayo Clinic in Rochester, Minnesota.
Clumsy Transitions
Investigators distributed a 25-question survey to patients and caregivers who attended the 2019 Epilepsy Awareness Day at Disneyland, and through online support groups in North America. Among 58 responses, 32 came from patients between ages 12 and 17 years or their caregivers.
Despite attempts to recruit a diverse cross-section of respondents, most patients had severe epilepsy and comorbidities: 43% had daily or weekly seizures; 45% were on three or more antiseizure medications; and 74% had intellectual disabilities.
Many children with early-life epilepsies suffer from developmental and epileptic encephalopathy, which has associated non-seizure symptoms including learning challenges, behavioral issues, and other medical concerns, Dr. Wirrell said. Therefore, she said, finding a neurologist who treats adults — and has the expertise and interest to care for such patients — can be difficult.
“We’re seeing many patients not making that transition, or maybe not making it appropriately, so they’re not necessarily getting to the providers who have the most expertise in managing their epilepsy.” Among adults surveyed, 27% were still being followed by pediatric neurologists, and 35% were visiting family doctors for epilepsy-related treatment.
Because the needs of children with complex epilepsy can extend well beyond neurology, Dr. Wirrell added, managing such cases often requires multidisciplinary pediatric teams. “Finding that team on the adult side is more challenging.” As a result, she said, patients may transfer their neurology care without getting additional support for comorbidities such as mood disorders and learning disabilities.
The foregoing challenges are complicated by the fact that pediatric neurologists often lack the time (and in the United States, reimbursement) to adequately address the transition process, said Dr. Wirrell. Providers in freestanding children’s hospitals may face additional challenges coordinating with adult-care providers outside their facilities, she said.
“There’s also potentially a reluctance of both families and physicians to transition the patient on, because there’s concern that maybe there isn’t anybody on the adult side who is able to do as good a job as what they have on the pediatric side.”
Well-Coordinated Transitions Should Have No Surprises
Transition should be a planned, independence-promoting process that results in smooth, well-coordinated movement of pediatric patients into adult care — one without surprises or disconnections, the authors wrote. However, 55% of respondents never heard the term “transition” from any provider, even though 69% of patients were being treated in academic specialty centers.
Among 12- to 17-year-olds, 72% had never discussed transition with their healthcare team. That figure includes no 17-year-olds. Approximately 90% of respondents said they received sufficient time during healthcare visits, but 54% reported feeling stressed when moving from pediatric to adult care.
Given resource constraints in many pediatric epilepsy programs, the study authors recommended patient-empowerment tools such as a transition toolkit to help patients and families navigate the transition process even in places without formal transition programs.
“Many of these children are coming over with boatloads of medical records,” Dr. Wirrell said. “It’s not fair to the adult provider, who then has to go through all those records.” Instead, she said, pediatric teams should provide succinct summaries of relevant test results, medication side effects, prior treatments tried, and the like. “Those summaries are critically important so that we can get information to the person who needs it.”
Although successful transition requires significant coordination, she added, much of the process can often be handled by nonphysicians. “There are some very good nurse-led transition programs. Often, we can have a nurse providing education to the family and even potentially having a joint visit with an adult epilepsy nurse for complex patients.”
Pediatric providers also must know when to begin the transition process, Dr. Wirrell said. As soon as patients are 13 or 14 years old, she suggested discussing the process with them and their families every 6 to 12 months, covering specifics ranging from how to order medications to why adult patients may need power of attorney designees.
On a broader scale, said Dr. Wirrell, a smooth handoff requires planning. Fortunately, she said, the topic is becoming a significant priority for a growing number of children’s hospitals specific not only to epilepsy, but also to other chronic illnesses.
Dr. Wirrell is co–editor-in-chief for epilepsy.com. She reports no relevant financial interests.
study was published online in Epilepsy & Behavior.
, according to a recent survey. Many respondents received little to no information regarding the process, and many adults were still receiving care from family physicians or pediatric neurologists. TheRoom for Improvement
“We are not doing as good a job with planning for transition as we should,” said Elaine C. Wirrell, MD, who was not involved with the study. “It is not just a simple issue of sending your patient to an adult neurologist. Transition is a process that happens over time, so we need to do a better job getting our families ready for moving on to an adult provider.” Dr. Wirrell is director of pediatric epilepsy and professor of neurology at the Mayo Clinic in Rochester, Minnesota.
Clumsy Transitions
Investigators distributed a 25-question survey to patients and caregivers who attended the 2019 Epilepsy Awareness Day at Disneyland, and through online support groups in North America. Among 58 responses, 32 came from patients between ages 12 and 17 years or their caregivers.
Despite attempts to recruit a diverse cross-section of respondents, most patients had severe epilepsy and comorbidities: 43% had daily or weekly seizures; 45% were on three or more antiseizure medications; and 74% had intellectual disabilities.
Many children with early-life epilepsies suffer from developmental and epileptic encephalopathy, which has associated non-seizure symptoms including learning challenges, behavioral issues, and other medical concerns, Dr. Wirrell said. Therefore, she said, finding a neurologist who treats adults — and has the expertise and interest to care for such patients — can be difficult.
“We’re seeing many patients not making that transition, or maybe not making it appropriately, so they’re not necessarily getting to the providers who have the most expertise in managing their epilepsy.” Among adults surveyed, 27% were still being followed by pediatric neurologists, and 35% were visiting family doctors for epilepsy-related treatment.
Because the needs of children with complex epilepsy can extend well beyond neurology, Dr. Wirrell added, managing such cases often requires multidisciplinary pediatric teams. “Finding that team on the adult side is more challenging.” As a result, she said, patients may transfer their neurology care without getting additional support for comorbidities such as mood disorders and learning disabilities.
The foregoing challenges are complicated by the fact that pediatric neurologists often lack the time (and in the United States, reimbursement) to adequately address the transition process, said Dr. Wirrell. Providers in freestanding children’s hospitals may face additional challenges coordinating with adult-care providers outside their facilities, she said.
“There’s also potentially a reluctance of both families and physicians to transition the patient on, because there’s concern that maybe there isn’t anybody on the adult side who is able to do as good a job as what they have on the pediatric side.”
Well-Coordinated Transitions Should Have No Surprises
Transition should be a planned, independence-promoting process that results in smooth, well-coordinated movement of pediatric patients into adult care — one without surprises or disconnections, the authors wrote. However, 55% of respondents never heard the term “transition” from any provider, even though 69% of patients were being treated in academic specialty centers.
Among 12- to 17-year-olds, 72% had never discussed transition with their healthcare team. That figure includes no 17-year-olds. Approximately 90% of respondents said they received sufficient time during healthcare visits, but 54% reported feeling stressed when moving from pediatric to adult care.
Given resource constraints in many pediatric epilepsy programs, the study authors recommended patient-empowerment tools such as a transition toolkit to help patients and families navigate the transition process even in places without formal transition programs.
“Many of these children are coming over with boatloads of medical records,” Dr. Wirrell said. “It’s not fair to the adult provider, who then has to go through all those records.” Instead, she said, pediatric teams should provide succinct summaries of relevant test results, medication side effects, prior treatments tried, and the like. “Those summaries are critically important so that we can get information to the person who needs it.”
Although successful transition requires significant coordination, she added, much of the process can often be handled by nonphysicians. “There are some very good nurse-led transition programs. Often, we can have a nurse providing education to the family and even potentially having a joint visit with an adult epilepsy nurse for complex patients.”
Pediatric providers also must know when to begin the transition process, Dr. Wirrell said. As soon as patients are 13 or 14 years old, she suggested discussing the process with them and their families every 6 to 12 months, covering specifics ranging from how to order medications to why adult patients may need power of attorney designees.
On a broader scale, said Dr. Wirrell, a smooth handoff requires planning. Fortunately, she said, the topic is becoming a significant priority for a growing number of children’s hospitals specific not only to epilepsy, but also to other chronic illnesses.
Dr. Wirrell is co–editor-in-chief for epilepsy.com. She reports no relevant financial interests.
study was published online in Epilepsy & Behavior.
, according to a recent survey. Many respondents received little to no information regarding the process, and many adults were still receiving care from family physicians or pediatric neurologists. TheRoom for Improvement
“We are not doing as good a job with planning for transition as we should,” said Elaine C. Wirrell, MD, who was not involved with the study. “It is not just a simple issue of sending your patient to an adult neurologist. Transition is a process that happens over time, so we need to do a better job getting our families ready for moving on to an adult provider.” Dr. Wirrell is director of pediatric epilepsy and professor of neurology at the Mayo Clinic in Rochester, Minnesota.
Clumsy Transitions
Investigators distributed a 25-question survey to patients and caregivers who attended the 2019 Epilepsy Awareness Day at Disneyland, and through online support groups in North America. Among 58 responses, 32 came from patients between ages 12 and 17 years or their caregivers.
Despite attempts to recruit a diverse cross-section of respondents, most patients had severe epilepsy and comorbidities: 43% had daily or weekly seizures; 45% were on three or more antiseizure medications; and 74% had intellectual disabilities.
Many children with early-life epilepsies suffer from developmental and epileptic encephalopathy, which has associated non-seizure symptoms including learning challenges, behavioral issues, and other medical concerns, Dr. Wirrell said. Therefore, she said, finding a neurologist who treats adults — and has the expertise and interest to care for such patients — can be difficult.
“We’re seeing many patients not making that transition, or maybe not making it appropriately, so they’re not necessarily getting to the providers who have the most expertise in managing their epilepsy.” Among adults surveyed, 27% were still being followed by pediatric neurologists, and 35% were visiting family doctors for epilepsy-related treatment.
Because the needs of children with complex epilepsy can extend well beyond neurology, Dr. Wirrell added, managing such cases often requires multidisciplinary pediatric teams. “Finding that team on the adult side is more challenging.” As a result, she said, patients may transfer their neurology care without getting additional support for comorbidities such as mood disorders and learning disabilities.
The foregoing challenges are complicated by the fact that pediatric neurologists often lack the time (and in the United States, reimbursement) to adequately address the transition process, said Dr. Wirrell. Providers in freestanding children’s hospitals may face additional challenges coordinating with adult-care providers outside their facilities, she said.
“There’s also potentially a reluctance of both families and physicians to transition the patient on, because there’s concern that maybe there isn’t anybody on the adult side who is able to do as good a job as what they have on the pediatric side.”
Well-Coordinated Transitions Should Have No Surprises
Transition should be a planned, independence-promoting process that results in smooth, well-coordinated movement of pediatric patients into adult care — one without surprises or disconnections, the authors wrote. However, 55% of respondents never heard the term “transition” from any provider, even though 69% of patients were being treated in academic specialty centers.
Among 12- to 17-year-olds, 72% had never discussed transition with their healthcare team. That figure includes no 17-year-olds. Approximately 90% of respondents said they received sufficient time during healthcare visits, but 54% reported feeling stressed when moving from pediatric to adult care.
Given resource constraints in many pediatric epilepsy programs, the study authors recommended patient-empowerment tools such as a transition toolkit to help patients and families navigate the transition process even in places without formal transition programs.
“Many of these children are coming over with boatloads of medical records,” Dr. Wirrell said. “It’s not fair to the adult provider, who then has to go through all those records.” Instead, she said, pediatric teams should provide succinct summaries of relevant test results, medication side effects, prior treatments tried, and the like. “Those summaries are critically important so that we can get information to the person who needs it.”
Although successful transition requires significant coordination, she added, much of the process can often be handled by nonphysicians. “There are some very good nurse-led transition programs. Often, we can have a nurse providing education to the family and even potentially having a joint visit with an adult epilepsy nurse for complex patients.”
Pediatric providers also must know when to begin the transition process, Dr. Wirrell said. As soon as patients are 13 or 14 years old, she suggested discussing the process with them and their families every 6 to 12 months, covering specifics ranging from how to order medications to why adult patients may need power of attorney designees.
On a broader scale, said Dr. Wirrell, a smooth handoff requires planning. Fortunately, she said, the topic is becoming a significant priority for a growing number of children’s hospitals specific not only to epilepsy, but also to other chronic illnesses.
Dr. Wirrell is co–editor-in-chief for epilepsy.com. She reports no relevant financial interests.
FROM EPILEPSY & BEHAVIOR
Chronic Fatigue Syndrome and Fibromyalgia: A Single Disease Entity?
Myalgic encephalomyelitis/chronic fatigue syndrome (ME/CFS) and fibromyalgia (FM) have overlapping neurologic symptoms — particularly profound fatigue. The similarity between these two conditions has led to the question of whether they are indeed distinct central nervous system (CNS) entities, or whether they exist along a spectrum and are actually two different manifestations of the same disease process.
A new study utilized a novel methodology — unbiased quantitative mass spectrometry-based proteomics — to investigate this question by analyzing cerebrospinal fluid (CSF) in a group of patients with ME/CFS and another group of patients diagnosed with both ME/CFS and FM.
Close to 2,100 proteins were identified, of which nearly 1,800 were common to both conditions.
“ME/CFS and fibromyalgia do not appear to be distinct entities, with respect to their cerebrospinal fluid proteins,” lead author Steven Schutzer, MD, professor of medicine, Rutgers New Jersey School of Medicine, told this news organization.
“Work is underway to solve the multiple mysteries of ME/CFS, fibromyalgia, and other neurologic-associated diseases,” he continued. “We have further affirmed that we have a precise objective discovery tool in our hands. Collectively studying multiple diseases brings clarity to each individual disease.”
The study was published in the December 2023 issue of Annals of Medicine.
Cutting-Edge Technology
“ME/CFS is characterized by disabling fatigue, and FM is an illness characterized by body-wide pain,” Dr. Schutzer said. These “medically unexplained” illnesses often coexist by current definitions, and the overlap between them has suggested that they may be part of the “same illness spectrum.”
But co-investigator Benjamin Natelson, MD, professor of neurology and director of the Pain and Fatigue Study Center, Mount Sinai, New York, and others found in previous research that there are distinct differences between the conditions, raising the possibility that there may be different pathophysiological processes.
“The physicians and scientists on our team have had longstanding interest in studying neurologic diseases with cutting-edge tools such as mass spectrometry applied to CSF,” Dr. Schutzer said. “We have had success using this message to distinguish diseases such as ME/CFS from post-treatment Lyme disease, multiple sclerosis, and healthy normal people.”
Dr. Schutzer explained that Dr. Natelson had acquired CSF samples from “well-characterized [ME/CFS] patients and controls.”
Since the cause of ME/CFS is “unknown,” it seemed “ripe to investigate it further with the discovery tool of mass spectrometry” by harnessing the “most advanced equipment in the country at the pacific Northwest National Laboratory, which is part of the US Department of Energy.”
Dr. Schutzer noted that it was the “merger of different clinical and laboratory expertise” that enabled them to address whether ME/CFS and FM are two distinct disease processes.
The choice of analyzing CSF is that it’s the fluid closest to the brain, he added. “A lot of people have studied ME/CFS peripherally because they don’t have access to spinal fluid or it’s easier to look peripherally in the blood, but that doesn’t mean that the blood is where the real ‘action’ is occurring.”
The researchers compared the CSF of 15 patients with ME/CFS only to 15 patients with ME/CFS+FM using mass spectrometry-based proteomics, which they had employed in previous research to see whether ME/CFS was distinct from persistent neurologic Lyme disease syndrome.
This technology has become the “method of choice and discovery tool to rapidly uncover protein biomarkers that can distinguish one disease from another,” the authors stated.
In particular, in unbiased quantitative mass spectrometry-based proteomics, the researchers do not have to know in advance what’s in a sample before studying it, Dr. Schutzer explained.
Shared Pathophysiology?
Both groups of patients were of similar age (41.3 ± 9.4 years and 40.1 ± 11.0 years, respectively), with no differences in gender or rates of current comorbid psychiatric diagnoses between the groups.
The researchers quantified a total of 2,083 proteins, including 1,789 that were specifically quantified in all of the CSF samples, regardless of the presence or absence of FM.
Several analyses (including an ANOVA analysis with adjusted P values, a Random Forest machine learning approach that looked at relative protein abundance changes between those with ME/CFS and ME/CFS+FM, and unsupervised hierarchical clustering analyses) did not find distinguishing differences between the groups.
the authors stated.
They noted that both conditions are “medically unexplained,” with core symptoms of pain, fatigue, sleep problems, and cognitive difficulty. The fact that these two syndromes coexist so often has led to the assumption that the “similarities between them outweigh the differences,” they wrote.
They pointed to some differences between the conditions, including an increase in substance P in the CSF of FM patients, but not in ME/CFS patients reported by others. There are also some immunological, physiological and genetic differences.
But if the conclusion that the two illnesses may share a similar pathophysiological basis is supported by other research that includes FM-only patients as comparators to those with ME/CFS, “this would support the notion that the two illnesses fall along a common illness spectrum and may be approached as a single entity — with implications for both diagnosis and the development of new treatment approaches,” they concluded.
‘Noncontributory’ Findings
Commenting on the research, Robert G. Lahita, MD, PhD, director of the Institute for Autoimmune and Rheumatic Diseases, St. Joseph Health, Wayne, New Jersey, stated that he does not regard these diseases as neurologic but rather as rheumatologic.
“Most neurologists don’t see these diseases, but as a rheumatologist, I see them every day,” said Dr. Lahita, professor of medicine at Hackensack (New Jersey) Meridian School of Medicine and a clinical professor of medicine at Rutgers New Jersey Medical School, New Brunswick. “ME/CFS isn’t as common in my practice, but we do deal with many post-COVID patients who are afflicted mostly with ME/CFS.”
He noted that an important reason for fatigue in FM is that patients generally don’t sleep, or their sleep is disrupted. This is different from the cause of fatigue in ME/CFS.
In addition, the small sample size and the lack of difference between males and females were both limitations of the current study, said Dr. Lahita, who was not involved in this research. “We know that FM disproportionately affects women — in my practice, for example, over 95% of the patients with FM are female — while ME/CFS affects both genders similarly.”
Using proteomics as a biomarker was also problematic, according to Dr. Lahita. “It would have been more valuable to investigate differences in cytokines, for example,” he suggested.
Ultimately, Dr. Lahita thinks that the study is “non-contributory to the field and, as complex as the analysis was, it does nothing to shed differentiate the two conditions or explain the syndromes themselves.”
He added that it would have been more valuable to compare ME/CFS not only to ME/CFS plus FM but also with FM without ME/CFS and to healthy controls, and perhaps to a group with an autoimmune condition, such as lupus or Hashimoto’s thyroiditis.
Dr. Schutzer acknowledged that a limitation of the current study is that his team was unable analyze the CSF of patients with only FM. He and his colleagues “combed the world’s labs” for existing CSF samples of patients with FM alone but were unable to obtain any. “We see this study as a ‘stepping stone’ and hope that future studies will include patients with FM who are willing to donate CSF samples that we can use for comparison,” he said.
The authors received support from the National Institutes of Health, National Institute of Allergy and Infectious Diseases, and National Institute of Neurological Disorders and Stroke. Dr. Schutzer, coauthors, and Dr. Lahita reported no relevant financial relationships.
Myalgic encephalomyelitis/chronic fatigue syndrome (ME/CFS) and fibromyalgia (FM) have overlapping neurologic symptoms — particularly profound fatigue. The similarity between these two conditions has led to the question of whether they are indeed distinct central nervous system (CNS) entities, or whether they exist along a spectrum and are actually two different manifestations of the same disease process.
A new study utilized a novel methodology — unbiased quantitative mass spectrometry-based proteomics — to investigate this question by analyzing cerebrospinal fluid (CSF) in a group of patients with ME/CFS and another group of patients diagnosed with both ME/CFS and FM.
Close to 2,100 proteins were identified, of which nearly 1,800 were common to both conditions.
“ME/CFS and fibromyalgia do not appear to be distinct entities, with respect to their cerebrospinal fluid proteins,” lead author Steven Schutzer, MD, professor of medicine, Rutgers New Jersey School of Medicine, told this news organization.
“Work is underway to solve the multiple mysteries of ME/CFS, fibromyalgia, and other neurologic-associated diseases,” he continued. “We have further affirmed that we have a precise objective discovery tool in our hands. Collectively studying multiple diseases brings clarity to each individual disease.”
The study was published in the December 2023 issue of Annals of Medicine.
Cutting-Edge Technology
“ME/CFS is characterized by disabling fatigue, and FM is an illness characterized by body-wide pain,” Dr. Schutzer said. These “medically unexplained” illnesses often coexist by current definitions, and the overlap between them has suggested that they may be part of the “same illness spectrum.”
But co-investigator Benjamin Natelson, MD, professor of neurology and director of the Pain and Fatigue Study Center, Mount Sinai, New York, and others found in previous research that there are distinct differences between the conditions, raising the possibility that there may be different pathophysiological processes.
“The physicians and scientists on our team have had longstanding interest in studying neurologic diseases with cutting-edge tools such as mass spectrometry applied to CSF,” Dr. Schutzer said. “We have had success using this message to distinguish diseases such as ME/CFS from post-treatment Lyme disease, multiple sclerosis, and healthy normal people.”
Dr. Schutzer explained that Dr. Natelson had acquired CSF samples from “well-characterized [ME/CFS] patients and controls.”
Since the cause of ME/CFS is “unknown,” it seemed “ripe to investigate it further with the discovery tool of mass spectrometry” by harnessing the “most advanced equipment in the country at the pacific Northwest National Laboratory, which is part of the US Department of Energy.”
Dr. Schutzer noted that it was the “merger of different clinical and laboratory expertise” that enabled them to address whether ME/CFS and FM are two distinct disease processes.
The choice of analyzing CSF is that it’s the fluid closest to the brain, he added. “A lot of people have studied ME/CFS peripherally because they don’t have access to spinal fluid or it’s easier to look peripherally in the blood, but that doesn’t mean that the blood is where the real ‘action’ is occurring.”
The researchers compared the CSF of 15 patients with ME/CFS only to 15 patients with ME/CFS+FM using mass spectrometry-based proteomics, which they had employed in previous research to see whether ME/CFS was distinct from persistent neurologic Lyme disease syndrome.
This technology has become the “method of choice and discovery tool to rapidly uncover protein biomarkers that can distinguish one disease from another,” the authors stated.
In particular, in unbiased quantitative mass spectrometry-based proteomics, the researchers do not have to know in advance what’s in a sample before studying it, Dr. Schutzer explained.
Shared Pathophysiology?
Both groups of patients were of similar age (41.3 ± 9.4 years and 40.1 ± 11.0 years, respectively), with no differences in gender or rates of current comorbid psychiatric diagnoses between the groups.
The researchers quantified a total of 2,083 proteins, including 1,789 that were specifically quantified in all of the CSF samples, regardless of the presence or absence of FM.
Several analyses (including an ANOVA analysis with adjusted P values, a Random Forest machine learning approach that looked at relative protein abundance changes between those with ME/CFS and ME/CFS+FM, and unsupervised hierarchical clustering analyses) did not find distinguishing differences between the groups.
the authors stated.
They noted that both conditions are “medically unexplained,” with core symptoms of pain, fatigue, sleep problems, and cognitive difficulty. The fact that these two syndromes coexist so often has led to the assumption that the “similarities between them outweigh the differences,” they wrote.
They pointed to some differences between the conditions, including an increase in substance P in the CSF of FM patients, but not in ME/CFS patients reported by others. There are also some immunological, physiological and genetic differences.
But if the conclusion that the two illnesses may share a similar pathophysiological basis is supported by other research that includes FM-only patients as comparators to those with ME/CFS, “this would support the notion that the two illnesses fall along a common illness spectrum and may be approached as a single entity — with implications for both diagnosis and the development of new treatment approaches,” they concluded.
‘Noncontributory’ Findings
Commenting on the research, Robert G. Lahita, MD, PhD, director of the Institute for Autoimmune and Rheumatic Diseases, St. Joseph Health, Wayne, New Jersey, stated that he does not regard these diseases as neurologic but rather as rheumatologic.
“Most neurologists don’t see these diseases, but as a rheumatologist, I see them every day,” said Dr. Lahita, professor of medicine at Hackensack (New Jersey) Meridian School of Medicine and a clinical professor of medicine at Rutgers New Jersey Medical School, New Brunswick. “ME/CFS isn’t as common in my practice, but we do deal with many post-COVID patients who are afflicted mostly with ME/CFS.”
He noted that an important reason for fatigue in FM is that patients generally don’t sleep, or their sleep is disrupted. This is different from the cause of fatigue in ME/CFS.
In addition, the small sample size and the lack of difference between males and females were both limitations of the current study, said Dr. Lahita, who was not involved in this research. “We know that FM disproportionately affects women — in my practice, for example, over 95% of the patients with FM are female — while ME/CFS affects both genders similarly.”
Using proteomics as a biomarker was also problematic, according to Dr. Lahita. “It would have been more valuable to investigate differences in cytokines, for example,” he suggested.
Ultimately, Dr. Lahita thinks that the study is “non-contributory to the field and, as complex as the analysis was, it does nothing to shed differentiate the two conditions or explain the syndromes themselves.”
He added that it would have been more valuable to compare ME/CFS not only to ME/CFS plus FM but also with FM without ME/CFS and to healthy controls, and perhaps to a group with an autoimmune condition, such as lupus or Hashimoto’s thyroiditis.
Dr. Schutzer acknowledged that a limitation of the current study is that his team was unable analyze the CSF of patients with only FM. He and his colleagues “combed the world’s labs” for existing CSF samples of patients with FM alone but were unable to obtain any. “We see this study as a ‘stepping stone’ and hope that future studies will include patients with FM who are willing to donate CSF samples that we can use for comparison,” he said.
The authors received support from the National Institutes of Health, National Institute of Allergy and Infectious Diseases, and National Institute of Neurological Disorders and Stroke. Dr. Schutzer, coauthors, and Dr. Lahita reported no relevant financial relationships.
Myalgic encephalomyelitis/chronic fatigue syndrome (ME/CFS) and fibromyalgia (FM) have overlapping neurologic symptoms — particularly profound fatigue. The similarity between these two conditions has led to the question of whether they are indeed distinct central nervous system (CNS) entities, or whether they exist along a spectrum and are actually two different manifestations of the same disease process.
A new study utilized a novel methodology — unbiased quantitative mass spectrometry-based proteomics — to investigate this question by analyzing cerebrospinal fluid (CSF) in a group of patients with ME/CFS and another group of patients diagnosed with both ME/CFS and FM.
Close to 2,100 proteins were identified, of which nearly 1,800 were common to both conditions.
“ME/CFS and fibromyalgia do not appear to be distinct entities, with respect to their cerebrospinal fluid proteins,” lead author Steven Schutzer, MD, professor of medicine, Rutgers New Jersey School of Medicine, told this news organization.
“Work is underway to solve the multiple mysteries of ME/CFS, fibromyalgia, and other neurologic-associated diseases,” he continued. “We have further affirmed that we have a precise objective discovery tool in our hands. Collectively studying multiple diseases brings clarity to each individual disease.”
The study was published in the December 2023 issue of Annals of Medicine.
Cutting-Edge Technology
“ME/CFS is characterized by disabling fatigue, and FM is an illness characterized by body-wide pain,” Dr. Schutzer said. These “medically unexplained” illnesses often coexist by current definitions, and the overlap between them has suggested that they may be part of the “same illness spectrum.”
But co-investigator Benjamin Natelson, MD, professor of neurology and director of the Pain and Fatigue Study Center, Mount Sinai, New York, and others found in previous research that there are distinct differences between the conditions, raising the possibility that there may be different pathophysiological processes.
“The physicians and scientists on our team have had longstanding interest in studying neurologic diseases with cutting-edge tools such as mass spectrometry applied to CSF,” Dr. Schutzer said. “We have had success using this message to distinguish diseases such as ME/CFS from post-treatment Lyme disease, multiple sclerosis, and healthy normal people.”
Dr. Schutzer explained that Dr. Natelson had acquired CSF samples from “well-characterized [ME/CFS] patients and controls.”
Since the cause of ME/CFS is “unknown,” it seemed “ripe to investigate it further with the discovery tool of mass spectrometry” by harnessing the “most advanced equipment in the country at the pacific Northwest National Laboratory, which is part of the US Department of Energy.”
Dr. Schutzer noted that it was the “merger of different clinical and laboratory expertise” that enabled them to address whether ME/CFS and FM are two distinct disease processes.
The choice of analyzing CSF is that it’s the fluid closest to the brain, he added. “A lot of people have studied ME/CFS peripherally because they don’t have access to spinal fluid or it’s easier to look peripherally in the blood, but that doesn’t mean that the blood is where the real ‘action’ is occurring.”
The researchers compared the CSF of 15 patients with ME/CFS only to 15 patients with ME/CFS+FM using mass spectrometry-based proteomics, which they had employed in previous research to see whether ME/CFS was distinct from persistent neurologic Lyme disease syndrome.
This technology has become the “method of choice and discovery tool to rapidly uncover protein biomarkers that can distinguish one disease from another,” the authors stated.
In particular, in unbiased quantitative mass spectrometry-based proteomics, the researchers do not have to know in advance what’s in a sample before studying it, Dr. Schutzer explained.
Shared Pathophysiology?
Both groups of patients were of similar age (41.3 ± 9.4 years and 40.1 ± 11.0 years, respectively), with no differences in gender or rates of current comorbid psychiatric diagnoses between the groups.
The researchers quantified a total of 2,083 proteins, including 1,789 that were specifically quantified in all of the CSF samples, regardless of the presence or absence of FM.
Several analyses (including an ANOVA analysis with adjusted P values, a Random Forest machine learning approach that looked at relative protein abundance changes between those with ME/CFS and ME/CFS+FM, and unsupervised hierarchical clustering analyses) did not find distinguishing differences between the groups.
the authors stated.
They noted that both conditions are “medically unexplained,” with core symptoms of pain, fatigue, sleep problems, and cognitive difficulty. The fact that these two syndromes coexist so often has led to the assumption that the “similarities between them outweigh the differences,” they wrote.
They pointed to some differences between the conditions, including an increase in substance P in the CSF of FM patients, but not in ME/CFS patients reported by others. There are also some immunological, physiological and genetic differences.
But if the conclusion that the two illnesses may share a similar pathophysiological basis is supported by other research that includes FM-only patients as comparators to those with ME/CFS, “this would support the notion that the two illnesses fall along a common illness spectrum and may be approached as a single entity — with implications for both diagnosis and the development of new treatment approaches,” they concluded.
‘Noncontributory’ Findings
Commenting on the research, Robert G. Lahita, MD, PhD, director of the Institute for Autoimmune and Rheumatic Diseases, St. Joseph Health, Wayne, New Jersey, stated that he does not regard these diseases as neurologic but rather as rheumatologic.
“Most neurologists don’t see these diseases, but as a rheumatologist, I see them every day,” said Dr. Lahita, professor of medicine at Hackensack (New Jersey) Meridian School of Medicine and a clinical professor of medicine at Rutgers New Jersey Medical School, New Brunswick. “ME/CFS isn’t as common in my practice, but we do deal with many post-COVID patients who are afflicted mostly with ME/CFS.”
He noted that an important reason for fatigue in FM is that patients generally don’t sleep, or their sleep is disrupted. This is different from the cause of fatigue in ME/CFS.
In addition, the small sample size and the lack of difference between males and females were both limitations of the current study, said Dr. Lahita, who was not involved in this research. “We know that FM disproportionately affects women — in my practice, for example, over 95% of the patients with FM are female — while ME/CFS affects both genders similarly.”
Using proteomics as a biomarker was also problematic, according to Dr. Lahita. “It would have been more valuable to investigate differences in cytokines, for example,” he suggested.
Ultimately, Dr. Lahita thinks that the study is “non-contributory to the field and, as complex as the analysis was, it does nothing to shed differentiate the two conditions or explain the syndromes themselves.”
He added that it would have been more valuable to compare ME/CFS not only to ME/CFS plus FM but also with FM without ME/CFS and to healthy controls, and perhaps to a group with an autoimmune condition, such as lupus or Hashimoto’s thyroiditis.
Dr. Schutzer acknowledged that a limitation of the current study is that his team was unable analyze the CSF of patients with only FM. He and his colleagues “combed the world’s labs” for existing CSF samples of patients with FM alone but were unable to obtain any. “We see this study as a ‘stepping stone’ and hope that future studies will include patients with FM who are willing to donate CSF samples that we can use for comparison,” he said.
The authors received support from the National Institutes of Health, National Institute of Allergy and Infectious Diseases, and National Institute of Neurological Disorders and Stroke. Dr. Schutzer, coauthors, and Dr. Lahita reported no relevant financial relationships.
Retinal Perfusion Is Reduced During Migraine Attacks
study was published online in Headache.
Together, these changes could one day represent migraine biomarkers, authors say. The“We’re always looking for a biological marker for migraine,” said Alan M. Rapoport, MD, a clinical professor of neurology in the David Geffen School of Medicine at the University of California Los Angeles and past president of the International Headache Society. Researchers have identified many parameters that make people more likely to experience migraine, he said, but there remains no smoking gun. “We do not yet have a diagnostic test.”
Investigators have long been examining ocular vascular supply, added Dr. Rapoport, who was not involved with the study, because the eyes, visual system of the brain, and migraine are closely related. “But no one has ever figured out that one could use anything related to the eye as a definitive diagnostic test. This study was interesting because researchers used a very advanced technique to see if there are changes in the vascular supply to the eyeball during migraine.”
During Attacks
Study investigators prospectively enrolled 37 patients diagnosed with migraine with aura (MA), 30 with migraine without aura (MO), and 20 healthy controls. All subjects underwent macular OCTA for interictal analysis. A total of 20 patients with migraine (12 with MA and 8 with MO) underwent repeat scans during migraine attacks, and 5 control patients had repeat scans.
Compared with interictal measurements, significant parafoveal reductions in vessel flux index, an indicator of retinal perfusion, occurred in both the MA and MO groups during migraine attacks: –7% (95% CI, –10% to –4%; P = .006) and –7% (95% CI, –10% to –3%; P = .016), respectively, versus controls (2%, 95% CI, –3% to 7%).
The fact that migraine attacks resulted in reduced blood supply to the retinal region responsible for central vision is intriguing, said Dr. Rapoport, because sufficient reductions in blood supply there could result in blurred vision or other visual difficulties that might be mistaken for a true aura. “Many patients describe blurred vision related to their migraine headaches which do not usually qualify for an aura diagnosis,” he said.
Diagnostic criteria for MA, which afflicts around one third of people with migraine, include visual aberrations lasting at least 5 minutes and no more than 60 minutes. Visual aberrations average about 20-25 minutes, said Dr. Rapoport. “And we don’t usually accept blurred vision.” For most people who experience ictal blurred vision, he added, the phenomenon only lasts a short time and is not considered an aura.
More typical visual manifestations of MA include zigzag lines in an overall crescent shape that may blink, have bright edges, grow and shrink in size, and/or move across the visual field; patients also may have blind spots or distortions (e.g. far away vision, smaller or larger vision, or kaleidoscopic fractured vision). Nevertheless, said Dr. Rapoport, the study may shed light on why some people experiencing a migraine attack may suffer a brief bout of blurred vision and mistakenly report experiencing an aura.
Between Attacks
Comparing the two migraine groups interictally showed statistically significant differences in macular structure and function. Compared with the MO cohort, the MA cohort had higher circularity (mean [SD] 0.686 [0.088] vs. 0.629 [0.120] MO, P = .004), as well as a 13% (SD ± 10%, P = .003) lower foveal vessel flux index. “Not only is perfusion lower in both types of migraine during the attack,” said Dr. Rapoport, “but between attacks, people with MA had a lower blood supply to the retina than those who had MO.”
Unilateral Migraine
In a subset of patients (14 with MA and 12 with MO) whose headaches occurred unilaterally, investigators found retinal vascular parameters consistent with greater perfusion in the ipsilateral eye versus the contralateral eye. The significance of these findings remains unclear, Dr. Rapoport said, because circulatory findings revealed by CAT or MRI scans of patients with unilateral headaches are often normal or involve complex changes or mild edema on the side of the headache. The visual cortex on either side receives input from both eyes, he added.
Study Limitations
Authors acknowledged several study shortcomings. Most notably, COVID-19 restrictions resulted in a small sample size, and several patients (excluded from analysis) failed to return for repeat scans during migraine attacks. The study included patients with migraine attacks of varying frequency, and a handful of patients used acute rescue medications before undergoing ictal scans.
“If a future study corrected all these shortcomings,” Dr. Rapoport said, “the results might be more impressive and even more significant.” Based on these results alone, he said, it would be premature to pronounce OCTA-derived measurements of retinal perfusion and related parameters as future migraine biomarkers.
“But it’s a good start. If this hasn’t been done before, in quite this way, this is a very interesting study which, when repeated, should lead to even more significant findings.”
For now, the paper should remind practicing neurologists to dig deeper when patients complain of visual problems during migraine attacks. “It might be blurred vision for just 3 minutes,” he said. “Some patients may be calling it an aura, or the doctor may be thinking it is an aura because they’re not digging for further information in the history. We may now have a window into decreased retinal perfusion during a migraine attack and why some patients have blurred vision.”
The study was funded by the Amgen and the Baldwin Foundation. Dr. Rapoport is editor-in-chief of Neurology Reviews but reports no relevant relationships with the funders of this research.
study was published online in Headache.
Together, these changes could one day represent migraine biomarkers, authors say. The“We’re always looking for a biological marker for migraine,” said Alan M. Rapoport, MD, a clinical professor of neurology in the David Geffen School of Medicine at the University of California Los Angeles and past president of the International Headache Society. Researchers have identified many parameters that make people more likely to experience migraine, he said, but there remains no smoking gun. “We do not yet have a diagnostic test.”
Investigators have long been examining ocular vascular supply, added Dr. Rapoport, who was not involved with the study, because the eyes, visual system of the brain, and migraine are closely related. “But no one has ever figured out that one could use anything related to the eye as a definitive diagnostic test. This study was interesting because researchers used a very advanced technique to see if there are changes in the vascular supply to the eyeball during migraine.”
During Attacks
Study investigators prospectively enrolled 37 patients diagnosed with migraine with aura (MA), 30 with migraine without aura (MO), and 20 healthy controls. All subjects underwent macular OCTA for interictal analysis. A total of 20 patients with migraine (12 with MA and 8 with MO) underwent repeat scans during migraine attacks, and 5 control patients had repeat scans.
Compared with interictal measurements, significant parafoveal reductions in vessel flux index, an indicator of retinal perfusion, occurred in both the MA and MO groups during migraine attacks: –7% (95% CI, –10% to –4%; P = .006) and –7% (95% CI, –10% to –3%; P = .016), respectively, versus controls (2%, 95% CI, –3% to 7%).
The fact that migraine attacks resulted in reduced blood supply to the retinal region responsible for central vision is intriguing, said Dr. Rapoport, because sufficient reductions in blood supply there could result in blurred vision or other visual difficulties that might be mistaken for a true aura. “Many patients describe blurred vision related to their migraine headaches which do not usually qualify for an aura diagnosis,” he said.
Diagnostic criteria for MA, which afflicts around one third of people with migraine, include visual aberrations lasting at least 5 minutes and no more than 60 minutes. Visual aberrations average about 20-25 minutes, said Dr. Rapoport. “And we don’t usually accept blurred vision.” For most people who experience ictal blurred vision, he added, the phenomenon only lasts a short time and is not considered an aura.
More typical visual manifestations of MA include zigzag lines in an overall crescent shape that may blink, have bright edges, grow and shrink in size, and/or move across the visual field; patients also may have blind spots or distortions (e.g. far away vision, smaller or larger vision, or kaleidoscopic fractured vision). Nevertheless, said Dr. Rapoport, the study may shed light on why some people experiencing a migraine attack may suffer a brief bout of blurred vision and mistakenly report experiencing an aura.
Between Attacks
Comparing the two migraine groups interictally showed statistically significant differences in macular structure and function. Compared with the MO cohort, the MA cohort had higher circularity (mean [SD] 0.686 [0.088] vs. 0.629 [0.120] MO, P = .004), as well as a 13% (SD ± 10%, P = .003) lower foveal vessel flux index. “Not only is perfusion lower in both types of migraine during the attack,” said Dr. Rapoport, “but between attacks, people with MA had a lower blood supply to the retina than those who had MO.”
Unilateral Migraine
In a subset of patients (14 with MA and 12 with MO) whose headaches occurred unilaterally, investigators found retinal vascular parameters consistent with greater perfusion in the ipsilateral eye versus the contralateral eye. The significance of these findings remains unclear, Dr. Rapoport said, because circulatory findings revealed by CAT or MRI scans of patients with unilateral headaches are often normal or involve complex changes or mild edema on the side of the headache. The visual cortex on either side receives input from both eyes, he added.
Study Limitations
Authors acknowledged several study shortcomings. Most notably, COVID-19 restrictions resulted in a small sample size, and several patients (excluded from analysis) failed to return for repeat scans during migraine attacks. The study included patients with migraine attacks of varying frequency, and a handful of patients used acute rescue medications before undergoing ictal scans.
“If a future study corrected all these shortcomings,” Dr. Rapoport said, “the results might be more impressive and even more significant.” Based on these results alone, he said, it would be premature to pronounce OCTA-derived measurements of retinal perfusion and related parameters as future migraine biomarkers.
“But it’s a good start. If this hasn’t been done before, in quite this way, this is a very interesting study which, when repeated, should lead to even more significant findings.”
For now, the paper should remind practicing neurologists to dig deeper when patients complain of visual problems during migraine attacks. “It might be blurred vision for just 3 minutes,” he said. “Some patients may be calling it an aura, or the doctor may be thinking it is an aura because they’re not digging for further information in the history. We may now have a window into decreased retinal perfusion during a migraine attack and why some patients have blurred vision.”
The study was funded by the Amgen and the Baldwin Foundation. Dr. Rapoport is editor-in-chief of Neurology Reviews but reports no relevant relationships with the funders of this research.
study was published online in Headache.
Together, these changes could one day represent migraine biomarkers, authors say. The“We’re always looking for a biological marker for migraine,” said Alan M. Rapoport, MD, a clinical professor of neurology in the David Geffen School of Medicine at the University of California Los Angeles and past president of the International Headache Society. Researchers have identified many parameters that make people more likely to experience migraine, he said, but there remains no smoking gun. “We do not yet have a diagnostic test.”
Investigators have long been examining ocular vascular supply, added Dr. Rapoport, who was not involved with the study, because the eyes, visual system of the brain, and migraine are closely related. “But no one has ever figured out that one could use anything related to the eye as a definitive diagnostic test. This study was interesting because researchers used a very advanced technique to see if there are changes in the vascular supply to the eyeball during migraine.”
During Attacks
Study investigators prospectively enrolled 37 patients diagnosed with migraine with aura (MA), 30 with migraine without aura (MO), and 20 healthy controls. All subjects underwent macular OCTA for interictal analysis. A total of 20 patients with migraine (12 with MA and 8 with MO) underwent repeat scans during migraine attacks, and 5 control patients had repeat scans.
Compared with interictal measurements, significant parafoveal reductions in vessel flux index, an indicator of retinal perfusion, occurred in both the MA and MO groups during migraine attacks: –7% (95% CI, –10% to –4%; P = .006) and –7% (95% CI, –10% to –3%; P = .016), respectively, versus controls (2%, 95% CI, –3% to 7%).
The fact that migraine attacks resulted in reduced blood supply to the retinal region responsible for central vision is intriguing, said Dr. Rapoport, because sufficient reductions in blood supply there could result in blurred vision or other visual difficulties that might be mistaken for a true aura. “Many patients describe blurred vision related to their migraine headaches which do not usually qualify for an aura diagnosis,” he said.
Diagnostic criteria for MA, which afflicts around one third of people with migraine, include visual aberrations lasting at least 5 minutes and no more than 60 minutes. Visual aberrations average about 20-25 minutes, said Dr. Rapoport. “And we don’t usually accept blurred vision.” For most people who experience ictal blurred vision, he added, the phenomenon only lasts a short time and is not considered an aura.
More typical visual manifestations of MA include zigzag lines in an overall crescent shape that may blink, have bright edges, grow and shrink in size, and/or move across the visual field; patients also may have blind spots or distortions (e.g. far away vision, smaller or larger vision, or kaleidoscopic fractured vision). Nevertheless, said Dr. Rapoport, the study may shed light on why some people experiencing a migraine attack may suffer a brief bout of blurred vision and mistakenly report experiencing an aura.
Between Attacks
Comparing the two migraine groups interictally showed statistically significant differences in macular structure and function. Compared with the MO cohort, the MA cohort had higher circularity (mean [SD] 0.686 [0.088] vs. 0.629 [0.120] MO, P = .004), as well as a 13% (SD ± 10%, P = .003) lower foveal vessel flux index. “Not only is perfusion lower in both types of migraine during the attack,” said Dr. Rapoport, “but between attacks, people with MA had a lower blood supply to the retina than those who had MO.”
Unilateral Migraine
In a subset of patients (14 with MA and 12 with MO) whose headaches occurred unilaterally, investigators found retinal vascular parameters consistent with greater perfusion in the ipsilateral eye versus the contralateral eye. The significance of these findings remains unclear, Dr. Rapoport said, because circulatory findings revealed by CAT or MRI scans of patients with unilateral headaches are often normal or involve complex changes or mild edema on the side of the headache. The visual cortex on either side receives input from both eyes, he added.
Study Limitations
Authors acknowledged several study shortcomings. Most notably, COVID-19 restrictions resulted in a small sample size, and several patients (excluded from analysis) failed to return for repeat scans during migraine attacks. The study included patients with migraine attacks of varying frequency, and a handful of patients used acute rescue medications before undergoing ictal scans.
“If a future study corrected all these shortcomings,” Dr. Rapoport said, “the results might be more impressive and even more significant.” Based on these results alone, he said, it would be premature to pronounce OCTA-derived measurements of retinal perfusion and related parameters as future migraine biomarkers.
“But it’s a good start. If this hasn’t been done before, in quite this way, this is a very interesting study which, when repeated, should lead to even more significant findings.”
For now, the paper should remind practicing neurologists to dig deeper when patients complain of visual problems during migraine attacks. “It might be blurred vision for just 3 minutes,” he said. “Some patients may be calling it an aura, or the doctor may be thinking it is an aura because they’re not digging for further information in the history. We may now have a window into decreased retinal perfusion during a migraine attack and why some patients have blurred vision.”
The study was funded by the Amgen and the Baldwin Foundation. Dr. Rapoport is editor-in-chief of Neurology Reviews but reports no relevant relationships with the funders of this research.
FROM HEADACHE
Regular Physical Activity Linked to Larger Brain Volume
TOPLINE:
, new data suggest.
METHODOLOGY:
- The potential neuroprotective effects of regular physical activity on brain structure are unclear despite reported links between physical activity and reduced dementia risk.
- To investigate, researchers analyzed MRI brain scans from 10,125 healthy adults (mean age, 53 years; 52% male) who self-reported their level of physical activity.
- Moderate to vigorous physical activities, defined as those increasing respiration and pulse rate for at least 10 continuous minutes, was modeled with brain volumes, adjusting for covariates.
- The threshold for defining physically active (vs nonactive) adults was intentionally set at 2.5 days per week, a level far lower than current guidelines.
TAKEAWAY:
- Three quarters of the cohort reported engaging in moderate to vigorous physical activity approximately 4 days per week.
- Physically active adults tended to be younger, with a higher proportion of White individuals, and with lower rates of hypertension and type 2 diabetes.
- After adjusting for multiple factors, increased days of moderate to vigorous activity correlated with larger normalized brain volume in multiple regions including total gray matter; white matter; hippocampus; and frontal, parietal, and occipital lobes.
IN PRACTICE:
“We found that even moderate levels of physical activity, such as taking fewer than 4,000 steps a day, can have a positive effect on brain health. This is much less than the often-suggested 10,000 steps, making it a more achievable goal for many people,” co-author David Merrill, MD, with Pacific Brain Health Center, Santa Monica, California, said in a statement.
SOURCE:
The study, with first author Cyrus A. Raji, MD, PhD, Washington University School of Medicine, St. Louis, was published online in the Journal of Alzheimer’s Disease.
LIMITATIONS:
Participants self-reported physical activity in the past 2 weeks, which does not reflect a lifetime of activity levels. The correlation identified between physical activity and brain volumes may not be solely attributable to physical activity alone.
DISCLOSURES:
The study received funding from several health centers and foundations. Dr. Raji consults for Brainreader ApS, Neurevolution LLC, Apollo Health, Voxelwise Imaging Technology, and Pacific Neuroscience Foundation and is an editorial board member of the Journal of Alzheimer’s Disease but was not involved in the peer-review process.
A version of this article appeared on Medscape.com.
TOPLINE:
, new data suggest.
METHODOLOGY:
- The potential neuroprotective effects of regular physical activity on brain structure are unclear despite reported links between physical activity and reduced dementia risk.
- To investigate, researchers analyzed MRI brain scans from 10,125 healthy adults (mean age, 53 years; 52% male) who self-reported their level of physical activity.
- Moderate to vigorous physical activities, defined as those increasing respiration and pulse rate for at least 10 continuous minutes, was modeled with brain volumes, adjusting for covariates.
- The threshold for defining physically active (vs nonactive) adults was intentionally set at 2.5 days per week, a level far lower than current guidelines.
TAKEAWAY:
- Three quarters of the cohort reported engaging in moderate to vigorous physical activity approximately 4 days per week.
- Physically active adults tended to be younger, with a higher proportion of White individuals, and with lower rates of hypertension and type 2 diabetes.
- After adjusting for multiple factors, increased days of moderate to vigorous activity correlated with larger normalized brain volume in multiple regions including total gray matter; white matter; hippocampus; and frontal, parietal, and occipital lobes.
IN PRACTICE:
“We found that even moderate levels of physical activity, such as taking fewer than 4,000 steps a day, can have a positive effect on brain health. This is much less than the often-suggested 10,000 steps, making it a more achievable goal for many people,” co-author David Merrill, MD, with Pacific Brain Health Center, Santa Monica, California, said in a statement.
SOURCE:
The study, with first author Cyrus A. Raji, MD, PhD, Washington University School of Medicine, St. Louis, was published online in the Journal of Alzheimer’s Disease.
LIMITATIONS:
Participants self-reported physical activity in the past 2 weeks, which does not reflect a lifetime of activity levels. The correlation identified between physical activity and brain volumes may not be solely attributable to physical activity alone.
DISCLOSURES:
The study received funding from several health centers and foundations. Dr. Raji consults for Brainreader ApS, Neurevolution LLC, Apollo Health, Voxelwise Imaging Technology, and Pacific Neuroscience Foundation and is an editorial board member of the Journal of Alzheimer’s Disease but was not involved in the peer-review process.
A version of this article appeared on Medscape.com.
TOPLINE:
, new data suggest.
METHODOLOGY:
- The potential neuroprotective effects of regular physical activity on brain structure are unclear despite reported links between physical activity and reduced dementia risk.
- To investigate, researchers analyzed MRI brain scans from 10,125 healthy adults (mean age, 53 years; 52% male) who self-reported their level of physical activity.
- Moderate to vigorous physical activities, defined as those increasing respiration and pulse rate for at least 10 continuous minutes, was modeled with brain volumes, adjusting for covariates.
- The threshold for defining physically active (vs nonactive) adults was intentionally set at 2.5 days per week, a level far lower than current guidelines.
TAKEAWAY:
- Three quarters of the cohort reported engaging in moderate to vigorous physical activity approximately 4 days per week.
- Physically active adults tended to be younger, with a higher proportion of White individuals, and with lower rates of hypertension and type 2 diabetes.
- After adjusting for multiple factors, increased days of moderate to vigorous activity correlated with larger normalized brain volume in multiple regions including total gray matter; white matter; hippocampus; and frontal, parietal, and occipital lobes.
IN PRACTICE:
“We found that even moderate levels of physical activity, such as taking fewer than 4,000 steps a day, can have a positive effect on brain health. This is much less than the often-suggested 10,000 steps, making it a more achievable goal for many people,” co-author David Merrill, MD, with Pacific Brain Health Center, Santa Monica, California, said in a statement.
SOURCE:
The study, with first author Cyrus A. Raji, MD, PhD, Washington University School of Medicine, St. Louis, was published online in the Journal of Alzheimer’s Disease.
LIMITATIONS:
Participants self-reported physical activity in the past 2 weeks, which does not reflect a lifetime of activity levels. The correlation identified between physical activity and brain volumes may not be solely attributable to physical activity alone.
DISCLOSURES:
The study received funding from several health centers and foundations. Dr. Raji consults for Brainreader ApS, Neurevolution LLC, Apollo Health, Voxelwise Imaging Technology, and Pacific Neuroscience Foundation and is an editorial board member of the Journal of Alzheimer’s Disease but was not involved in the peer-review process.
A version of this article appeared on Medscape.com.
H pylori Infection Linked to Increased Alzheimer’s Risk
TOPLINE:
results of a large and lengthy population-based study suggest.
METHODOLOGY:
- Researchers identified all cases with a first-time diagnosis of AD and matched each AD case to up to 40 AD-free control cases on the basis of age, sex, cohort entry date, and duration of follow-up.
- The exposure of interest was CAHPI, defined based on an algorithm using clinical guidelines and recommendations on the management of H pylori (HP) infection, with researchers focusing on infected individuals presenting with symptoms or developing serious complications from the infection.
- Researchers performed several sensitivity analyses, which included repeating the primary analysis using alternate lag periods, restricting the cohort to participants with AD (not vascular, alcoholic, and unspecified dementia), and using salmonellosis, an infection not previously associated with AD, as a negative control exposure.
TAKEAWAY:
- Compared with no exposure to CAHPI, exposure to CAHPI was associated with a moderately increased risk for AD (odds ratio [OR], 1.11; 95% CI, 1.01-1.21), with no major effect modification by demographics or socioeconomic status.
- The increased risk peaked 7.3-10.8 years after CAHPI onset (OR, 1.24; 95% CI, 1.05-1.47) before decreasing.
- Sensitivity analyses yielded findings that were overall consistent with those of the primary analysis.
- The analysis with salmonellosis as a negative control exposure showed no association with the risk for AD (OR, 1.03; 95% CI, 0.82-1.29).
IN PRACTICE:
“These results support the notion of HP infection as a potential modifiable risk factor of AD” and “pave the way for future randomized controlled trials that would assess the impact and cost-effectiveness of population-based targeted interventions such as individualized HP eradication programs, on the development of AD,” the authors write.
SOURCE:
The study was conducted by Antonios Douros, Department of Medicine, and Department of Epidemiology, Biostatistics, and Occupational Health, McGill University, Montreal, Quebec, Canada, and colleagues. It was published online in Alzheimer’s & Dementia.
LIMITATIONS:
Given the observational nature of the study, residual confounding is possible. Because the exposure definition was on the basis of CAHPI recorded by general practitioners, exposure misclassification due to symptomatic patients not seeking primary care is possible, as is outcome misclassification. The authors can’t rule out the possibility of an association between asymptomatic H pylori infection and AD risk.
DISCLOSURES:
The study received funding from the Canadian Institutes of Health Research. Douros has no relevant conflicts of interest; see paper for disclosures of other authors.
Pauline Anderson has disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
TOPLINE:
results of a large and lengthy population-based study suggest.
METHODOLOGY:
- Researchers identified all cases with a first-time diagnosis of AD and matched each AD case to up to 40 AD-free control cases on the basis of age, sex, cohort entry date, and duration of follow-up.
- The exposure of interest was CAHPI, defined based on an algorithm using clinical guidelines and recommendations on the management of H pylori (HP) infection, with researchers focusing on infected individuals presenting with symptoms or developing serious complications from the infection.
- Researchers performed several sensitivity analyses, which included repeating the primary analysis using alternate lag periods, restricting the cohort to participants with AD (not vascular, alcoholic, and unspecified dementia), and using salmonellosis, an infection not previously associated with AD, as a negative control exposure.
TAKEAWAY:
- Compared with no exposure to CAHPI, exposure to CAHPI was associated with a moderately increased risk for AD (odds ratio [OR], 1.11; 95% CI, 1.01-1.21), with no major effect modification by demographics or socioeconomic status.
- The increased risk peaked 7.3-10.8 years after CAHPI onset (OR, 1.24; 95% CI, 1.05-1.47) before decreasing.
- Sensitivity analyses yielded findings that were overall consistent with those of the primary analysis.
- The analysis with salmonellosis as a negative control exposure showed no association with the risk for AD (OR, 1.03; 95% CI, 0.82-1.29).
IN PRACTICE:
“These results support the notion of HP infection as a potential modifiable risk factor of AD” and “pave the way for future randomized controlled trials that would assess the impact and cost-effectiveness of population-based targeted interventions such as individualized HP eradication programs, on the development of AD,” the authors write.
SOURCE:
The study was conducted by Antonios Douros, Department of Medicine, and Department of Epidemiology, Biostatistics, and Occupational Health, McGill University, Montreal, Quebec, Canada, and colleagues. It was published online in Alzheimer’s & Dementia.
LIMITATIONS:
Given the observational nature of the study, residual confounding is possible. Because the exposure definition was on the basis of CAHPI recorded by general practitioners, exposure misclassification due to symptomatic patients not seeking primary care is possible, as is outcome misclassification. The authors can’t rule out the possibility of an association between asymptomatic H pylori infection and AD risk.
DISCLOSURES:
The study received funding from the Canadian Institutes of Health Research. Douros has no relevant conflicts of interest; see paper for disclosures of other authors.
Pauline Anderson has disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
TOPLINE:
results of a large and lengthy population-based study suggest.
METHODOLOGY:
- Researchers identified all cases with a first-time diagnosis of AD and matched each AD case to up to 40 AD-free control cases on the basis of age, sex, cohort entry date, and duration of follow-up.
- The exposure of interest was CAHPI, defined based on an algorithm using clinical guidelines and recommendations on the management of H pylori (HP) infection, with researchers focusing on infected individuals presenting with symptoms or developing serious complications from the infection.
- Researchers performed several sensitivity analyses, which included repeating the primary analysis using alternate lag periods, restricting the cohort to participants with AD (not vascular, alcoholic, and unspecified dementia), and using salmonellosis, an infection not previously associated with AD, as a negative control exposure.
TAKEAWAY:
- Compared with no exposure to CAHPI, exposure to CAHPI was associated with a moderately increased risk for AD (odds ratio [OR], 1.11; 95% CI, 1.01-1.21), with no major effect modification by demographics or socioeconomic status.
- The increased risk peaked 7.3-10.8 years after CAHPI onset (OR, 1.24; 95% CI, 1.05-1.47) before decreasing.
- Sensitivity analyses yielded findings that were overall consistent with those of the primary analysis.
- The analysis with salmonellosis as a negative control exposure showed no association with the risk for AD (OR, 1.03; 95% CI, 0.82-1.29).
IN PRACTICE:
“These results support the notion of HP infection as a potential modifiable risk factor of AD” and “pave the way for future randomized controlled trials that would assess the impact and cost-effectiveness of population-based targeted interventions such as individualized HP eradication programs, on the development of AD,” the authors write.
SOURCE:
The study was conducted by Antonios Douros, Department of Medicine, and Department of Epidemiology, Biostatistics, and Occupational Health, McGill University, Montreal, Quebec, Canada, and colleagues. It was published online in Alzheimer’s & Dementia.
LIMITATIONS:
Given the observational nature of the study, residual confounding is possible. Because the exposure definition was on the basis of CAHPI recorded by general practitioners, exposure misclassification due to symptomatic patients not seeking primary care is possible, as is outcome misclassification. The authors can’t rule out the possibility of an association between asymptomatic H pylori infection and AD risk.
DISCLOSURES:
The study received funding from the Canadian Institutes of Health Research. Douros has no relevant conflicts of interest; see paper for disclosures of other authors.
Pauline Anderson has disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
Walking Fast May Help Prevent Type 2 Diabetes
Walking is a simple, cost-free form of exercise that benefits physical, social, and mental health in many ways. Several clinical trials have shown that walking regularly is associated with a lower risk for cardiovascular events and all-cause mortality, and having a higher daily step count is linked to a decreased risk for premature death.
Walking and Diabetes
In recent years, the link between walking speed and the risk for multiple health problems has sparked keen interest. Data suggest that a faster walking pace may have a greater physiological response and may be associated with more favorable health advantages than a slow walking pace. A previous meta-analysis of eight cohort studies suggested that individuals in the fastest walking-pace category (median = 5.6 km/h) had a 44% lower risk for stroke than those in the slowest walking-pace category (median = 1.6 km/h). The risk for the former decreased by 13% for every 1 km/h increment in baseline walking pace.
Type 2 diabetes (T2D) is one of the most common metabolic diseases in the world. People with this type of diabetes have an increased risk for microvascular and macrovascular complications and a shorter life expectancy. Approximately 537 million adults are estimated to be living with diabetes worldwide, and this number is expected to reach 783 million by 2045.
Physical activity is an essential component of T2D prevention programs and can favorably affect blood sugar control. A meta-analysis of cohort studies showed that being physically active was associated with a 35% reduction in the risk of acquiring T2D in the general population, and regular walking was associated with a 15% reduction in the risk of developing T2D.
However, no studies have investigated the link between different walking speeds and the risk for T2D. A team from the Research Center at the Semnan University of Medical Sciences in Iran carried out a systematic review of the association between walking speed and the risk of developing T2D in adults; this review was published in the British Journal of Sports Medicine.
10 Cohort Studies
This systematic review used publications (1999-2022) available in the usual data sources (PubMed, Scopus, CENTRAL, and Web of Science). Random-effects meta-analyses were used to calculate relative risk (RR) and risk difference (RD) based on different walking speeds. The researchers rated the credibility of subgroup differences and the certainty of evidence using the Instrument to assess the Credibility of Effect Modification ANalyses (ICEMAN) and Grading of Recommendations Assessment, Development, and Evaluation (GRADE) tools, respectively.
Of the 508,121 potential participants, 18,410 adults from 10 prospective cohort studies conducted in the United States, Japan, and the United Kingdom were deemed eligible. The proportion of women was between 52% and 73%, depending on the cohort. Follow-up duration varied from 3 to 11.1 years (median, 8 years).
Five cohort studies measured walking speed using stopwatch testing, while the other five used self-assessed questionnaires. To define cases of T2D, seven studies used objective methods such as blood glucose measurement or linkage with medical records, and in three cohorts, self-assessment questionnaires were used (these were checked against patient records). All studies controlled age, sex, and tobacco consumption in the multivariate analyses, and some controlled just alcohol consumption, blood pressure, total physical activity volume, body mass index, time spent walking or daily step count, and a family history of diabetes.
The Right Speed
The authors first categorized walking speed into four prespecified levels: Easy or casual (< 2 mph or 3.2 km/h), average or normal (2-3 mph or 3.2-4.8 km/h), fairly brisk (3-4 mph or 4.8-6.4 km/h), and very brisk or brisk/striding (> 4 mph or > 6.4 km/h).
Four cohort studies with 6,520 cases of T2D among 160,321 participants reported information on average or normal walking. Participants with average or normal walking were at a 15% lower risk for T2D than those with easy or casual walking (RR = 0.85 [95% CI, 0.70-1.00]; RD = 0.86 [1.72-0]). Ten cohort studies with 18,410 cases among 508,121 participants reported information on fairly brisk walking. Those with fairly brisk walking were at a 24% lower risk for T2D than those with easy or casual walking (RR = 0.76 [0.65-0.87]; I2 = 90%; RD = 1.38 [2.01-0.75]).
There was no significant or credible subgroup difference by adjustment for the total physical activity or time spent walking per day. The dose-response analysis suggested that the risk for T2D decreased significantly at a walking speed of 4 km/h and above.
Study Limitations
This meta-analysis has strengths that may increase the generalizability of its results. The researchers included cohort studies, which allowed them to consider the temporal sequence of exposure and outcome. Cohort studies are less affected by recall and selection biases compared with retrospective case–control studies, which increase the likelihood of causality. The researchers also assessed the credibility of subgroup differences using the recently developed ICEMAN tool, calculated both relative and absolute risks, and rated the certainty of evidence using the GRADE approach.
Some shortcomings must be considered. Most of the studies included in the present review were rated as having a serious risk for bias, with the most important biases resulting from inadequate adjustment for potential confounders and the methods used for walking speed assessment and diagnosis of T2D. In addition, the findings could have been subject to reverse causality bias because participants with faster walking speed are more likely to perform more physical activity and have better cardiorespiratory fitness, greater muscle mass, and better health status. However, the subgroup analyses of fairly brisk and brisk/striding walking indicated that there were no significant subgroup differences by follow-up duration and that the significant inverse associations remained stable in the subgroup of cohort studies with a follow-up duration of > 10 years.
The authors concluded that While current strategies to increase total walking time are beneficial, it may also be reasonable to encourage people to walk at faster speeds to further increase the health benefits of walking.”
This article was translated from JIM, which is part of the Medscape Professional Network. A version of this article appeared on Medscape.com.
Walking is a simple, cost-free form of exercise that benefits physical, social, and mental health in many ways. Several clinical trials have shown that walking regularly is associated with a lower risk for cardiovascular events and all-cause mortality, and having a higher daily step count is linked to a decreased risk for premature death.
Walking and Diabetes
In recent years, the link between walking speed and the risk for multiple health problems has sparked keen interest. Data suggest that a faster walking pace may have a greater physiological response and may be associated with more favorable health advantages than a slow walking pace. A previous meta-analysis of eight cohort studies suggested that individuals in the fastest walking-pace category (median = 5.6 km/h) had a 44% lower risk for stroke than those in the slowest walking-pace category (median = 1.6 km/h). The risk for the former decreased by 13% for every 1 km/h increment in baseline walking pace.
Type 2 diabetes (T2D) is one of the most common metabolic diseases in the world. People with this type of diabetes have an increased risk for microvascular and macrovascular complications and a shorter life expectancy. Approximately 537 million adults are estimated to be living with diabetes worldwide, and this number is expected to reach 783 million by 2045.
Physical activity is an essential component of T2D prevention programs and can favorably affect blood sugar control. A meta-analysis of cohort studies showed that being physically active was associated with a 35% reduction in the risk of acquiring T2D in the general population, and regular walking was associated with a 15% reduction in the risk of developing T2D.
However, no studies have investigated the link between different walking speeds and the risk for T2D. A team from the Research Center at the Semnan University of Medical Sciences in Iran carried out a systematic review of the association between walking speed and the risk of developing T2D in adults; this review was published in the British Journal of Sports Medicine.
10 Cohort Studies
This systematic review used publications (1999-2022) available in the usual data sources (PubMed, Scopus, CENTRAL, and Web of Science). Random-effects meta-analyses were used to calculate relative risk (RR) and risk difference (RD) based on different walking speeds. The researchers rated the credibility of subgroup differences and the certainty of evidence using the Instrument to assess the Credibility of Effect Modification ANalyses (ICEMAN) and Grading of Recommendations Assessment, Development, and Evaluation (GRADE) tools, respectively.
Of the 508,121 potential participants, 18,410 adults from 10 prospective cohort studies conducted in the United States, Japan, and the United Kingdom were deemed eligible. The proportion of women was between 52% and 73%, depending on the cohort. Follow-up duration varied from 3 to 11.1 years (median, 8 years).
Five cohort studies measured walking speed using stopwatch testing, while the other five used self-assessed questionnaires. To define cases of T2D, seven studies used objective methods such as blood glucose measurement or linkage with medical records, and in three cohorts, self-assessment questionnaires were used (these were checked against patient records). All studies controlled age, sex, and tobacco consumption in the multivariate analyses, and some controlled just alcohol consumption, blood pressure, total physical activity volume, body mass index, time spent walking or daily step count, and a family history of diabetes.
The Right Speed
The authors first categorized walking speed into four prespecified levels: Easy or casual (< 2 mph or 3.2 km/h), average or normal (2-3 mph or 3.2-4.8 km/h), fairly brisk (3-4 mph or 4.8-6.4 km/h), and very brisk or brisk/striding (> 4 mph or > 6.4 km/h).
Four cohort studies with 6,520 cases of T2D among 160,321 participants reported information on average or normal walking. Participants with average or normal walking were at a 15% lower risk for T2D than those with easy or casual walking (RR = 0.85 [95% CI, 0.70-1.00]; RD = 0.86 [1.72-0]). Ten cohort studies with 18,410 cases among 508,121 participants reported information on fairly brisk walking. Those with fairly brisk walking were at a 24% lower risk for T2D than those with easy or casual walking (RR = 0.76 [0.65-0.87]; I2 = 90%; RD = 1.38 [2.01-0.75]).
There was no significant or credible subgroup difference by adjustment for the total physical activity or time spent walking per day. The dose-response analysis suggested that the risk for T2D decreased significantly at a walking speed of 4 km/h and above.
Study Limitations
This meta-analysis has strengths that may increase the generalizability of its results. The researchers included cohort studies, which allowed them to consider the temporal sequence of exposure and outcome. Cohort studies are less affected by recall and selection biases compared with retrospective case–control studies, which increase the likelihood of causality. The researchers also assessed the credibility of subgroup differences using the recently developed ICEMAN tool, calculated both relative and absolute risks, and rated the certainty of evidence using the GRADE approach.
Some shortcomings must be considered. Most of the studies included in the present review were rated as having a serious risk for bias, with the most important biases resulting from inadequate adjustment for potential confounders and the methods used for walking speed assessment and diagnosis of T2D. In addition, the findings could have been subject to reverse causality bias because participants with faster walking speed are more likely to perform more physical activity and have better cardiorespiratory fitness, greater muscle mass, and better health status. However, the subgroup analyses of fairly brisk and brisk/striding walking indicated that there were no significant subgroup differences by follow-up duration and that the significant inverse associations remained stable in the subgroup of cohort studies with a follow-up duration of > 10 years.
The authors concluded that While current strategies to increase total walking time are beneficial, it may also be reasonable to encourage people to walk at faster speeds to further increase the health benefits of walking.”
This article was translated from JIM, which is part of the Medscape Professional Network. A version of this article appeared on Medscape.com.
Walking is a simple, cost-free form of exercise that benefits physical, social, and mental health in many ways. Several clinical trials have shown that walking regularly is associated with a lower risk for cardiovascular events and all-cause mortality, and having a higher daily step count is linked to a decreased risk for premature death.
Walking and Diabetes
In recent years, the link between walking speed and the risk for multiple health problems has sparked keen interest. Data suggest that a faster walking pace may have a greater physiological response and may be associated with more favorable health advantages than a slow walking pace. A previous meta-analysis of eight cohort studies suggested that individuals in the fastest walking-pace category (median = 5.6 km/h) had a 44% lower risk for stroke than those in the slowest walking-pace category (median = 1.6 km/h). The risk for the former decreased by 13% for every 1 km/h increment in baseline walking pace.
Type 2 diabetes (T2D) is one of the most common metabolic diseases in the world. People with this type of diabetes have an increased risk for microvascular and macrovascular complications and a shorter life expectancy. Approximately 537 million adults are estimated to be living with diabetes worldwide, and this number is expected to reach 783 million by 2045.
Physical activity is an essential component of T2D prevention programs and can favorably affect blood sugar control. A meta-analysis of cohort studies showed that being physically active was associated with a 35% reduction in the risk of acquiring T2D in the general population, and regular walking was associated with a 15% reduction in the risk of developing T2D.
However, no studies have investigated the link between different walking speeds and the risk for T2D. A team from the Research Center at the Semnan University of Medical Sciences in Iran carried out a systematic review of the association between walking speed and the risk of developing T2D in adults; this review was published in the British Journal of Sports Medicine.
10 Cohort Studies
This systematic review used publications (1999-2022) available in the usual data sources (PubMed, Scopus, CENTRAL, and Web of Science). Random-effects meta-analyses were used to calculate relative risk (RR) and risk difference (RD) based on different walking speeds. The researchers rated the credibility of subgroup differences and the certainty of evidence using the Instrument to assess the Credibility of Effect Modification ANalyses (ICEMAN) and Grading of Recommendations Assessment, Development, and Evaluation (GRADE) tools, respectively.
Of the 508,121 potential participants, 18,410 adults from 10 prospective cohort studies conducted in the United States, Japan, and the United Kingdom were deemed eligible. The proportion of women was between 52% and 73%, depending on the cohort. Follow-up duration varied from 3 to 11.1 years (median, 8 years).
Five cohort studies measured walking speed using stopwatch testing, while the other five used self-assessed questionnaires. To define cases of T2D, seven studies used objective methods such as blood glucose measurement or linkage with medical records, and in three cohorts, self-assessment questionnaires were used (these were checked against patient records). All studies controlled age, sex, and tobacco consumption in the multivariate analyses, and some controlled just alcohol consumption, blood pressure, total physical activity volume, body mass index, time spent walking or daily step count, and a family history of diabetes.
The Right Speed
The authors first categorized walking speed into four prespecified levels: Easy or casual (< 2 mph or 3.2 km/h), average or normal (2-3 mph or 3.2-4.8 km/h), fairly brisk (3-4 mph or 4.8-6.4 km/h), and very brisk or brisk/striding (> 4 mph or > 6.4 km/h).
Four cohort studies with 6,520 cases of T2D among 160,321 participants reported information on average or normal walking. Participants with average or normal walking were at a 15% lower risk for T2D than those with easy or casual walking (RR = 0.85 [95% CI, 0.70-1.00]; RD = 0.86 [1.72-0]). Ten cohort studies with 18,410 cases among 508,121 participants reported information on fairly brisk walking. Those with fairly brisk walking were at a 24% lower risk for T2D than those with easy or casual walking (RR = 0.76 [0.65-0.87]; I2 = 90%; RD = 1.38 [2.01-0.75]).
There was no significant or credible subgroup difference by adjustment for the total physical activity or time spent walking per day. The dose-response analysis suggested that the risk for T2D decreased significantly at a walking speed of 4 km/h and above.
Study Limitations
This meta-analysis has strengths that may increase the generalizability of its results. The researchers included cohort studies, which allowed them to consider the temporal sequence of exposure and outcome. Cohort studies are less affected by recall and selection biases compared with retrospective case–control studies, which increase the likelihood of causality. The researchers also assessed the credibility of subgroup differences using the recently developed ICEMAN tool, calculated both relative and absolute risks, and rated the certainty of evidence using the GRADE approach.
Some shortcomings must be considered. Most of the studies included in the present review were rated as having a serious risk for bias, with the most important biases resulting from inadequate adjustment for potential confounders and the methods used for walking speed assessment and diagnosis of T2D. In addition, the findings could have been subject to reverse causality bias because participants with faster walking speed are more likely to perform more physical activity and have better cardiorespiratory fitness, greater muscle mass, and better health status. However, the subgroup analyses of fairly brisk and brisk/striding walking indicated that there were no significant subgroup differences by follow-up duration and that the significant inverse associations remained stable in the subgroup of cohort studies with a follow-up duration of > 10 years.
The authors concluded that While current strategies to increase total walking time are beneficial, it may also be reasonable to encourage people to walk at faster speeds to further increase the health benefits of walking.”
This article was translated from JIM, which is part of the Medscape Professional Network. A version of this article appeared on Medscape.com.
FROM THE BRITISH JOURNAL OF SPORTS MEDICINE
Hearing Aids and Dementia Risk Study Retracted
The study was published April 13 in The Lancet Public Health and reported at that time. It was retracted by the journal on December 12.
According to the retraction notice, the journal editors in late November were informed by the authors of the paper that an error was introduced in the output format setting of their SAS codes, which led to data for people with hearing loss using hearing aids and those with hearing loss without using hearing aids being switched.
This led to errors in their analysis, “which render their findings and conclusions false and misleading,” the retraction notice states.
These errors were identified by the researchers following an exchange with scientists seeking to reproduce the authors’ findings.In a statement, The Lancet Group said it “takes issues relating to research integrity extremely seriously” and follows best-practice guidance from the Committee on Publication Ethics (COPE) and the International Committee of Medical Journal Editors (ICMJE).
“Retractions are a rare but important part of the publishing process, and we are grateful to the scientists who prompted the re-examination of the data,” the statement reads.
Despite the retraction, other studies have suggested a link between hearing and dementia.
One study of US Medicare beneficiaries found a 61% higher dementia prevalence in those with moderate to severe hearing loss compared to those with normal hearing.
In this research, even mild hearing loss was associated with increased dementia risk, although it was not statistically significant, and use of hearing aids was tied to a 32% decrease in dementia prevalence.
In addition, a large meta-analysis showed that hearing aids significantly reduce the risk for cognitive decline and dementia and even improve short-term cognitive function in individuals with hearing loss.
A version of this article appeared on Medscape.com.
The study was published April 13 in The Lancet Public Health and reported at that time. It was retracted by the journal on December 12.
According to the retraction notice, the journal editors in late November were informed by the authors of the paper that an error was introduced in the output format setting of their SAS codes, which led to data for people with hearing loss using hearing aids and those with hearing loss without using hearing aids being switched.
This led to errors in their analysis, “which render their findings and conclusions false and misleading,” the retraction notice states.
These errors were identified by the researchers following an exchange with scientists seeking to reproduce the authors’ findings.In a statement, The Lancet Group said it “takes issues relating to research integrity extremely seriously” and follows best-practice guidance from the Committee on Publication Ethics (COPE) and the International Committee of Medical Journal Editors (ICMJE).
“Retractions are a rare but important part of the publishing process, and we are grateful to the scientists who prompted the re-examination of the data,” the statement reads.
Despite the retraction, other studies have suggested a link between hearing and dementia.
One study of US Medicare beneficiaries found a 61% higher dementia prevalence in those with moderate to severe hearing loss compared to those with normal hearing.
In this research, even mild hearing loss was associated with increased dementia risk, although it was not statistically significant, and use of hearing aids was tied to a 32% decrease in dementia prevalence.
In addition, a large meta-analysis showed that hearing aids significantly reduce the risk for cognitive decline and dementia and even improve short-term cognitive function in individuals with hearing loss.
A version of this article appeared on Medscape.com.
The study was published April 13 in The Lancet Public Health and reported at that time. It was retracted by the journal on December 12.
According to the retraction notice, the journal editors in late November were informed by the authors of the paper that an error was introduced in the output format setting of their SAS codes, which led to data for people with hearing loss using hearing aids and those with hearing loss without using hearing aids being switched.
This led to errors in their analysis, “which render their findings and conclusions false and misleading,” the retraction notice states.
These errors were identified by the researchers following an exchange with scientists seeking to reproduce the authors’ findings.In a statement, The Lancet Group said it “takes issues relating to research integrity extremely seriously” and follows best-practice guidance from the Committee on Publication Ethics (COPE) and the International Committee of Medical Journal Editors (ICMJE).
“Retractions are a rare but important part of the publishing process, and we are grateful to the scientists who prompted the re-examination of the data,” the statement reads.
Despite the retraction, other studies have suggested a link between hearing and dementia.
One study of US Medicare beneficiaries found a 61% higher dementia prevalence in those with moderate to severe hearing loss compared to those with normal hearing.
In this research, even mild hearing loss was associated with increased dementia risk, although it was not statistically significant, and use of hearing aids was tied to a 32% decrease in dementia prevalence.
In addition, a large meta-analysis showed that hearing aids significantly reduce the risk for cognitive decline and dementia and even improve short-term cognitive function in individuals with hearing loss.
A version of this article appeared on Medscape.com.
FROM THE LANCET PUBLIC HEALTH
GLP-1 RAs Associated With Reduced Colorectal Cancer Risk in Patients With Type 2 Diabetes
analysis.
In particular, GLP-1 RAs were associated with decreased risk compared with other antidiabetic treatments, including insulin, metformin, sodium-glucose cotransporter 2 (SGLT2) inhibitors, sulfonylureas, and thiazolidinediones.
More profound effects were seen in patients with overweight or obesity, “suggesting a potential protective effect against CRC partially mediated by weight loss and other mechanisms related to weight loss,” Lindsey Wang, an undergraduate student at Case Western Reserve University, Cleveland, Ohio, and colleagues wrote in JAMA Oncology.
Testing Treatments
GLP-1 RAs, usually given by injection, are approved by the US Food and Drug Administration to treat type 2 diabetes. They can lower blood sugar levels, improve insulin sensitivity, and help patients manage their weight.
Diabetes, overweight, and obesity are known risk factors for CRC and make prognosis worse. Ms. Wang and colleagues hypothesized that GLP-1 RAs might reduce CRC risk compared with other antidiabetics, including metformin and insulin, which have also been shown to reduce CRC risk.
Using a national database of more than 101 million electronic health records, Ms. Wang and colleagues conducted a population-based study of more than 1.2 million patients who had medical encounters for type 2 diabetes and were subsequently prescribed antidiabetic medications between 2005 and 2019. The patients had no prior antidiabetic medication use nor CRC diagnosis.
The researchers analyzed the effects of GLP-1 RAs on CRC incidence compared with the other prescribed antidiabetic drugs, matching for demographics, adverse socioeconomic determinants of health, preexisting medical conditions, family and personal history of cancers and colonic polyps, lifestyle factors, and procedures such as colonoscopy.
During a 15-year follow-up, GLP-1 RAs were associated with decreased risk for CRC compared with insulin (hazard ratio [HR], 0.56), metformin (HR, 0.75), SGLT2 inhibitors (HR, 0.77), sulfonylureas (HR, 0.82), and thiazolidinediones (HR, 0.82) in the overall study population.
For instance, among 22,572 patients who took insulin, 167 cases of CRC occurred, compared with 94 cases among the matched GLP-1 RA cohort. Among 18,518 patients who took metformin, 153 cases of CRC occurred compared with 96 cases among the matched GLP-1 RA cohort.
GLP-1 RAs also were associated with lower but not statistically significant risk than alpha-glucosidase inhibitors (HR, 0.59) and dipeptidyl-peptidase-4 (DPP-4) inhibitors (HR, 0.93).
In patients with overweight or obesity, GLP-1 RAs were associated with a lower risk for CRC than most of the other antidiabetics, including insulin (HR, 0.5), metformin (HR, 0.58), SGLT2 inhibitors (HR, 0.68), sulfonylureas (HR, 0.63), thiazolidinediones (HR, 0.73), and DPP-4 inhibitors (HR, 0.77).
Consistent findings were observed in women and men.
“Our results clearly demonstrate that GLP-1 RAs are significantly more effective than popular antidiabetic drugs, such as metformin or insulin, at preventing the development of CRC,” said Nathan Berger, MD, co-lead researcher, professor of experimental medicine, and member of the Case Comprehensive Cancer Center.
Targets for Future Research
Study limitations include potential unmeasured or uncontrolled confounders, self-selection, reverse causality, and other biases involved in observational studies, the research team noted.
Further research is warranted to investigate the effects in patients with prior antidiabetic treatments, underlying mechanisms, potential variation in effects among different GLP-1 RAs, and the potential of GLP-1 RAs to reduce the risks for other obesity-associated cancers, the researchers wrote.
“To our knowledge, this is the first indication this popular weight loss and antidiabetic class of drugs reduces incidence of CRC, relative to other antidiabetic agents,” said Rong Xu, PhD, co-lead researcher, professor of medicine, and member of the Case Comprehensive Cancer Center.
The study was supported by the National Cancer Institute Case Comprehensive Cancer Center, American Cancer Society, Landon Foundation-American Association for Cancer Research, National Institutes of Health Director’s New Innovator Award Program, National Institute on Aging, and National Institute on Alcohol Abuse and Alcoholism. Several authors reported grants from the National Institutes of Health during the conduct of the study.
A version of this article appeared on Medscape.com.
analysis.
In particular, GLP-1 RAs were associated with decreased risk compared with other antidiabetic treatments, including insulin, metformin, sodium-glucose cotransporter 2 (SGLT2) inhibitors, sulfonylureas, and thiazolidinediones.
More profound effects were seen in patients with overweight or obesity, “suggesting a potential protective effect against CRC partially mediated by weight loss and other mechanisms related to weight loss,” Lindsey Wang, an undergraduate student at Case Western Reserve University, Cleveland, Ohio, and colleagues wrote in JAMA Oncology.
Testing Treatments
GLP-1 RAs, usually given by injection, are approved by the US Food and Drug Administration to treat type 2 diabetes. They can lower blood sugar levels, improve insulin sensitivity, and help patients manage their weight.
Diabetes, overweight, and obesity are known risk factors for CRC and make prognosis worse. Ms. Wang and colleagues hypothesized that GLP-1 RAs might reduce CRC risk compared with other antidiabetics, including metformin and insulin, which have also been shown to reduce CRC risk.
Using a national database of more than 101 million electronic health records, Ms. Wang and colleagues conducted a population-based study of more than 1.2 million patients who had medical encounters for type 2 diabetes and were subsequently prescribed antidiabetic medications between 2005 and 2019. The patients had no prior antidiabetic medication use nor CRC diagnosis.
The researchers analyzed the effects of GLP-1 RAs on CRC incidence compared with the other prescribed antidiabetic drugs, matching for demographics, adverse socioeconomic determinants of health, preexisting medical conditions, family and personal history of cancers and colonic polyps, lifestyle factors, and procedures such as colonoscopy.
During a 15-year follow-up, GLP-1 RAs were associated with decreased risk for CRC compared with insulin (hazard ratio [HR], 0.56), metformin (HR, 0.75), SGLT2 inhibitors (HR, 0.77), sulfonylureas (HR, 0.82), and thiazolidinediones (HR, 0.82) in the overall study population.
For instance, among 22,572 patients who took insulin, 167 cases of CRC occurred, compared with 94 cases among the matched GLP-1 RA cohort. Among 18,518 patients who took metformin, 153 cases of CRC occurred compared with 96 cases among the matched GLP-1 RA cohort.
GLP-1 RAs also were associated with lower but not statistically significant risk than alpha-glucosidase inhibitors (HR, 0.59) and dipeptidyl-peptidase-4 (DPP-4) inhibitors (HR, 0.93).
In patients with overweight or obesity, GLP-1 RAs were associated with a lower risk for CRC than most of the other antidiabetics, including insulin (HR, 0.5), metformin (HR, 0.58), SGLT2 inhibitors (HR, 0.68), sulfonylureas (HR, 0.63), thiazolidinediones (HR, 0.73), and DPP-4 inhibitors (HR, 0.77).
Consistent findings were observed in women and men.
“Our results clearly demonstrate that GLP-1 RAs are significantly more effective than popular antidiabetic drugs, such as metformin or insulin, at preventing the development of CRC,” said Nathan Berger, MD, co-lead researcher, professor of experimental medicine, and member of the Case Comprehensive Cancer Center.
Targets for Future Research
Study limitations include potential unmeasured or uncontrolled confounders, self-selection, reverse causality, and other biases involved in observational studies, the research team noted.
Further research is warranted to investigate the effects in patients with prior antidiabetic treatments, underlying mechanisms, potential variation in effects among different GLP-1 RAs, and the potential of GLP-1 RAs to reduce the risks for other obesity-associated cancers, the researchers wrote.
“To our knowledge, this is the first indication this popular weight loss and antidiabetic class of drugs reduces incidence of CRC, relative to other antidiabetic agents,” said Rong Xu, PhD, co-lead researcher, professor of medicine, and member of the Case Comprehensive Cancer Center.
The study was supported by the National Cancer Institute Case Comprehensive Cancer Center, American Cancer Society, Landon Foundation-American Association for Cancer Research, National Institutes of Health Director’s New Innovator Award Program, National Institute on Aging, and National Institute on Alcohol Abuse and Alcoholism. Several authors reported grants from the National Institutes of Health during the conduct of the study.
A version of this article appeared on Medscape.com.
analysis.
In particular, GLP-1 RAs were associated with decreased risk compared with other antidiabetic treatments, including insulin, metformin, sodium-glucose cotransporter 2 (SGLT2) inhibitors, sulfonylureas, and thiazolidinediones.
More profound effects were seen in patients with overweight or obesity, “suggesting a potential protective effect against CRC partially mediated by weight loss and other mechanisms related to weight loss,” Lindsey Wang, an undergraduate student at Case Western Reserve University, Cleveland, Ohio, and colleagues wrote in JAMA Oncology.
Testing Treatments
GLP-1 RAs, usually given by injection, are approved by the US Food and Drug Administration to treat type 2 diabetes. They can lower blood sugar levels, improve insulin sensitivity, and help patients manage their weight.
Diabetes, overweight, and obesity are known risk factors for CRC and make prognosis worse. Ms. Wang and colleagues hypothesized that GLP-1 RAs might reduce CRC risk compared with other antidiabetics, including metformin and insulin, which have also been shown to reduce CRC risk.
Using a national database of more than 101 million electronic health records, Ms. Wang and colleagues conducted a population-based study of more than 1.2 million patients who had medical encounters for type 2 diabetes and were subsequently prescribed antidiabetic medications between 2005 and 2019. The patients had no prior antidiabetic medication use nor CRC diagnosis.
The researchers analyzed the effects of GLP-1 RAs on CRC incidence compared with the other prescribed antidiabetic drugs, matching for demographics, adverse socioeconomic determinants of health, preexisting medical conditions, family and personal history of cancers and colonic polyps, lifestyle factors, and procedures such as colonoscopy.
During a 15-year follow-up, GLP-1 RAs were associated with decreased risk for CRC compared with insulin (hazard ratio [HR], 0.56), metformin (HR, 0.75), SGLT2 inhibitors (HR, 0.77), sulfonylureas (HR, 0.82), and thiazolidinediones (HR, 0.82) in the overall study population.
For instance, among 22,572 patients who took insulin, 167 cases of CRC occurred, compared with 94 cases among the matched GLP-1 RA cohort. Among 18,518 patients who took metformin, 153 cases of CRC occurred compared with 96 cases among the matched GLP-1 RA cohort.
GLP-1 RAs also were associated with lower but not statistically significant risk than alpha-glucosidase inhibitors (HR, 0.59) and dipeptidyl-peptidase-4 (DPP-4) inhibitors (HR, 0.93).
In patients with overweight or obesity, GLP-1 RAs were associated with a lower risk for CRC than most of the other antidiabetics, including insulin (HR, 0.5), metformin (HR, 0.58), SGLT2 inhibitors (HR, 0.68), sulfonylureas (HR, 0.63), thiazolidinediones (HR, 0.73), and DPP-4 inhibitors (HR, 0.77).
Consistent findings were observed in women and men.
“Our results clearly demonstrate that GLP-1 RAs are significantly more effective than popular antidiabetic drugs, such as metformin or insulin, at preventing the development of CRC,” said Nathan Berger, MD, co-lead researcher, professor of experimental medicine, and member of the Case Comprehensive Cancer Center.
Targets for Future Research
Study limitations include potential unmeasured or uncontrolled confounders, self-selection, reverse causality, and other biases involved in observational studies, the research team noted.
Further research is warranted to investigate the effects in patients with prior antidiabetic treatments, underlying mechanisms, potential variation in effects among different GLP-1 RAs, and the potential of GLP-1 RAs to reduce the risks for other obesity-associated cancers, the researchers wrote.
“To our knowledge, this is the first indication this popular weight loss and antidiabetic class of drugs reduces incidence of CRC, relative to other antidiabetic agents,” said Rong Xu, PhD, co-lead researcher, professor of medicine, and member of the Case Comprehensive Cancer Center.
The study was supported by the National Cancer Institute Case Comprehensive Cancer Center, American Cancer Society, Landon Foundation-American Association for Cancer Research, National Institutes of Health Director’s New Innovator Award Program, National Institute on Aging, and National Institute on Alcohol Abuse and Alcoholism. Several authors reported grants from the National Institutes of Health during the conduct of the study.
A version of this article appeared on Medscape.com.