User login
‘Striking’ jump in cost of brand-name epilepsy meds
, a new analysis shows.
After adjustment for inflation, the cost of a 1-year supply of brand-name ASMs grew 277%, while generics became 42% less expensive.
“Our study makes transparent striking trends in brand name prescribing patterns,” the study team wrote.
Since 2010, the costs for brand-name ASMs have “consistently” increased. Costs were particularly boosted by increases in prescriptions for lacosamide (Vimpat), in addition to a “steep increase in the cost per pill, with brand-name drugs costing 10 times more than their generic counterparts,” first author Samuel Waller Terman, MD, of the University of Michigan, Ann Arbor, added in a news release.
The study was published online in Neurology.
Is a 10-fold increase in cost worth it?
To evaluate trends in ASM prescriptions and costs, the researchers used a random sample of 20% of Medicare beneficiaries with coverage from 2008 to 2018. There were 77,000 to 133,000 patients with epilepsy each year.
Over time, likely because of increasing availability of generics, brand-name ASMs made up a smaller proportion of pills prescribed, from 56% in 2008 to 14% in 2018, but still made up 79% of prescription drug costs in 2018.
The annual cost of brand-name ASMs rose from $2,800 in 2008 to $10,700 in 2018, while the cost of generic drugs decreased from $800 to $460 during that time.
An increased number of prescriptions for lacosamide was responsible for 45% of the total increase in brand-name costs.
As of 2018, lacosamide comprised 30% of all brand-name pill supply (followed by pregabalin, at 15%) and 30% of all brand-name costs (followed by clobazam and pregabalin, both at 9%), the investigators reported.
Brand-name antiepileptic drug costs decreased from 2008 to 2010, but after the introduction of lacosamide, total brand-name costs steadily rose from $72 million in 2010 (in 2018 dollars) to $256 million in 2018, they noted.
Because the dataset consists of a 20% random Medicare sample, total Medicare costs for brand-name ASMs for beneficiaries with epilepsy alone likely rose from roughly $360 million in 2010 to $1.3 billion in 2018, they added.
“Clinicians must remain cognizant of this societal cost magnitude when judging whether the 10-fold increased expense per pill for brand name medications is worth the possible benefits,” they wrote.
“While newer-generation drugs have potential advantages such as limited drug interactions and different side effect profiles, there have been conflicting studies on whether they are cost effective,” Dr. Terman noted in a news release.
A barrier to treatment
The authors of an accompanying editorial propose that the problem of prescription drug costs could be solved through a combination of competition and government regulation of prices. Patients and physicians are the most important stakeholders in this issue.
“When something represents 14% of the total use, but contributes 79% of the cost, it would be wise to consider alternatives, assuming that these alternatives are not of lower quality,” wrote Wyatt Bensken, with Case Western Reserve University, Cleveland, and Iván Sánchez Fernández, MD, with Boston Medical Center.
“When there are several ASMs with a similar mechanism of action, similar efficacy, similar safety and tolerability profile, and different costs, it would be unwise to choose the more expensive alternative just because it is newer,” they said.
This study, they added, provides data to “understand, and begin to act, on the challenging problem of the cost of prescription ASMs. After all, what is the point of having a large number of ASMs if their cost severely limits their use?”
A limitation of the study is that only Medicare prescription claims were included, so the results may not apply to younger patients with private insurance.
The study received no direct funding. The authors and editorialists have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, a new analysis shows.
After adjustment for inflation, the cost of a 1-year supply of brand-name ASMs grew 277%, while generics became 42% less expensive.
“Our study makes transparent striking trends in brand name prescribing patterns,” the study team wrote.
Since 2010, the costs for brand-name ASMs have “consistently” increased. Costs were particularly boosted by increases in prescriptions for lacosamide (Vimpat), in addition to a “steep increase in the cost per pill, with brand-name drugs costing 10 times more than their generic counterparts,” first author Samuel Waller Terman, MD, of the University of Michigan, Ann Arbor, added in a news release.
The study was published online in Neurology.
Is a 10-fold increase in cost worth it?
To evaluate trends in ASM prescriptions and costs, the researchers used a random sample of 20% of Medicare beneficiaries with coverage from 2008 to 2018. There were 77,000 to 133,000 patients with epilepsy each year.
Over time, likely because of increasing availability of generics, brand-name ASMs made up a smaller proportion of pills prescribed, from 56% in 2008 to 14% in 2018, but still made up 79% of prescription drug costs in 2018.
The annual cost of brand-name ASMs rose from $2,800 in 2008 to $10,700 in 2018, while the cost of generic drugs decreased from $800 to $460 during that time.
An increased number of prescriptions for lacosamide was responsible for 45% of the total increase in brand-name costs.
As of 2018, lacosamide comprised 30% of all brand-name pill supply (followed by pregabalin, at 15%) and 30% of all brand-name costs (followed by clobazam and pregabalin, both at 9%), the investigators reported.
Brand-name antiepileptic drug costs decreased from 2008 to 2010, but after the introduction of lacosamide, total brand-name costs steadily rose from $72 million in 2010 (in 2018 dollars) to $256 million in 2018, they noted.
Because the dataset consists of a 20% random Medicare sample, total Medicare costs for brand-name ASMs for beneficiaries with epilepsy alone likely rose from roughly $360 million in 2010 to $1.3 billion in 2018, they added.
“Clinicians must remain cognizant of this societal cost magnitude when judging whether the 10-fold increased expense per pill for brand name medications is worth the possible benefits,” they wrote.
“While newer-generation drugs have potential advantages such as limited drug interactions and different side effect profiles, there have been conflicting studies on whether they are cost effective,” Dr. Terman noted in a news release.
A barrier to treatment
The authors of an accompanying editorial propose that the problem of prescription drug costs could be solved through a combination of competition and government regulation of prices. Patients and physicians are the most important stakeholders in this issue.
“When something represents 14% of the total use, but contributes 79% of the cost, it would be wise to consider alternatives, assuming that these alternatives are not of lower quality,” wrote Wyatt Bensken, with Case Western Reserve University, Cleveland, and Iván Sánchez Fernández, MD, with Boston Medical Center.
“When there are several ASMs with a similar mechanism of action, similar efficacy, similar safety and tolerability profile, and different costs, it would be unwise to choose the more expensive alternative just because it is newer,” they said.
This study, they added, provides data to “understand, and begin to act, on the challenging problem of the cost of prescription ASMs. After all, what is the point of having a large number of ASMs if their cost severely limits their use?”
A limitation of the study is that only Medicare prescription claims were included, so the results may not apply to younger patients with private insurance.
The study received no direct funding. The authors and editorialists have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, a new analysis shows.
After adjustment for inflation, the cost of a 1-year supply of brand-name ASMs grew 277%, while generics became 42% less expensive.
“Our study makes transparent striking trends in brand name prescribing patterns,” the study team wrote.
Since 2010, the costs for brand-name ASMs have “consistently” increased. Costs were particularly boosted by increases in prescriptions for lacosamide (Vimpat), in addition to a “steep increase in the cost per pill, with brand-name drugs costing 10 times more than their generic counterparts,” first author Samuel Waller Terman, MD, of the University of Michigan, Ann Arbor, added in a news release.
The study was published online in Neurology.
Is a 10-fold increase in cost worth it?
To evaluate trends in ASM prescriptions and costs, the researchers used a random sample of 20% of Medicare beneficiaries with coverage from 2008 to 2018. There were 77,000 to 133,000 patients with epilepsy each year.
Over time, likely because of increasing availability of generics, brand-name ASMs made up a smaller proportion of pills prescribed, from 56% in 2008 to 14% in 2018, but still made up 79% of prescription drug costs in 2018.
The annual cost of brand-name ASMs rose from $2,800 in 2008 to $10,700 in 2018, while the cost of generic drugs decreased from $800 to $460 during that time.
An increased number of prescriptions for lacosamide was responsible for 45% of the total increase in brand-name costs.
As of 2018, lacosamide comprised 30% of all brand-name pill supply (followed by pregabalin, at 15%) and 30% of all brand-name costs (followed by clobazam and pregabalin, both at 9%), the investigators reported.
Brand-name antiepileptic drug costs decreased from 2008 to 2010, but after the introduction of lacosamide, total brand-name costs steadily rose from $72 million in 2010 (in 2018 dollars) to $256 million in 2018, they noted.
Because the dataset consists of a 20% random Medicare sample, total Medicare costs for brand-name ASMs for beneficiaries with epilepsy alone likely rose from roughly $360 million in 2010 to $1.3 billion in 2018, they added.
“Clinicians must remain cognizant of this societal cost magnitude when judging whether the 10-fold increased expense per pill for brand name medications is worth the possible benefits,” they wrote.
“While newer-generation drugs have potential advantages such as limited drug interactions and different side effect profiles, there have been conflicting studies on whether they are cost effective,” Dr. Terman noted in a news release.
A barrier to treatment
The authors of an accompanying editorial propose that the problem of prescription drug costs could be solved through a combination of competition and government regulation of prices. Patients and physicians are the most important stakeholders in this issue.
“When something represents 14% of the total use, but contributes 79% of the cost, it would be wise to consider alternatives, assuming that these alternatives are not of lower quality,” wrote Wyatt Bensken, with Case Western Reserve University, Cleveland, and Iván Sánchez Fernández, MD, with Boston Medical Center.
“When there are several ASMs with a similar mechanism of action, similar efficacy, similar safety and tolerability profile, and different costs, it would be unwise to choose the more expensive alternative just because it is newer,” they said.
This study, they added, provides data to “understand, and begin to act, on the challenging problem of the cost of prescription ASMs. After all, what is the point of having a large number of ASMs if their cost severely limits their use?”
A limitation of the study is that only Medicare prescription claims were included, so the results may not apply to younger patients with private insurance.
The study received no direct funding. The authors and editorialists have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Scientists find brain mechanism behind age-related memory loss
Scientists at Johns Hopkins University have identified a mechanism in the brain behind age-related memory loss, expanding our knowledge of the inner workings of the aging brain and possibly opening the door to new Alzheimer’s treatments.
The researchers looked at the hippocampus, a part of the brain thought to store long-term memories.
Neurons there are responsible for a pair of memory functions – called pattern separation and pattern completion – that work together in young, healthy brains. These functions can swing out of balance with age, impacting memory.
The Johns Hopkins team may have discovered what causes this imbalance. Their findings – reported in a paper in the journal Current Biology – may not only help us improve dementia treatments, but even prevent or delay a loss of thinking skills in the first place, the researchers say.
Pattern separation vs. pattern completion
To understand how the hippocampus changes with age, the researchers looked at rats’ brains. In rats and in humans, pattern separation and pattern completion are present, controlled by neurons in the hippocampus.
As the name suggests, pattern completion is when you take a few details or fragments of information – a few notes of music, or the start of a famous movie quote – and your brain retrieves the full memory. Pattern separation, on the other hand, is being able to tell similar observations or experiences apart (like two visits to the same restaurant) to be stored as separate memories.
These functions occur along a gradient across a tiny region called CA3. That gradient, the study found, disappears with aging, said lead study author Hey-Kyoung Lee, PhD, an assistant research scientist at the university’s Zanvyl Krieger Mind/Brain Institute. “The main consequence of the loss,” Dr. Lee said, “is that pattern completion becomes more dominant in rats as they age.”
What’s happening in the brain
Neurons responsible for pattern completion occupy the “distal” end of CA3, while those in charge of pattern separation reside at the “proximal” end. Dr. Lee said prior studies had not examined the proximal and distal regions separately, as she and her team did in this study.
What was surprising, said Dr. Lee, “was that hyperactivity in aging was observed toward the proximal CA3 region, not the expected distal region.” Contrary to their expectations, that hyperactivity did not enhance function in that area but rather dampened it. Hence: “There is diminished pattern separation and augmented pattern completion,” she said.
– they may recall a certain restaurant they’d been to but not be able to separate what happened during one visit versus another.
Why do some older adults stay sharp?
That memory impairment does not happen to everyone, and it doesn’t happen to all rats either. In fact, the researchers found that some older rats performed spatial-learning tasks as well as young rats did – even though their brains were already beginning to favor pattern completion.
If we can better understand why this happens, we may uncover new therapies for age-related memory loss, Dr. Lee said.
Coauthor Michela Gallagher’s team previously demonstrated that the anti-epilepsy drug levetiracetam improves memory performance by reducing hyperactivity in the hippocampus.
The extra detail this study adds may allow scientists to better aim such drugs in the future, Dr. Lee speculated. “It would give us better control of where we could possibly target the deficits we see.”
A version of this article first appeared on WebMD.com.
Scientists at Johns Hopkins University have identified a mechanism in the brain behind age-related memory loss, expanding our knowledge of the inner workings of the aging brain and possibly opening the door to new Alzheimer’s treatments.
The researchers looked at the hippocampus, a part of the brain thought to store long-term memories.
Neurons there are responsible for a pair of memory functions – called pattern separation and pattern completion – that work together in young, healthy brains. These functions can swing out of balance with age, impacting memory.
The Johns Hopkins team may have discovered what causes this imbalance. Their findings – reported in a paper in the journal Current Biology – may not only help us improve dementia treatments, but even prevent or delay a loss of thinking skills in the first place, the researchers say.
Pattern separation vs. pattern completion
To understand how the hippocampus changes with age, the researchers looked at rats’ brains. In rats and in humans, pattern separation and pattern completion are present, controlled by neurons in the hippocampus.
As the name suggests, pattern completion is when you take a few details or fragments of information – a few notes of music, or the start of a famous movie quote – and your brain retrieves the full memory. Pattern separation, on the other hand, is being able to tell similar observations or experiences apart (like two visits to the same restaurant) to be stored as separate memories.
These functions occur along a gradient across a tiny region called CA3. That gradient, the study found, disappears with aging, said lead study author Hey-Kyoung Lee, PhD, an assistant research scientist at the university’s Zanvyl Krieger Mind/Brain Institute. “The main consequence of the loss,” Dr. Lee said, “is that pattern completion becomes more dominant in rats as they age.”
What’s happening in the brain
Neurons responsible for pattern completion occupy the “distal” end of CA3, while those in charge of pattern separation reside at the “proximal” end. Dr. Lee said prior studies had not examined the proximal and distal regions separately, as she and her team did in this study.
What was surprising, said Dr. Lee, “was that hyperactivity in aging was observed toward the proximal CA3 region, not the expected distal region.” Contrary to their expectations, that hyperactivity did not enhance function in that area but rather dampened it. Hence: “There is diminished pattern separation and augmented pattern completion,” she said.
– they may recall a certain restaurant they’d been to but not be able to separate what happened during one visit versus another.
Why do some older adults stay sharp?
That memory impairment does not happen to everyone, and it doesn’t happen to all rats either. In fact, the researchers found that some older rats performed spatial-learning tasks as well as young rats did – even though their brains were already beginning to favor pattern completion.
If we can better understand why this happens, we may uncover new therapies for age-related memory loss, Dr. Lee said.
Coauthor Michela Gallagher’s team previously demonstrated that the anti-epilepsy drug levetiracetam improves memory performance by reducing hyperactivity in the hippocampus.
The extra detail this study adds may allow scientists to better aim such drugs in the future, Dr. Lee speculated. “It would give us better control of where we could possibly target the deficits we see.”
A version of this article first appeared on WebMD.com.
Scientists at Johns Hopkins University have identified a mechanism in the brain behind age-related memory loss, expanding our knowledge of the inner workings of the aging brain and possibly opening the door to new Alzheimer’s treatments.
The researchers looked at the hippocampus, a part of the brain thought to store long-term memories.
Neurons there are responsible for a pair of memory functions – called pattern separation and pattern completion – that work together in young, healthy brains. These functions can swing out of balance with age, impacting memory.
The Johns Hopkins team may have discovered what causes this imbalance. Their findings – reported in a paper in the journal Current Biology – may not only help us improve dementia treatments, but even prevent or delay a loss of thinking skills in the first place, the researchers say.
Pattern separation vs. pattern completion
To understand how the hippocampus changes with age, the researchers looked at rats’ brains. In rats and in humans, pattern separation and pattern completion are present, controlled by neurons in the hippocampus.
As the name suggests, pattern completion is when you take a few details or fragments of information – a few notes of music, or the start of a famous movie quote – and your brain retrieves the full memory. Pattern separation, on the other hand, is being able to tell similar observations or experiences apart (like two visits to the same restaurant) to be stored as separate memories.
These functions occur along a gradient across a tiny region called CA3. That gradient, the study found, disappears with aging, said lead study author Hey-Kyoung Lee, PhD, an assistant research scientist at the university’s Zanvyl Krieger Mind/Brain Institute. “The main consequence of the loss,” Dr. Lee said, “is that pattern completion becomes more dominant in rats as they age.”
What’s happening in the brain
Neurons responsible for pattern completion occupy the “distal” end of CA3, while those in charge of pattern separation reside at the “proximal” end. Dr. Lee said prior studies had not examined the proximal and distal regions separately, as she and her team did in this study.
What was surprising, said Dr. Lee, “was that hyperactivity in aging was observed toward the proximal CA3 region, not the expected distal region.” Contrary to their expectations, that hyperactivity did not enhance function in that area but rather dampened it. Hence: “There is diminished pattern separation and augmented pattern completion,” she said.
– they may recall a certain restaurant they’d been to but not be able to separate what happened during one visit versus another.
Why do some older adults stay sharp?
That memory impairment does not happen to everyone, and it doesn’t happen to all rats either. In fact, the researchers found that some older rats performed spatial-learning tasks as well as young rats did – even though their brains were already beginning to favor pattern completion.
If we can better understand why this happens, we may uncover new therapies for age-related memory loss, Dr. Lee said.
Coauthor Michela Gallagher’s team previously demonstrated that the anti-epilepsy drug levetiracetam improves memory performance by reducing hyperactivity in the hippocampus.
The extra detail this study adds may allow scientists to better aim such drugs in the future, Dr. Lee speculated. “It would give us better control of where we could possibly target the deficits we see.”
A version of this article first appeared on WebMD.com.
FROM CURRENT BIOLOGY
Can bone density scans help predict dementia risk?
, new research suggests.
In an analysis of more than 900 study participants, women in their 70s with more advanced abdominal aortic calcification (AAC) seen on lateral spine images during dual-energy x-ray absorptiometry (DXA) had a two- to fourfold higher risk for late-life dementia than those with low AAC.
This finding was independent of cardiovascular risk factors and apolipoprotein E (APOE ) genotype.
“While these results are exciting, we now need to undertake further large screening studies in older men and women using this approach to show that the findings are generalizable to older men and can identify people with greater cognitive decline,” coinvestigator Marc Sim, PhD, Edith Cowan University, Joondalup, Australia, said in an interview.
“This will hopefully open the door to studies of early disease-modifying interventions,” Sim said.
The findings were published online in The Lancet Regional Health – Western Pacific.
AAC and cognition
Late-life dementia occurring after age 80 is increasingly common because of both vascular and nonvascular risk factors.
Two recent studies in middle-aged and older men and women showed that AAC identified on bone densitometry was associated with poorer cognition, suggesting it may be related to cognitive decline and increased dementia risk.
This provided the rationale for the current study, Dr. Sim noted.
The researchers assessed AAC using DXA lateral spine images captured in the late 1990s in a prospective cohort of 958 older women who were participating in an osteoporosis study.
AAC was classified into established low, moderate, and extensive categories. At baseline, all women were aged 70 and older, and 45% had low AAC, 36% had moderate AAC, and 19% had extensive AAC.
Over 14.5 years, 150 women (15.7%) had a late-life hospitalization and/or died.
Improved risk prediction
Results showed that, compared with women who had low AAC, women with moderate and extensive AAC were more likely to experience late-life dementia hospitalization (9.3% low, 15.5% moderate, and 18.3% extensive) and death (2.8%, 8.3%, and 9.4%, respectively).
After multivariable adjustment, women with moderate AAC had a two- and threefold increased relative risk for late-life dementia hospitalization or death, compared with their peers who had low AAC.
Women with extensive AAC had a two- and fourfold increase in the adjusted relative risk for late-life dementia hospitalization or death.
“To our knowledge this is the first time it has been shown that AAC from these scans is related to late-life dementia,” Dr. Sim said.
“We demonstrated that AAC improved risk prediction in addition to cardiovascular risk factors and APOE genotype, a genetic risk factor for Alzheimer’s disease, the major form of dementia,” he added.
Dr. Sim noted “these additional lateral spine images” can be taken at the same time that hip and spine bone density tests are done.
“This provides an opportunity to identify AAC in large numbers of people,” he said.
He cautioned, however, that further studies with detailed dementia-related phenotypes, brain imaging, and measures of cognition are needed to confirm whether AAC will add value to dementia risk prediction.
‘Not surprising’
Commenting on the findings for this article, Claire Sexton, DPhil, senior director of scientific programs and outreach at the Alzheimer’s Association, Chicago, noted that AAC is a marker of atherosclerosis and is associated with vascular health outcomes.
Therefore, it is “not surprising it would be associated with dementia too. There’s been previous research linking atherosclerosis and Alzheimer’s disease,” Dr. Sexton said.
“What’s novel about this research is that it’s looking at AAC specifically, which can be identified through a relatively simple test that is already in widespread use,” she added.
Dr. Sexton noted that “much more research” is now needed in larger, more diverse populations in order to better understand the link between AAC and dementia – and whether bone density testing may be an appropriate dementia-screening tool.
“The good news is vascular conditions like atherosclerosis can be managed through lifestyle changes like eating a healthy diet and getting regular exercise. And research tells us what’s good for the heart is good for the brain,” Dr. Sexton said.
The study was funded by Kidney Health Australia, Healthway Health Promotion Foundation of Western Australia, Sir Charles Gairdner Hospital Research Advisory Committee Grant, and the National Health and Medical Research Council of Australia. Dr. Sim and Dr. Sexton have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, new research suggests.
In an analysis of more than 900 study participants, women in their 70s with more advanced abdominal aortic calcification (AAC) seen on lateral spine images during dual-energy x-ray absorptiometry (DXA) had a two- to fourfold higher risk for late-life dementia than those with low AAC.
This finding was independent of cardiovascular risk factors and apolipoprotein E (APOE ) genotype.
“While these results are exciting, we now need to undertake further large screening studies in older men and women using this approach to show that the findings are generalizable to older men and can identify people with greater cognitive decline,” coinvestigator Marc Sim, PhD, Edith Cowan University, Joondalup, Australia, said in an interview.
“This will hopefully open the door to studies of early disease-modifying interventions,” Sim said.
The findings were published online in The Lancet Regional Health – Western Pacific.
AAC and cognition
Late-life dementia occurring after age 80 is increasingly common because of both vascular and nonvascular risk factors.
Two recent studies in middle-aged and older men and women showed that AAC identified on bone densitometry was associated with poorer cognition, suggesting it may be related to cognitive decline and increased dementia risk.
This provided the rationale for the current study, Dr. Sim noted.
The researchers assessed AAC using DXA lateral spine images captured in the late 1990s in a prospective cohort of 958 older women who were participating in an osteoporosis study.
AAC was classified into established low, moderate, and extensive categories. At baseline, all women were aged 70 and older, and 45% had low AAC, 36% had moderate AAC, and 19% had extensive AAC.
Over 14.5 years, 150 women (15.7%) had a late-life hospitalization and/or died.
Improved risk prediction
Results showed that, compared with women who had low AAC, women with moderate and extensive AAC were more likely to experience late-life dementia hospitalization (9.3% low, 15.5% moderate, and 18.3% extensive) and death (2.8%, 8.3%, and 9.4%, respectively).
After multivariable adjustment, women with moderate AAC had a two- and threefold increased relative risk for late-life dementia hospitalization or death, compared with their peers who had low AAC.
Women with extensive AAC had a two- and fourfold increase in the adjusted relative risk for late-life dementia hospitalization or death.
“To our knowledge this is the first time it has been shown that AAC from these scans is related to late-life dementia,” Dr. Sim said.
“We demonstrated that AAC improved risk prediction in addition to cardiovascular risk factors and APOE genotype, a genetic risk factor for Alzheimer’s disease, the major form of dementia,” he added.
Dr. Sim noted “these additional lateral spine images” can be taken at the same time that hip and spine bone density tests are done.
“This provides an opportunity to identify AAC in large numbers of people,” he said.
He cautioned, however, that further studies with detailed dementia-related phenotypes, brain imaging, and measures of cognition are needed to confirm whether AAC will add value to dementia risk prediction.
‘Not surprising’
Commenting on the findings for this article, Claire Sexton, DPhil, senior director of scientific programs and outreach at the Alzheimer’s Association, Chicago, noted that AAC is a marker of atherosclerosis and is associated with vascular health outcomes.
Therefore, it is “not surprising it would be associated with dementia too. There’s been previous research linking atherosclerosis and Alzheimer’s disease,” Dr. Sexton said.
“What’s novel about this research is that it’s looking at AAC specifically, which can be identified through a relatively simple test that is already in widespread use,” she added.
Dr. Sexton noted that “much more research” is now needed in larger, more diverse populations in order to better understand the link between AAC and dementia – and whether bone density testing may be an appropriate dementia-screening tool.
“The good news is vascular conditions like atherosclerosis can be managed through lifestyle changes like eating a healthy diet and getting regular exercise. And research tells us what’s good for the heart is good for the brain,” Dr. Sexton said.
The study was funded by Kidney Health Australia, Healthway Health Promotion Foundation of Western Australia, Sir Charles Gairdner Hospital Research Advisory Committee Grant, and the National Health and Medical Research Council of Australia. Dr. Sim and Dr. Sexton have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, new research suggests.
In an analysis of more than 900 study participants, women in their 70s with more advanced abdominal aortic calcification (AAC) seen on lateral spine images during dual-energy x-ray absorptiometry (DXA) had a two- to fourfold higher risk for late-life dementia than those with low AAC.
This finding was independent of cardiovascular risk factors and apolipoprotein E (APOE ) genotype.
“While these results are exciting, we now need to undertake further large screening studies in older men and women using this approach to show that the findings are generalizable to older men and can identify people with greater cognitive decline,” coinvestigator Marc Sim, PhD, Edith Cowan University, Joondalup, Australia, said in an interview.
“This will hopefully open the door to studies of early disease-modifying interventions,” Sim said.
The findings were published online in The Lancet Regional Health – Western Pacific.
AAC and cognition
Late-life dementia occurring after age 80 is increasingly common because of both vascular and nonvascular risk factors.
Two recent studies in middle-aged and older men and women showed that AAC identified on bone densitometry was associated with poorer cognition, suggesting it may be related to cognitive decline and increased dementia risk.
This provided the rationale for the current study, Dr. Sim noted.
The researchers assessed AAC using DXA lateral spine images captured in the late 1990s in a prospective cohort of 958 older women who were participating in an osteoporosis study.
AAC was classified into established low, moderate, and extensive categories. At baseline, all women were aged 70 and older, and 45% had low AAC, 36% had moderate AAC, and 19% had extensive AAC.
Over 14.5 years, 150 women (15.7%) had a late-life hospitalization and/or died.
Improved risk prediction
Results showed that, compared with women who had low AAC, women with moderate and extensive AAC were more likely to experience late-life dementia hospitalization (9.3% low, 15.5% moderate, and 18.3% extensive) and death (2.8%, 8.3%, and 9.4%, respectively).
After multivariable adjustment, women with moderate AAC had a two- and threefold increased relative risk for late-life dementia hospitalization or death, compared with their peers who had low AAC.
Women with extensive AAC had a two- and fourfold increase in the adjusted relative risk for late-life dementia hospitalization or death.
“To our knowledge this is the first time it has been shown that AAC from these scans is related to late-life dementia,” Dr. Sim said.
“We demonstrated that AAC improved risk prediction in addition to cardiovascular risk factors and APOE genotype, a genetic risk factor for Alzheimer’s disease, the major form of dementia,” he added.
Dr. Sim noted “these additional lateral spine images” can be taken at the same time that hip and spine bone density tests are done.
“This provides an opportunity to identify AAC in large numbers of people,” he said.
He cautioned, however, that further studies with detailed dementia-related phenotypes, brain imaging, and measures of cognition are needed to confirm whether AAC will add value to dementia risk prediction.
‘Not surprising’
Commenting on the findings for this article, Claire Sexton, DPhil, senior director of scientific programs and outreach at the Alzheimer’s Association, Chicago, noted that AAC is a marker of atherosclerosis and is associated with vascular health outcomes.
Therefore, it is “not surprising it would be associated with dementia too. There’s been previous research linking atherosclerosis and Alzheimer’s disease,” Dr. Sexton said.
“What’s novel about this research is that it’s looking at AAC specifically, which can be identified through a relatively simple test that is already in widespread use,” she added.
Dr. Sexton noted that “much more research” is now needed in larger, more diverse populations in order to better understand the link between AAC and dementia – and whether bone density testing may be an appropriate dementia-screening tool.
“The good news is vascular conditions like atherosclerosis can be managed through lifestyle changes like eating a healthy diet and getting regular exercise. And research tells us what’s good for the heart is good for the brain,” Dr. Sexton said.
The study was funded by Kidney Health Australia, Healthway Health Promotion Foundation of Western Australia, Sir Charles Gairdner Hospital Research Advisory Committee Grant, and the National Health and Medical Research Council of Australia. Dr. Sim and Dr. Sexton have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM THE LANCET REGIONAL HEALTH – WESTERN PACIFIC
First-ever Huntington staging system may jump-start drug development for early-stage disease
Researchers liken the Huntington’s disease Integrated Staging System (HD-ISS) to the system currently used to stage cancer. It groups patients according to their underlying biological, clinical, and functional characteristics.
It also includes criteria to biologically define Huntington’s disease stages across the entire disease spectrum, from birth to death, which is something that has not been done before. For now, the HD-ISS is only intended for research, but it could one day be modified for use in the clinic, investigators wrote.
“This systematization is of critical importance to select the most appropriate target population for clinical trials and studies,” said co-investigator Cristina Sampaio, MD, chief medical officer at the CHDI Foundation, Princeton, N.J.
“By providing a methodology to precisely define cases early in the neurodegenerative process, the HD-ISS will be instrumental in conducting trials in the very early disease stages,” Dr. Sampaio added.
The position paper was published in the July issue of the Lancet Neurology.
New approach needed
There is no approved therapy to slow Huntington’s disease progression. Clinical trials currently enroll patients with demonstrable symptoms, which limits the ability to test therapeutics that could delay or prevent neurodegeneration.
Huntington’s disease is rare, occurring in about 2.7 per 100,000 individuals worldwide. It is caused by a mutation in the HTT gene involving a DNA segment known as a CAG trinucleotide repeat.
Currently, Huntington’s disease is diagnosed on the basis of clinical signs that emerge late in the disease course, an approach developed before the discovery of the HTT gene and the development of the genetic test for the CAG mutation.
The disease phase prior to diagnosis has been described as presymptomatic, premanifest, or prodromal. However, the three terms have varying definitions that make it difficult to compare study results across trials.
Because drug development had focused on the overt motor sign phase of the disease, there was no real need for an evidence-based staging system that classified disease phases from birth, the investigators noted.
“Now, the research community and regulators recognize that it is critical to conduct trials early in the disease when no signs or overt symptoms are measurable,” Dr. Sampaio said.
Defining disease stages
Work on the staging system was done through the Huntington’s Disease Regulatory Science Consortium, an international project begun in 2018 among biotech and pharma companies, academic institutions, and nonprofit research and advocacy organizations.
Overall, more than 50 clinicians and researchers were involved in developing the HD-ISS.
Using modeling data from four large observational studies that included patients with Huntington’s disease and control groups, researchers identified four different stages of Huntington’s disease:
- Stage 0: Begins at birth with identification of HTT gene mutations but no detectable pathologic changes.
- Stage 1: Begins when biomarker changes are detected via MRI by a volume decrease in six brain areas.
- Stage 2: Begins when clinical signs of Huntington’s disease are present, as determined through motor and cognitive assessments.
- Stage 3: Begins when functional decline is evident, with worsening on the Independence Scale and the Total Functional Capacity of the Unified Huntington’s Disease Rating Scale.
Applying the HD-ISS to clinical trials requires the collection of information routinely recorded in Huntington’s disease research, as well as some additional data, but researchers say its application is straightforward.
The HD-ISS uses a numerical staging system similar to that used in the U.S. Food and Drug Administration’s guidance for Alzheimer’s disease (AD) and integrates the prodromal, presymptomatic, or premanifest phase of the disease. This distinguishes it from earlier classification systems.
The HD-ISS can be adapted if new Huntington’s disease biomarkers are identified.
“As research results are generated, this will further validate the HD-ISS and potentially lead to the development of a derivative, and possibly simplified, system for clinical practice,” Dr. Sampaio said.
The new system goes further than a more recent proposal from the Movement Disorder Society task force, which addresses earlier stages in Huntington’s disease but doesn’t consider objective biomarker data.
Question of timing
Commenting on the findings, Erin Furr-Stimming, MD, neurologist and director of the Huntington’s Disease Society of America Center of Excellence with McGovern Medical School, UTHealth, Houston, said targeting early-stage disease will be key.
“Similar to more common neurodegenerative diseases such as Alzheimer’s disease and Parkinson’s disease, there is a period of at least a decade when changes are occurring in the nervous system, prior to the manifestation of clinical symptoms and signs significant enough to warrant a clinical diagnosis,” Dr. Furr-Stimming said.
She noted that multiple trials of disease-modifying agents for Alzheimer’s disease, Parkinson’s disease, and Huntington’s disease have failed for a multitude of reasons, “but one consistent question that is relevant to all these diseases is that of timing: Should we intervene and test these therapies earlier?
“The premanifest or prodromal period may be the ideal time to intervene with a disease-modifying therapy, prior to onset of any neurodegeneration,” Dr. Furr-Stimming said.
The CHDI Foundation provided financial support to the Critical Path Institute for the Huntington’s Disease Regulatory Science Consortium, including all working group efforts. Dr. Sampio is an employee of and receives salary from CHDI Management. She has also received consultancy honorariums (unrelated to HD) from Pfizer, Kyowa Kirin, vTv Therapeutics, GW Pharmaceuticals, Neuraly, Neuroderm, Green Valley Pharmaceuticals, and Pinteon Pharmaceuticals. A full list of disclosures for the other researchers is in the original article. Dr. Furr-Stimming reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Researchers liken the Huntington’s disease Integrated Staging System (HD-ISS) to the system currently used to stage cancer. It groups patients according to their underlying biological, clinical, and functional characteristics.
It also includes criteria to biologically define Huntington’s disease stages across the entire disease spectrum, from birth to death, which is something that has not been done before. For now, the HD-ISS is only intended for research, but it could one day be modified for use in the clinic, investigators wrote.
“This systematization is of critical importance to select the most appropriate target population for clinical trials and studies,” said co-investigator Cristina Sampaio, MD, chief medical officer at the CHDI Foundation, Princeton, N.J.
“By providing a methodology to precisely define cases early in the neurodegenerative process, the HD-ISS will be instrumental in conducting trials in the very early disease stages,” Dr. Sampaio added.
The position paper was published in the July issue of the Lancet Neurology.
New approach needed
There is no approved therapy to slow Huntington’s disease progression. Clinical trials currently enroll patients with demonstrable symptoms, which limits the ability to test therapeutics that could delay or prevent neurodegeneration.
Huntington’s disease is rare, occurring in about 2.7 per 100,000 individuals worldwide. It is caused by a mutation in the HTT gene involving a DNA segment known as a CAG trinucleotide repeat.
Currently, Huntington’s disease is diagnosed on the basis of clinical signs that emerge late in the disease course, an approach developed before the discovery of the HTT gene and the development of the genetic test for the CAG mutation.
The disease phase prior to diagnosis has been described as presymptomatic, premanifest, or prodromal. However, the three terms have varying definitions that make it difficult to compare study results across trials.
Because drug development had focused on the overt motor sign phase of the disease, there was no real need for an evidence-based staging system that classified disease phases from birth, the investigators noted.
“Now, the research community and regulators recognize that it is critical to conduct trials early in the disease when no signs or overt symptoms are measurable,” Dr. Sampaio said.
Defining disease stages
Work on the staging system was done through the Huntington’s Disease Regulatory Science Consortium, an international project begun in 2018 among biotech and pharma companies, academic institutions, and nonprofit research and advocacy organizations.
Overall, more than 50 clinicians and researchers were involved in developing the HD-ISS.
Using modeling data from four large observational studies that included patients with Huntington’s disease and control groups, researchers identified four different stages of Huntington’s disease:
- Stage 0: Begins at birth with identification of HTT gene mutations but no detectable pathologic changes.
- Stage 1: Begins when biomarker changes are detected via MRI by a volume decrease in six brain areas.
- Stage 2: Begins when clinical signs of Huntington’s disease are present, as determined through motor and cognitive assessments.
- Stage 3: Begins when functional decline is evident, with worsening on the Independence Scale and the Total Functional Capacity of the Unified Huntington’s Disease Rating Scale.
Applying the HD-ISS to clinical trials requires the collection of information routinely recorded in Huntington’s disease research, as well as some additional data, but researchers say its application is straightforward.
The HD-ISS uses a numerical staging system similar to that used in the U.S. Food and Drug Administration’s guidance for Alzheimer’s disease (AD) and integrates the prodromal, presymptomatic, or premanifest phase of the disease. This distinguishes it from earlier classification systems.
The HD-ISS can be adapted if new Huntington’s disease biomarkers are identified.
“As research results are generated, this will further validate the HD-ISS and potentially lead to the development of a derivative, and possibly simplified, system for clinical practice,” Dr. Sampaio said.
The new system goes further than a more recent proposal from the Movement Disorder Society task force, which addresses earlier stages in Huntington’s disease but doesn’t consider objective biomarker data.
Question of timing
Commenting on the findings, Erin Furr-Stimming, MD, neurologist and director of the Huntington’s Disease Society of America Center of Excellence with McGovern Medical School, UTHealth, Houston, said targeting early-stage disease will be key.
“Similar to more common neurodegenerative diseases such as Alzheimer’s disease and Parkinson’s disease, there is a period of at least a decade when changes are occurring in the nervous system, prior to the manifestation of clinical symptoms and signs significant enough to warrant a clinical diagnosis,” Dr. Furr-Stimming said.
She noted that multiple trials of disease-modifying agents for Alzheimer’s disease, Parkinson’s disease, and Huntington’s disease have failed for a multitude of reasons, “but one consistent question that is relevant to all these diseases is that of timing: Should we intervene and test these therapies earlier?
“The premanifest or prodromal period may be the ideal time to intervene with a disease-modifying therapy, prior to onset of any neurodegeneration,” Dr. Furr-Stimming said.
The CHDI Foundation provided financial support to the Critical Path Institute for the Huntington’s Disease Regulatory Science Consortium, including all working group efforts. Dr. Sampio is an employee of and receives salary from CHDI Management. She has also received consultancy honorariums (unrelated to HD) from Pfizer, Kyowa Kirin, vTv Therapeutics, GW Pharmaceuticals, Neuraly, Neuroderm, Green Valley Pharmaceuticals, and Pinteon Pharmaceuticals. A full list of disclosures for the other researchers is in the original article. Dr. Furr-Stimming reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Researchers liken the Huntington’s disease Integrated Staging System (HD-ISS) to the system currently used to stage cancer. It groups patients according to their underlying biological, clinical, and functional characteristics.
It also includes criteria to biologically define Huntington’s disease stages across the entire disease spectrum, from birth to death, which is something that has not been done before. For now, the HD-ISS is only intended for research, but it could one day be modified for use in the clinic, investigators wrote.
“This systematization is of critical importance to select the most appropriate target population for clinical trials and studies,” said co-investigator Cristina Sampaio, MD, chief medical officer at the CHDI Foundation, Princeton, N.J.
“By providing a methodology to precisely define cases early in the neurodegenerative process, the HD-ISS will be instrumental in conducting trials in the very early disease stages,” Dr. Sampaio added.
The position paper was published in the July issue of the Lancet Neurology.
New approach needed
There is no approved therapy to slow Huntington’s disease progression. Clinical trials currently enroll patients with demonstrable symptoms, which limits the ability to test therapeutics that could delay or prevent neurodegeneration.
Huntington’s disease is rare, occurring in about 2.7 per 100,000 individuals worldwide. It is caused by a mutation in the HTT gene involving a DNA segment known as a CAG trinucleotide repeat.
Currently, Huntington’s disease is diagnosed on the basis of clinical signs that emerge late in the disease course, an approach developed before the discovery of the HTT gene and the development of the genetic test for the CAG mutation.
The disease phase prior to diagnosis has been described as presymptomatic, premanifest, or prodromal. However, the three terms have varying definitions that make it difficult to compare study results across trials.
Because drug development had focused on the overt motor sign phase of the disease, there was no real need for an evidence-based staging system that classified disease phases from birth, the investigators noted.
“Now, the research community and regulators recognize that it is critical to conduct trials early in the disease when no signs or overt symptoms are measurable,” Dr. Sampaio said.
Defining disease stages
Work on the staging system was done through the Huntington’s Disease Regulatory Science Consortium, an international project begun in 2018 among biotech and pharma companies, academic institutions, and nonprofit research and advocacy organizations.
Overall, more than 50 clinicians and researchers were involved in developing the HD-ISS.
Using modeling data from four large observational studies that included patients with Huntington’s disease and control groups, researchers identified four different stages of Huntington’s disease:
- Stage 0: Begins at birth with identification of HTT gene mutations but no detectable pathologic changes.
- Stage 1: Begins when biomarker changes are detected via MRI by a volume decrease in six brain areas.
- Stage 2: Begins when clinical signs of Huntington’s disease are present, as determined through motor and cognitive assessments.
- Stage 3: Begins when functional decline is evident, with worsening on the Independence Scale and the Total Functional Capacity of the Unified Huntington’s Disease Rating Scale.
Applying the HD-ISS to clinical trials requires the collection of information routinely recorded in Huntington’s disease research, as well as some additional data, but researchers say its application is straightforward.
The HD-ISS uses a numerical staging system similar to that used in the U.S. Food and Drug Administration’s guidance for Alzheimer’s disease (AD) and integrates the prodromal, presymptomatic, or premanifest phase of the disease. This distinguishes it from earlier classification systems.
The HD-ISS can be adapted if new Huntington’s disease biomarkers are identified.
“As research results are generated, this will further validate the HD-ISS and potentially lead to the development of a derivative, and possibly simplified, system for clinical practice,” Dr. Sampaio said.
The new system goes further than a more recent proposal from the Movement Disorder Society task force, which addresses earlier stages in Huntington’s disease but doesn’t consider objective biomarker data.
Question of timing
Commenting on the findings, Erin Furr-Stimming, MD, neurologist and director of the Huntington’s Disease Society of America Center of Excellence with McGovern Medical School, UTHealth, Houston, said targeting early-stage disease will be key.
“Similar to more common neurodegenerative diseases such as Alzheimer’s disease and Parkinson’s disease, there is a period of at least a decade when changes are occurring in the nervous system, prior to the manifestation of clinical symptoms and signs significant enough to warrant a clinical diagnosis,” Dr. Furr-Stimming said.
She noted that multiple trials of disease-modifying agents for Alzheimer’s disease, Parkinson’s disease, and Huntington’s disease have failed for a multitude of reasons, “but one consistent question that is relevant to all these diseases is that of timing: Should we intervene and test these therapies earlier?
“The premanifest or prodromal period may be the ideal time to intervene with a disease-modifying therapy, prior to onset of any neurodegeneration,” Dr. Furr-Stimming said.
The CHDI Foundation provided financial support to the Critical Path Institute for the Huntington’s Disease Regulatory Science Consortium, including all working group efforts. Dr. Sampio is an employee of and receives salary from CHDI Management. She has also received consultancy honorariums (unrelated to HD) from Pfizer, Kyowa Kirin, vTv Therapeutics, GW Pharmaceuticals, Neuraly, Neuroderm, Green Valley Pharmaceuticals, and Pinteon Pharmaceuticals. A full list of disclosures for the other researchers is in the original article. Dr. Furr-Stimming reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM THE LANCET NEUROLOGY
Acupuncture deep needling technique points to greater tension headache relief
(TTH), new research suggests. Result of a randomized trial showed that though the majority of participants reported some relief from TTH after 8 weeks of acupuncture treatment, those who received needling at a depth of 12.5-20.0 mm reported the greatest reduction in headache frequency and severity.
At this depth, acupuncture promotes deqi sensation, a feeling of numbness, soreness, heaviness, or irritating pain in the needling site that is considered key to successful acupuncture treatment in traditional Chinese acupuncture theory.
“Our study showed that deqi sensation could enhance the effect of acupuncture in the treatment of chronic TTH, and the effect of acupuncture lasted at least 6 months when the treatment was stopped,” said co-investigator Ying Li, MD, PhD, The Third Hospital/Acupuncture and Tuina School, Chengdu University of Traditional Chinese Medicine, China.
The findings were published online in Neurology.
Deqi sensation key
TTH is the most common type of headache, with a lifetime prevalence of up to 78% in some studies. The pain is often described as throbbing or a vice-like tightness on both sides of the head. TTH is considered chronic when it occurs at least 15 days a month.
Previous studies have suggested that acupuncture can offer relief from headache pain, but specific information on TTH, especially chronic TTH, has been lacking.
To address the issue, researchers designed a parallel-design, patient-and-assessor blinded randomized controlled trial with 218 individuals with a history of chronic TTH. All were untreated with prophylactic treatment in the previous 3 months.
The treatment group (n = 110) received 20 sessions of true acupuncture (TA) over 8 weeks. This included three sessions per week in the first 4 weeks and two sessions per week in the last 4 weeks. The depth of needling at each point ranged from 12.5 to 20 mm, which is needed to achieve deqi sensation.
The control group (n = 108) received superficial acupuncture (SA) on the same schedule as the TA group and at traditional acupuncture points. However, this was done at a maximum depth of 2 mm, which is not deep enough for deqi sensation.
At week 16, 68.2% of the participants receiving TA reported a greater than 50% reduction in monthly headache days, compared with 48.1% of those receiving SA (odds ratio, 2.65; P < .001).
Mean monthly headache days decreased from 20.38 days at baseline to 7.48 days at week 32 in the TA group versus 22.6 days at baseline to 11.94 days in the SA group.
Headache intensity and severity decreased in both groups, although those who achieved deqi sensation reported the most improvement.
Only four patients reported adverse effects, all of which were mild and none requiring treatment.
Patients in both groups reported some pain relief, suggesting that those who are not comfortable with deqi sensation may still benefit from superficial acupuncture, although to a lesser extent, Dr. Li said.
“We assume that the point-specific effect and placebo effect were combined to give the patients relief of headaches,” Dr. Li added. “Further, the effect of deqi sensation added more treatment effect. This might be explained by gate-control theory or other unknown mechanisms.”
Deeper understanding?
Commenting on the research, Jennifer Bickel, MD, a senior member of neurology at Moffit Cancer Center and professor of oncologic sciences at University of South Florida, Tampa, said that the study provides a deeper understanding of acupuncture’s efficacy for chronic TTH, which could aid clinicians who are unfamiliar with the therapy or when and how to refer treatment.
“This study provides a more descriptive outline for what type of acupuncture treatment and duration can be effective for patients so doctors can prep patients on what to expect and so doctors can better assess if patients received appropriate acupuncture for their headaches,” said Dr. Bickel, who was not involved with the research.
However, she noted that the acupuncture sites and techniques did not vary during the trial. Although that makes sense for a controlled study, it may not reflect real-world clinical practice, she added.
“The downside is that the study didn’t fully reflect that most acupuncturists in clinical practice would alter treatments during the 20 sessions based on the patient’s response and accompanying symptoms or comorbidities,” Dr. Bickel said.
The study also lacked information on medication overuse headache or patients’ prior history of TTH treatments.
“This could be helpful to understand which patients in clinical practice are most likely to benefit from treatment,” Dr. Bickel said.
Study authors received funding from the Department of Science and Technology of Sichuan Province and the National Natural Science Foundation of China. Dr. Li, Dr. Bickel, and Dr. Vickers report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
(TTH), new research suggests. Result of a randomized trial showed that though the majority of participants reported some relief from TTH after 8 weeks of acupuncture treatment, those who received needling at a depth of 12.5-20.0 mm reported the greatest reduction in headache frequency and severity.
At this depth, acupuncture promotes deqi sensation, a feeling of numbness, soreness, heaviness, or irritating pain in the needling site that is considered key to successful acupuncture treatment in traditional Chinese acupuncture theory.
“Our study showed that deqi sensation could enhance the effect of acupuncture in the treatment of chronic TTH, and the effect of acupuncture lasted at least 6 months when the treatment was stopped,” said co-investigator Ying Li, MD, PhD, The Third Hospital/Acupuncture and Tuina School, Chengdu University of Traditional Chinese Medicine, China.
The findings were published online in Neurology.
Deqi sensation key
TTH is the most common type of headache, with a lifetime prevalence of up to 78% in some studies. The pain is often described as throbbing or a vice-like tightness on both sides of the head. TTH is considered chronic when it occurs at least 15 days a month.
Previous studies have suggested that acupuncture can offer relief from headache pain, but specific information on TTH, especially chronic TTH, has been lacking.
To address the issue, researchers designed a parallel-design, patient-and-assessor blinded randomized controlled trial with 218 individuals with a history of chronic TTH. All were untreated with prophylactic treatment in the previous 3 months.
The treatment group (n = 110) received 20 sessions of true acupuncture (TA) over 8 weeks. This included three sessions per week in the first 4 weeks and two sessions per week in the last 4 weeks. The depth of needling at each point ranged from 12.5 to 20 mm, which is needed to achieve deqi sensation.
The control group (n = 108) received superficial acupuncture (SA) on the same schedule as the TA group and at traditional acupuncture points. However, this was done at a maximum depth of 2 mm, which is not deep enough for deqi sensation.
At week 16, 68.2% of the participants receiving TA reported a greater than 50% reduction in monthly headache days, compared with 48.1% of those receiving SA (odds ratio, 2.65; P < .001).
Mean monthly headache days decreased from 20.38 days at baseline to 7.48 days at week 32 in the TA group versus 22.6 days at baseline to 11.94 days in the SA group.
Headache intensity and severity decreased in both groups, although those who achieved deqi sensation reported the most improvement.
Only four patients reported adverse effects, all of which were mild and none requiring treatment.
Patients in both groups reported some pain relief, suggesting that those who are not comfortable with deqi sensation may still benefit from superficial acupuncture, although to a lesser extent, Dr. Li said.
“We assume that the point-specific effect and placebo effect were combined to give the patients relief of headaches,” Dr. Li added. “Further, the effect of deqi sensation added more treatment effect. This might be explained by gate-control theory or other unknown mechanisms.”
Deeper understanding?
Commenting on the research, Jennifer Bickel, MD, a senior member of neurology at Moffit Cancer Center and professor of oncologic sciences at University of South Florida, Tampa, said that the study provides a deeper understanding of acupuncture’s efficacy for chronic TTH, which could aid clinicians who are unfamiliar with the therapy or when and how to refer treatment.
“This study provides a more descriptive outline for what type of acupuncture treatment and duration can be effective for patients so doctors can prep patients on what to expect and so doctors can better assess if patients received appropriate acupuncture for their headaches,” said Dr. Bickel, who was not involved with the research.
However, she noted that the acupuncture sites and techniques did not vary during the trial. Although that makes sense for a controlled study, it may not reflect real-world clinical practice, she added.
“The downside is that the study didn’t fully reflect that most acupuncturists in clinical practice would alter treatments during the 20 sessions based on the patient’s response and accompanying symptoms or comorbidities,” Dr. Bickel said.
The study also lacked information on medication overuse headache or patients’ prior history of TTH treatments.
“This could be helpful to understand which patients in clinical practice are most likely to benefit from treatment,” Dr. Bickel said.
Study authors received funding from the Department of Science and Technology of Sichuan Province and the National Natural Science Foundation of China. Dr. Li, Dr. Bickel, and Dr. Vickers report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
(TTH), new research suggests. Result of a randomized trial showed that though the majority of participants reported some relief from TTH after 8 weeks of acupuncture treatment, those who received needling at a depth of 12.5-20.0 mm reported the greatest reduction in headache frequency and severity.
At this depth, acupuncture promotes deqi sensation, a feeling of numbness, soreness, heaviness, or irritating pain in the needling site that is considered key to successful acupuncture treatment in traditional Chinese acupuncture theory.
“Our study showed that deqi sensation could enhance the effect of acupuncture in the treatment of chronic TTH, and the effect of acupuncture lasted at least 6 months when the treatment was stopped,” said co-investigator Ying Li, MD, PhD, The Third Hospital/Acupuncture and Tuina School, Chengdu University of Traditional Chinese Medicine, China.
The findings were published online in Neurology.
Deqi sensation key
TTH is the most common type of headache, with a lifetime prevalence of up to 78% in some studies. The pain is often described as throbbing or a vice-like tightness on both sides of the head. TTH is considered chronic when it occurs at least 15 days a month.
Previous studies have suggested that acupuncture can offer relief from headache pain, but specific information on TTH, especially chronic TTH, has been lacking.
To address the issue, researchers designed a parallel-design, patient-and-assessor blinded randomized controlled trial with 218 individuals with a history of chronic TTH. All were untreated with prophylactic treatment in the previous 3 months.
The treatment group (n = 110) received 20 sessions of true acupuncture (TA) over 8 weeks. This included three sessions per week in the first 4 weeks and two sessions per week in the last 4 weeks. The depth of needling at each point ranged from 12.5 to 20 mm, which is needed to achieve deqi sensation.
The control group (n = 108) received superficial acupuncture (SA) on the same schedule as the TA group and at traditional acupuncture points. However, this was done at a maximum depth of 2 mm, which is not deep enough for deqi sensation.
At week 16, 68.2% of the participants receiving TA reported a greater than 50% reduction in monthly headache days, compared with 48.1% of those receiving SA (odds ratio, 2.65; P < .001).
Mean monthly headache days decreased from 20.38 days at baseline to 7.48 days at week 32 in the TA group versus 22.6 days at baseline to 11.94 days in the SA group.
Headache intensity and severity decreased in both groups, although those who achieved deqi sensation reported the most improvement.
Only four patients reported adverse effects, all of which were mild and none requiring treatment.
Patients in both groups reported some pain relief, suggesting that those who are not comfortable with deqi sensation may still benefit from superficial acupuncture, although to a lesser extent, Dr. Li said.
“We assume that the point-specific effect and placebo effect were combined to give the patients relief of headaches,” Dr. Li added. “Further, the effect of deqi sensation added more treatment effect. This might be explained by gate-control theory or other unknown mechanisms.”
Deeper understanding?
Commenting on the research, Jennifer Bickel, MD, a senior member of neurology at Moffit Cancer Center and professor of oncologic sciences at University of South Florida, Tampa, said that the study provides a deeper understanding of acupuncture’s efficacy for chronic TTH, which could aid clinicians who are unfamiliar with the therapy or when and how to refer treatment.
“This study provides a more descriptive outline for what type of acupuncture treatment and duration can be effective for patients so doctors can prep patients on what to expect and so doctors can better assess if patients received appropriate acupuncture for their headaches,” said Dr. Bickel, who was not involved with the research.
However, she noted that the acupuncture sites and techniques did not vary during the trial. Although that makes sense for a controlled study, it may not reflect real-world clinical practice, she added.
“The downside is that the study didn’t fully reflect that most acupuncturists in clinical practice would alter treatments during the 20 sessions based on the patient’s response and accompanying symptoms or comorbidities,” Dr. Bickel said.
The study also lacked information on medication overuse headache or patients’ prior history of TTH treatments.
“This could be helpful to understand which patients in clinical practice are most likely to benefit from treatment,” Dr. Bickel said.
Study authors received funding from the Department of Science and Technology of Sichuan Province and the National Natural Science Foundation of China. Dr. Li, Dr. Bickel, and Dr. Vickers report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM NEUROLOGY
What’s the best time of day to exercise? It depends on your goals
For most of us, the “best” time of day to work out is simple: When we can.
Maybe that’s before or after work. Or when the gym offers free daycare. Or when our favorite instructor teaches our favorite class.
That’s why we call it a “routine.” And if the results are the same, it’s hard to imagine changing it up.
But what if the results aren’t the same?
They may not be, according to a new study from a research team at Skidmore College in Saratoga Springs, N.Y.
Women who worked out in the morning lost more fat, while those who trained in the evening gained more upper-body strength and power. As for men, the performance improvements were similar no matter when they exercised. But those who did so in the evening had a significant drop in blood pressure, among other benefits.
The study is part of a growing body of research showing different results for different times of day among different populations. As it turns out, when you exercise can ultimately have a big effect, not just on strength and fat loss, but also heart health, mood, and quality of sleep.
An accidental discovery
The original goal of the Skidmore study was to test a unique fitness program with a group of healthy, fit, and extremely active adults in early middle age.
The program includes four workouts a week, each with a different focus: strength, steady-pace endurance, high-intensity intervals, and flexibility (traditional stretching combined with yoga and Pilates exercises).
But because the group was so large – 27 women and 20 men completed the 3-month program – they had to split them into morning and evening workout groups.
It wasn’t until researchers looked at the results that they saw the differences between morning and evening exercise, says lead author Paul Arciero, PhD.
Dr. Arciero stressed that participants in every group got leaner and stronger. But the women who worked out in the morning got much bigger reductions in body fat and body-fat percentage than the evening group. Meanwhile, women in the evening group got much bigger gains in upper-body strength, power, and muscular endurance than their morning counterparts.
Among the men, the evening group had significantly larger improvements in blood pressure, cholesterol levels, and the percentage of fat they burned for energy, along with a bigger drop in feelings of fatigue.
Strategic timing for powerful results
Some of these findings are consistent with previous research. For example, a study published in 2021 showed that the ability to exert high effort and express strength and power peaks in the late afternoon, about the same time that your core body temperature is at its highest point.
On the other hand, you’ll probably perform better in the morning when the activity requires a lot of skill and coordination or depends on strategic decision-making.
The findings apply to both men and women.
Performance aside, exercise timing might offer strong health benefits for men with type 2 diabetes, or at high risk for it.
A study showed that men who exercised between 3 p.m. and 6 p.m. saw dramatic improvements in blood sugar management and insulin sensitivity, compared to a group that worked out between 8 a.m. and 10 a.m.
They also lost more fat during the 12-week program, even though they were doing the exact same workouts.
Train consistently, sleep well
When you exercise can affect your sleep quality in many ways, said neuroscientist Jennifer Heisz, PhD, of McMaster University, Hamilton, Ont.
First, she said, “exercise helps you fall asleep faster and sleep deeper at night.” (The only exception is if you exercise so intensely or so close to bedtime that your heart rate is still elevated.)
Second, “exercising at a consistent time every day helps regulate the body’s circadian rhythms.” It doesn’t matter if the exercise is in the morning, evening, or anywhere in between. As long as it’s predictable, it will help you fall asleep and wake up at the same times.
Outdoor exercise is even better, she said. The sun is the most powerful regulator of the circadian clock and works in tandem with physical activity.
Third, exercising at specific times can help you overcome jet lag or adjust to an earlier or later shift at work.
“Exercising at 7 a.m. or between 1 and 4 p.m. helps your circadian clock to ‘fall back’ in time, making it easier to wake up earlier,” Dr. Heisz said. If you need to train your body to wake up later in the morning, try working out between 7 p.m. and 10 p.m.
All exercise is good, but the right timing can make it even better
“The best time to exercise is when you can fit it in,” Dr. Arciero said. “You’ve got to choose the time that fits your lifestyle best.”
But context matters, he noted.
“For someone needing to achieve an improvement in their risk for cardiometabolic disease,” his study shows an advantage to working out later in the day, especially for men. If you’re more focused on building upper-body strength and power, you’ll probably get better results from training in the afternoon or evening.
And for fat loss, the Skidmore study shows better results for women who did morning workouts.
And if you’re still not sure? Try sleeping on it – preferably after your workout.
A version of this article first appeared on WebMD.com.
For most of us, the “best” time of day to work out is simple: When we can.
Maybe that’s before or after work. Or when the gym offers free daycare. Or when our favorite instructor teaches our favorite class.
That’s why we call it a “routine.” And if the results are the same, it’s hard to imagine changing it up.
But what if the results aren’t the same?
They may not be, according to a new study from a research team at Skidmore College in Saratoga Springs, N.Y.
Women who worked out in the morning lost more fat, while those who trained in the evening gained more upper-body strength and power. As for men, the performance improvements were similar no matter when they exercised. But those who did so in the evening had a significant drop in blood pressure, among other benefits.
The study is part of a growing body of research showing different results for different times of day among different populations. As it turns out, when you exercise can ultimately have a big effect, not just on strength and fat loss, but also heart health, mood, and quality of sleep.
An accidental discovery
The original goal of the Skidmore study was to test a unique fitness program with a group of healthy, fit, and extremely active adults in early middle age.
The program includes four workouts a week, each with a different focus: strength, steady-pace endurance, high-intensity intervals, and flexibility (traditional stretching combined with yoga and Pilates exercises).
But because the group was so large – 27 women and 20 men completed the 3-month program – they had to split them into morning and evening workout groups.
It wasn’t until researchers looked at the results that they saw the differences between morning and evening exercise, says lead author Paul Arciero, PhD.
Dr. Arciero stressed that participants in every group got leaner and stronger. But the women who worked out in the morning got much bigger reductions in body fat and body-fat percentage than the evening group. Meanwhile, women in the evening group got much bigger gains in upper-body strength, power, and muscular endurance than their morning counterparts.
Among the men, the evening group had significantly larger improvements in blood pressure, cholesterol levels, and the percentage of fat they burned for energy, along with a bigger drop in feelings of fatigue.
Strategic timing for powerful results
Some of these findings are consistent with previous research. For example, a study published in 2021 showed that the ability to exert high effort and express strength and power peaks in the late afternoon, about the same time that your core body temperature is at its highest point.
On the other hand, you’ll probably perform better in the morning when the activity requires a lot of skill and coordination or depends on strategic decision-making.
The findings apply to both men and women.
Performance aside, exercise timing might offer strong health benefits for men with type 2 diabetes, or at high risk for it.
A study showed that men who exercised between 3 p.m. and 6 p.m. saw dramatic improvements in blood sugar management and insulin sensitivity, compared to a group that worked out between 8 a.m. and 10 a.m.
They also lost more fat during the 12-week program, even though they were doing the exact same workouts.
Train consistently, sleep well
When you exercise can affect your sleep quality in many ways, said neuroscientist Jennifer Heisz, PhD, of McMaster University, Hamilton, Ont.
First, she said, “exercise helps you fall asleep faster and sleep deeper at night.” (The only exception is if you exercise so intensely or so close to bedtime that your heart rate is still elevated.)
Second, “exercising at a consistent time every day helps regulate the body’s circadian rhythms.” It doesn’t matter if the exercise is in the morning, evening, or anywhere in between. As long as it’s predictable, it will help you fall asleep and wake up at the same times.
Outdoor exercise is even better, she said. The sun is the most powerful regulator of the circadian clock and works in tandem with physical activity.
Third, exercising at specific times can help you overcome jet lag or adjust to an earlier or later shift at work.
“Exercising at 7 a.m. or between 1 and 4 p.m. helps your circadian clock to ‘fall back’ in time, making it easier to wake up earlier,” Dr. Heisz said. If you need to train your body to wake up later in the morning, try working out between 7 p.m. and 10 p.m.
All exercise is good, but the right timing can make it even better
“The best time to exercise is when you can fit it in,” Dr. Arciero said. “You’ve got to choose the time that fits your lifestyle best.”
But context matters, he noted.
“For someone needing to achieve an improvement in their risk for cardiometabolic disease,” his study shows an advantage to working out later in the day, especially for men. If you’re more focused on building upper-body strength and power, you’ll probably get better results from training in the afternoon or evening.
And for fat loss, the Skidmore study shows better results for women who did morning workouts.
And if you’re still not sure? Try sleeping on it – preferably after your workout.
A version of this article first appeared on WebMD.com.
For most of us, the “best” time of day to work out is simple: When we can.
Maybe that’s before or after work. Or when the gym offers free daycare. Or when our favorite instructor teaches our favorite class.
That’s why we call it a “routine.” And if the results are the same, it’s hard to imagine changing it up.
But what if the results aren’t the same?
They may not be, according to a new study from a research team at Skidmore College in Saratoga Springs, N.Y.
Women who worked out in the morning lost more fat, while those who trained in the evening gained more upper-body strength and power. As for men, the performance improvements were similar no matter when they exercised. But those who did so in the evening had a significant drop in blood pressure, among other benefits.
The study is part of a growing body of research showing different results for different times of day among different populations. As it turns out, when you exercise can ultimately have a big effect, not just on strength and fat loss, but also heart health, mood, and quality of sleep.
An accidental discovery
The original goal of the Skidmore study was to test a unique fitness program with a group of healthy, fit, and extremely active adults in early middle age.
The program includes four workouts a week, each with a different focus: strength, steady-pace endurance, high-intensity intervals, and flexibility (traditional stretching combined with yoga and Pilates exercises).
But because the group was so large – 27 women and 20 men completed the 3-month program – they had to split them into morning and evening workout groups.
It wasn’t until researchers looked at the results that they saw the differences between morning and evening exercise, says lead author Paul Arciero, PhD.
Dr. Arciero stressed that participants in every group got leaner and stronger. But the women who worked out in the morning got much bigger reductions in body fat and body-fat percentage than the evening group. Meanwhile, women in the evening group got much bigger gains in upper-body strength, power, and muscular endurance than their morning counterparts.
Among the men, the evening group had significantly larger improvements in blood pressure, cholesterol levels, and the percentage of fat they burned for energy, along with a bigger drop in feelings of fatigue.
Strategic timing for powerful results
Some of these findings are consistent with previous research. For example, a study published in 2021 showed that the ability to exert high effort and express strength and power peaks in the late afternoon, about the same time that your core body temperature is at its highest point.
On the other hand, you’ll probably perform better in the morning when the activity requires a lot of skill and coordination or depends on strategic decision-making.
The findings apply to both men and women.
Performance aside, exercise timing might offer strong health benefits for men with type 2 diabetes, or at high risk for it.
A study showed that men who exercised between 3 p.m. and 6 p.m. saw dramatic improvements in blood sugar management and insulin sensitivity, compared to a group that worked out between 8 a.m. and 10 a.m.
They also lost more fat during the 12-week program, even though they were doing the exact same workouts.
Train consistently, sleep well
When you exercise can affect your sleep quality in many ways, said neuroscientist Jennifer Heisz, PhD, of McMaster University, Hamilton, Ont.
First, she said, “exercise helps you fall asleep faster and sleep deeper at night.” (The only exception is if you exercise so intensely or so close to bedtime that your heart rate is still elevated.)
Second, “exercising at a consistent time every day helps regulate the body’s circadian rhythms.” It doesn’t matter if the exercise is in the morning, evening, or anywhere in between. As long as it’s predictable, it will help you fall asleep and wake up at the same times.
Outdoor exercise is even better, she said. The sun is the most powerful regulator of the circadian clock and works in tandem with physical activity.
Third, exercising at specific times can help you overcome jet lag or adjust to an earlier or later shift at work.
“Exercising at 7 a.m. or between 1 and 4 p.m. helps your circadian clock to ‘fall back’ in time, making it easier to wake up earlier,” Dr. Heisz said. If you need to train your body to wake up later in the morning, try working out between 7 p.m. and 10 p.m.
All exercise is good, but the right timing can make it even better
“The best time to exercise is when you can fit it in,” Dr. Arciero said. “You’ve got to choose the time that fits your lifestyle best.”
But context matters, he noted.
“For someone needing to achieve an improvement in their risk for cardiometabolic disease,” his study shows an advantage to working out later in the day, especially for men. If you’re more focused on building upper-body strength and power, you’ll probably get better results from training in the afternoon or evening.
And for fat loss, the Skidmore study shows better results for women who did morning workouts.
And if you’re still not sure? Try sleeping on it – preferably after your workout.
A version of this article first appeared on WebMD.com.
FROM FRONTIERS IN PHYSIOLOGY
How we treat acute pain could be wrong
In a surprising discovery that flies in the face of conventional medicine,
The paper, published in Science Translational Medicine, suggests that inflammation, a normal part of injury recovery, helps resolve acute pain and prevents it from becoming chronic. Blocking that inflammation may interfere with this process, leading to harder-to-treat pain.
“What we’ve been doing for decades not only appears to be wrong, but appears to be 180 degrees wrong,” says senior study author Jeffrey Mogil, PhD, a professor in the department of psychology at McGill University in Montreal. “You should not be blocking inflammation. You should be letting inflammation happen. That’s what stops chronic pain.”
Inflammation: Nature’s pain reliever
Wanting to know why pain goes away for some but drags on (and on) for others, the researchers looked at pain mechanisms in both humans and mice. They found that a type of white blood cell known as a neutrophil seems to play a key role.
“In analyzing the genes of people suffering from lower back pain, we observed active changes in genes over time in people whose pain went away,” says Luda Diatchenko, PhD, a professor in the faculty of medicine and Canada excellence research chair in human pain genetics at McGill. “Changes in the blood cells and their activity seemed to be the most important factor, especially in cells called neutrophils.”
To test this link, the researchers blocked neutrophils in mice and found the pain lasted 2-10 times longer than normal. Anti-inflammatory drugs, despite providing short-term relief, had the same pain-prolonging effect – though injecting neutrophils into the mice seemed to keep that from happening.
The findings are supported by a separate analysis of 500,000 people in the United Kingdom that showed those taking anti-inflammatory drugs to treat their pain were more likely to have pain 2-10 years later.
“Inflammation occurs for a reason,” says Dr. Mogil, “and it looks like it’s dangerous to interfere with it.”
Rethinking how we treat pain
Neutrophils arrive early during inflammation, at the onset of injury – just when many of us reach for pain medication. This research suggests it might be better not to block inflammation, instead letting the neutrophils “do their thing.” Taking an analgesic that alleviates pain without blocking neutrophils, like acetaminophen, may be better than taking an anti-inflammatory drug or steroid, says Dr. Mogil.
Still, while the findings are compelling, clinical trials are needed to directly compare anti-inflammatory drugs to other painkillers, the researchers said. This research may also lay the groundwork for new drug development for chronic pain patients, Dr. Mogil says.
“Our data strongly suggests that neutrophils act like analgesics themselves, which is potentially useful in terms of analgesic development,” Dr. Mogil says. “And of course, we need new analgesics.”
A version of this article first appeared on WebMD.com.
In a surprising discovery that flies in the face of conventional medicine,
The paper, published in Science Translational Medicine, suggests that inflammation, a normal part of injury recovery, helps resolve acute pain and prevents it from becoming chronic. Blocking that inflammation may interfere with this process, leading to harder-to-treat pain.
“What we’ve been doing for decades not only appears to be wrong, but appears to be 180 degrees wrong,” says senior study author Jeffrey Mogil, PhD, a professor in the department of psychology at McGill University in Montreal. “You should not be blocking inflammation. You should be letting inflammation happen. That’s what stops chronic pain.”
Inflammation: Nature’s pain reliever
Wanting to know why pain goes away for some but drags on (and on) for others, the researchers looked at pain mechanisms in both humans and mice. They found that a type of white blood cell known as a neutrophil seems to play a key role.
“In analyzing the genes of people suffering from lower back pain, we observed active changes in genes over time in people whose pain went away,” says Luda Diatchenko, PhD, a professor in the faculty of medicine and Canada excellence research chair in human pain genetics at McGill. “Changes in the blood cells and their activity seemed to be the most important factor, especially in cells called neutrophils.”
To test this link, the researchers blocked neutrophils in mice and found the pain lasted 2-10 times longer than normal. Anti-inflammatory drugs, despite providing short-term relief, had the same pain-prolonging effect – though injecting neutrophils into the mice seemed to keep that from happening.
The findings are supported by a separate analysis of 500,000 people in the United Kingdom that showed those taking anti-inflammatory drugs to treat their pain were more likely to have pain 2-10 years later.
“Inflammation occurs for a reason,” says Dr. Mogil, “and it looks like it’s dangerous to interfere with it.”
Rethinking how we treat pain
Neutrophils arrive early during inflammation, at the onset of injury – just when many of us reach for pain medication. This research suggests it might be better not to block inflammation, instead letting the neutrophils “do their thing.” Taking an analgesic that alleviates pain without blocking neutrophils, like acetaminophen, may be better than taking an anti-inflammatory drug or steroid, says Dr. Mogil.
Still, while the findings are compelling, clinical trials are needed to directly compare anti-inflammatory drugs to other painkillers, the researchers said. This research may also lay the groundwork for new drug development for chronic pain patients, Dr. Mogil says.
“Our data strongly suggests that neutrophils act like analgesics themselves, which is potentially useful in terms of analgesic development,” Dr. Mogil says. “And of course, we need new analgesics.”
A version of this article first appeared on WebMD.com.
In a surprising discovery that flies in the face of conventional medicine,
The paper, published in Science Translational Medicine, suggests that inflammation, a normal part of injury recovery, helps resolve acute pain and prevents it from becoming chronic. Blocking that inflammation may interfere with this process, leading to harder-to-treat pain.
“What we’ve been doing for decades not only appears to be wrong, but appears to be 180 degrees wrong,” says senior study author Jeffrey Mogil, PhD, a professor in the department of psychology at McGill University in Montreal. “You should not be blocking inflammation. You should be letting inflammation happen. That’s what stops chronic pain.”
Inflammation: Nature’s pain reliever
Wanting to know why pain goes away for some but drags on (and on) for others, the researchers looked at pain mechanisms in both humans and mice. They found that a type of white blood cell known as a neutrophil seems to play a key role.
“In analyzing the genes of people suffering from lower back pain, we observed active changes in genes over time in people whose pain went away,” says Luda Diatchenko, PhD, a professor in the faculty of medicine and Canada excellence research chair in human pain genetics at McGill. “Changes in the blood cells and their activity seemed to be the most important factor, especially in cells called neutrophils.”
To test this link, the researchers blocked neutrophils in mice and found the pain lasted 2-10 times longer than normal. Anti-inflammatory drugs, despite providing short-term relief, had the same pain-prolonging effect – though injecting neutrophils into the mice seemed to keep that from happening.
The findings are supported by a separate analysis of 500,000 people in the United Kingdom that showed those taking anti-inflammatory drugs to treat their pain were more likely to have pain 2-10 years later.
“Inflammation occurs for a reason,” says Dr. Mogil, “and it looks like it’s dangerous to interfere with it.”
Rethinking how we treat pain
Neutrophils arrive early during inflammation, at the onset of injury – just when many of us reach for pain medication. This research suggests it might be better not to block inflammation, instead letting the neutrophils “do their thing.” Taking an analgesic that alleviates pain without blocking neutrophils, like acetaminophen, may be better than taking an anti-inflammatory drug or steroid, says Dr. Mogil.
Still, while the findings are compelling, clinical trials are needed to directly compare anti-inflammatory drugs to other painkillers, the researchers said. This research may also lay the groundwork for new drug development for chronic pain patients, Dr. Mogil says.
“Our data strongly suggests that neutrophils act like analgesics themselves, which is potentially useful in terms of analgesic development,” Dr. Mogil says. “And of course, we need new analgesics.”
A version of this article first appeared on WebMD.com.
FROM SCIENCE TRANSLATIONAL MEDICINE
New guideline for in-hospital care of diabetes says use CGMs
Goal-directed glycemic management – which may include new technologies for glucose monitoring – for non–critically ill hospitalized patients who have diabetes or newly recognized hyperglycemia can improve outcomes, according to a new practice guideline from the Endocrine Society.
Even though roughly 35% of hospitalized patients have diabetes or newly discovered hyperglycemia, there is “wide variability in glycemic management in clinical practice,” writing panel chair Mary Korytkowski, MD, from the University of Pittsburgh, said at the annual meeting of the Endocrine Society. “These patients get admitted to every patient service in the hospital, meaning that every clinical service will encounter this group of patients, and their glycemic management can have a major effect on their outcomes. Both short term and long term.”
This guideline provides strategies “to achieve previously recommended glycemic goals while also reducing the risk for hypoglycemia, and this includes inpatient use of insulin pump therapy or continuous glucose monitoring [CGM] devices, among others,” she said.
It also includes “recommendations for preoperative glycemic goals as well as when the use of correctional insulin – well known as sliding scale insulin – may be appropriate” and when it is not.
The document, which replaces a 2012 guideline, was published online in the Journal of Clinical Endocrinology & Metabolism.
A multidisciplinary panel developed the document over the last 3 years to answer 10 clinical practice questions related to management of non–critically ill hospitalized patients with diabetes or newly discovered hyperglycemia.
Use of CGM devices in hospital
The first recommendation is: “In adults with insulin-treated diabetes hospitalized for noncritical illness who are at high risk of hypoglycemia, we suggest the use of real-time [CGM] with confirmatory bedside point-of-care blood glucose monitoring for adjustments in insulin dosing rather than point-of-care blood glucose rather than testing alone in hospital settings where resources and training are available.” (Conditional recommendation. Low certainty of evidence).
“We were actually very careful in terms of looking at the data” for use of CGMs, Dr. Korytkowski said in an interview.
Although CGMs are approved by the Food and Drug Administration in the outpatient setting, and that’s becoming the standard of care there, they are not yet approved for in-hospital use.
However, the FDA granted an emergency allowance for use of CGMs in hospitals during the COVID-19 pandemic.
That was “when everyone was scrambling for what to do,” Dr. Korytkowski noted. “There was a shortage of personal protective equipment and a real interest in trying to limit the amount of exposure of healthcare personnel in some of these really critically ill patients for whom intravenous insulin therapy was used to control their glucose level.”
On March 1, the FDA granted Breakthrough Devices Designation for Dexcom CGM use in the hospital setting.
The new guideline suggests CGM be used to detect trends in glycemic management, with insulin dosing decisions made with point-of-care glucose measure (the standard of care).
To implement CGM for glycemic management in hospitals, Dr. Korytkowski said, would require “extensive staff and nursing education to have people with expertise available to provide support to nursing personnel who are both placing these devices, changing these devices, looking at trends, and then knowing when to remove them for certain procedures such as MRI or radiologic procedures.”
“We know that not all hospitals may be readily available to use these devices,” she said. “It is an area of active research. But the use of these devices during the pandemic, in both critical care and non–critical care setting has really provided us with a lot of information that was used to formulate this suggestion in the guideline.”
The document addresses the following areas: CGM, continuous subcutaneous insulin infusion pump therapy, inpatient diabetes education, prespecified preoperative glycemic targets, use of neutral protamine Hagedorn insulin for glucocorticoid or enteral nutrition-associated hyperglycemia, noninsulin therapies, preoperative carbohydrate-containing oral fluids, carbohydrate counting for prandial (mealtime) insulin dosing, and correctional and scheduled (basal or basal bolus) insulin therapies.
Nine key recommendations
Dr. Korytkowski identified nine key recommendations:
- CGM systems can help guide glycemic management with reduced risk for hypoglycemia.
- Patients experiencing glucocorticoid- or enteral nutrition–associated hyperglycemia require scheduled insulin therapy to address anticipated glucose excursions.
- Selected patients using insulin pump therapy prior to a hospital admission can continue to use these devices in the hospital if they have the mental and physical capacity to do so with knowledgeable hospital personnel.
- Diabetes self-management education provided to hospitalized patients can promote improved glycemic control following discharge with reductions in the risk for hospital readmission. “We know that is recommended for patients in the outpatient setting but often they do not get this,” she said. “We were able to observe that this can also impact long-term outcomes “
- Patients with diabetes scheduled for elective surgery may have improved postoperative outcomes when preoperative hemoglobin A1c is 8% or less and preoperative blood glucose is less than 180 mg/dL. “This recommendation answers the question: ‘Where should glycemic goals be for people who are undergoing surgery?’ ”
- Providing preoperative carbohydrate-containing beverages to patients with known diabetes is not recommended.
- Patients with newly recognized hyperglycemia or well-managed diabetes on noninsulin therapy may be treated with correctional insulin alone as initial therapy at hospital admission.
- Some noninsulin diabetes therapies can be used in combination with correction insulin for patients with type 2 diabetes who have mild hyperglycemia.
- Correctional insulin – “otherwise known as sliding-scale insulin” – can be used as initial therapy for patients with newly recognized hyperglycemia or type 2 diabetes treated with noninsulin therapy prior to hospital admission.
- Scheduled insulin therapy is preferred for patients experiencing persistent blood glucose values greater than 180 mg/dL and is recommended for patients using insulin therapy prior to admission.
The guideline writers’ hopes
“We hope that this guideline will resolve debates” about appropriate preoperative glycemic management and when sliding-scale insulin can be used and should not be used, said Dr. Korytkowski.
The authors also hope that “it will stimulate research funding for this very important aspect of diabetes care, and that hospitals will recognize the importance of having access to knowledgeable diabetes care and education specialists who can provide staff education regarding inpatient glycemic management, provide oversight for patients using insulin pump therapy or CGM devices, and empower hospital nurses to provide diabetes [self-management] education prior to patient discharge.”
Claire Pegg, the patient representative on the panel, hopes “that this guideline serves as the beginning of a conversation that will allow inpatient caregivers to provide individualized care to patients – some of whom may be self-sufficient with their glycemic management and others who need additional assistance.”
Development of the guideline was funded by the Endocrine Society. Dr. Korytkowski has reported no relevant financial disclosures.
A version of this article first appeared on Medscape.com.
Goal-directed glycemic management – which may include new technologies for glucose monitoring – for non–critically ill hospitalized patients who have diabetes or newly recognized hyperglycemia can improve outcomes, according to a new practice guideline from the Endocrine Society.
Even though roughly 35% of hospitalized patients have diabetes or newly discovered hyperglycemia, there is “wide variability in glycemic management in clinical practice,” writing panel chair Mary Korytkowski, MD, from the University of Pittsburgh, said at the annual meeting of the Endocrine Society. “These patients get admitted to every patient service in the hospital, meaning that every clinical service will encounter this group of patients, and their glycemic management can have a major effect on their outcomes. Both short term and long term.”
This guideline provides strategies “to achieve previously recommended glycemic goals while also reducing the risk for hypoglycemia, and this includes inpatient use of insulin pump therapy or continuous glucose monitoring [CGM] devices, among others,” she said.
It also includes “recommendations for preoperative glycemic goals as well as when the use of correctional insulin – well known as sliding scale insulin – may be appropriate” and when it is not.
The document, which replaces a 2012 guideline, was published online in the Journal of Clinical Endocrinology & Metabolism.
A multidisciplinary panel developed the document over the last 3 years to answer 10 clinical practice questions related to management of non–critically ill hospitalized patients with diabetes or newly discovered hyperglycemia.
Use of CGM devices in hospital
The first recommendation is: “In adults with insulin-treated diabetes hospitalized for noncritical illness who are at high risk of hypoglycemia, we suggest the use of real-time [CGM] with confirmatory bedside point-of-care blood glucose monitoring for adjustments in insulin dosing rather than point-of-care blood glucose rather than testing alone in hospital settings where resources and training are available.” (Conditional recommendation. Low certainty of evidence).
“We were actually very careful in terms of looking at the data” for use of CGMs, Dr. Korytkowski said in an interview.
Although CGMs are approved by the Food and Drug Administration in the outpatient setting, and that’s becoming the standard of care there, they are not yet approved for in-hospital use.
However, the FDA granted an emergency allowance for use of CGMs in hospitals during the COVID-19 pandemic.
That was “when everyone was scrambling for what to do,” Dr. Korytkowski noted. “There was a shortage of personal protective equipment and a real interest in trying to limit the amount of exposure of healthcare personnel in some of these really critically ill patients for whom intravenous insulin therapy was used to control their glucose level.”
On March 1, the FDA granted Breakthrough Devices Designation for Dexcom CGM use in the hospital setting.
The new guideline suggests CGM be used to detect trends in glycemic management, with insulin dosing decisions made with point-of-care glucose measure (the standard of care).
To implement CGM for glycemic management in hospitals, Dr. Korytkowski said, would require “extensive staff and nursing education to have people with expertise available to provide support to nursing personnel who are both placing these devices, changing these devices, looking at trends, and then knowing when to remove them for certain procedures such as MRI or radiologic procedures.”
“We know that not all hospitals may be readily available to use these devices,” she said. “It is an area of active research. But the use of these devices during the pandemic, in both critical care and non–critical care setting has really provided us with a lot of information that was used to formulate this suggestion in the guideline.”
The document addresses the following areas: CGM, continuous subcutaneous insulin infusion pump therapy, inpatient diabetes education, prespecified preoperative glycemic targets, use of neutral protamine Hagedorn insulin for glucocorticoid or enteral nutrition-associated hyperglycemia, noninsulin therapies, preoperative carbohydrate-containing oral fluids, carbohydrate counting for prandial (mealtime) insulin dosing, and correctional and scheduled (basal or basal bolus) insulin therapies.
Nine key recommendations
Dr. Korytkowski identified nine key recommendations:
- CGM systems can help guide glycemic management with reduced risk for hypoglycemia.
- Patients experiencing glucocorticoid- or enteral nutrition–associated hyperglycemia require scheduled insulin therapy to address anticipated glucose excursions.
- Selected patients using insulin pump therapy prior to a hospital admission can continue to use these devices in the hospital if they have the mental and physical capacity to do so with knowledgeable hospital personnel.
- Diabetes self-management education provided to hospitalized patients can promote improved glycemic control following discharge with reductions in the risk for hospital readmission. “We know that is recommended for patients in the outpatient setting but often they do not get this,” she said. “We were able to observe that this can also impact long-term outcomes “
- Patients with diabetes scheduled for elective surgery may have improved postoperative outcomes when preoperative hemoglobin A1c is 8% or less and preoperative blood glucose is less than 180 mg/dL. “This recommendation answers the question: ‘Where should glycemic goals be for people who are undergoing surgery?’ ”
- Providing preoperative carbohydrate-containing beverages to patients with known diabetes is not recommended.
- Patients with newly recognized hyperglycemia or well-managed diabetes on noninsulin therapy may be treated with correctional insulin alone as initial therapy at hospital admission.
- Some noninsulin diabetes therapies can be used in combination with correction insulin for patients with type 2 diabetes who have mild hyperglycemia.
- Correctional insulin – “otherwise known as sliding-scale insulin” – can be used as initial therapy for patients with newly recognized hyperglycemia or type 2 diabetes treated with noninsulin therapy prior to hospital admission.
- Scheduled insulin therapy is preferred for patients experiencing persistent blood glucose values greater than 180 mg/dL and is recommended for patients using insulin therapy prior to admission.
The guideline writers’ hopes
“We hope that this guideline will resolve debates” about appropriate preoperative glycemic management and when sliding-scale insulin can be used and should not be used, said Dr. Korytkowski.
The authors also hope that “it will stimulate research funding for this very important aspect of diabetes care, and that hospitals will recognize the importance of having access to knowledgeable diabetes care and education specialists who can provide staff education regarding inpatient glycemic management, provide oversight for patients using insulin pump therapy or CGM devices, and empower hospital nurses to provide diabetes [self-management] education prior to patient discharge.”
Claire Pegg, the patient representative on the panel, hopes “that this guideline serves as the beginning of a conversation that will allow inpatient caregivers to provide individualized care to patients – some of whom may be self-sufficient with their glycemic management and others who need additional assistance.”
Development of the guideline was funded by the Endocrine Society. Dr. Korytkowski has reported no relevant financial disclosures.
A version of this article first appeared on Medscape.com.
Goal-directed glycemic management – which may include new technologies for glucose monitoring – for non–critically ill hospitalized patients who have diabetes or newly recognized hyperglycemia can improve outcomes, according to a new practice guideline from the Endocrine Society.
Even though roughly 35% of hospitalized patients have diabetes or newly discovered hyperglycemia, there is “wide variability in glycemic management in clinical practice,” writing panel chair Mary Korytkowski, MD, from the University of Pittsburgh, said at the annual meeting of the Endocrine Society. “These patients get admitted to every patient service in the hospital, meaning that every clinical service will encounter this group of patients, and their glycemic management can have a major effect on their outcomes. Both short term and long term.”
This guideline provides strategies “to achieve previously recommended glycemic goals while also reducing the risk for hypoglycemia, and this includes inpatient use of insulin pump therapy or continuous glucose monitoring [CGM] devices, among others,” she said.
It also includes “recommendations for preoperative glycemic goals as well as when the use of correctional insulin – well known as sliding scale insulin – may be appropriate” and when it is not.
The document, which replaces a 2012 guideline, was published online in the Journal of Clinical Endocrinology & Metabolism.
A multidisciplinary panel developed the document over the last 3 years to answer 10 clinical practice questions related to management of non–critically ill hospitalized patients with diabetes or newly discovered hyperglycemia.
Use of CGM devices in hospital
The first recommendation is: “In adults with insulin-treated diabetes hospitalized for noncritical illness who are at high risk of hypoglycemia, we suggest the use of real-time [CGM] with confirmatory bedside point-of-care blood glucose monitoring for adjustments in insulin dosing rather than point-of-care blood glucose rather than testing alone in hospital settings where resources and training are available.” (Conditional recommendation. Low certainty of evidence).
“We were actually very careful in terms of looking at the data” for use of CGMs, Dr. Korytkowski said in an interview.
Although CGMs are approved by the Food and Drug Administration in the outpatient setting, and that’s becoming the standard of care there, they are not yet approved for in-hospital use.
However, the FDA granted an emergency allowance for use of CGMs in hospitals during the COVID-19 pandemic.
That was “when everyone was scrambling for what to do,” Dr. Korytkowski noted. “There was a shortage of personal protective equipment and a real interest in trying to limit the amount of exposure of healthcare personnel in some of these really critically ill patients for whom intravenous insulin therapy was used to control their glucose level.”
On March 1, the FDA granted Breakthrough Devices Designation for Dexcom CGM use in the hospital setting.
The new guideline suggests CGM be used to detect trends in glycemic management, with insulin dosing decisions made with point-of-care glucose measure (the standard of care).
To implement CGM for glycemic management in hospitals, Dr. Korytkowski said, would require “extensive staff and nursing education to have people with expertise available to provide support to nursing personnel who are both placing these devices, changing these devices, looking at trends, and then knowing when to remove them for certain procedures such as MRI or radiologic procedures.”
“We know that not all hospitals may be readily available to use these devices,” she said. “It is an area of active research. But the use of these devices during the pandemic, in both critical care and non–critical care setting has really provided us with a lot of information that was used to formulate this suggestion in the guideline.”
The document addresses the following areas: CGM, continuous subcutaneous insulin infusion pump therapy, inpatient diabetes education, prespecified preoperative glycemic targets, use of neutral protamine Hagedorn insulin for glucocorticoid or enteral nutrition-associated hyperglycemia, noninsulin therapies, preoperative carbohydrate-containing oral fluids, carbohydrate counting for prandial (mealtime) insulin dosing, and correctional and scheduled (basal or basal bolus) insulin therapies.
Nine key recommendations
Dr. Korytkowski identified nine key recommendations:
- CGM systems can help guide glycemic management with reduced risk for hypoglycemia.
- Patients experiencing glucocorticoid- or enteral nutrition–associated hyperglycemia require scheduled insulin therapy to address anticipated glucose excursions.
- Selected patients using insulin pump therapy prior to a hospital admission can continue to use these devices in the hospital if they have the mental and physical capacity to do so with knowledgeable hospital personnel.
- Diabetes self-management education provided to hospitalized patients can promote improved glycemic control following discharge with reductions in the risk for hospital readmission. “We know that is recommended for patients in the outpatient setting but often they do not get this,” she said. “We were able to observe that this can also impact long-term outcomes “
- Patients with diabetes scheduled for elective surgery may have improved postoperative outcomes when preoperative hemoglobin A1c is 8% or less and preoperative blood glucose is less than 180 mg/dL. “This recommendation answers the question: ‘Where should glycemic goals be for people who are undergoing surgery?’ ”
- Providing preoperative carbohydrate-containing beverages to patients with known diabetes is not recommended.
- Patients with newly recognized hyperglycemia or well-managed diabetes on noninsulin therapy may be treated with correctional insulin alone as initial therapy at hospital admission.
- Some noninsulin diabetes therapies can be used in combination with correction insulin for patients with type 2 diabetes who have mild hyperglycemia.
- Correctional insulin – “otherwise known as sliding-scale insulin” – can be used as initial therapy for patients with newly recognized hyperglycemia or type 2 diabetes treated with noninsulin therapy prior to hospital admission.
- Scheduled insulin therapy is preferred for patients experiencing persistent blood glucose values greater than 180 mg/dL and is recommended for patients using insulin therapy prior to admission.
The guideline writers’ hopes
“We hope that this guideline will resolve debates” about appropriate preoperative glycemic management and when sliding-scale insulin can be used and should not be used, said Dr. Korytkowski.
The authors also hope that “it will stimulate research funding for this very important aspect of diabetes care, and that hospitals will recognize the importance of having access to knowledgeable diabetes care and education specialists who can provide staff education regarding inpatient glycemic management, provide oversight for patients using insulin pump therapy or CGM devices, and empower hospital nurses to provide diabetes [self-management] education prior to patient discharge.”
Claire Pegg, the patient representative on the panel, hopes “that this guideline serves as the beginning of a conversation that will allow inpatient caregivers to provide individualized care to patients – some of whom may be self-sufficient with their glycemic management and others who need additional assistance.”
Development of the guideline was funded by the Endocrine Society. Dr. Korytkowski has reported no relevant financial disclosures.
A version of this article first appeared on Medscape.com.
FROM ENDO 2022
Opioid use in the elderly a dementia risk factor?
in new findings that suggest exposure to these drugs may be another modifiable risk factor for dementia.
“Clinicians and others may want to consider that opioid exposure in those aged 75-80 increases dementia risk, and to balance the potential benefits of opioid use in old age with adverse side effects,” said Stephen Z. Levine, PhD, professor, department of community mental health, University of Haifa (Israel).
The study was published online in the American Journal of Geriatric Psychiatry.
Widespread use
Evidence points to a relatively high rate of opioid prescriptions among older adults. A Morbidity and Mortality Weekly Report noted 19.2% of the U.S. adult population filled an opioid prescription in 2018, with the rate in those over 65 double that of adults aged 20-24 years (25% vs. 11.2%).
Disorders and illnesses for which opioids might be prescribed, including cancer and some pain conditions, “are far more prevalent in old age than at a younger age,” said Dr. Levine.
This high rate of opioid use underscores the need to consider the risks of opioid use in old age, said Dr. Levine. “Unfortunately, studies of the association between opioid use and dementia risk in old age are few, and their results are inconsistent.”
The study included 91,307 Israeli citizens aged 60 and over without dementia who were enrolled in the Meuhedet Healthcare Services, a nonprofit health maintenance organization (HMO) serving 14% of the country’s population. Meuhedet has maintained an up-to-date dementia registry since 2002.
The average age of the study sample was 68.29 years at the start of the study (in 2012).
In Israel, opioids are prescribed for a 30-day period. In this study, opioid exposure was defined as opioid medication fills covering 60 days (or two prescriptions) within a 120-day interval.
The primary outcome was incident dementia during follow-up from Jan. 1, 2013 to Oct. 30, 2017. The analysis controlled for a number of factors, including age, sex, smoking status, health conditions such as arthritis, depression, diabetes, osteoporosis, cognitive decline, vitamin deficiencies, cancer, cardiovascular conditions, and hospitalizations for falls.
Researchers also accounted for the competing risk of mortality.
During the study, 3.1% of subjects were exposed to opioids at a mean age of 73.94 years, and 5.8% of subjects developed dementia at an average age of 78.07 years.
Increased dementia risk
The risk of incident dementia was significantly increased in those exposed to opioids versus unexposed individuals in the 75- to 80-year age group (adjusted hazard ratio, 1.39; 95% confidence interval, 1.01-1.92; z statistic = 2.02; P < .05).
The authors noted the effect size for opioid exposure in this elderly age group is like other potentially modifiable risk factors for dementia, including body mass index and smoking.
The current study could not determine the biological explanation for the increased dementia risk among older opioid users. “Causal notions are challenging in observational studies and should be viewed with caution,” Dr. Levine noted.
However, a plausible mechanism highlighted in the literature is that opioids promote apoptosis of microglia and neurons that contribute to neurodegenerative diseases, he said.
The study included 14 sensitivity analyses, including those that looked at females, subjects older than 70, smokers, and groups with and without comorbid health conditions. The only sensitivity analysis that didn’t have similar findings to the primary analysis looked at dementia risk restricted to subjects without a vitamin deficiency.
“It’s reassuring that 13 or 14 sensitivity analyses found a significant association between opioid exposure and dementia risk,” said Dr. Levine.
Some prior studies did not show an association between opioid exposure and dementia risk. One possible reason for the discrepancy with the current findings is that the previous research didn’t account for age-specific opioid use effects, or the competing risk of mortality, said Dr. Levine.
Clinicians have a number of potential alternatives to opioids to treat various conditions including acetaminophen, non-steroidal anti-inflammatory drugs, amine reuptake inhibitors (ARIs), membrane stabilizers, muscle relaxants, topical capsaicin, botulinum toxin, cannabinoids, and steroids.
A limitation of the study was that it didn’t adjust for all possible comorbid health conditions, including vascular conditions, or for use of benzodiazepines, and surgical procedures.
In addition, since up to 50% of dementia cases are undetected, it’s possible some in the unexposed opioid group may actually have undiagnosed dementia, thereby reducing the effect sizes in the results.
Reverse causality is also a possibility as the neuropathological process associated with dementia could have started prior to opioid exposure. In addition, the results are limited to prolonged opioid exposure.
Interpret with caution
Commenting on the study, David Knopman, MD, a neurologist at Mayo Clinic in Rochester, Minn., whose research involves late-life cognitive disorders, was skeptical.
“On the face of it, the fact that an association was seen only in one narrow age range – 75+ to 80 years – ought to raise serious suspicion about the reliability and validity of the claim that opioid use is a risk factor for dementia, he said.
Although the researchers performed several sensitivity analyses, including accounting for mortality, “pharmacoepidemiological studies are terribly sensitive to residual biases” related to physician and patient choices related to medication use, added Dr. Knopman.
The claim that opioids are a dementia risk “should be viewed with great caution” and should not influence use of opioids where they’re truly indicated, he said.
“It would be a great pity if patients with pain requiring opioids avoid them because of fears about dementia based on the dubious relationship between age and opioid use.”
Dr. Levine and Dr. Knopman report no relevant financial disclosures.
A version of this article first appeared on Medscape.com.
in new findings that suggest exposure to these drugs may be another modifiable risk factor for dementia.
“Clinicians and others may want to consider that opioid exposure in those aged 75-80 increases dementia risk, and to balance the potential benefits of opioid use in old age with adverse side effects,” said Stephen Z. Levine, PhD, professor, department of community mental health, University of Haifa (Israel).
The study was published online in the American Journal of Geriatric Psychiatry.
Widespread use
Evidence points to a relatively high rate of opioid prescriptions among older adults. A Morbidity and Mortality Weekly Report noted 19.2% of the U.S. adult population filled an opioid prescription in 2018, with the rate in those over 65 double that of adults aged 20-24 years (25% vs. 11.2%).
Disorders and illnesses for which opioids might be prescribed, including cancer and some pain conditions, “are far more prevalent in old age than at a younger age,” said Dr. Levine.
This high rate of opioid use underscores the need to consider the risks of opioid use in old age, said Dr. Levine. “Unfortunately, studies of the association between opioid use and dementia risk in old age are few, and their results are inconsistent.”
The study included 91,307 Israeli citizens aged 60 and over without dementia who were enrolled in the Meuhedet Healthcare Services, a nonprofit health maintenance organization (HMO) serving 14% of the country’s population. Meuhedet has maintained an up-to-date dementia registry since 2002.
The average age of the study sample was 68.29 years at the start of the study (in 2012).
In Israel, opioids are prescribed for a 30-day period. In this study, opioid exposure was defined as opioid medication fills covering 60 days (or two prescriptions) within a 120-day interval.
The primary outcome was incident dementia during follow-up from Jan. 1, 2013 to Oct. 30, 2017. The analysis controlled for a number of factors, including age, sex, smoking status, health conditions such as arthritis, depression, diabetes, osteoporosis, cognitive decline, vitamin deficiencies, cancer, cardiovascular conditions, and hospitalizations for falls.
Researchers also accounted for the competing risk of mortality.
During the study, 3.1% of subjects were exposed to opioids at a mean age of 73.94 years, and 5.8% of subjects developed dementia at an average age of 78.07 years.
Increased dementia risk
The risk of incident dementia was significantly increased in those exposed to opioids versus unexposed individuals in the 75- to 80-year age group (adjusted hazard ratio, 1.39; 95% confidence interval, 1.01-1.92; z statistic = 2.02; P < .05).
The authors noted the effect size for opioid exposure in this elderly age group is like other potentially modifiable risk factors for dementia, including body mass index and smoking.
The current study could not determine the biological explanation for the increased dementia risk among older opioid users. “Causal notions are challenging in observational studies and should be viewed with caution,” Dr. Levine noted.
However, a plausible mechanism highlighted in the literature is that opioids promote apoptosis of microglia and neurons that contribute to neurodegenerative diseases, he said.
The study included 14 sensitivity analyses, including those that looked at females, subjects older than 70, smokers, and groups with and without comorbid health conditions. The only sensitivity analysis that didn’t have similar findings to the primary analysis looked at dementia risk restricted to subjects without a vitamin deficiency.
“It’s reassuring that 13 or 14 sensitivity analyses found a significant association between opioid exposure and dementia risk,” said Dr. Levine.
Some prior studies did not show an association between opioid exposure and dementia risk. One possible reason for the discrepancy with the current findings is that the previous research didn’t account for age-specific opioid use effects, or the competing risk of mortality, said Dr. Levine.
Clinicians have a number of potential alternatives to opioids to treat various conditions including acetaminophen, non-steroidal anti-inflammatory drugs, amine reuptake inhibitors (ARIs), membrane stabilizers, muscle relaxants, topical capsaicin, botulinum toxin, cannabinoids, and steroids.
A limitation of the study was that it didn’t adjust for all possible comorbid health conditions, including vascular conditions, or for use of benzodiazepines, and surgical procedures.
In addition, since up to 50% of dementia cases are undetected, it’s possible some in the unexposed opioid group may actually have undiagnosed dementia, thereby reducing the effect sizes in the results.
Reverse causality is also a possibility as the neuropathological process associated with dementia could have started prior to opioid exposure. In addition, the results are limited to prolonged opioid exposure.
Interpret with caution
Commenting on the study, David Knopman, MD, a neurologist at Mayo Clinic in Rochester, Minn., whose research involves late-life cognitive disorders, was skeptical.
“On the face of it, the fact that an association was seen only in one narrow age range – 75+ to 80 years – ought to raise serious suspicion about the reliability and validity of the claim that opioid use is a risk factor for dementia, he said.
Although the researchers performed several sensitivity analyses, including accounting for mortality, “pharmacoepidemiological studies are terribly sensitive to residual biases” related to physician and patient choices related to medication use, added Dr. Knopman.
The claim that opioids are a dementia risk “should be viewed with great caution” and should not influence use of opioids where they’re truly indicated, he said.
“It would be a great pity if patients with pain requiring opioids avoid them because of fears about dementia based on the dubious relationship between age and opioid use.”
Dr. Levine and Dr. Knopman report no relevant financial disclosures.
A version of this article first appeared on Medscape.com.
in new findings that suggest exposure to these drugs may be another modifiable risk factor for dementia.
“Clinicians and others may want to consider that opioid exposure in those aged 75-80 increases dementia risk, and to balance the potential benefits of opioid use in old age with adverse side effects,” said Stephen Z. Levine, PhD, professor, department of community mental health, University of Haifa (Israel).
The study was published online in the American Journal of Geriatric Psychiatry.
Widespread use
Evidence points to a relatively high rate of opioid prescriptions among older adults. A Morbidity and Mortality Weekly Report noted 19.2% of the U.S. adult population filled an opioid prescription in 2018, with the rate in those over 65 double that of adults aged 20-24 years (25% vs. 11.2%).
Disorders and illnesses for which opioids might be prescribed, including cancer and some pain conditions, “are far more prevalent in old age than at a younger age,” said Dr. Levine.
This high rate of opioid use underscores the need to consider the risks of opioid use in old age, said Dr. Levine. “Unfortunately, studies of the association between opioid use and dementia risk in old age are few, and their results are inconsistent.”
The study included 91,307 Israeli citizens aged 60 and over without dementia who were enrolled in the Meuhedet Healthcare Services, a nonprofit health maintenance organization (HMO) serving 14% of the country’s population. Meuhedet has maintained an up-to-date dementia registry since 2002.
The average age of the study sample was 68.29 years at the start of the study (in 2012).
In Israel, opioids are prescribed for a 30-day period. In this study, opioid exposure was defined as opioid medication fills covering 60 days (or two prescriptions) within a 120-day interval.
The primary outcome was incident dementia during follow-up from Jan. 1, 2013 to Oct. 30, 2017. The analysis controlled for a number of factors, including age, sex, smoking status, health conditions such as arthritis, depression, diabetes, osteoporosis, cognitive decline, vitamin deficiencies, cancer, cardiovascular conditions, and hospitalizations for falls.
Researchers also accounted for the competing risk of mortality.
During the study, 3.1% of subjects were exposed to opioids at a mean age of 73.94 years, and 5.8% of subjects developed dementia at an average age of 78.07 years.
Increased dementia risk
The risk of incident dementia was significantly increased in those exposed to opioids versus unexposed individuals in the 75- to 80-year age group (adjusted hazard ratio, 1.39; 95% confidence interval, 1.01-1.92; z statistic = 2.02; P < .05).
The authors noted the effect size for opioid exposure in this elderly age group is like other potentially modifiable risk factors for dementia, including body mass index and smoking.
The current study could not determine the biological explanation for the increased dementia risk among older opioid users. “Causal notions are challenging in observational studies and should be viewed with caution,” Dr. Levine noted.
However, a plausible mechanism highlighted in the literature is that opioids promote apoptosis of microglia and neurons that contribute to neurodegenerative diseases, he said.
The study included 14 sensitivity analyses, including those that looked at females, subjects older than 70, smokers, and groups with and without comorbid health conditions. The only sensitivity analysis that didn’t have similar findings to the primary analysis looked at dementia risk restricted to subjects without a vitamin deficiency.
“It’s reassuring that 13 or 14 sensitivity analyses found a significant association between opioid exposure and dementia risk,” said Dr. Levine.
Some prior studies did not show an association between opioid exposure and dementia risk. One possible reason for the discrepancy with the current findings is that the previous research didn’t account for age-specific opioid use effects, or the competing risk of mortality, said Dr. Levine.
Clinicians have a number of potential alternatives to opioids to treat various conditions including acetaminophen, non-steroidal anti-inflammatory drugs, amine reuptake inhibitors (ARIs), membrane stabilizers, muscle relaxants, topical capsaicin, botulinum toxin, cannabinoids, and steroids.
A limitation of the study was that it didn’t adjust for all possible comorbid health conditions, including vascular conditions, or for use of benzodiazepines, and surgical procedures.
In addition, since up to 50% of dementia cases are undetected, it’s possible some in the unexposed opioid group may actually have undiagnosed dementia, thereby reducing the effect sizes in the results.
Reverse causality is also a possibility as the neuropathological process associated with dementia could have started prior to opioid exposure. In addition, the results are limited to prolonged opioid exposure.
Interpret with caution
Commenting on the study, David Knopman, MD, a neurologist at Mayo Clinic in Rochester, Minn., whose research involves late-life cognitive disorders, was skeptical.
“On the face of it, the fact that an association was seen only in one narrow age range – 75+ to 80 years – ought to raise serious suspicion about the reliability and validity of the claim that opioid use is a risk factor for dementia, he said.
Although the researchers performed several sensitivity analyses, including accounting for mortality, “pharmacoepidemiological studies are terribly sensitive to residual biases” related to physician and patient choices related to medication use, added Dr. Knopman.
The claim that opioids are a dementia risk “should be viewed with great caution” and should not influence use of opioids where they’re truly indicated, he said.
“It would be a great pity if patients with pain requiring opioids avoid them because of fears about dementia based on the dubious relationship between age and opioid use.”
Dr. Levine and Dr. Knopman report no relevant financial disclosures.
A version of this article first appeared on Medscape.com.
FROM AMERICAN JOURNAL OF GERIATRIC PSYCHIATRY
Collagen ‘tile’ delivers postsurgical radiation in glioblastoma
and spares healthy tissue, new research suggests.
The results showed inserting a collagen matrix containing radioactive seeds into the brain postsurgery did not impede wound healing. It also showed a favorable safety profile, researchers note.
Benefits for patients undergoing this GammaTile (GT) intervention include not having to wait weeks to receive radiation treatment, which in turn improves their quality of life, said study investigator Clark C. Chen, MD, PhD, chair, department of neurosurgery, University of Minnesota Medical School, Minneapolis.
“These initial results are highly promising and offer hope for patients afflicted with an otherwise devastating disease,” Dr. Chen said in an interview.
If replicated in larger trials, GT therapy “could define a new standard of care, and there would really be no reason why patients shouldn’t get this therapy,” he added.
This is the first clinical series describing GT use since its approval by the U.S. Food and Drug Administration (FDA) for recurrent brain cancer.
The findings were presented at the annual meeting of the American Association of Neurological Surgeons (AANS) and were published recently in Neuro-Oncology Advances.
Radioactive seeds
GT therapy is a version of brachytherapy where radioactive sources are placed adjacent to cancerous tissue. It consists of radioactive seeds embedded with a collagen tile.
The neurosurgeon inserts these “tiles” immediately after tumor removal to cover the entire resection cavity, Dr. Chen said. The tiles maintain the cavity architecture to prevent radiation “hot spots” associated with cavity collapse.
Dr. Chen noted the therapy is “short range,” with most of the radiation delivered within 8 millimeters of the radioactive seeds.
The radiation lasts for about a month and the collagen tiles are eventually absorbed within the body. “You put in the tiles and you don’t need to do anything more,” Dr. Chen said.
GT has a number of advantages. Unlike with traditional brachytherapy, the collagen tile provides a buffer around the radiation sources, allowing delivery of the optimal radiation dose while preserving healthy tissue.
It also avoids the up-to-6-weeks patients have to wait postsurgery to get external beam radiation therapy. “If you start radiation too early, it actually compromises wound healing, and in the meantime the tumor is growing,” said Dr. Chen.
“I have several patients where I removed a large tumor and within that 6-week period, the tumor came back entirely,” he added.
With the gamma-tile, however, radiation from the seeds kills the tumor while the body heals.
Safety profile
The study included 22 patients (mean age, 57.7 years; 15 men, 7 women) with wild-type isocitrate dehydrogenase glioblastoma. They were all having surgery for recurrent tumors.
“One of the most challenging aspects of glioblastomas is that not only do the tumors come back, they come back immediately adjacent to where you have done the surgery, and for many patients this is demoralizing,” Dr. Chen said.
Six participants had 0 6 -Methylguanine-DNA methyltranferase (MGMT) methylated glioblastoma, while the others had unmethylated MGMT.
The mean follow-up from initial diagnosis was 733 days (2 years).
Results showed one patient had to be readmitted to the hospital for hydrocephalus, but there were no re-admissions within 30 days attributable to GT.
Despite participants having undergone a second and third resection through the same surgical incision, there were no wound infections. “One of the concerns of giving radiation right after surgery is it can compromise wound healing, and this is why you wait 6 weeks,” Dr. Chen noted.
He stressed that no patient in the study suffered from adverse radiation effects that required medical or surgical intervention.
As the radiation is so short-range, hair loss and skin irritation are not side effects of GT, he added.
“The radiation is inside the brain and highly targeted, so it doesn’t hit hair follicles,” said Dr. Chen. “As best as I can observe in these patients, I did not see toxicity associated with radiation.”
One and done
Among the 22 participants, 18 had neurologic symptoms at baseline. There were no new neurologic deficits that developed after GT placement.
In addition, GT therapy improved “local control” — preventing the tumor from growing back at the site of the surgery. The local control was 86% at 6 months and 81% at 12 months.
The median progression-free survival was about 8 months. The median overall survival was 20 months (about 600 days) for the unmethylated MGMT group and 37.4 months (about 1120 days) for the methylated group.
Outcomes compared favorably to an independent glioblastoma cohort of similar patients who did not receive GT treatment during the study period, Dr. Chen noted.
“This therapy can potentially redefine how we treat glioblastoma patients whose cancer came back,” he said.
A study limitation was that it did not include quality-of-life data, which makes it challenging to assess the therapy’s overall impact, Dr. Chen said. However, he added that from his experience, patients very much appreciate not having to repeatedly take time off work for clinic or hospital visits to receive radiation treatments.
“One of the beauties of this therapy is it’s a one-and-done deal,” he said.
Interesting, timely
Commenting for this news organization, William T. Curry Jr, MD, co-director at MassGeneral Neuroscience and director of neurosurgical oncology at Mass General Cancer Center, Boston, called the study “interesting and timely.”
These new data “underscore that GT is safe in patients that have undergone gross total resection of recurrent glioblastoma and that rates of progression free survival may exceed those treated with resection alone,” said Dr. Curry, who was not involved with the research.
“Surgeons are excited about anything that has the potential to improve outcomes for patients with this very challenging disease, and it is wonderful to be able to offer hope and survival tools to patients,” he added.
However, Dr. Curry noted there are challenges and potential biases when studying survival in cancer patients without conducting a randomization process. The investigators “admit to methodological flaws inherent in the single-arm design in a patient population with recurrent glioblastoma not treated uniformly,” he said.
In addition, he noted overall survival may not have been related to the GT intervention. “Multicenter randomization is probably required to get to the bottom of the survival advantage in different subsets of glioblastoma patients,” Dr. Curry said.
Further research is needed to confirm the efficacy, appropriate indications, and timing of the intervention, but “I would support a randomized multicenter study in patients undergoing near gross total resection of recurrent glioblastoma,” he concluded.
The study received no outside funding. Dr. Chen and Dr. Curry have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
and spares healthy tissue, new research suggests.
The results showed inserting a collagen matrix containing radioactive seeds into the brain postsurgery did not impede wound healing. It also showed a favorable safety profile, researchers note.
Benefits for patients undergoing this GammaTile (GT) intervention include not having to wait weeks to receive radiation treatment, which in turn improves their quality of life, said study investigator Clark C. Chen, MD, PhD, chair, department of neurosurgery, University of Minnesota Medical School, Minneapolis.
“These initial results are highly promising and offer hope for patients afflicted with an otherwise devastating disease,” Dr. Chen said in an interview.
If replicated in larger trials, GT therapy “could define a new standard of care, and there would really be no reason why patients shouldn’t get this therapy,” he added.
This is the first clinical series describing GT use since its approval by the U.S. Food and Drug Administration (FDA) for recurrent brain cancer.
The findings were presented at the annual meeting of the American Association of Neurological Surgeons (AANS) and were published recently in Neuro-Oncology Advances.
Radioactive seeds
GT therapy is a version of brachytherapy where radioactive sources are placed adjacent to cancerous tissue. It consists of radioactive seeds embedded with a collagen tile.
The neurosurgeon inserts these “tiles” immediately after tumor removal to cover the entire resection cavity, Dr. Chen said. The tiles maintain the cavity architecture to prevent radiation “hot spots” associated with cavity collapse.
Dr. Chen noted the therapy is “short range,” with most of the radiation delivered within 8 millimeters of the radioactive seeds.
The radiation lasts for about a month and the collagen tiles are eventually absorbed within the body. “You put in the tiles and you don’t need to do anything more,” Dr. Chen said.
GT has a number of advantages. Unlike with traditional brachytherapy, the collagen tile provides a buffer around the radiation sources, allowing delivery of the optimal radiation dose while preserving healthy tissue.
It also avoids the up-to-6-weeks patients have to wait postsurgery to get external beam radiation therapy. “If you start radiation too early, it actually compromises wound healing, and in the meantime the tumor is growing,” said Dr. Chen.
“I have several patients where I removed a large tumor and within that 6-week period, the tumor came back entirely,” he added.
With the gamma-tile, however, radiation from the seeds kills the tumor while the body heals.
Safety profile
The study included 22 patients (mean age, 57.7 years; 15 men, 7 women) with wild-type isocitrate dehydrogenase glioblastoma. They were all having surgery for recurrent tumors.
“One of the most challenging aspects of glioblastomas is that not only do the tumors come back, they come back immediately adjacent to where you have done the surgery, and for many patients this is demoralizing,” Dr. Chen said.
Six participants had 0 6 -Methylguanine-DNA methyltranferase (MGMT) methylated glioblastoma, while the others had unmethylated MGMT.
The mean follow-up from initial diagnosis was 733 days (2 years).
Results showed one patient had to be readmitted to the hospital for hydrocephalus, but there were no re-admissions within 30 days attributable to GT.
Despite participants having undergone a second and third resection through the same surgical incision, there were no wound infections. “One of the concerns of giving radiation right after surgery is it can compromise wound healing, and this is why you wait 6 weeks,” Dr. Chen noted.
He stressed that no patient in the study suffered from adverse radiation effects that required medical or surgical intervention.
As the radiation is so short-range, hair loss and skin irritation are not side effects of GT, he added.
“The radiation is inside the brain and highly targeted, so it doesn’t hit hair follicles,” said Dr. Chen. “As best as I can observe in these patients, I did not see toxicity associated with radiation.”
One and done
Among the 22 participants, 18 had neurologic symptoms at baseline. There were no new neurologic deficits that developed after GT placement.
In addition, GT therapy improved “local control” — preventing the tumor from growing back at the site of the surgery. The local control was 86% at 6 months and 81% at 12 months.
The median progression-free survival was about 8 months. The median overall survival was 20 months (about 600 days) for the unmethylated MGMT group and 37.4 months (about 1120 days) for the methylated group.
Outcomes compared favorably to an independent glioblastoma cohort of similar patients who did not receive GT treatment during the study period, Dr. Chen noted.
“This therapy can potentially redefine how we treat glioblastoma patients whose cancer came back,” he said.
A study limitation was that it did not include quality-of-life data, which makes it challenging to assess the therapy’s overall impact, Dr. Chen said. However, he added that from his experience, patients very much appreciate not having to repeatedly take time off work for clinic or hospital visits to receive radiation treatments.
“One of the beauties of this therapy is it’s a one-and-done deal,” he said.
Interesting, timely
Commenting for this news organization, William T. Curry Jr, MD, co-director at MassGeneral Neuroscience and director of neurosurgical oncology at Mass General Cancer Center, Boston, called the study “interesting and timely.”
These new data “underscore that GT is safe in patients that have undergone gross total resection of recurrent glioblastoma and that rates of progression free survival may exceed those treated with resection alone,” said Dr. Curry, who was not involved with the research.
“Surgeons are excited about anything that has the potential to improve outcomes for patients with this very challenging disease, and it is wonderful to be able to offer hope and survival tools to patients,” he added.
However, Dr. Curry noted there are challenges and potential biases when studying survival in cancer patients without conducting a randomization process. The investigators “admit to methodological flaws inherent in the single-arm design in a patient population with recurrent glioblastoma not treated uniformly,” he said.
In addition, he noted overall survival may not have been related to the GT intervention. “Multicenter randomization is probably required to get to the bottom of the survival advantage in different subsets of glioblastoma patients,” Dr. Curry said.
Further research is needed to confirm the efficacy, appropriate indications, and timing of the intervention, but “I would support a randomized multicenter study in patients undergoing near gross total resection of recurrent glioblastoma,” he concluded.
The study received no outside funding. Dr. Chen and Dr. Curry have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
and spares healthy tissue, new research suggests.
The results showed inserting a collagen matrix containing radioactive seeds into the brain postsurgery did not impede wound healing. It also showed a favorable safety profile, researchers note.
Benefits for patients undergoing this GammaTile (GT) intervention include not having to wait weeks to receive radiation treatment, which in turn improves their quality of life, said study investigator Clark C. Chen, MD, PhD, chair, department of neurosurgery, University of Minnesota Medical School, Minneapolis.
“These initial results are highly promising and offer hope for patients afflicted with an otherwise devastating disease,” Dr. Chen said in an interview.
If replicated in larger trials, GT therapy “could define a new standard of care, and there would really be no reason why patients shouldn’t get this therapy,” he added.
This is the first clinical series describing GT use since its approval by the U.S. Food and Drug Administration (FDA) for recurrent brain cancer.
The findings were presented at the annual meeting of the American Association of Neurological Surgeons (AANS) and were published recently in Neuro-Oncology Advances.
Radioactive seeds
GT therapy is a version of brachytherapy where radioactive sources are placed adjacent to cancerous tissue. It consists of radioactive seeds embedded with a collagen tile.
The neurosurgeon inserts these “tiles” immediately after tumor removal to cover the entire resection cavity, Dr. Chen said. The tiles maintain the cavity architecture to prevent radiation “hot spots” associated with cavity collapse.
Dr. Chen noted the therapy is “short range,” with most of the radiation delivered within 8 millimeters of the radioactive seeds.
The radiation lasts for about a month and the collagen tiles are eventually absorbed within the body. “You put in the tiles and you don’t need to do anything more,” Dr. Chen said.
GT has a number of advantages. Unlike with traditional brachytherapy, the collagen tile provides a buffer around the radiation sources, allowing delivery of the optimal radiation dose while preserving healthy tissue.
It also avoids the up-to-6-weeks patients have to wait postsurgery to get external beam radiation therapy. “If you start radiation too early, it actually compromises wound healing, and in the meantime the tumor is growing,” said Dr. Chen.
“I have several patients where I removed a large tumor and within that 6-week period, the tumor came back entirely,” he added.
With the gamma-tile, however, radiation from the seeds kills the tumor while the body heals.
Safety profile
The study included 22 patients (mean age, 57.7 years; 15 men, 7 women) with wild-type isocitrate dehydrogenase glioblastoma. They were all having surgery for recurrent tumors.
“One of the most challenging aspects of glioblastomas is that not only do the tumors come back, they come back immediately adjacent to where you have done the surgery, and for many patients this is demoralizing,” Dr. Chen said.
Six participants had 0 6 -Methylguanine-DNA methyltranferase (MGMT) methylated glioblastoma, while the others had unmethylated MGMT.
The mean follow-up from initial diagnosis was 733 days (2 years).
Results showed one patient had to be readmitted to the hospital for hydrocephalus, but there were no re-admissions within 30 days attributable to GT.
Despite participants having undergone a second and third resection through the same surgical incision, there were no wound infections. “One of the concerns of giving radiation right after surgery is it can compromise wound healing, and this is why you wait 6 weeks,” Dr. Chen noted.
He stressed that no patient in the study suffered from adverse radiation effects that required medical or surgical intervention.
As the radiation is so short-range, hair loss and skin irritation are not side effects of GT, he added.
“The radiation is inside the brain and highly targeted, so it doesn’t hit hair follicles,” said Dr. Chen. “As best as I can observe in these patients, I did not see toxicity associated with radiation.”
One and done
Among the 22 participants, 18 had neurologic symptoms at baseline. There were no new neurologic deficits that developed after GT placement.
In addition, GT therapy improved “local control” — preventing the tumor from growing back at the site of the surgery. The local control was 86% at 6 months and 81% at 12 months.
The median progression-free survival was about 8 months. The median overall survival was 20 months (about 600 days) for the unmethylated MGMT group and 37.4 months (about 1120 days) for the methylated group.
Outcomes compared favorably to an independent glioblastoma cohort of similar patients who did not receive GT treatment during the study period, Dr. Chen noted.
“This therapy can potentially redefine how we treat glioblastoma patients whose cancer came back,” he said.
A study limitation was that it did not include quality-of-life data, which makes it challenging to assess the therapy’s overall impact, Dr. Chen said. However, he added that from his experience, patients very much appreciate not having to repeatedly take time off work for clinic or hospital visits to receive radiation treatments.
“One of the beauties of this therapy is it’s a one-and-done deal,” he said.
Interesting, timely
Commenting for this news organization, William T. Curry Jr, MD, co-director at MassGeneral Neuroscience and director of neurosurgical oncology at Mass General Cancer Center, Boston, called the study “interesting and timely.”
These new data “underscore that GT is safe in patients that have undergone gross total resection of recurrent glioblastoma and that rates of progression free survival may exceed those treated with resection alone,” said Dr. Curry, who was not involved with the research.
“Surgeons are excited about anything that has the potential to improve outcomes for patients with this very challenging disease, and it is wonderful to be able to offer hope and survival tools to patients,” he added.
However, Dr. Curry noted there are challenges and potential biases when studying survival in cancer patients without conducting a randomization process. The investigators “admit to methodological flaws inherent in the single-arm design in a patient population with recurrent glioblastoma not treated uniformly,” he said.
In addition, he noted overall survival may not have been related to the GT intervention. “Multicenter randomization is probably required to get to the bottom of the survival advantage in different subsets of glioblastoma patients,” Dr. Curry said.
Further research is needed to confirm the efficacy, appropriate indications, and timing of the intervention, but “I would support a randomized multicenter study in patients undergoing near gross total resection of recurrent glioblastoma,” he concluded.
The study received no outside funding. Dr. Chen and Dr. Curry have disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM AANS 2022