User login
Genetic testing warranted in epilepsy of unknown origin
ORLANDO —
, new research suggests. Investigators found that pathogenic genetic variants were identified in over 40% of patients with epilepsy of unknown cause who underwent genetic testing.Such testing is particularly beneficial for those with early-onset epilepsy and those with comorbid developmental delay, said study investigator Yi Li, MD, PhD, clinical assistant professor, Department of Neurology & Neurological Sciences, Stanford University School of Medicine, Stanford, California.
But every patient with epilepsy of unknown etiology needs to consider genetic testing as part of their standard workup.
Dr. Li noted research showing that a diagnosis of a genetic epilepsy leads to alteration of treatment in about 20% of cases — for example, starting a specific antiseizure medication or avoiding a treatment such as a sodium channel blocker in patients diagnosed with Dravet syndrome. A genetic diagnosis also may make patients eligible for clinical trials investigating gene therapies.
Genetic testing results may end a long and exhausting “diagnostic odyssey” that families have been on, she said. Patients often wait more than a decade to get genetic testing, the study found.
The findings were presented at the annual meeting of the American Epilepsy Society.
Major Delays
About 20%-30% of epilepsy is caused by acquired conditions such as stroke, tumor, or head injury. The remaining 70%-80% is believed to be due to one or more genetic factors.
Genetic testing has become standard for children with early-onset epilepsy, but it’s not common practice among adults with the condition — at least not yet.
The retrospective study involved a chart review of patient electronic health records from 2018-2023. Researchers used the Stanford electronic health record Cohort Discovery tool (STARR) database to identify 286 patients over age 16 years with epilepsy who had records of genetic testing.
Of the 286 patients, 148 were male and 138 female, and mean age was approximately 30 years. Among those with known epilepsy types, 53.6% had focal epilepsy and 28.8% had generalized epilpesy.
The mean age of seizure onset was 11.9 years, but the mean age at genetic testing was 25.1 years. “There’s a gap of about 13 or 14 years for genetic workup after a patient has a first seizure,” said Dr. Li.
Such a “huge delay” means patients may miss out on “potential precision treatment choices,” she said.
And having a diagnosis can connect patients to others with the same condition as well as to related organizations and communities that offer support, she added.
Types of genetic testing identified in the study included panel testing, which looks at the genes associated with epilepsy; whole exome sequencing (WES), which includes all 20,000 genes in one test; and microarray testing, which assesses missing sections of chromosomes. WES had the highest diagnostic yield (48%), followed by genetic panel testing (32.7%) and microarray testing (20.9%).
These tests collectively identified pathogenic variants in 40.9% of patients. In addition, test results showed that 53.10% of patients had variants of uncertain significance.
In the full cohort, the most commonly identified variants were mutations in TSC1 (which causes tuberous sclerosis, SCN1A (which causes Dravet syndrome), and MECP2. Among patients with seizure onset after age 1 year, MECP2 and DEPDC5 were the two most commonly identified pathogenic variants.
Researchers examined factors possibly associated with a higher risk for genetic epilepsy, including family history, comorbid developmental delay, febrile seizures, status epilepticus, perinatal injury, and seizure onset age. In an adjusted analysis, comorbid developmental delay (estimate 2.338; 95% confidence interval [CI], 1.402-3.900; P =.001) and seizure onset before 1 year (estimate 2.365; 95% CI, 1.282-4.366; P =.006) predicted higher yield of pathogenic variants related to epilepsy.
Dr. Li noted that study participants with a family history of epilepsy were not more likely to test positive for a genetic link, so doctors shouldn’t rule out testing in patients if there’s no family history.
Both the International League Against Epilepsy (ILAE) and the National Society of Genetic Counselors (NSGC) recommend genetic testing in adult epilepsy patients, with the AES endorsing the NSGC guideline.
Although testing is becoming increasingly accessible, insurance companies don’t always cover the cost.
Dr. Li said she hopes her research raises awareness among clinicians that there’s more they can do to improve care for epilepsy patients. “We should offer patients genetic testing if we don’t have a clear etiology.”
Valuable Evidence
Commenting on the research findings, Annapurna Poduri, MD, MPH, director, Epilepsy Genetics Program, Boston Children’s Hospital, Boston, Massachusetts, said this research “is incredibly important.”
“What’s really telling about this study and others that have come up over the last few years is they’re real-world retrospective studies, so they’re looking back at patients who have been seen over many, many years.”
The research provides clinicians, insurance companies, and others with evidence that genetic testing is “valuable and can actually improve outcomes,” said Dr. Poduri.
She noted that 20 years ago, there were only a handful of genes identified as being involved with epilepsy, most related to sodium or potassium channels. But since then, “the technology has just raced ahead” to the point where now “dozens of genes” have been identified.
Not only does knowing the genetic basis of epilepsy improve management, but it offers families some peace of mind. “They blame themselves” for their loved one’s condition, said Dr. Poduri. “They may worry it was something they did in pregnancy; for example, maybe it was because [they] didn’t take that vitamin one day.”
Diagnostic certainty also means that patients “don’t have to do more tests which might be invasive” and unnecessarily costly.
Drs. Li and Poduri report no relevant conflicts of interest.
A version of this article first appeared on Medscape.com.
ORLANDO —
, new research suggests. Investigators found that pathogenic genetic variants were identified in over 40% of patients with epilepsy of unknown cause who underwent genetic testing.Such testing is particularly beneficial for those with early-onset epilepsy and those with comorbid developmental delay, said study investigator Yi Li, MD, PhD, clinical assistant professor, Department of Neurology & Neurological Sciences, Stanford University School of Medicine, Stanford, California.
But every patient with epilepsy of unknown etiology needs to consider genetic testing as part of their standard workup.
Dr. Li noted research showing that a diagnosis of a genetic epilepsy leads to alteration of treatment in about 20% of cases — for example, starting a specific antiseizure medication or avoiding a treatment such as a sodium channel blocker in patients diagnosed with Dravet syndrome. A genetic diagnosis also may make patients eligible for clinical trials investigating gene therapies.
Genetic testing results may end a long and exhausting “diagnostic odyssey” that families have been on, she said. Patients often wait more than a decade to get genetic testing, the study found.
The findings were presented at the annual meeting of the American Epilepsy Society.
Major Delays
About 20%-30% of epilepsy is caused by acquired conditions such as stroke, tumor, or head injury. The remaining 70%-80% is believed to be due to one or more genetic factors.
Genetic testing has become standard for children with early-onset epilepsy, but it’s not common practice among adults with the condition — at least not yet.
The retrospective study involved a chart review of patient electronic health records from 2018-2023. Researchers used the Stanford electronic health record Cohort Discovery tool (STARR) database to identify 286 patients over age 16 years with epilepsy who had records of genetic testing.
Of the 286 patients, 148 were male and 138 female, and mean age was approximately 30 years. Among those with known epilepsy types, 53.6% had focal epilepsy and 28.8% had generalized epilpesy.
The mean age of seizure onset was 11.9 years, but the mean age at genetic testing was 25.1 years. “There’s a gap of about 13 or 14 years for genetic workup after a patient has a first seizure,” said Dr. Li.
Such a “huge delay” means patients may miss out on “potential precision treatment choices,” she said.
And having a diagnosis can connect patients to others with the same condition as well as to related organizations and communities that offer support, she added.
Types of genetic testing identified in the study included panel testing, which looks at the genes associated with epilepsy; whole exome sequencing (WES), which includes all 20,000 genes in one test; and microarray testing, which assesses missing sections of chromosomes. WES had the highest diagnostic yield (48%), followed by genetic panel testing (32.7%) and microarray testing (20.9%).
These tests collectively identified pathogenic variants in 40.9% of patients. In addition, test results showed that 53.10% of patients had variants of uncertain significance.
In the full cohort, the most commonly identified variants were mutations in TSC1 (which causes tuberous sclerosis, SCN1A (which causes Dravet syndrome), and MECP2. Among patients with seizure onset after age 1 year, MECP2 and DEPDC5 were the two most commonly identified pathogenic variants.
Researchers examined factors possibly associated with a higher risk for genetic epilepsy, including family history, comorbid developmental delay, febrile seizures, status epilepticus, perinatal injury, and seizure onset age. In an adjusted analysis, comorbid developmental delay (estimate 2.338; 95% confidence interval [CI], 1.402-3.900; P =.001) and seizure onset before 1 year (estimate 2.365; 95% CI, 1.282-4.366; P =.006) predicted higher yield of pathogenic variants related to epilepsy.
Dr. Li noted that study participants with a family history of epilepsy were not more likely to test positive for a genetic link, so doctors shouldn’t rule out testing in patients if there’s no family history.
Both the International League Against Epilepsy (ILAE) and the National Society of Genetic Counselors (NSGC) recommend genetic testing in adult epilepsy patients, with the AES endorsing the NSGC guideline.
Although testing is becoming increasingly accessible, insurance companies don’t always cover the cost.
Dr. Li said she hopes her research raises awareness among clinicians that there’s more they can do to improve care for epilepsy patients. “We should offer patients genetic testing if we don’t have a clear etiology.”
Valuable Evidence
Commenting on the research findings, Annapurna Poduri, MD, MPH, director, Epilepsy Genetics Program, Boston Children’s Hospital, Boston, Massachusetts, said this research “is incredibly important.”
“What’s really telling about this study and others that have come up over the last few years is they’re real-world retrospective studies, so they’re looking back at patients who have been seen over many, many years.”
The research provides clinicians, insurance companies, and others with evidence that genetic testing is “valuable and can actually improve outcomes,” said Dr. Poduri.
She noted that 20 years ago, there were only a handful of genes identified as being involved with epilepsy, most related to sodium or potassium channels. But since then, “the technology has just raced ahead” to the point where now “dozens of genes” have been identified.
Not only does knowing the genetic basis of epilepsy improve management, but it offers families some peace of mind. “They blame themselves” for their loved one’s condition, said Dr. Poduri. “They may worry it was something they did in pregnancy; for example, maybe it was because [they] didn’t take that vitamin one day.”
Diagnostic certainty also means that patients “don’t have to do more tests which might be invasive” and unnecessarily costly.
Drs. Li and Poduri report no relevant conflicts of interest.
A version of this article first appeared on Medscape.com.
ORLANDO —
, new research suggests. Investigators found that pathogenic genetic variants were identified in over 40% of patients with epilepsy of unknown cause who underwent genetic testing.Such testing is particularly beneficial for those with early-onset epilepsy and those with comorbid developmental delay, said study investigator Yi Li, MD, PhD, clinical assistant professor, Department of Neurology & Neurological Sciences, Stanford University School of Medicine, Stanford, California.
But every patient with epilepsy of unknown etiology needs to consider genetic testing as part of their standard workup.
Dr. Li noted research showing that a diagnosis of a genetic epilepsy leads to alteration of treatment in about 20% of cases — for example, starting a specific antiseizure medication or avoiding a treatment such as a sodium channel blocker in patients diagnosed with Dravet syndrome. A genetic diagnosis also may make patients eligible for clinical trials investigating gene therapies.
Genetic testing results may end a long and exhausting “diagnostic odyssey” that families have been on, she said. Patients often wait more than a decade to get genetic testing, the study found.
The findings were presented at the annual meeting of the American Epilepsy Society.
Major Delays
About 20%-30% of epilepsy is caused by acquired conditions such as stroke, tumor, or head injury. The remaining 70%-80% is believed to be due to one or more genetic factors.
Genetic testing has become standard for children with early-onset epilepsy, but it’s not common practice among adults with the condition — at least not yet.
The retrospective study involved a chart review of patient electronic health records from 2018-2023. Researchers used the Stanford electronic health record Cohort Discovery tool (STARR) database to identify 286 patients over age 16 years with epilepsy who had records of genetic testing.
Of the 286 patients, 148 were male and 138 female, and mean age was approximately 30 years. Among those with known epilepsy types, 53.6% had focal epilepsy and 28.8% had generalized epilpesy.
The mean age of seizure onset was 11.9 years, but the mean age at genetic testing was 25.1 years. “There’s a gap of about 13 or 14 years for genetic workup after a patient has a first seizure,” said Dr. Li.
Such a “huge delay” means patients may miss out on “potential precision treatment choices,” she said.
And having a diagnosis can connect patients to others with the same condition as well as to related organizations and communities that offer support, she added.
Types of genetic testing identified in the study included panel testing, which looks at the genes associated with epilepsy; whole exome sequencing (WES), which includes all 20,000 genes in one test; and microarray testing, which assesses missing sections of chromosomes. WES had the highest diagnostic yield (48%), followed by genetic panel testing (32.7%) and microarray testing (20.9%).
These tests collectively identified pathogenic variants in 40.9% of patients. In addition, test results showed that 53.10% of patients had variants of uncertain significance.
In the full cohort, the most commonly identified variants were mutations in TSC1 (which causes tuberous sclerosis, SCN1A (which causes Dravet syndrome), and MECP2. Among patients with seizure onset after age 1 year, MECP2 and DEPDC5 were the two most commonly identified pathogenic variants.
Researchers examined factors possibly associated with a higher risk for genetic epilepsy, including family history, comorbid developmental delay, febrile seizures, status epilepticus, perinatal injury, and seizure onset age. In an adjusted analysis, comorbid developmental delay (estimate 2.338; 95% confidence interval [CI], 1.402-3.900; P =.001) and seizure onset before 1 year (estimate 2.365; 95% CI, 1.282-4.366; P =.006) predicted higher yield of pathogenic variants related to epilepsy.
Dr. Li noted that study participants with a family history of epilepsy were not more likely to test positive for a genetic link, so doctors shouldn’t rule out testing in patients if there’s no family history.
Both the International League Against Epilepsy (ILAE) and the National Society of Genetic Counselors (NSGC) recommend genetic testing in adult epilepsy patients, with the AES endorsing the NSGC guideline.
Although testing is becoming increasingly accessible, insurance companies don’t always cover the cost.
Dr. Li said she hopes her research raises awareness among clinicians that there’s more they can do to improve care for epilepsy patients. “We should offer patients genetic testing if we don’t have a clear etiology.”
Valuable Evidence
Commenting on the research findings, Annapurna Poduri, MD, MPH, director, Epilepsy Genetics Program, Boston Children’s Hospital, Boston, Massachusetts, said this research “is incredibly important.”
“What’s really telling about this study and others that have come up over the last few years is they’re real-world retrospective studies, so they’re looking back at patients who have been seen over many, many years.”
The research provides clinicians, insurance companies, and others with evidence that genetic testing is “valuable and can actually improve outcomes,” said Dr. Poduri.
She noted that 20 years ago, there were only a handful of genes identified as being involved with epilepsy, most related to sodium or potassium channels. But since then, “the technology has just raced ahead” to the point where now “dozens of genes” have been identified.
Not only does knowing the genetic basis of epilepsy improve management, but it offers families some peace of mind. “They blame themselves” for their loved one’s condition, said Dr. Poduri. “They may worry it was something they did in pregnancy; for example, maybe it was because [they] didn’t take that vitamin one day.”
Diagnostic certainty also means that patients “don’t have to do more tests which might be invasive” and unnecessarily costly.
Drs. Li and Poduri report no relevant conflicts of interest.
A version of this article first appeared on Medscape.com.
FROM AES 2023
Excessive TV-watching tied to elevated risk for dementia, Parkinson’s disease, and depression
TOPLINE:
whereas a limited amount of daily computer use that is not work-related is linked to a lower risk for dementia.
METHODOLOGY:
- Investigators analyzed data on 473,184 people aged 39-72 years from the UK Biobank who were enrolled from 2006 to 2010 and followed until a diagnosis of dementia, PD, depression, death, or study end (2018 for Wales residents; 2021 for residents of England and Scotland).
- Participants reported on the number of hours they spent outside of work exercising, watching television, and using the computer.
- MRI was conducted to determine participants’ brain volume.
TAKEAWAY:
- During the study, 6096 people developed dementia, 3000 developed PD, 23,600 developed depression, 1200 developed dementia and depression, and 486 developed PD and depression.
- Compared with those who watched TV for under 1 hour per day, those who reported watching 4 or more hours per day had a 28% higher risk for dementia (adjusted hazard ratio [aHR], 1.28; 95% CI, 1.17-1.39), a 35% higher risk for depression, (aHR, 1.35; 95% CI, 1.29-1.40) and a 16% greater risk for PD (aHR, 1.16; 95% CI, 1.03-1.29).
- However, moderate computer use outside of work seemed somewhat protective. Participants who used the computer for 30-60 minutes per day had lower risks for dementia (aHR, 0.68; 95% CI, 0.64-0.72), PD, (aHR, 0.86; 95% CI, 0.79-0.93), and depression (aHR, 0.85; 95% CI, 0.83-0.88) compared with those who reported the lowest levels of computer usage.
- Replacing 30 minutes per day of computer time with an equal amount of structured exercise was associated with decreased risk for dementia (aHR, 0.74; 95% CI, 0.85-0.95) and PD (aHR, 0.84; 95% CI, 0.78-0.90).
IN PRACTICE:
The association between extended periods of TV use and higher risk for PD and dementia could be explained by a lack of activity, the authors note. They add that sedentary behavior is, “associated with biomarkers of low-grade inflammation and changes in inflammation markers that could initiate and or worsen neuroinflammation and contribute to neurodegeneration.”
SOURCE:
Hanzhang Wu, PhD, of Tianjin University of Traditional Medicine in Tianjin, China, led the study, which was published online in the International Journal of Behavioral Nutrition and Physical Activity.
LIMITATIONS:
Screen behaviors were assessed using self-report measures, which is subject to recall bias. Also, there may have been variables confounding the findings for which investigators did not account.
DISCLOSURES:
The study was funded by the National Natural Science Foundation of China, the Tianjin Major Public Health Science and Technology Project, the National Health Commission of China, the Food Science and Technology Foundation of Chinese Institute of Food Science and Technology, the China Cohort Consortium, and the Chinese Nutrition Society Nutrition Research Foundation–DSM Research Fund, China. There were no disclosures reported.
Eve Bender has no relevant financial relationships.
A version of this article appeared on Medscape.com.
TOPLINE:
whereas a limited amount of daily computer use that is not work-related is linked to a lower risk for dementia.
METHODOLOGY:
- Investigators analyzed data on 473,184 people aged 39-72 years from the UK Biobank who were enrolled from 2006 to 2010 and followed until a diagnosis of dementia, PD, depression, death, or study end (2018 for Wales residents; 2021 for residents of England and Scotland).
- Participants reported on the number of hours they spent outside of work exercising, watching television, and using the computer.
- MRI was conducted to determine participants’ brain volume.
TAKEAWAY:
- During the study, 6096 people developed dementia, 3000 developed PD, 23,600 developed depression, 1200 developed dementia and depression, and 486 developed PD and depression.
- Compared with those who watched TV for under 1 hour per day, those who reported watching 4 or more hours per day had a 28% higher risk for dementia (adjusted hazard ratio [aHR], 1.28; 95% CI, 1.17-1.39), a 35% higher risk for depression, (aHR, 1.35; 95% CI, 1.29-1.40) and a 16% greater risk for PD (aHR, 1.16; 95% CI, 1.03-1.29).
- However, moderate computer use outside of work seemed somewhat protective. Participants who used the computer for 30-60 minutes per day had lower risks for dementia (aHR, 0.68; 95% CI, 0.64-0.72), PD, (aHR, 0.86; 95% CI, 0.79-0.93), and depression (aHR, 0.85; 95% CI, 0.83-0.88) compared with those who reported the lowest levels of computer usage.
- Replacing 30 minutes per day of computer time with an equal amount of structured exercise was associated with decreased risk for dementia (aHR, 0.74; 95% CI, 0.85-0.95) and PD (aHR, 0.84; 95% CI, 0.78-0.90).
IN PRACTICE:
The association between extended periods of TV use and higher risk for PD and dementia could be explained by a lack of activity, the authors note. They add that sedentary behavior is, “associated with biomarkers of low-grade inflammation and changes in inflammation markers that could initiate and or worsen neuroinflammation and contribute to neurodegeneration.”
SOURCE:
Hanzhang Wu, PhD, of Tianjin University of Traditional Medicine in Tianjin, China, led the study, which was published online in the International Journal of Behavioral Nutrition and Physical Activity.
LIMITATIONS:
Screen behaviors were assessed using self-report measures, which is subject to recall bias. Also, there may have been variables confounding the findings for which investigators did not account.
DISCLOSURES:
The study was funded by the National Natural Science Foundation of China, the Tianjin Major Public Health Science and Technology Project, the National Health Commission of China, the Food Science and Technology Foundation of Chinese Institute of Food Science and Technology, the China Cohort Consortium, and the Chinese Nutrition Society Nutrition Research Foundation–DSM Research Fund, China. There were no disclosures reported.
Eve Bender has no relevant financial relationships.
A version of this article appeared on Medscape.com.
TOPLINE:
whereas a limited amount of daily computer use that is not work-related is linked to a lower risk for dementia.
METHODOLOGY:
- Investigators analyzed data on 473,184 people aged 39-72 years from the UK Biobank who were enrolled from 2006 to 2010 and followed until a diagnosis of dementia, PD, depression, death, or study end (2018 for Wales residents; 2021 for residents of England and Scotland).
- Participants reported on the number of hours they spent outside of work exercising, watching television, and using the computer.
- MRI was conducted to determine participants’ brain volume.
TAKEAWAY:
- During the study, 6096 people developed dementia, 3000 developed PD, 23,600 developed depression, 1200 developed dementia and depression, and 486 developed PD and depression.
- Compared with those who watched TV for under 1 hour per day, those who reported watching 4 or more hours per day had a 28% higher risk for dementia (adjusted hazard ratio [aHR], 1.28; 95% CI, 1.17-1.39), a 35% higher risk for depression, (aHR, 1.35; 95% CI, 1.29-1.40) and a 16% greater risk for PD (aHR, 1.16; 95% CI, 1.03-1.29).
- However, moderate computer use outside of work seemed somewhat protective. Participants who used the computer for 30-60 minutes per day had lower risks for dementia (aHR, 0.68; 95% CI, 0.64-0.72), PD, (aHR, 0.86; 95% CI, 0.79-0.93), and depression (aHR, 0.85; 95% CI, 0.83-0.88) compared with those who reported the lowest levels of computer usage.
- Replacing 30 minutes per day of computer time with an equal amount of structured exercise was associated with decreased risk for dementia (aHR, 0.74; 95% CI, 0.85-0.95) and PD (aHR, 0.84; 95% CI, 0.78-0.90).
IN PRACTICE:
The association between extended periods of TV use and higher risk for PD and dementia could be explained by a lack of activity, the authors note. They add that sedentary behavior is, “associated with biomarkers of low-grade inflammation and changes in inflammation markers that could initiate and or worsen neuroinflammation and contribute to neurodegeneration.”
SOURCE:
Hanzhang Wu, PhD, of Tianjin University of Traditional Medicine in Tianjin, China, led the study, which was published online in the International Journal of Behavioral Nutrition and Physical Activity.
LIMITATIONS:
Screen behaviors were assessed using self-report measures, which is subject to recall bias. Also, there may have been variables confounding the findings for which investigators did not account.
DISCLOSURES:
The study was funded by the National Natural Science Foundation of China, the Tianjin Major Public Health Science and Technology Project, the National Health Commission of China, the Food Science and Technology Foundation of Chinese Institute of Food Science and Technology, the China Cohort Consortium, and the Chinese Nutrition Society Nutrition Research Foundation–DSM Research Fund, China. There were no disclosures reported.
Eve Bender has no relevant financial relationships.
A version of this article appeared on Medscape.com.
Medication overuse headache a pain to treat
BARCELONA, SPAIN — Around half of all patients with chronic headache or migraine overuse their medication, leading to aggravated or new types of headaches. “Medication overuse headache” is the third most frequent type of headache, affecting some 60 million people or around 1% of the world’s population.
“It’s a big problem,” Sait Ashina, MD, of Beth Israel Deaconess Medical Center, Boston, Massachusetts, told the audience in an opening plenary at the 17th European Headache Congress in Barcelona.
Medication overuse headache is characterized by an increasing headache frequency and progressive use of short-term medication and is recognized as a major factor in the shift from episodic to chronic headache.
It is often underrecognized; however, educating doctors and patients is a crucial element of effective treatment. Recognition that headache medication is being overused is a crucial first step to treatment, followed by advising the patient to discontinue the medication. But this poses its own problems, as it can cause withdrawal symptoms.
According to a longitudinal population-based study published in 2008, most patients overuse acetaminophen or paracetamol, followed by nonsteroidal anti-inflammatory drugs and 5-hydroxytriptamine agonists (triptans) and, in the United States, barbiturates and opioids.
What’s the Best Treatment Strategy?
Medication overuse headache is often treated by complete withdrawal from medication, but the withdrawal symptoms can be severe. They include nausea and vomiting, arterial hypertension, tachycardia, sleep disturbances, anxiety, and restlessness, with their duration and severity depending solely on the type of headache medication that has been overused.
There is, however, no consensus on how best to treat medication overuse headache — withdrawal plus preventive treatment, preventive treatment without withdrawal, and withdrawal with optional preventive treatment 2 months after withdrawal. The findings showed that all three strategies were effective. But the research team concluded that withdrawal combined with preventive medication from the start of withdrawal was the recommended approach.
The electronic headache diary has proved to be very useful, as it can aid accurate diagnosis by providing clear insights into a patient’s condition. Information from the diary is more reliable than self-reports because patients often underestimate the frequency of their headaches, migraines, and use of medication.
Patients who are treated for medication overuse headache tend to have a high relapse rate. So, the electronic headache diary can also be very useful for follow-up by alerting patients and clinicians when headaches and medication overuse are increasing again.
“After diagnosing medication overuse or medication overuse headache, we advise our patients to discontinue the medication,” said Judith Pijpers of Leiden University Medical School, the Netherlands. “This provides clinically relevant improvements in headache frequency in a majority of patients and a significant reduction in headache days.”
In 2019, Dr. Pijpers and her colleagues published the results of a double-blind randomized controlled trial showing that botulinum toxin A, which is widely used to treat chronic migraine, has no additional benefit over acute withdrawal in patients with chronic migraine and medication overuse.
“We saw no difference between the groups during both the double-blind and the open label phase,” said Dr. Pijpers. “And that is why we do not give patients botulinum toxin A during withdrawal.”
A further trial within the botox study showed modest benefits for behavioral intervention by a headache nurse comprising education, motivational interviewing, and value-based activity planning during withdrawal therapy.
Patients can be stratified to some extent based on the type of headache they have and the medication they are taking for it.
“You can predict [a patient’s response] to some extent from the type of medication they overuse and the type of underlying primary headache,” Dr. Pijpers said in an interview.
“Those with underlying tension-type headache have different withdrawal symptoms than those with underlying migraine, and the withdrawal symptoms tend to be somewhat shorter if a patient overuses triptans compared to analgesics.”
Predicting Patients’ Responses to Migraine Medication
Dr. Pijpers and her colleagues recently published the results of a cohort study suggesting that cutaneous allodynia may predict how patients with migraine respond to withdrawal therapy. Nearly 75% of the 173 patients enrolled in the study reported experiencing allodynia — pain caused by a stimulus that does not normally cause pain. The study showed that absence of allodynia was predictive of a good outcome for patients after withdrawal therapy and of reversion from chronic to episodic migraine.
The ability to accurately predict patients’ responses could pave the way for personalized treatments of medication overuse headache.
A version of this article appeared in Medscape.com.
BARCELONA, SPAIN — Around half of all patients with chronic headache or migraine overuse their medication, leading to aggravated or new types of headaches. “Medication overuse headache” is the third most frequent type of headache, affecting some 60 million people or around 1% of the world’s population.
“It’s a big problem,” Sait Ashina, MD, of Beth Israel Deaconess Medical Center, Boston, Massachusetts, told the audience in an opening plenary at the 17th European Headache Congress in Barcelona.
Medication overuse headache is characterized by an increasing headache frequency and progressive use of short-term medication and is recognized as a major factor in the shift from episodic to chronic headache.
It is often underrecognized; however, educating doctors and patients is a crucial element of effective treatment. Recognition that headache medication is being overused is a crucial first step to treatment, followed by advising the patient to discontinue the medication. But this poses its own problems, as it can cause withdrawal symptoms.
According to a longitudinal population-based study published in 2008, most patients overuse acetaminophen or paracetamol, followed by nonsteroidal anti-inflammatory drugs and 5-hydroxytriptamine agonists (triptans) and, in the United States, barbiturates and opioids.
What’s the Best Treatment Strategy?
Medication overuse headache is often treated by complete withdrawal from medication, but the withdrawal symptoms can be severe. They include nausea and vomiting, arterial hypertension, tachycardia, sleep disturbances, anxiety, and restlessness, with their duration and severity depending solely on the type of headache medication that has been overused.
There is, however, no consensus on how best to treat medication overuse headache — withdrawal plus preventive treatment, preventive treatment without withdrawal, and withdrawal with optional preventive treatment 2 months after withdrawal. The findings showed that all three strategies were effective. But the research team concluded that withdrawal combined with preventive medication from the start of withdrawal was the recommended approach.
The electronic headache diary has proved to be very useful, as it can aid accurate diagnosis by providing clear insights into a patient’s condition. Information from the diary is more reliable than self-reports because patients often underestimate the frequency of their headaches, migraines, and use of medication.
Patients who are treated for medication overuse headache tend to have a high relapse rate. So, the electronic headache diary can also be very useful for follow-up by alerting patients and clinicians when headaches and medication overuse are increasing again.
“After diagnosing medication overuse or medication overuse headache, we advise our patients to discontinue the medication,” said Judith Pijpers of Leiden University Medical School, the Netherlands. “This provides clinically relevant improvements in headache frequency in a majority of patients and a significant reduction in headache days.”
In 2019, Dr. Pijpers and her colleagues published the results of a double-blind randomized controlled trial showing that botulinum toxin A, which is widely used to treat chronic migraine, has no additional benefit over acute withdrawal in patients with chronic migraine and medication overuse.
“We saw no difference between the groups during both the double-blind and the open label phase,” said Dr. Pijpers. “And that is why we do not give patients botulinum toxin A during withdrawal.”
A further trial within the botox study showed modest benefits for behavioral intervention by a headache nurse comprising education, motivational interviewing, and value-based activity planning during withdrawal therapy.
Patients can be stratified to some extent based on the type of headache they have and the medication they are taking for it.
“You can predict [a patient’s response] to some extent from the type of medication they overuse and the type of underlying primary headache,” Dr. Pijpers said in an interview.
“Those with underlying tension-type headache have different withdrawal symptoms than those with underlying migraine, and the withdrawal symptoms tend to be somewhat shorter if a patient overuses triptans compared to analgesics.”
Predicting Patients’ Responses to Migraine Medication
Dr. Pijpers and her colleagues recently published the results of a cohort study suggesting that cutaneous allodynia may predict how patients with migraine respond to withdrawal therapy. Nearly 75% of the 173 patients enrolled in the study reported experiencing allodynia — pain caused by a stimulus that does not normally cause pain. The study showed that absence of allodynia was predictive of a good outcome for patients after withdrawal therapy and of reversion from chronic to episodic migraine.
The ability to accurately predict patients’ responses could pave the way for personalized treatments of medication overuse headache.
A version of this article appeared in Medscape.com.
BARCELONA, SPAIN — Around half of all patients with chronic headache or migraine overuse their medication, leading to aggravated or new types of headaches. “Medication overuse headache” is the third most frequent type of headache, affecting some 60 million people or around 1% of the world’s population.
“It’s a big problem,” Sait Ashina, MD, of Beth Israel Deaconess Medical Center, Boston, Massachusetts, told the audience in an opening plenary at the 17th European Headache Congress in Barcelona.
Medication overuse headache is characterized by an increasing headache frequency and progressive use of short-term medication and is recognized as a major factor in the shift from episodic to chronic headache.
It is often underrecognized; however, educating doctors and patients is a crucial element of effective treatment. Recognition that headache medication is being overused is a crucial first step to treatment, followed by advising the patient to discontinue the medication. But this poses its own problems, as it can cause withdrawal symptoms.
According to a longitudinal population-based study published in 2008, most patients overuse acetaminophen or paracetamol, followed by nonsteroidal anti-inflammatory drugs and 5-hydroxytriptamine agonists (triptans) and, in the United States, barbiturates and opioids.
What’s the Best Treatment Strategy?
Medication overuse headache is often treated by complete withdrawal from medication, but the withdrawal symptoms can be severe. They include nausea and vomiting, arterial hypertension, tachycardia, sleep disturbances, anxiety, and restlessness, with their duration and severity depending solely on the type of headache medication that has been overused.
There is, however, no consensus on how best to treat medication overuse headache — withdrawal plus preventive treatment, preventive treatment without withdrawal, and withdrawal with optional preventive treatment 2 months after withdrawal. The findings showed that all three strategies were effective. But the research team concluded that withdrawal combined with preventive medication from the start of withdrawal was the recommended approach.
The electronic headache diary has proved to be very useful, as it can aid accurate diagnosis by providing clear insights into a patient’s condition. Information from the diary is more reliable than self-reports because patients often underestimate the frequency of their headaches, migraines, and use of medication.
Patients who are treated for medication overuse headache tend to have a high relapse rate. So, the electronic headache diary can also be very useful for follow-up by alerting patients and clinicians when headaches and medication overuse are increasing again.
“After diagnosing medication overuse or medication overuse headache, we advise our patients to discontinue the medication,” said Judith Pijpers of Leiden University Medical School, the Netherlands. “This provides clinically relevant improvements in headache frequency in a majority of patients and a significant reduction in headache days.”
In 2019, Dr. Pijpers and her colleagues published the results of a double-blind randomized controlled trial showing that botulinum toxin A, which is widely used to treat chronic migraine, has no additional benefit over acute withdrawal in patients with chronic migraine and medication overuse.
“We saw no difference between the groups during both the double-blind and the open label phase,” said Dr. Pijpers. “And that is why we do not give patients botulinum toxin A during withdrawal.”
A further trial within the botox study showed modest benefits for behavioral intervention by a headache nurse comprising education, motivational interviewing, and value-based activity planning during withdrawal therapy.
Patients can be stratified to some extent based on the type of headache they have and the medication they are taking for it.
“You can predict [a patient’s response] to some extent from the type of medication they overuse and the type of underlying primary headache,” Dr. Pijpers said in an interview.
“Those with underlying tension-type headache have different withdrawal symptoms than those with underlying migraine, and the withdrawal symptoms tend to be somewhat shorter if a patient overuses triptans compared to analgesics.”
Predicting Patients’ Responses to Migraine Medication
Dr. Pijpers and her colleagues recently published the results of a cohort study suggesting that cutaneous allodynia may predict how patients with migraine respond to withdrawal therapy. Nearly 75% of the 173 patients enrolled in the study reported experiencing allodynia — pain caused by a stimulus that does not normally cause pain. The study showed that absence of allodynia was predictive of a good outcome for patients after withdrawal therapy and of reversion from chronic to episodic migraine.
The ability to accurately predict patients’ responses could pave the way for personalized treatments of medication overuse headache.
A version of this article appeared in Medscape.com.
FROM EHC 2023
Sleep disorders linked to increased mortality risk in epilepsy
ORLANDO —
, new research shows.SUDEP is a major concern for patients with epilepsy, said study investigator Marion Lazaj, MSc, Center for Neuroscience Studies, Queen’s University, Kingston, Ontario, Canada, but she believes that SUDEP risk assessment is overly focused on seizure control.
“We want to push the idea that this mortality risk assessment needs to be widened to include sleep factors, and not just sleep disorders but even sleep disturbances,” said Ms. Lazaj.
She also believes physicians should routinely discuss SUDEP with their patients with epilepsy. Given that the incidence of SUDEP is only about 1%, many clinicians don’t want to unduly frighten their patients, she added.
The findings were presented at the annual meeting of the American Epilepsy Society (AES).
The retrospective study included chart data from 1,506 consecutive patients diagnosed with epilepsy at a single center over 4 years. The mean age of participants was about 37 years but there was a large age range, said Ms. Lazaj.
The cohort was divided into two groups. Group 1 included 1130 patients without a comorbid sleep disorder, and Group 2 had 376 patients with a primary comorbid sleep disorder, mostly obstructive sleep apnea (OSA) but also restless leg syndrome or insomnia.
They gathered demographic information including age, sex, employment status, education, and epilepsy-related data such as epilepsy type, duration, the number of anti-seizure medications and relevant information from hospital and emergency room (ER) records.
SUDEP Inventory
Researchers assessed SUDEP risk using the revised SUDEP-7 risk inventory. The first four items on this inventory focus on generalized tonic clonic seizure activity and occurrence while others assess the number of antiseizure medicines, epilepsy duration, and the presence of other developmental delays.
Investigators then stratified patients into high risk (score on the SUDEP-7 of 5 or greater) and low mortality risk (score less than 5).
Results showed a significant association between a high mortality risk and having a comorbid sleep disorder (P = .033). Researchers also looked at all-cause mortality, including drownings and suicides, and found a similar significant association (P = .026). There was also an association between high risk and accidents and trauma (P = .042).
The researchers had access to overnight diagnostic polysomnography data for a smaller group of patients. Here, they found decreased sleep efficiency (P =.0098), increased spontaneous arousal index (P = .034), and prolonged sleep onset latency (P = .0000052) were all significantly associated with high SUDEP risk.
From the polysomnographic data, researchers found high SUDEP risk was significantly associated with a diagnosis of OSA (P = .034).
Powerful Study
Commenting on the findings, Gordon F. Buchanan, MD, PhD, Beth L. Tross epilepsy associate professor, Department of Neurology, University of Iowa Carver College of Medicine, Iowa City, said he was “very excited” by the research.
“That this study attempts to look through data in a retrospective way and see if there’s additional risk with having comorbid sleep disorders is really interesting and I think really powerful,” he said.
Sleep disorders “are potentially a really simple thing that we can screen for and test for,” he added. He also noted that additional research is needed to replicate the findings.
Dr. Buchanan acknowledged that the SUDEP-7 inventory is not a particularly good tool and said there is a need for a better means of assessment that includes sleep disorders and other factors like sleep states and circadian rhythm, which he said affect SUDEP risk.
Ms. Lazaj and Dr. Buchanan report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
ORLANDO —
, new research shows.SUDEP is a major concern for patients with epilepsy, said study investigator Marion Lazaj, MSc, Center for Neuroscience Studies, Queen’s University, Kingston, Ontario, Canada, but she believes that SUDEP risk assessment is overly focused on seizure control.
“We want to push the idea that this mortality risk assessment needs to be widened to include sleep factors, and not just sleep disorders but even sleep disturbances,” said Ms. Lazaj.
She also believes physicians should routinely discuss SUDEP with their patients with epilepsy. Given that the incidence of SUDEP is only about 1%, many clinicians don’t want to unduly frighten their patients, she added.
The findings were presented at the annual meeting of the American Epilepsy Society (AES).
The retrospective study included chart data from 1,506 consecutive patients diagnosed with epilepsy at a single center over 4 years. The mean age of participants was about 37 years but there was a large age range, said Ms. Lazaj.
The cohort was divided into two groups. Group 1 included 1130 patients without a comorbid sleep disorder, and Group 2 had 376 patients with a primary comorbid sleep disorder, mostly obstructive sleep apnea (OSA) but also restless leg syndrome or insomnia.
They gathered demographic information including age, sex, employment status, education, and epilepsy-related data such as epilepsy type, duration, the number of anti-seizure medications and relevant information from hospital and emergency room (ER) records.
SUDEP Inventory
Researchers assessed SUDEP risk using the revised SUDEP-7 risk inventory. The first four items on this inventory focus on generalized tonic clonic seizure activity and occurrence while others assess the number of antiseizure medicines, epilepsy duration, and the presence of other developmental delays.
Investigators then stratified patients into high risk (score on the SUDEP-7 of 5 or greater) and low mortality risk (score less than 5).
Results showed a significant association between a high mortality risk and having a comorbid sleep disorder (P = .033). Researchers also looked at all-cause mortality, including drownings and suicides, and found a similar significant association (P = .026). There was also an association between high risk and accidents and trauma (P = .042).
The researchers had access to overnight diagnostic polysomnography data for a smaller group of patients. Here, they found decreased sleep efficiency (P =.0098), increased spontaneous arousal index (P = .034), and prolonged sleep onset latency (P = .0000052) were all significantly associated with high SUDEP risk.
From the polysomnographic data, researchers found high SUDEP risk was significantly associated with a diagnosis of OSA (P = .034).
Powerful Study
Commenting on the findings, Gordon F. Buchanan, MD, PhD, Beth L. Tross epilepsy associate professor, Department of Neurology, University of Iowa Carver College of Medicine, Iowa City, said he was “very excited” by the research.
“That this study attempts to look through data in a retrospective way and see if there’s additional risk with having comorbid sleep disorders is really interesting and I think really powerful,” he said.
Sleep disorders “are potentially a really simple thing that we can screen for and test for,” he added. He also noted that additional research is needed to replicate the findings.
Dr. Buchanan acknowledged that the SUDEP-7 inventory is not a particularly good tool and said there is a need for a better means of assessment that includes sleep disorders and other factors like sleep states and circadian rhythm, which he said affect SUDEP risk.
Ms. Lazaj and Dr. Buchanan report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
ORLANDO —
, new research shows.SUDEP is a major concern for patients with epilepsy, said study investigator Marion Lazaj, MSc, Center for Neuroscience Studies, Queen’s University, Kingston, Ontario, Canada, but she believes that SUDEP risk assessment is overly focused on seizure control.
“We want to push the idea that this mortality risk assessment needs to be widened to include sleep factors, and not just sleep disorders but even sleep disturbances,” said Ms. Lazaj.
She also believes physicians should routinely discuss SUDEP with their patients with epilepsy. Given that the incidence of SUDEP is only about 1%, many clinicians don’t want to unduly frighten their patients, she added.
The findings were presented at the annual meeting of the American Epilepsy Society (AES).
The retrospective study included chart data from 1,506 consecutive patients diagnosed with epilepsy at a single center over 4 years. The mean age of participants was about 37 years but there was a large age range, said Ms. Lazaj.
The cohort was divided into two groups. Group 1 included 1130 patients without a comorbid sleep disorder, and Group 2 had 376 patients with a primary comorbid sleep disorder, mostly obstructive sleep apnea (OSA) but also restless leg syndrome or insomnia.
They gathered demographic information including age, sex, employment status, education, and epilepsy-related data such as epilepsy type, duration, the number of anti-seizure medications and relevant information from hospital and emergency room (ER) records.
SUDEP Inventory
Researchers assessed SUDEP risk using the revised SUDEP-7 risk inventory. The first four items on this inventory focus on generalized tonic clonic seizure activity and occurrence while others assess the number of antiseizure medicines, epilepsy duration, and the presence of other developmental delays.
Investigators then stratified patients into high risk (score on the SUDEP-7 of 5 or greater) and low mortality risk (score less than 5).
Results showed a significant association between a high mortality risk and having a comorbid sleep disorder (P = .033). Researchers also looked at all-cause mortality, including drownings and suicides, and found a similar significant association (P = .026). There was also an association between high risk and accidents and trauma (P = .042).
The researchers had access to overnight diagnostic polysomnography data for a smaller group of patients. Here, they found decreased sleep efficiency (P =.0098), increased spontaneous arousal index (P = .034), and prolonged sleep onset latency (P = .0000052) were all significantly associated with high SUDEP risk.
From the polysomnographic data, researchers found high SUDEP risk was significantly associated with a diagnosis of OSA (P = .034).
Powerful Study
Commenting on the findings, Gordon F. Buchanan, MD, PhD, Beth L. Tross epilepsy associate professor, Department of Neurology, University of Iowa Carver College of Medicine, Iowa City, said he was “very excited” by the research.
“That this study attempts to look through data in a retrospective way and see if there’s additional risk with having comorbid sleep disorders is really interesting and I think really powerful,” he said.
Sleep disorders “are potentially a really simple thing that we can screen for and test for,” he added. He also noted that additional research is needed to replicate the findings.
Dr. Buchanan acknowledged that the SUDEP-7 inventory is not a particularly good tool and said there is a need for a better means of assessment that includes sleep disorders and other factors like sleep states and circadian rhythm, which he said affect SUDEP risk.
Ms. Lazaj and Dr. Buchanan report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM AES 2023
Experimental Therapy Restores Cognitive Function in Chronic TBI
(msTBI) and chronic sequelae.
Participants in this first-in-humans trial experienced brain injuries between 3-18 years before the study that left them with persistent neuropsychological impairment and a range of functional disabilities.
This is the first time a DBS device has been implanted in the central thalamus in humans, an area of the brain measuring only a few millimeters wide that helps regulate consciousness.
Placing the electrodes required a novel surgical technique developed by the investigators that included virtual models of each participant’s brain, microelectrode recording, and neuroimaging to identify neuronal circuits affected by the TBI.
After 3 months of 12-hour daily DBS treatments, participants’ performance on cognitive tests improved by an average of 32% from baseline. Participants were able to read books, watch TV shows, play video games, complete schoolwork, and felt significantly less fatigued during the day.
Although the small trial only included five patients, the work is already being hailed by other experts as significant.“We were looking for partial restoration of executive attention and expected [the treatment] would have an effect, but I wouldn’t have anticipated the effect size we saw,” co-lead investigator Nicholas Schiff, MD, professor of neuroscience at Weill Cornell Medical College, New York City, said in an interview.
The findings were published online Dec. 4 in Nature Medicine.
“No Trivial Feat”
An estimated 5.3 million children and adults are living with a permanent TBI-related disability in the US today. There currently is no effective therapy for impaired attention, executive function, working memory or information-processing speed caused by the initial injury.
Previous research suggests that a loss of activity in key brain circuits in the thalamus may be associated with a loss of cognitive function.
The investigators recruited six adults (four men and two women) between the ages of 22 and 60 years with a history of msTBI and chronic neuropsychological impairment and functional disability. One participant was later withdrawn from the trial for protocol noncompliance.
Participants completed a range of questionnaires and tests to establish baseline cognitive, psychological, and quality-of-life status.
To restore lost executive functioning in the brain, investigators had to target not only the central lateral nucleus, but also the neuronal network connected to the region that reaches other parts of the brain.
“To do both of those things we had to develop a whole toolset in order to model both the target and trajectory, which had to be right to make it work properly,” co-lead investigator Jaimie Henderson, MD, professor of neurosurgery at Stanford University College of Medicine, Stanford, California, said in an interview. “That gave us a pretty narrow window in which to work and getting an electrode accurately to this target is not a trivial feat.”
“A Moving Target”
Each participant’s brain physiology was slightly different, meaning the path that worked for one individual might not work for another. The surgery was further complicated by shifting in the brain that occurred as individual electrodes were placed.
“It was a literal moving target,” Dr. Henderson said.
In the beginning, investigators used microelectrode recording to “listen” to individual neurons to see which ones weren’t firing correctly.
When that method failed to offer the precise information needed for electrode placement, the investigators switched to neuroimaging, which allowed them to complete the surgery more quickly and accurately.
Participants remained in the hospital 1-2 days after surgery. They returned for postoperative imaging 30 days after surgery and were randomly assigned to different schedules for a 14-day titration period to optimize DBS stimulation.
The primary outcome was a 10% improvement on part B of the trail-making test, a neuropsychological test that measures executive functioning.
After 90 days of 12-hour daily DBS treatments, participants’ scores increased 15%–52% (average 32%) from baseline. Participants also reported an average of 33% decline in fatigue, one of the most common side effects of msTBI, and an average 80% improvement in attention.
The main safety risk during the 3- to-4-hour procedure is bleeding, which didn’t affect any of the participants in this study. One participant developed a surgical site infection, but all other side effects were mild.
After the 90-day treatment period, the study plan called for patients to be randomly assigned to a blinded withdrawal of treatment, with the DBS turned off for 21 days. Two of the patients declined to be randomized. DBS was turned off in one participant while the other two continued as normal.
After 3 weeks, the patient whose DBS was turned off showed a 34% decline on cognitive tests. The device was reactivated after the study and that participant has since reported improvements.
The DBS devices continue to function in all participants. Although their performance is not being measured as part of the study, anecdotal reports indicate sustained improvement in executive functioning.
“The brain injury causes this global down-regulation of brain function and what we think that this is doing is turning that back up again,” Dr. Henderson said. “At a very simplistic level, what we’re trying to do is turn the lights back up after the dimmer switch is switched down from the injury.”
New Hope
TBI patients are usually treated aggressively during the first year, when significant improvements are most likely, but there are few therapeutic options beyond that time, said neurologist Javier Cardenas, MD, who commented on the findings for this article.
“Many providers throw their hands up after a year in terms of intervention and then we’re always looking at potential declines over time,” said Dr. Cardenas, director of the Concussion and Brain Injury Center at the Rockefeller Neuroscience Institution, West Virginia University, Morgantown. “Most people plateau and don’t decline but we’re always worried about a secondary decline in traumatic brain injury.”Surgery is usually only employed immediately following the brain injury. The notion of surgery as a therapeutic option years after the initial assault on the brain is novel, said Jimmy Yang, MD, assistant professor of neurologic surgery at Ohio State University College of Medicine, Columbus, who commented on the findings for this article.
“While deep brain stimulation surgery in clinical practice is specifically tailored to each patient we treat, this study goes a step further by integrating research tools that have not yet made it to the clinical realm,” Dr. Yang said. “As a result, while these methods are not commonly used in clinical care, the overall strategy highlights how research advances are linked to clinical advances.”
Investigators are working to secure funding for a larger phase 2 trial.
“With millions of people affected by traumatic brain injury but without effective therapies, this study brings hope that options are on the horizon to help these patients,” Dr. Yang said.
The study was supported by funding from the National Institute of Health BRAIN Initiative and a grant from the Translational Science Center at Weill Cornell Medical College. Surgical implants were provided by Medtronic. Dr. Henderson and Dr. Schiff are listed as inventors on several patent applications for the experimental DBS therapy described in the study. Dr. Cardenas and Dr. Yang report no relevant financial relationships.
A version of this article first appeared on Medscape.com .
(msTBI) and chronic sequelae.
Participants in this first-in-humans trial experienced brain injuries between 3-18 years before the study that left them with persistent neuropsychological impairment and a range of functional disabilities.
This is the first time a DBS device has been implanted in the central thalamus in humans, an area of the brain measuring only a few millimeters wide that helps regulate consciousness.
Placing the electrodes required a novel surgical technique developed by the investigators that included virtual models of each participant’s brain, microelectrode recording, and neuroimaging to identify neuronal circuits affected by the TBI.
After 3 months of 12-hour daily DBS treatments, participants’ performance on cognitive tests improved by an average of 32% from baseline. Participants were able to read books, watch TV shows, play video games, complete schoolwork, and felt significantly less fatigued during the day.
Although the small trial only included five patients, the work is already being hailed by other experts as significant.“We were looking for partial restoration of executive attention and expected [the treatment] would have an effect, but I wouldn’t have anticipated the effect size we saw,” co-lead investigator Nicholas Schiff, MD, professor of neuroscience at Weill Cornell Medical College, New York City, said in an interview.
The findings were published online Dec. 4 in Nature Medicine.
“No Trivial Feat”
An estimated 5.3 million children and adults are living with a permanent TBI-related disability in the US today. There currently is no effective therapy for impaired attention, executive function, working memory or information-processing speed caused by the initial injury.
Previous research suggests that a loss of activity in key brain circuits in the thalamus may be associated with a loss of cognitive function.
The investigators recruited six adults (four men and two women) between the ages of 22 and 60 years with a history of msTBI and chronic neuropsychological impairment and functional disability. One participant was later withdrawn from the trial for protocol noncompliance.
Participants completed a range of questionnaires and tests to establish baseline cognitive, psychological, and quality-of-life status.
To restore lost executive functioning in the brain, investigators had to target not only the central lateral nucleus, but also the neuronal network connected to the region that reaches other parts of the brain.
“To do both of those things we had to develop a whole toolset in order to model both the target and trajectory, which had to be right to make it work properly,” co-lead investigator Jaimie Henderson, MD, professor of neurosurgery at Stanford University College of Medicine, Stanford, California, said in an interview. “That gave us a pretty narrow window in which to work and getting an electrode accurately to this target is not a trivial feat.”
“A Moving Target”
Each participant’s brain physiology was slightly different, meaning the path that worked for one individual might not work for another. The surgery was further complicated by shifting in the brain that occurred as individual electrodes were placed.
“It was a literal moving target,” Dr. Henderson said.
In the beginning, investigators used microelectrode recording to “listen” to individual neurons to see which ones weren’t firing correctly.
When that method failed to offer the precise information needed for electrode placement, the investigators switched to neuroimaging, which allowed them to complete the surgery more quickly and accurately.
Participants remained in the hospital 1-2 days after surgery. They returned for postoperative imaging 30 days after surgery and were randomly assigned to different schedules for a 14-day titration period to optimize DBS stimulation.
The primary outcome was a 10% improvement on part B of the trail-making test, a neuropsychological test that measures executive functioning.
After 90 days of 12-hour daily DBS treatments, participants’ scores increased 15%–52% (average 32%) from baseline. Participants also reported an average of 33% decline in fatigue, one of the most common side effects of msTBI, and an average 80% improvement in attention.
The main safety risk during the 3- to-4-hour procedure is bleeding, which didn’t affect any of the participants in this study. One participant developed a surgical site infection, but all other side effects were mild.
After the 90-day treatment period, the study plan called for patients to be randomly assigned to a blinded withdrawal of treatment, with the DBS turned off for 21 days. Two of the patients declined to be randomized. DBS was turned off in one participant while the other two continued as normal.
After 3 weeks, the patient whose DBS was turned off showed a 34% decline on cognitive tests. The device was reactivated after the study and that participant has since reported improvements.
The DBS devices continue to function in all participants. Although their performance is not being measured as part of the study, anecdotal reports indicate sustained improvement in executive functioning.
“The brain injury causes this global down-regulation of brain function and what we think that this is doing is turning that back up again,” Dr. Henderson said. “At a very simplistic level, what we’re trying to do is turn the lights back up after the dimmer switch is switched down from the injury.”
New Hope
TBI patients are usually treated aggressively during the first year, when significant improvements are most likely, but there are few therapeutic options beyond that time, said neurologist Javier Cardenas, MD, who commented on the findings for this article.
“Many providers throw their hands up after a year in terms of intervention and then we’re always looking at potential declines over time,” said Dr. Cardenas, director of the Concussion and Brain Injury Center at the Rockefeller Neuroscience Institution, West Virginia University, Morgantown. “Most people plateau and don’t decline but we’re always worried about a secondary decline in traumatic brain injury.”Surgery is usually only employed immediately following the brain injury. The notion of surgery as a therapeutic option years after the initial assault on the brain is novel, said Jimmy Yang, MD, assistant professor of neurologic surgery at Ohio State University College of Medicine, Columbus, who commented on the findings for this article.
“While deep brain stimulation surgery in clinical practice is specifically tailored to each patient we treat, this study goes a step further by integrating research tools that have not yet made it to the clinical realm,” Dr. Yang said. “As a result, while these methods are not commonly used in clinical care, the overall strategy highlights how research advances are linked to clinical advances.”
Investigators are working to secure funding for a larger phase 2 trial.
“With millions of people affected by traumatic brain injury but without effective therapies, this study brings hope that options are on the horizon to help these patients,” Dr. Yang said.
The study was supported by funding from the National Institute of Health BRAIN Initiative and a grant from the Translational Science Center at Weill Cornell Medical College. Surgical implants were provided by Medtronic. Dr. Henderson and Dr. Schiff are listed as inventors on several patent applications for the experimental DBS therapy described in the study. Dr. Cardenas and Dr. Yang report no relevant financial relationships.
A version of this article first appeared on Medscape.com .
(msTBI) and chronic sequelae.
Participants in this first-in-humans trial experienced brain injuries between 3-18 years before the study that left them with persistent neuropsychological impairment and a range of functional disabilities.
This is the first time a DBS device has been implanted in the central thalamus in humans, an area of the brain measuring only a few millimeters wide that helps regulate consciousness.
Placing the electrodes required a novel surgical technique developed by the investigators that included virtual models of each participant’s brain, microelectrode recording, and neuroimaging to identify neuronal circuits affected by the TBI.
After 3 months of 12-hour daily DBS treatments, participants’ performance on cognitive tests improved by an average of 32% from baseline. Participants were able to read books, watch TV shows, play video games, complete schoolwork, and felt significantly less fatigued during the day.
Although the small trial only included five patients, the work is already being hailed by other experts as significant.“We were looking for partial restoration of executive attention and expected [the treatment] would have an effect, but I wouldn’t have anticipated the effect size we saw,” co-lead investigator Nicholas Schiff, MD, professor of neuroscience at Weill Cornell Medical College, New York City, said in an interview.
The findings were published online Dec. 4 in Nature Medicine.
“No Trivial Feat”
An estimated 5.3 million children and adults are living with a permanent TBI-related disability in the US today. There currently is no effective therapy for impaired attention, executive function, working memory or information-processing speed caused by the initial injury.
Previous research suggests that a loss of activity in key brain circuits in the thalamus may be associated with a loss of cognitive function.
The investigators recruited six adults (four men and two women) between the ages of 22 and 60 years with a history of msTBI and chronic neuropsychological impairment and functional disability. One participant was later withdrawn from the trial for protocol noncompliance.
Participants completed a range of questionnaires and tests to establish baseline cognitive, psychological, and quality-of-life status.
To restore lost executive functioning in the brain, investigators had to target not only the central lateral nucleus, but also the neuronal network connected to the region that reaches other parts of the brain.
“To do both of those things we had to develop a whole toolset in order to model both the target and trajectory, which had to be right to make it work properly,” co-lead investigator Jaimie Henderson, MD, professor of neurosurgery at Stanford University College of Medicine, Stanford, California, said in an interview. “That gave us a pretty narrow window in which to work and getting an electrode accurately to this target is not a trivial feat.”
“A Moving Target”
Each participant’s brain physiology was slightly different, meaning the path that worked for one individual might not work for another. The surgery was further complicated by shifting in the brain that occurred as individual electrodes were placed.
“It was a literal moving target,” Dr. Henderson said.
In the beginning, investigators used microelectrode recording to “listen” to individual neurons to see which ones weren’t firing correctly.
When that method failed to offer the precise information needed for electrode placement, the investigators switched to neuroimaging, which allowed them to complete the surgery more quickly and accurately.
Participants remained in the hospital 1-2 days after surgery. They returned for postoperative imaging 30 days after surgery and were randomly assigned to different schedules for a 14-day titration period to optimize DBS stimulation.
The primary outcome was a 10% improvement on part B of the trail-making test, a neuropsychological test that measures executive functioning.
After 90 days of 12-hour daily DBS treatments, participants’ scores increased 15%–52% (average 32%) from baseline. Participants also reported an average of 33% decline in fatigue, one of the most common side effects of msTBI, and an average 80% improvement in attention.
The main safety risk during the 3- to-4-hour procedure is bleeding, which didn’t affect any of the participants in this study. One participant developed a surgical site infection, but all other side effects were mild.
After the 90-day treatment period, the study plan called for patients to be randomly assigned to a blinded withdrawal of treatment, with the DBS turned off for 21 days. Two of the patients declined to be randomized. DBS was turned off in one participant while the other two continued as normal.
After 3 weeks, the patient whose DBS was turned off showed a 34% decline on cognitive tests. The device was reactivated after the study and that participant has since reported improvements.
The DBS devices continue to function in all participants. Although their performance is not being measured as part of the study, anecdotal reports indicate sustained improvement in executive functioning.
“The brain injury causes this global down-regulation of brain function and what we think that this is doing is turning that back up again,” Dr. Henderson said. “At a very simplistic level, what we’re trying to do is turn the lights back up after the dimmer switch is switched down from the injury.”
New Hope
TBI patients are usually treated aggressively during the first year, when significant improvements are most likely, but there are few therapeutic options beyond that time, said neurologist Javier Cardenas, MD, who commented on the findings for this article.
“Many providers throw their hands up after a year in terms of intervention and then we’re always looking at potential declines over time,” said Dr. Cardenas, director of the Concussion and Brain Injury Center at the Rockefeller Neuroscience Institution, West Virginia University, Morgantown. “Most people plateau and don’t decline but we’re always worried about a secondary decline in traumatic brain injury.”Surgery is usually only employed immediately following the brain injury. The notion of surgery as a therapeutic option years after the initial assault on the brain is novel, said Jimmy Yang, MD, assistant professor of neurologic surgery at Ohio State University College of Medicine, Columbus, who commented on the findings for this article.
“While deep brain stimulation surgery in clinical practice is specifically tailored to each patient we treat, this study goes a step further by integrating research tools that have not yet made it to the clinical realm,” Dr. Yang said. “As a result, while these methods are not commonly used in clinical care, the overall strategy highlights how research advances are linked to clinical advances.”
Investigators are working to secure funding for a larger phase 2 trial.
“With millions of people affected by traumatic brain injury but without effective therapies, this study brings hope that options are on the horizon to help these patients,” Dr. Yang said.
The study was supported by funding from the National Institute of Health BRAIN Initiative and a grant from the Translational Science Center at Weill Cornell Medical College. Surgical implants were provided by Medtronic. Dr. Henderson and Dr. Schiff are listed as inventors on several patent applications for the experimental DBS therapy described in the study. Dr. Cardenas and Dr. Yang report no relevant financial relationships.
A version of this article first appeared on Medscape.com .
Camp Lejeune Family Members Now Eligible for Health Care Reimbursement Related to Parkinson Disease
Family members of veterans exposed to contaminated drinking water at Marine Corps Base Camp Lejeune, Jacksonville, North Carolina, from August 1, 1953, to December 31, 1987, are now eligible for reimbursement of health care costs associated with Parkinson disease (PD) under the Camp Lejeune Family Member Program, the US Department of Veterans Affairs (VA) has announced.
That brings the number of illnesses or conditions those family members can be reimbursed for to 16: esophageal, lung, breast, bladder, and kidney cancer, leukemia, multiple myeloma, renal toxicity, miscarriage, hepatic steatosis, female infertility, myelodysplastic syndromes, scleroderma, neurobehavioral effects, non-Hodgkin lymphoma, and Parkinson disease.
A recent JAMA study of 340,489 service members found that the risk of PD is 70% higher for veterans stationed at Camp Lejeune (n = 279) compared with veterans stationed at Camp Pendleton, California (n = 151).
The researchers say water supplies at Camp Lejeune were contaminated with several volatile organic compounds. They suggest that the risk of PD may be related to trichloroethylene exposure (TCE), a volatile organic compound widely used as a cleaning agent, in the manufacturing of some refrigerants, and found in paints and other products. In January, the US Environmental Protection Agency issued a revised risk determination saying that TCE presents an unreasonable risk to the health of workers, occupational nonusers (workers nearby but not in direct contact with this chemical), consumers, and bystanders.
Levels at Camp Lejeune were highest for TCE, with monthly median values greater than 70-fold the permissible amount.
Camp Lejeune veterans also had a significantly increased risk of prodromal PD diagnoses, including tremor, anxiety, and erectile dysfunction, and higher cumulative prodromal risk scores. No excess risk was found for other forms of neurodegenerative parkinsonism.
The PACT Act allows veterans and their families to file lawsuits for harm caused by exposure to contaminated water at Camp Lejeune. “Veterans and their families deserve no-cost health care for the conditions they developed due to the contaminated water at Camp Lejeune,” said VA’s Under Secretary for Health, Dr. Shereef Elnahal, MD. “We’re proud to add Parkinson disease to the list of conditions that are covered for veteran family members, and we implore anyone who may be living with this disease—or any of the other conditions covered by VA’s Camp Lejeune Family Member Program—to apply for assistance today.”
Family members of veterans exposed to contaminated drinking water at Marine Corps Base Camp Lejeune, Jacksonville, North Carolina, from August 1, 1953, to December 31, 1987, are now eligible for reimbursement of health care costs associated with Parkinson disease (PD) under the Camp Lejeune Family Member Program, the US Department of Veterans Affairs (VA) has announced.
That brings the number of illnesses or conditions those family members can be reimbursed for to 16: esophageal, lung, breast, bladder, and kidney cancer, leukemia, multiple myeloma, renal toxicity, miscarriage, hepatic steatosis, female infertility, myelodysplastic syndromes, scleroderma, neurobehavioral effects, non-Hodgkin lymphoma, and Parkinson disease.
A recent JAMA study of 340,489 service members found that the risk of PD is 70% higher for veterans stationed at Camp Lejeune (n = 279) compared with veterans stationed at Camp Pendleton, California (n = 151).
The researchers say water supplies at Camp Lejeune were contaminated with several volatile organic compounds. They suggest that the risk of PD may be related to trichloroethylene exposure (TCE), a volatile organic compound widely used as a cleaning agent, in the manufacturing of some refrigerants, and found in paints and other products. In January, the US Environmental Protection Agency issued a revised risk determination saying that TCE presents an unreasonable risk to the health of workers, occupational nonusers (workers nearby but not in direct contact with this chemical), consumers, and bystanders.
Levels at Camp Lejeune were highest for TCE, with monthly median values greater than 70-fold the permissible amount.
Camp Lejeune veterans also had a significantly increased risk of prodromal PD diagnoses, including tremor, anxiety, and erectile dysfunction, and higher cumulative prodromal risk scores. No excess risk was found for other forms of neurodegenerative parkinsonism.
The PACT Act allows veterans and their families to file lawsuits for harm caused by exposure to contaminated water at Camp Lejeune. “Veterans and their families deserve no-cost health care for the conditions they developed due to the contaminated water at Camp Lejeune,” said VA’s Under Secretary for Health, Dr. Shereef Elnahal, MD. “We’re proud to add Parkinson disease to the list of conditions that are covered for veteran family members, and we implore anyone who may be living with this disease—or any of the other conditions covered by VA’s Camp Lejeune Family Member Program—to apply for assistance today.”
Family members of veterans exposed to contaminated drinking water at Marine Corps Base Camp Lejeune, Jacksonville, North Carolina, from August 1, 1953, to December 31, 1987, are now eligible for reimbursement of health care costs associated with Parkinson disease (PD) under the Camp Lejeune Family Member Program, the US Department of Veterans Affairs (VA) has announced.
That brings the number of illnesses or conditions those family members can be reimbursed for to 16: esophageal, lung, breast, bladder, and kidney cancer, leukemia, multiple myeloma, renal toxicity, miscarriage, hepatic steatosis, female infertility, myelodysplastic syndromes, scleroderma, neurobehavioral effects, non-Hodgkin lymphoma, and Parkinson disease.
A recent JAMA study of 340,489 service members found that the risk of PD is 70% higher for veterans stationed at Camp Lejeune (n = 279) compared with veterans stationed at Camp Pendleton, California (n = 151).
The researchers say water supplies at Camp Lejeune were contaminated with several volatile organic compounds. They suggest that the risk of PD may be related to trichloroethylene exposure (TCE), a volatile organic compound widely used as a cleaning agent, in the manufacturing of some refrigerants, and found in paints and other products. In January, the US Environmental Protection Agency issued a revised risk determination saying that TCE presents an unreasonable risk to the health of workers, occupational nonusers (workers nearby but not in direct contact with this chemical), consumers, and bystanders.
Levels at Camp Lejeune were highest for TCE, with monthly median values greater than 70-fold the permissible amount.
Camp Lejeune veterans also had a significantly increased risk of prodromal PD diagnoses, including tremor, anxiety, and erectile dysfunction, and higher cumulative prodromal risk scores. No excess risk was found for other forms of neurodegenerative parkinsonism.
The PACT Act allows veterans and their families to file lawsuits for harm caused by exposure to contaminated water at Camp Lejeune. “Veterans and their families deserve no-cost health care for the conditions they developed due to the contaminated water at Camp Lejeune,” said VA’s Under Secretary for Health, Dr. Shereef Elnahal, MD. “We’re proud to add Parkinson disease to the list of conditions that are covered for veteran family members, and we implore anyone who may be living with this disease—or any of the other conditions covered by VA’s Camp Lejeune Family Member Program—to apply for assistance today.”
Secondhand smoke exposure linked to migraine, severe headache
TOPLINE:
, with effects of exposure varying depending on body mass index (BMI) and level of physical activity, new research shows.
METHODOLOGY:
Investigators analyzed data on 4,560 participants (median age, 43 years; 60% female; 71.5% White) from the 1999-2004 National Health and Nutrition Examination Survey.
Participants were aged 20 years or older and had never smoked.
Migraine headache status was determined by asking whether participants experienced severe headaches or migraines during the previous 3 months.
SHS exposure was categorized as unexposed (serum cotinine levels <0.05 ng/mL and no smoker in the home), low (0.05 ng/mL ≤ serum cotinine level <1 ng/mL), or heavy (1 ng/mL ≤ serum cotinine level ≤ 10 ng/mL).
TAKEAWAY:
In all, 919 (20%) participants had severe headaches or migraines.
After adjustment for demographic and lifestyle factors (including medication use), heavy SHS exposure was positively associated with severe headache or migraine (adjusted odds ratio [aOR], 2.02; 95% CI, 1.19-3.43).
No significant association was found between low SHS exposure and severe headaches or migraine (aOR, 1.15; 95% CI, 0.91-1.47).
In participants who were sedentary (P=.016) and those with a BMI <25 (P=.001), significant associations between SHS and severe headache or migraine were observed.
IN PRACTICE:
Noting a linear dose-response relationship between cotinine and severe headaches or migraine, the investigators write, “These findings underscore the need for stronger regulation of tobacco exposure, particularly in homes and public places.”
SOURCE:
Junpeng Wu, MMc, and Haitang Wang, MD, of Southern Medical University in Guangzhou, China, and their colleagues conducted the study. It was published online in Headache.
LIMITATIONS:
The study could not establish causal relationships between SHS and migraine or severe headache. In addition, the half-life of serum cotinine is 15-40 hours and thus this measure can reflect only recent SHS exposure.
DISCLOSURES:
The study was not funded. The investigators reported no disclosures.
A version of this article appeared on Medscape.com.
TOPLINE:
, with effects of exposure varying depending on body mass index (BMI) and level of physical activity, new research shows.
METHODOLOGY:
Investigators analyzed data on 4,560 participants (median age, 43 years; 60% female; 71.5% White) from the 1999-2004 National Health and Nutrition Examination Survey.
Participants were aged 20 years or older and had never smoked.
Migraine headache status was determined by asking whether participants experienced severe headaches or migraines during the previous 3 months.
SHS exposure was categorized as unexposed (serum cotinine levels <0.05 ng/mL and no smoker in the home), low (0.05 ng/mL ≤ serum cotinine level <1 ng/mL), or heavy (1 ng/mL ≤ serum cotinine level ≤ 10 ng/mL).
TAKEAWAY:
In all, 919 (20%) participants had severe headaches or migraines.
After adjustment for demographic and lifestyle factors (including medication use), heavy SHS exposure was positively associated with severe headache or migraine (adjusted odds ratio [aOR], 2.02; 95% CI, 1.19-3.43).
No significant association was found between low SHS exposure and severe headaches or migraine (aOR, 1.15; 95% CI, 0.91-1.47).
In participants who were sedentary (P=.016) and those with a BMI <25 (P=.001), significant associations between SHS and severe headache or migraine were observed.
IN PRACTICE:
Noting a linear dose-response relationship between cotinine and severe headaches or migraine, the investigators write, “These findings underscore the need for stronger regulation of tobacco exposure, particularly in homes and public places.”
SOURCE:
Junpeng Wu, MMc, and Haitang Wang, MD, of Southern Medical University in Guangzhou, China, and their colleagues conducted the study. It was published online in Headache.
LIMITATIONS:
The study could not establish causal relationships between SHS and migraine or severe headache. In addition, the half-life of serum cotinine is 15-40 hours and thus this measure can reflect only recent SHS exposure.
DISCLOSURES:
The study was not funded. The investigators reported no disclosures.
A version of this article appeared on Medscape.com.
TOPLINE:
, with effects of exposure varying depending on body mass index (BMI) and level of physical activity, new research shows.
METHODOLOGY:
Investigators analyzed data on 4,560 participants (median age, 43 years; 60% female; 71.5% White) from the 1999-2004 National Health and Nutrition Examination Survey.
Participants were aged 20 years or older and had never smoked.
Migraine headache status was determined by asking whether participants experienced severe headaches or migraines during the previous 3 months.
SHS exposure was categorized as unexposed (serum cotinine levels <0.05 ng/mL and no smoker in the home), low (0.05 ng/mL ≤ serum cotinine level <1 ng/mL), or heavy (1 ng/mL ≤ serum cotinine level ≤ 10 ng/mL).
TAKEAWAY:
In all, 919 (20%) participants had severe headaches or migraines.
After adjustment for demographic and lifestyle factors (including medication use), heavy SHS exposure was positively associated with severe headache or migraine (adjusted odds ratio [aOR], 2.02; 95% CI, 1.19-3.43).
No significant association was found between low SHS exposure and severe headaches or migraine (aOR, 1.15; 95% CI, 0.91-1.47).
In participants who were sedentary (P=.016) and those with a BMI <25 (P=.001), significant associations between SHS and severe headache or migraine were observed.
IN PRACTICE:
Noting a linear dose-response relationship between cotinine and severe headaches or migraine, the investigators write, “These findings underscore the need for stronger regulation of tobacco exposure, particularly in homes and public places.”
SOURCE:
Junpeng Wu, MMc, and Haitang Wang, MD, of Southern Medical University in Guangzhou, China, and their colleagues conducted the study. It was published online in Headache.
LIMITATIONS:
The study could not establish causal relationships between SHS and migraine or severe headache. In addition, the half-life of serum cotinine is 15-40 hours and thus this measure can reflect only recent SHS exposure.
DISCLOSURES:
The study was not funded. The investigators reported no disclosures.
A version of this article appeared on Medscape.com.
Food insecurity a dementia risk factor?
TOPLINE:
Food insecurity among older adults is associated with increased dementia risk, poorer memory function, and faster memory decline, new research indicates.
METHODOLOGY:
- Researchers analyzed data on 7,012 adults (mean age, 67 years; 59% women) from the U.S. Health and Retirement Study.
- Food security status was assessed in 2013 using a validated survey, with cognitive outcomes evaluated between 2014 and 2018.
- Analyses were adjusted for demographics, socioeconomics, and health factors.
TAKEAWAY:
- About 18% of adults were food insecure, with 10% reporting low food security and 8% very low food security. About 11% of those aged 65+ in 2013 were food insecure.
- The odds of dementia were 38% higher (odds ratio, 1.38; 95% confidence interval [CI], 1.15-1.67) in adults with low food security and 37% higher (OR, 1.37; 95% CI, 1.11-1.59) in those with very low food security, compared with food-secure adults.
- Translated to years of excess cognitive aging, food insecurity was associated with increased dementia risk equivalent to roughly 1.3 excess years of aging.
- Low and very low food security were also associated with lower memory levels and faster age-related memory decline.
IN PRACTICE:
“Our study contributes to a limited literature by capitalizing on a large and diverse sample, validated exposure and outcome measures, and longitudinal data to robustly evaluate these associations, providing evidence in support of the connection between food insecurity in older adulthood and subsequent brain health,” the authors wrote. “Our findings highlight the need to improve food security in older adults and that doing so may protect individuals from cognitive decline and dementia.”
SOURCE:
The study, with first author Haobing Qian, PhD, with the University of California, San Francisco, was published online in JAMA Network Open.
LIMITATIONS:
Residual confounding cannot be ruled out. Food insecurity was not assessed prior to 2013. The researchers lacked information on clinical dementia diagnoses.
DISCLOSURES:
The study was supported by grants from the National Institutes of Health. The authors reported no relevant conflicts of interest.
A version of this article first appeared on Medscape.com.
TOPLINE:
Food insecurity among older adults is associated with increased dementia risk, poorer memory function, and faster memory decline, new research indicates.
METHODOLOGY:
- Researchers analyzed data on 7,012 adults (mean age, 67 years; 59% women) from the U.S. Health and Retirement Study.
- Food security status was assessed in 2013 using a validated survey, with cognitive outcomes evaluated between 2014 and 2018.
- Analyses were adjusted for demographics, socioeconomics, and health factors.
TAKEAWAY:
- About 18% of adults were food insecure, with 10% reporting low food security and 8% very low food security. About 11% of those aged 65+ in 2013 were food insecure.
- The odds of dementia were 38% higher (odds ratio, 1.38; 95% confidence interval [CI], 1.15-1.67) in adults with low food security and 37% higher (OR, 1.37; 95% CI, 1.11-1.59) in those with very low food security, compared with food-secure adults.
- Translated to years of excess cognitive aging, food insecurity was associated with increased dementia risk equivalent to roughly 1.3 excess years of aging.
- Low and very low food security were also associated with lower memory levels and faster age-related memory decline.
IN PRACTICE:
“Our study contributes to a limited literature by capitalizing on a large and diverse sample, validated exposure and outcome measures, and longitudinal data to robustly evaluate these associations, providing evidence in support of the connection between food insecurity in older adulthood and subsequent brain health,” the authors wrote. “Our findings highlight the need to improve food security in older adults and that doing so may protect individuals from cognitive decline and dementia.”
SOURCE:
The study, with first author Haobing Qian, PhD, with the University of California, San Francisco, was published online in JAMA Network Open.
LIMITATIONS:
Residual confounding cannot be ruled out. Food insecurity was not assessed prior to 2013. The researchers lacked information on clinical dementia diagnoses.
DISCLOSURES:
The study was supported by grants from the National Institutes of Health. The authors reported no relevant conflicts of interest.
A version of this article first appeared on Medscape.com.
TOPLINE:
Food insecurity among older adults is associated with increased dementia risk, poorer memory function, and faster memory decline, new research indicates.
METHODOLOGY:
- Researchers analyzed data on 7,012 adults (mean age, 67 years; 59% women) from the U.S. Health and Retirement Study.
- Food security status was assessed in 2013 using a validated survey, with cognitive outcomes evaluated between 2014 and 2018.
- Analyses were adjusted for demographics, socioeconomics, and health factors.
TAKEAWAY:
- About 18% of adults were food insecure, with 10% reporting low food security and 8% very low food security. About 11% of those aged 65+ in 2013 were food insecure.
- The odds of dementia were 38% higher (odds ratio, 1.38; 95% confidence interval [CI], 1.15-1.67) in adults with low food security and 37% higher (OR, 1.37; 95% CI, 1.11-1.59) in those with very low food security, compared with food-secure adults.
- Translated to years of excess cognitive aging, food insecurity was associated with increased dementia risk equivalent to roughly 1.3 excess years of aging.
- Low and very low food security were also associated with lower memory levels and faster age-related memory decline.
IN PRACTICE:
“Our study contributes to a limited literature by capitalizing on a large and diverse sample, validated exposure and outcome measures, and longitudinal data to robustly evaluate these associations, providing evidence in support of the connection between food insecurity in older adulthood and subsequent brain health,” the authors wrote. “Our findings highlight the need to improve food security in older adults and that doing so may protect individuals from cognitive decline and dementia.”
SOURCE:
The study, with first author Haobing Qian, PhD, with the University of California, San Francisco, was published online in JAMA Network Open.
LIMITATIONS:
Residual confounding cannot be ruled out. Food insecurity was not assessed prior to 2013. The researchers lacked information on clinical dementia diagnoses.
DISCLOSURES:
The study was supported by grants from the National Institutes of Health. The authors reported no relevant conflicts of interest.
A version of this article first appeared on Medscape.com.
Exercise improves physical and cognitive health in Down syndrome
In the first study of its kind, U.K. and French researchers reported that exercise positively affected physical and cognitive health in persons with Down syndrome. “The findings are significant and offer a crucial challenge to the [Down syndrome] and wider societies,” wrote a team led by Dan Gordon, PhD, associate professor of cardiorespiratory exercise physiology at Anglia Ruskin University in Cambridge, England. “Impact of Prescribed Exercise on the Physical and Cognitive Health of Adults with Down Syndrome: The MinDSets Study” was published in the International Journal of Environmental Research and Public Health.
“Through the simple application of walking, a form of exercise which requires little to no equipment or expense, there were significant increases in cognitive and executive function, reflecting improved capabilities in key attributes of information processing, vigilance, and selective attention,” the researchers wrote.
“Increased cognitive function will help foster increased societal integration and quality of life, which, given that this is the first generation of those with [Down syndrome] to outlive their parents and caregivers, is of importance,” they wrote.
For example, those in an exercise-only intervention arm had an 11.4% improvement on the distance covered in the Six-Minute Walk Test, going from a mean of 498.8 meters before intervention to 522.1 meters afterward. Those in a group that combined group exercise with cognitive training increased the distance walked by 9.9%, or 49.2 meters. Groups that got cognitive training only or no intervention showed no significant changes.
In measures of cognitive function, the exercise group showed a 38% increase in selective attention, with the cognitive and combined groups showing changes for the same measure of 16.5% and 55.3%, respectively. The changes for concentration in the exercise-alone group was 31.5%, while those receiving cognitive training alone or combined exercise plus cognitive training showed improvements in concentration of 21% and 15%, respectively.
Asked why a combination intervention was not superior to exercise alone, Dr. Gordon said in an interview, “Something we’re looking at in the data but can’t fully confirm is that the combined group started to become fatigued due to the double dose of the intervention, and this prevented them in the final tests from doing quite so well as the exercise-alone group. Irrespective of the magnitude of change, any cognitive adaptation observed will be beneficial to this population.”
The evidence for the benefits of exercise on both physical and cognitive health in a non–Down syndrome population are well established, he said, but there were few data on its effect on the Down syndrome population.
One small study showed physical and neurocognitive benefits with resistance training.
“The evidence from previous studies showed increased levels of inactivity and sitting time in Down syndrome individuals compared with non–[Down syndrome] controls, so we hypothesized that exercise, albeit small amounts, would increase their physical fitness,” Dr. Gordon said.
His team also hypothesized that walking would stimulate cognitive development since it requires heightened cognitive engagement compared with inactivity. “What surprised us was the degree of improvement,” Dr. Gordon said.
The process of walking requires the brain to interpret information on a real-time basis from both internal and external cues, he continued. “For most of us this process requires low-level cognitive engagement. However, in the [Down syndrome] population, where motor control is impaired and accompanied by poor muscle tone, walking imposes a heightened cognitive load.” It requires them to concentrate on the action, be aware of their surroundings, and make the right decisions, all of which stimulate areas of the brain that control these functions.
Study details
Eighty-three adult participants were available for final analysis – 67 from North America, 8 from Europe, 5 from Africa, 2 from Asia, and 1 from Australia. The mean age of participants was 27.1 years, 40 were female, and all had caregiver support during the study.
Those unable to visualize information on computer and mobile/tablet screens or to listen to instructions/auditory cues were excluded. All were provided with instructions and a mobile monitoring tool set to record steps completed, distances covered, speeds, and heart rate.
Each was assigned to one of four groups. Exercise intervention-only consisted of 8 weeks of cardiorespiratory exercise defined as either walking or jogging three times a week for 30 minutes. Cognitive training included eight levels (about 20 minutes) of cognitive and executive function exercises six times per week. The combined group completed both the cardiorespiratory and cognitive interventions, while the fourth group acted as controls with no intervention.
According to the authors, the study offers a real-life scenario that can be readily adopted within the Down syndrome community.
This study was commissioned by the Canadian Down Syndrome Society. The authors had no conflicts of interest to declare.
In the first study of its kind, U.K. and French researchers reported that exercise positively affected physical and cognitive health in persons with Down syndrome. “The findings are significant and offer a crucial challenge to the [Down syndrome] and wider societies,” wrote a team led by Dan Gordon, PhD, associate professor of cardiorespiratory exercise physiology at Anglia Ruskin University in Cambridge, England. “Impact of Prescribed Exercise on the Physical and Cognitive Health of Adults with Down Syndrome: The MinDSets Study” was published in the International Journal of Environmental Research and Public Health.
“Through the simple application of walking, a form of exercise which requires little to no equipment or expense, there were significant increases in cognitive and executive function, reflecting improved capabilities in key attributes of information processing, vigilance, and selective attention,” the researchers wrote.
“Increased cognitive function will help foster increased societal integration and quality of life, which, given that this is the first generation of those with [Down syndrome] to outlive their parents and caregivers, is of importance,” they wrote.
For example, those in an exercise-only intervention arm had an 11.4% improvement on the distance covered in the Six-Minute Walk Test, going from a mean of 498.8 meters before intervention to 522.1 meters afterward. Those in a group that combined group exercise with cognitive training increased the distance walked by 9.9%, or 49.2 meters. Groups that got cognitive training only or no intervention showed no significant changes.
In measures of cognitive function, the exercise group showed a 38% increase in selective attention, with the cognitive and combined groups showing changes for the same measure of 16.5% and 55.3%, respectively. The changes for concentration in the exercise-alone group was 31.5%, while those receiving cognitive training alone or combined exercise plus cognitive training showed improvements in concentration of 21% and 15%, respectively.
Asked why a combination intervention was not superior to exercise alone, Dr. Gordon said in an interview, “Something we’re looking at in the data but can’t fully confirm is that the combined group started to become fatigued due to the double dose of the intervention, and this prevented them in the final tests from doing quite so well as the exercise-alone group. Irrespective of the magnitude of change, any cognitive adaptation observed will be beneficial to this population.”
The evidence for the benefits of exercise on both physical and cognitive health in a non–Down syndrome population are well established, he said, but there were few data on its effect on the Down syndrome population.
One small study showed physical and neurocognitive benefits with resistance training.
“The evidence from previous studies showed increased levels of inactivity and sitting time in Down syndrome individuals compared with non–[Down syndrome] controls, so we hypothesized that exercise, albeit small amounts, would increase their physical fitness,” Dr. Gordon said.
His team also hypothesized that walking would stimulate cognitive development since it requires heightened cognitive engagement compared with inactivity. “What surprised us was the degree of improvement,” Dr. Gordon said.
The process of walking requires the brain to interpret information on a real-time basis from both internal and external cues, he continued. “For most of us this process requires low-level cognitive engagement. However, in the [Down syndrome] population, where motor control is impaired and accompanied by poor muscle tone, walking imposes a heightened cognitive load.” It requires them to concentrate on the action, be aware of their surroundings, and make the right decisions, all of which stimulate areas of the brain that control these functions.
Study details
Eighty-three adult participants were available for final analysis – 67 from North America, 8 from Europe, 5 from Africa, 2 from Asia, and 1 from Australia. The mean age of participants was 27.1 years, 40 were female, and all had caregiver support during the study.
Those unable to visualize information on computer and mobile/tablet screens or to listen to instructions/auditory cues were excluded. All were provided with instructions and a mobile monitoring tool set to record steps completed, distances covered, speeds, and heart rate.
Each was assigned to one of four groups. Exercise intervention-only consisted of 8 weeks of cardiorespiratory exercise defined as either walking or jogging three times a week for 30 minutes. Cognitive training included eight levels (about 20 minutes) of cognitive and executive function exercises six times per week. The combined group completed both the cardiorespiratory and cognitive interventions, while the fourth group acted as controls with no intervention.
According to the authors, the study offers a real-life scenario that can be readily adopted within the Down syndrome community.
This study was commissioned by the Canadian Down Syndrome Society. The authors had no conflicts of interest to declare.
In the first study of its kind, U.K. and French researchers reported that exercise positively affected physical and cognitive health in persons with Down syndrome. “The findings are significant and offer a crucial challenge to the [Down syndrome] and wider societies,” wrote a team led by Dan Gordon, PhD, associate professor of cardiorespiratory exercise physiology at Anglia Ruskin University in Cambridge, England. “Impact of Prescribed Exercise on the Physical and Cognitive Health of Adults with Down Syndrome: The MinDSets Study” was published in the International Journal of Environmental Research and Public Health.
“Through the simple application of walking, a form of exercise which requires little to no equipment or expense, there were significant increases in cognitive and executive function, reflecting improved capabilities in key attributes of information processing, vigilance, and selective attention,” the researchers wrote.
“Increased cognitive function will help foster increased societal integration and quality of life, which, given that this is the first generation of those with [Down syndrome] to outlive their parents and caregivers, is of importance,” they wrote.
For example, those in an exercise-only intervention arm had an 11.4% improvement on the distance covered in the Six-Minute Walk Test, going from a mean of 498.8 meters before intervention to 522.1 meters afterward. Those in a group that combined group exercise with cognitive training increased the distance walked by 9.9%, or 49.2 meters. Groups that got cognitive training only or no intervention showed no significant changes.
In measures of cognitive function, the exercise group showed a 38% increase in selective attention, with the cognitive and combined groups showing changes for the same measure of 16.5% and 55.3%, respectively. The changes for concentration in the exercise-alone group was 31.5%, while those receiving cognitive training alone or combined exercise plus cognitive training showed improvements in concentration of 21% and 15%, respectively.
Asked why a combination intervention was not superior to exercise alone, Dr. Gordon said in an interview, “Something we’re looking at in the data but can’t fully confirm is that the combined group started to become fatigued due to the double dose of the intervention, and this prevented them in the final tests from doing quite so well as the exercise-alone group. Irrespective of the magnitude of change, any cognitive adaptation observed will be beneficial to this population.”
The evidence for the benefits of exercise on both physical and cognitive health in a non–Down syndrome population are well established, he said, but there were few data on its effect on the Down syndrome population.
One small study showed physical and neurocognitive benefits with resistance training.
“The evidence from previous studies showed increased levels of inactivity and sitting time in Down syndrome individuals compared with non–[Down syndrome] controls, so we hypothesized that exercise, albeit small amounts, would increase their physical fitness,” Dr. Gordon said.
His team also hypothesized that walking would stimulate cognitive development since it requires heightened cognitive engagement compared with inactivity. “What surprised us was the degree of improvement,” Dr. Gordon said.
The process of walking requires the brain to interpret information on a real-time basis from both internal and external cues, he continued. “For most of us this process requires low-level cognitive engagement. However, in the [Down syndrome] population, where motor control is impaired and accompanied by poor muscle tone, walking imposes a heightened cognitive load.” It requires them to concentrate on the action, be aware of their surroundings, and make the right decisions, all of which stimulate areas of the brain that control these functions.
Study details
Eighty-three adult participants were available for final analysis – 67 from North America, 8 from Europe, 5 from Africa, 2 from Asia, and 1 from Australia. The mean age of participants was 27.1 years, 40 were female, and all had caregiver support during the study.
Those unable to visualize information on computer and mobile/tablet screens or to listen to instructions/auditory cues were excluded. All were provided with instructions and a mobile monitoring tool set to record steps completed, distances covered, speeds, and heart rate.
Each was assigned to one of four groups. Exercise intervention-only consisted of 8 weeks of cardiorespiratory exercise defined as either walking or jogging three times a week for 30 minutes. Cognitive training included eight levels (about 20 minutes) of cognitive and executive function exercises six times per week. The combined group completed both the cardiorespiratory and cognitive interventions, while the fourth group acted as controls with no intervention.
According to the authors, the study offers a real-life scenario that can be readily adopted within the Down syndrome community.
This study was commissioned by the Canadian Down Syndrome Society. The authors had no conflicts of interest to declare.
FROM INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH
FDA warns of potentially lethal reaction to seizure meds
Known as drug reaction with eosinophilia and systemic symptoms (DRESS), it may start as a rash but can quickly progress and cause injury to internal organs, the need for hospitalization, and death, the FDA notes.
A search of the FDA Adverse Event Reporting System (FAERS) and the medical literature through March 2023 identified 32 serious cases of DRESS worldwide that were associated with levetiracetam.
Three cases occurred in the United States, and 29 occurred abroad. In all 32 cases, the patients were hospitalized and received medical treatment; in 2 cases, the patients died.
The median time to onset of DRESS in the levetiracetam cases was 24 days; times ranged from 7 to 170 days. The reported signs and symptoms included skin rash (n = 22), fever (n = 20), eosinophilia (n = 17), lymph node swelling (n = 9), and atypical lymphocytes (n = 4).
Twenty-two levetiracetam-associated cases of DRESS involved injury to one or more organs, including the liver, lungs, kidneys, and gallbladder.
In 25 of the 29 cases for which information on treatment discontinuation was available, DRESS symptoms resolved when levetiracetam was discontinued.
As for clobazam, a search of FAERS and the medical literature through July 2023 identified 10 serious cases of DRESS worldwide – 1 in the United States and 9 abroad. All 10 patients were hospitalized and received medical treatment. No deaths were reported.
The median time to onset of clobazam-associated DRESS was 21.5 days (range, 7-103 days). The reported signs and symptoms included skin rash (n = 10), fever (n = 8), eosinophilia (n = 7), facial swelling (n = 7), leukocytosis (n = 4), lymph node swelling (n = 4), and leukopenia/thrombocytopenia (n = 1).
In nine cases, there was injury to one or more organs, including the liver, kidneys, and gastrointestinal tract.
DRESS symptoms resolved in all 10 cases when treatment with clobazam was stopped. DRESS and other serious skin reactions reported with clobazam, a benzodiazepine, have not generally been associated with other benzodiazepines, the FDA notes.
Label updates
As a result of these cases, warnings about the risk of DRESS will be added to the prescribing information and patient medication guides for these medicines, the FDA announced.
“Health care professionals should be aware that prompt recognition and early treatment is important for improving DRESS outcomes and decreasing mortality,” the FDA said.
They noted that diagnosis is often difficult because early signs and symptoms, such as fever and swollen lymph nodes, may be present without evidence of a rash.
DRESS may develop 2-8 weeks after starting levetiracetam or clobazam. Symptoms and intensity can vary widely.
DRESS can also be confused with other serious skin reactions, such as Stevens-Johnson syndrome and toxic epidermal necrolysis.
The FDA says patients should be advised of the signs and symptoms of DRESS and be told to stop taking the medicine and seek immediate medical attention if DRESS is suspected during treatment with levetiracetam or clobazam.
Adverse reactions with these medications should be reported to the FDA’s MedWatch program.
A version of this article appeared on Medscape.com.
Known as drug reaction with eosinophilia and systemic symptoms (DRESS), it may start as a rash but can quickly progress and cause injury to internal organs, the need for hospitalization, and death, the FDA notes.
A search of the FDA Adverse Event Reporting System (FAERS) and the medical literature through March 2023 identified 32 serious cases of DRESS worldwide that were associated with levetiracetam.
Three cases occurred in the United States, and 29 occurred abroad. In all 32 cases, the patients were hospitalized and received medical treatment; in 2 cases, the patients died.
The median time to onset of DRESS in the levetiracetam cases was 24 days; times ranged from 7 to 170 days. The reported signs and symptoms included skin rash (n = 22), fever (n = 20), eosinophilia (n = 17), lymph node swelling (n = 9), and atypical lymphocytes (n = 4).
Twenty-two levetiracetam-associated cases of DRESS involved injury to one or more organs, including the liver, lungs, kidneys, and gallbladder.
In 25 of the 29 cases for which information on treatment discontinuation was available, DRESS symptoms resolved when levetiracetam was discontinued.
As for clobazam, a search of FAERS and the medical literature through July 2023 identified 10 serious cases of DRESS worldwide – 1 in the United States and 9 abroad. All 10 patients were hospitalized and received medical treatment. No deaths were reported.
The median time to onset of clobazam-associated DRESS was 21.5 days (range, 7-103 days). The reported signs and symptoms included skin rash (n = 10), fever (n = 8), eosinophilia (n = 7), facial swelling (n = 7), leukocytosis (n = 4), lymph node swelling (n = 4), and leukopenia/thrombocytopenia (n = 1).
In nine cases, there was injury to one or more organs, including the liver, kidneys, and gastrointestinal tract.
DRESS symptoms resolved in all 10 cases when treatment with clobazam was stopped. DRESS and other serious skin reactions reported with clobazam, a benzodiazepine, have not generally been associated with other benzodiazepines, the FDA notes.
Label updates
As a result of these cases, warnings about the risk of DRESS will be added to the prescribing information and patient medication guides for these medicines, the FDA announced.
“Health care professionals should be aware that prompt recognition and early treatment is important for improving DRESS outcomes and decreasing mortality,” the FDA said.
They noted that diagnosis is often difficult because early signs and symptoms, such as fever and swollen lymph nodes, may be present without evidence of a rash.
DRESS may develop 2-8 weeks after starting levetiracetam or clobazam. Symptoms and intensity can vary widely.
DRESS can also be confused with other serious skin reactions, such as Stevens-Johnson syndrome and toxic epidermal necrolysis.
The FDA says patients should be advised of the signs and symptoms of DRESS and be told to stop taking the medicine and seek immediate medical attention if DRESS is suspected during treatment with levetiracetam or clobazam.
Adverse reactions with these medications should be reported to the FDA’s MedWatch program.
A version of this article appeared on Medscape.com.
Known as drug reaction with eosinophilia and systemic symptoms (DRESS), it may start as a rash but can quickly progress and cause injury to internal organs, the need for hospitalization, and death, the FDA notes.
A search of the FDA Adverse Event Reporting System (FAERS) and the medical literature through March 2023 identified 32 serious cases of DRESS worldwide that were associated with levetiracetam.
Three cases occurred in the United States, and 29 occurred abroad. In all 32 cases, the patients were hospitalized and received medical treatment; in 2 cases, the patients died.
The median time to onset of DRESS in the levetiracetam cases was 24 days; times ranged from 7 to 170 days. The reported signs and symptoms included skin rash (n = 22), fever (n = 20), eosinophilia (n = 17), lymph node swelling (n = 9), and atypical lymphocytes (n = 4).
Twenty-two levetiracetam-associated cases of DRESS involved injury to one or more organs, including the liver, lungs, kidneys, and gallbladder.
In 25 of the 29 cases for which information on treatment discontinuation was available, DRESS symptoms resolved when levetiracetam was discontinued.
As for clobazam, a search of FAERS and the medical literature through July 2023 identified 10 serious cases of DRESS worldwide – 1 in the United States and 9 abroad. All 10 patients were hospitalized and received medical treatment. No deaths were reported.
The median time to onset of clobazam-associated DRESS was 21.5 days (range, 7-103 days). The reported signs and symptoms included skin rash (n = 10), fever (n = 8), eosinophilia (n = 7), facial swelling (n = 7), leukocytosis (n = 4), lymph node swelling (n = 4), and leukopenia/thrombocytopenia (n = 1).
In nine cases, there was injury to one or more organs, including the liver, kidneys, and gastrointestinal tract.
DRESS symptoms resolved in all 10 cases when treatment with clobazam was stopped. DRESS and other serious skin reactions reported with clobazam, a benzodiazepine, have not generally been associated with other benzodiazepines, the FDA notes.
Label updates
As a result of these cases, warnings about the risk of DRESS will be added to the prescribing information and patient medication guides for these medicines, the FDA announced.
“Health care professionals should be aware that prompt recognition and early treatment is important for improving DRESS outcomes and decreasing mortality,” the FDA said.
They noted that diagnosis is often difficult because early signs and symptoms, such as fever and swollen lymph nodes, may be present without evidence of a rash.
DRESS may develop 2-8 weeks after starting levetiracetam or clobazam. Symptoms and intensity can vary widely.
DRESS can also be confused with other serious skin reactions, such as Stevens-Johnson syndrome and toxic epidermal necrolysis.
The FDA says patients should be advised of the signs and symptoms of DRESS and be told to stop taking the medicine and seek immediate medical attention if DRESS is suspected during treatment with levetiracetam or clobazam.
Adverse reactions with these medications should be reported to the FDA’s MedWatch program.
A version of this article appeared on Medscape.com.