User login
Neighborhood Disadvantage Tied to Higher Risk for ASD
TOPLINE
, a population-based prospective cohort study shows.
METHODOLOGY
- Investigators analyzed data from a large cohort of singleton children with insurance born in Kaiser Permanente Southern California hospitals between 2001 and 2014.
- They ascertained ASD diagnosis, maternal race and ethnicity, and maternal address at time of birth.
- Neighborhood disadvantage was determined by the percentage of families in the mother’s neighborhood considered to be living in poverty, unemployed, have female-headed households with children, using public assistance, less than a high school education, among other variables.
TAKEAWAY
- Among 318,300 mothers who delivered babies during the study period, 6350 children were diagnosed with ASD during follow-up, and median age at diagnosis was 3.5 years.
- Greater neighborhood disadvantage at birth was associated with a higher likelihood of ASD diagnosis (adjusted hazard ratio [aHR], 1.07; 95% CI, 1.02-1.11)
- ASD diagnoses were more likely among children of mothers who were Black (aHR, 1.13; 95% CI, 1.02-1.25), Asian/Pacific Islander (aHR, 1.11; 95% CI, 1.02-1.20), or Hispanic (aHR, 1.07; 95% CI, 1.00-1.15), even after the researchers controlled for neighborhood.
- While odds of an ASD diagnosis were higher among children from minority racial and ethnic groups, neighborhood disadvantage was significantly associated with ASD diagnosis only for children of White mothers (aHR, 1.17; 95% CI, 1.09-1.26).
IN PRACTICE
Investigators noted that they could only speculate about the factors driving the association between neighborhood disadvantage and a stronger risk for ASD diagnosis in children of White mothers. “They may be due to systemic racism, discrimination, and their impact on maternal health during pregnancy,” they wrote.
SOURCE
Xin Yu, MS, and Daniel Hackman, PhD, of the University of Southern California Los Angeles, led the study, which was published online November 15 in JAMA Psychiatry.
LIMITATIONS
The research was limited by a lack of information on fathers and variables such as incomes, which may have confounded the findings. The authors also acknowledged that the study should be replicated in other health service settings.
DISCLOSURES
The study was funded by the National Institutes on Environmental Health Sciences, the National Institutes of Health (NIH), and the Environmental Protection Agency. Dr. Hackman reported receiving grant funding from NIH during the conduct of the study. Other disclosures are available in the original study.
A version of this article appeared on Medscape.com.
TOPLINE
, a population-based prospective cohort study shows.
METHODOLOGY
- Investigators analyzed data from a large cohort of singleton children with insurance born in Kaiser Permanente Southern California hospitals between 2001 and 2014.
- They ascertained ASD diagnosis, maternal race and ethnicity, and maternal address at time of birth.
- Neighborhood disadvantage was determined by the percentage of families in the mother’s neighborhood considered to be living in poverty, unemployed, have female-headed households with children, using public assistance, less than a high school education, among other variables.
TAKEAWAY
- Among 318,300 mothers who delivered babies during the study period, 6350 children were diagnosed with ASD during follow-up, and median age at diagnosis was 3.5 years.
- Greater neighborhood disadvantage at birth was associated with a higher likelihood of ASD diagnosis (adjusted hazard ratio [aHR], 1.07; 95% CI, 1.02-1.11)
- ASD diagnoses were more likely among children of mothers who were Black (aHR, 1.13; 95% CI, 1.02-1.25), Asian/Pacific Islander (aHR, 1.11; 95% CI, 1.02-1.20), or Hispanic (aHR, 1.07; 95% CI, 1.00-1.15), even after the researchers controlled for neighborhood.
- While odds of an ASD diagnosis were higher among children from minority racial and ethnic groups, neighborhood disadvantage was significantly associated with ASD diagnosis only for children of White mothers (aHR, 1.17; 95% CI, 1.09-1.26).
IN PRACTICE
Investigators noted that they could only speculate about the factors driving the association between neighborhood disadvantage and a stronger risk for ASD diagnosis in children of White mothers. “They may be due to systemic racism, discrimination, and their impact on maternal health during pregnancy,” they wrote.
SOURCE
Xin Yu, MS, and Daniel Hackman, PhD, of the University of Southern California Los Angeles, led the study, which was published online November 15 in JAMA Psychiatry.
LIMITATIONS
The research was limited by a lack of information on fathers and variables such as incomes, which may have confounded the findings. The authors also acknowledged that the study should be replicated in other health service settings.
DISCLOSURES
The study was funded by the National Institutes on Environmental Health Sciences, the National Institutes of Health (NIH), and the Environmental Protection Agency. Dr. Hackman reported receiving grant funding from NIH during the conduct of the study. Other disclosures are available in the original study.
A version of this article appeared on Medscape.com.
TOPLINE
, a population-based prospective cohort study shows.
METHODOLOGY
- Investigators analyzed data from a large cohort of singleton children with insurance born in Kaiser Permanente Southern California hospitals between 2001 and 2014.
- They ascertained ASD diagnosis, maternal race and ethnicity, and maternal address at time of birth.
- Neighborhood disadvantage was determined by the percentage of families in the mother’s neighborhood considered to be living in poverty, unemployed, have female-headed households with children, using public assistance, less than a high school education, among other variables.
TAKEAWAY
- Among 318,300 mothers who delivered babies during the study period, 6350 children were diagnosed with ASD during follow-up, and median age at diagnosis was 3.5 years.
- Greater neighborhood disadvantage at birth was associated with a higher likelihood of ASD diagnosis (adjusted hazard ratio [aHR], 1.07; 95% CI, 1.02-1.11)
- ASD diagnoses were more likely among children of mothers who were Black (aHR, 1.13; 95% CI, 1.02-1.25), Asian/Pacific Islander (aHR, 1.11; 95% CI, 1.02-1.20), or Hispanic (aHR, 1.07; 95% CI, 1.00-1.15), even after the researchers controlled for neighborhood.
- While odds of an ASD diagnosis were higher among children from minority racial and ethnic groups, neighborhood disadvantage was significantly associated with ASD diagnosis only for children of White mothers (aHR, 1.17; 95% CI, 1.09-1.26).
IN PRACTICE
Investigators noted that they could only speculate about the factors driving the association between neighborhood disadvantage and a stronger risk for ASD diagnosis in children of White mothers. “They may be due to systemic racism, discrimination, and their impact on maternal health during pregnancy,” they wrote.
SOURCE
Xin Yu, MS, and Daniel Hackman, PhD, of the University of Southern California Los Angeles, led the study, which was published online November 15 in JAMA Psychiatry.
LIMITATIONS
The research was limited by a lack of information on fathers and variables such as incomes, which may have confounded the findings. The authors also acknowledged that the study should be replicated in other health service settings.
DISCLOSURES
The study was funded by the National Institutes on Environmental Health Sciences, the National Institutes of Health (NIH), and the Environmental Protection Agency. Dr. Hackman reported receiving grant funding from NIH during the conduct of the study. Other disclosures are available in the original study.
A version of this article appeared on Medscape.com.
Exercise plan cost-effective in post-stroke cognitive rehab
A multicomponent exercise program that includes strength, aerobic, agility, and balance training exercises is cost-effective and results in improved cognition among stroke survivors, compared with a balance and tone control group, according to a new analysis.
On the other hand, a program consisting of cognitive and social enrichment activities that includes memory, brain training, and group social games entailed higher costs, compared with the balance and tone group, which included stretches, deep breathing and relaxation techniques, posture education, and core control exercises.
“Cognitive impairment is experienced in approximately one-third of stroke survivors,” study author Jennifer Davis, PhD, a Canada research chair in applied health economics and assistant professor of management at the University of British Columbia in Kelowna, said in an interview.
“The economic evaluation of the exercise intervention demonstrated that the multicomponent exercise program provided good value for the money when comparing costs and cognitive outcomes,” she said. However, “impacts on health-related quality of life were not observed.”
The study was published online November 30 in JAMA Network Open.
Comparing Three Approaches
Despite improved care, patients with stroke often face challenges with physical function, cognitive abilities, and quality of life, the authors wrote. Among older adults, in particular, cognitive deficits remain prevalent and are associated with increased risks for dementia, mortality, and increased burdens for patients, caregivers, and health systems.
Numerous interventions have shown promise for post-stroke cognitive rehabilitation, including exercise and cognitive training, the authors wrote. Research hasn’t indicated which programs offer the most efficient or cost-effective options, however.
Dr. Davis and colleagues conducted an economic evaluation alongside the Vitality study, a three-group randomized clinical trial that examined the efficacy of improving cognitive function among patients with chronic stroke through a multicomponent exercise program, cognitive and social enrichment activities, or a control group with balance and tone activities.
The economic evaluation team included a cost-effectiveness analysis (based on incremental cost per cognitive function change) and a cost-utility analysis (incremental cost per quality-adjusted life-year [QALY] gained). The researchers used a cost-effectiveness threshold of CAD $50,000 (Canadian dollars) per QALY for the cost-utility analysis, which was based on precedent treatment in Canada.
The clinical trial included 120 community-dwelling adults aged 55 years and older who had a stroke at least 12 months before the study. Based in the Vancouver metropolitan area, participants were randomly assigned to twice-weekly, 60-minute classes led by trained instructors for 26 weeks. The mean age was 71 years, and 62% of participants were men.
Exercise Effective
Overall, the balance and tone control group had the lowest delivery cost at CAD $777 per person, followed by CAD $1090 per person for the exercise group and CAD $1492 per person for the cognitive and social enrichment group.
After the 6-month intervention, the mean cognitive scores were –0.192 for the exercise group, –0.184 for the cognitive and social enrichment group, and –0.171 for the balance and tone group, indicating better cognitive function across all three groups.
In the cost-effectiveness analysis, the exercise intervention was costlier but more effective than the control group, with an incremental cost-effectiveness ratio (ICER) of CAD –$8823.
In the cost-utility analysis, the exercise intervention was cost saving (less costly and more effective), compared with the control group, with an ICER of CAD –$3381 per QALY gained at the end of the intervention and an ICER of CAD –$154,198 per QALY gained at the end of the 12-month follow-up period. The cognitive and social enrichment program was more costly and more effective than the control group, with an ICER of CAD $101,687 per QALY gained at the end of the intervention and an ICER of CAD $331,306 per QALY gained at the end of the follow-up period.
In additional analyses, the exercise group had the lowest healthcare resource utilization due to lower healthcare costs for physician visits and lab tests.
“This study provides initial data that suggests multicomponent exercise may be a cost-effective solution for combating cognitive decline among stroke survivors,” said Dr. Davis.
Overall, exercise was cost-effective for improving cognitive function but not quality of life among participants. The clinical trial was powered to detect changes in cognitive function rather than quality of life, so it lacked statistical power to detect differences in quality of life, said Dr. Davis.
Exercise programs and cognitive and social enrichment programs show promise for improving cognitive function after stroke, the authors wrote, though future research should focus on optimizing cost-effectiveness and enhancing health-related quality of life.
Considering Additional Benefits
Commenting on the study, Alan Tam, MD, a physiatrist at the Toronto Rehabilitation Institute’s Brain Rehabilitation Program, said, “The authors show that within the timeframe of their analysis, there is a trend to cost-effectiveness for the cognitive intervention being offered.” Dr. Tam did not participate in the research.
“However, the finding is not robust, as less than 50% of their simulations would meet their acceptability level they have defined,” he said. “Given that most of the cost of the intervention is up front, but the benefits are likely lifelong, potentially taking the 12-month analysis to a lifetime analysis would show more significant findings.”
Dr. Tam researches factors associated with brain injury rehabilitation and has explored the cost-effectiveness of a high-intensity outpatient stroke rehabilitation program.
“Presenting this type of work is important,” he said. “While there are interventions that do not meet our definition of statistical significance, especially in the rehabilitation world, there can still be a benefit for patients and health systems.”
The primary study was funded by the Canadian Institutes of Health Research (CIHR) and the Jack Brown and Family Alzheimer Research Foundation Society. Dr. Davis reported receiving grants from the CIHR and Michael Smith Health Research BC during the conduct of the study. Dr. Tam reported no relevant financial relationships.
A version of this article appeared on Medscape.com.
A multicomponent exercise program that includes strength, aerobic, agility, and balance training exercises is cost-effective and results in improved cognition among stroke survivors, compared with a balance and tone control group, according to a new analysis.
On the other hand, a program consisting of cognitive and social enrichment activities that includes memory, brain training, and group social games entailed higher costs, compared with the balance and tone group, which included stretches, deep breathing and relaxation techniques, posture education, and core control exercises.
“Cognitive impairment is experienced in approximately one-third of stroke survivors,” study author Jennifer Davis, PhD, a Canada research chair in applied health economics and assistant professor of management at the University of British Columbia in Kelowna, said in an interview.
“The economic evaluation of the exercise intervention demonstrated that the multicomponent exercise program provided good value for the money when comparing costs and cognitive outcomes,” she said. However, “impacts on health-related quality of life were not observed.”
The study was published online November 30 in JAMA Network Open.
Comparing Three Approaches
Despite improved care, patients with stroke often face challenges with physical function, cognitive abilities, and quality of life, the authors wrote. Among older adults, in particular, cognitive deficits remain prevalent and are associated with increased risks for dementia, mortality, and increased burdens for patients, caregivers, and health systems.
Numerous interventions have shown promise for post-stroke cognitive rehabilitation, including exercise and cognitive training, the authors wrote. Research hasn’t indicated which programs offer the most efficient or cost-effective options, however.
Dr. Davis and colleagues conducted an economic evaluation alongside the Vitality study, a three-group randomized clinical trial that examined the efficacy of improving cognitive function among patients with chronic stroke through a multicomponent exercise program, cognitive and social enrichment activities, or a control group with balance and tone activities.
The economic evaluation team included a cost-effectiveness analysis (based on incremental cost per cognitive function change) and a cost-utility analysis (incremental cost per quality-adjusted life-year [QALY] gained). The researchers used a cost-effectiveness threshold of CAD $50,000 (Canadian dollars) per QALY for the cost-utility analysis, which was based on precedent treatment in Canada.
The clinical trial included 120 community-dwelling adults aged 55 years and older who had a stroke at least 12 months before the study. Based in the Vancouver metropolitan area, participants were randomly assigned to twice-weekly, 60-minute classes led by trained instructors for 26 weeks. The mean age was 71 years, and 62% of participants were men.
Exercise Effective
Overall, the balance and tone control group had the lowest delivery cost at CAD $777 per person, followed by CAD $1090 per person for the exercise group and CAD $1492 per person for the cognitive and social enrichment group.
After the 6-month intervention, the mean cognitive scores were –0.192 for the exercise group, –0.184 for the cognitive and social enrichment group, and –0.171 for the balance and tone group, indicating better cognitive function across all three groups.
In the cost-effectiveness analysis, the exercise intervention was costlier but more effective than the control group, with an incremental cost-effectiveness ratio (ICER) of CAD –$8823.
In the cost-utility analysis, the exercise intervention was cost saving (less costly and more effective), compared with the control group, with an ICER of CAD –$3381 per QALY gained at the end of the intervention and an ICER of CAD –$154,198 per QALY gained at the end of the 12-month follow-up period. The cognitive and social enrichment program was more costly and more effective than the control group, with an ICER of CAD $101,687 per QALY gained at the end of the intervention and an ICER of CAD $331,306 per QALY gained at the end of the follow-up period.
In additional analyses, the exercise group had the lowest healthcare resource utilization due to lower healthcare costs for physician visits and lab tests.
“This study provides initial data that suggests multicomponent exercise may be a cost-effective solution for combating cognitive decline among stroke survivors,” said Dr. Davis.
Overall, exercise was cost-effective for improving cognitive function but not quality of life among participants. The clinical trial was powered to detect changes in cognitive function rather than quality of life, so it lacked statistical power to detect differences in quality of life, said Dr. Davis.
Exercise programs and cognitive and social enrichment programs show promise for improving cognitive function after stroke, the authors wrote, though future research should focus on optimizing cost-effectiveness and enhancing health-related quality of life.
Considering Additional Benefits
Commenting on the study, Alan Tam, MD, a physiatrist at the Toronto Rehabilitation Institute’s Brain Rehabilitation Program, said, “The authors show that within the timeframe of their analysis, there is a trend to cost-effectiveness for the cognitive intervention being offered.” Dr. Tam did not participate in the research.
“However, the finding is not robust, as less than 50% of their simulations would meet their acceptability level they have defined,” he said. “Given that most of the cost of the intervention is up front, but the benefits are likely lifelong, potentially taking the 12-month analysis to a lifetime analysis would show more significant findings.”
Dr. Tam researches factors associated with brain injury rehabilitation and has explored the cost-effectiveness of a high-intensity outpatient stroke rehabilitation program.
“Presenting this type of work is important,” he said. “While there are interventions that do not meet our definition of statistical significance, especially in the rehabilitation world, there can still be a benefit for patients and health systems.”
The primary study was funded by the Canadian Institutes of Health Research (CIHR) and the Jack Brown and Family Alzheimer Research Foundation Society. Dr. Davis reported receiving grants from the CIHR and Michael Smith Health Research BC during the conduct of the study. Dr. Tam reported no relevant financial relationships.
A version of this article appeared on Medscape.com.
A multicomponent exercise program that includes strength, aerobic, agility, and balance training exercises is cost-effective and results in improved cognition among stroke survivors, compared with a balance and tone control group, according to a new analysis.
On the other hand, a program consisting of cognitive and social enrichment activities that includes memory, brain training, and group social games entailed higher costs, compared with the balance and tone group, which included stretches, deep breathing and relaxation techniques, posture education, and core control exercises.
“Cognitive impairment is experienced in approximately one-third of stroke survivors,” study author Jennifer Davis, PhD, a Canada research chair in applied health economics and assistant professor of management at the University of British Columbia in Kelowna, said in an interview.
“The economic evaluation of the exercise intervention demonstrated that the multicomponent exercise program provided good value for the money when comparing costs and cognitive outcomes,” she said. However, “impacts on health-related quality of life were not observed.”
The study was published online November 30 in JAMA Network Open.
Comparing Three Approaches
Despite improved care, patients with stroke often face challenges with physical function, cognitive abilities, and quality of life, the authors wrote. Among older adults, in particular, cognitive deficits remain prevalent and are associated with increased risks for dementia, mortality, and increased burdens for patients, caregivers, and health systems.
Numerous interventions have shown promise for post-stroke cognitive rehabilitation, including exercise and cognitive training, the authors wrote. Research hasn’t indicated which programs offer the most efficient or cost-effective options, however.
Dr. Davis and colleagues conducted an economic evaluation alongside the Vitality study, a three-group randomized clinical trial that examined the efficacy of improving cognitive function among patients with chronic stroke through a multicomponent exercise program, cognitive and social enrichment activities, or a control group with balance and tone activities.
The economic evaluation team included a cost-effectiveness analysis (based on incremental cost per cognitive function change) and a cost-utility analysis (incremental cost per quality-adjusted life-year [QALY] gained). The researchers used a cost-effectiveness threshold of CAD $50,000 (Canadian dollars) per QALY for the cost-utility analysis, which was based on precedent treatment in Canada.
The clinical trial included 120 community-dwelling adults aged 55 years and older who had a stroke at least 12 months before the study. Based in the Vancouver metropolitan area, participants were randomly assigned to twice-weekly, 60-minute classes led by trained instructors for 26 weeks. The mean age was 71 years, and 62% of participants were men.
Exercise Effective
Overall, the balance and tone control group had the lowest delivery cost at CAD $777 per person, followed by CAD $1090 per person for the exercise group and CAD $1492 per person for the cognitive and social enrichment group.
After the 6-month intervention, the mean cognitive scores were –0.192 for the exercise group, –0.184 for the cognitive and social enrichment group, and –0.171 for the balance and tone group, indicating better cognitive function across all three groups.
In the cost-effectiveness analysis, the exercise intervention was costlier but more effective than the control group, with an incremental cost-effectiveness ratio (ICER) of CAD –$8823.
In the cost-utility analysis, the exercise intervention was cost saving (less costly and more effective), compared with the control group, with an ICER of CAD –$3381 per QALY gained at the end of the intervention and an ICER of CAD –$154,198 per QALY gained at the end of the 12-month follow-up period. The cognitive and social enrichment program was more costly and more effective than the control group, with an ICER of CAD $101,687 per QALY gained at the end of the intervention and an ICER of CAD $331,306 per QALY gained at the end of the follow-up period.
In additional analyses, the exercise group had the lowest healthcare resource utilization due to lower healthcare costs for physician visits and lab tests.
“This study provides initial data that suggests multicomponent exercise may be a cost-effective solution for combating cognitive decline among stroke survivors,” said Dr. Davis.
Overall, exercise was cost-effective for improving cognitive function but not quality of life among participants. The clinical trial was powered to detect changes in cognitive function rather than quality of life, so it lacked statistical power to detect differences in quality of life, said Dr. Davis.
Exercise programs and cognitive and social enrichment programs show promise for improving cognitive function after stroke, the authors wrote, though future research should focus on optimizing cost-effectiveness and enhancing health-related quality of life.
Considering Additional Benefits
Commenting on the study, Alan Tam, MD, a physiatrist at the Toronto Rehabilitation Institute’s Brain Rehabilitation Program, said, “The authors show that within the timeframe of their analysis, there is a trend to cost-effectiveness for the cognitive intervention being offered.” Dr. Tam did not participate in the research.
“However, the finding is not robust, as less than 50% of their simulations would meet their acceptability level they have defined,” he said. “Given that most of the cost of the intervention is up front, but the benefits are likely lifelong, potentially taking the 12-month analysis to a lifetime analysis would show more significant findings.”
Dr. Tam researches factors associated with brain injury rehabilitation and has explored the cost-effectiveness of a high-intensity outpatient stroke rehabilitation program.
“Presenting this type of work is important,” he said. “While there are interventions that do not meet our definition of statistical significance, especially in the rehabilitation world, there can still be a benefit for patients and health systems.”
The primary study was funded by the Canadian Institutes of Health Research (CIHR) and the Jack Brown and Family Alzheimer Research Foundation Society. Dr. Davis reported receiving grants from the CIHR and Michael Smith Health Research BC during the conduct of the study. Dr. Tam reported no relevant financial relationships.
A version of this article appeared on Medscape.com.
More evidence that modified Atkins diet lowers seizures in adults
ORLANDO —
The results of the small new review and meta-analysis suggest that “the MAD may be an effective adjuvant therapy for older patients who have failed anti-seizure medications,” study investigator Aiswarya Raj, MBBS, Aster Malabar Institute of Medical Sciences, Kerala, India, said in an interview.
The findings were presented at the annual meeting of the American Epilepsy Society.
Paucity of Adult Data
The MAD is a less restrictive hybrid of the ketogenic diet that limits carbohydrate intake and encourages fat consumption. It does not restrict fluids, calories, or proteins and does not require fats to be weighed or measured.
The diet includes fewer carbohydrates than the traditional Atkins diet and places more emphasis on fat intake. Dr. Raj said that the research suggests that the MAD “is a promising therapy in pediatric populations, but there’s not a lot of data in adults.”
Dr. Raj noted that this diet type has not been that popular in patients who clinicians believe might be better treated with drug therapy, possibly because of concern about the cardiac impact of consuming high-fat foods.
After conducting a systematic literature review assessing the efficacy of MAD in adults, the researchers included three randomized controlled trials and four observational studies published from January 2000 to May 2023 in the analysis.
The randomized controlled trials in the review assessed the primary outcome, a greater than 50% seizure reduction, at the end of 2 months, 3 months, and 6 months. In the MAD group, 32.5% of participants had more than a 50% seizure reduction vs 3% in the control group (odds ratio [OR], 12.62; 95% CI, 4.05-39.29; P < .0001).
Four participants who followed the diet achieved complete seizure-freedom compared with no participants in the control group (OR, 16.20; 95% CI, 0.82-318.82; P = .07).
The prospective studies examined this outcome at the end of 1 month or 3 months. In these studies, 41.9% of individuals experienced more than a 50% seizure reduction after 1 month of following the MAD, and 34.2% experienced this reduction after 3 months (OR, 1.41; 95% CI, 0.79-2.52; P = .24), with zero heterogeneity across studies.
It’s difficult to interpret the difference in seizure reduction between 1 and 3 months of therapy, Dr. Raj noted, because “there’s always the issue of compliance when you put a patient on a long-term diet.”
Positive results for MAD in adults were shown in another recent systematic review and meta-analysis published in Seizure: European Journal of Epilepsy.
That analysis included six studies with 575 patients who were randomly assigned to MAD or usual diet (UD) plus standard drug therapy. After an average follow-up of 12 weeks, MAD was associated with a higher rate of 50% or greater reduction in seizure frequency (relative risk [RR], 6.28; 95% CI, 3.52-10.50; P < .001), both in adults with drug-resistant epilepsy (RR, 6.14; 95% CI, 1.15-32.66; P = .033) and children (RR, 6.28; 95% CI, 3.43-11.49; P < .001).
MAD was also associated with a higher seizure freedom rate compared with UD (RR, 5.94; 95% CI, 1.93-18.31; P = .002).
Cholesterol Concern
In Dr. Raj’s analysis, there was an increment in blood total cholesterol level after 3 months of MAD (standard mean difference, -0.82; 95% CI, -1.23 to -0.40; P = .0001).
Concern about elevated blood cholesterol affecting coronary artery disease risk may explain why doctors sometimes shy away from recommending the MAD to their adult patients. “Some may not want to take that risk; you don’t want patients to succumb to coronary artery disease,” said Dr. Raj.
She noted that 3 months “is a very short time period,” and studies looking at cholesterol levels at the end of at least 1 year are needed to determine whether levels return to normal.
“We’re seeing a lot of literature now that suggests dietary intake does not really have a link with cholesterol levels,” she said. If this can be proven, “then this is definitely a great therapy.”
The evidence of cardiovascular safety of the MAD includes a study of 37 patients who showed that although total cholesterol and low-density lipoprotein (LDL) cholesterol increased over the first 3 months of MAD treatment, these values normalized within 1 year of treatment, including in patients treated with MAD for more than 3 years.
Primary Diet Recommendation
This news organization asked one of the authors of that study, Mackenzie C. Cervenka, MD, professor of neurology and medical director of the Adult Epilepsy Diet Center, Johns Hopkins Hospital, Baltimore, Maryland, to comment on the new research.
She said that she was “thrilled” to see more evidence showing that this diet therapy can be as effective for adults as for children. “This is a really important message to get out there.”
At her adult epilepsy diet center, the MAD is the “primary” diet recommended for patients who are resistant to seizure medication, not tube fed, and are keen to try diet therapy, said Dr. Cervenka.
In her experience, the likelihood of having a 50% or greater seizure reduction is about 40% among medication-resistant patients, “so very similar to what they reported in that review,” she said.
However, she noted that she emphasizes to patients that “diet therapy is not meant to be monotherapy.”
Dr. Cervenka’s team is examining LDL cholesterol levels as well as LDL particle size in adults who have been on the MAD for 2 years. LDL particle size, she noted, is a better predictor of long-term cardiovascular health.
No conflicts of interest were reported.
A version of this article appeared on Medscape.com.
ORLANDO —
The results of the small new review and meta-analysis suggest that “the MAD may be an effective adjuvant therapy for older patients who have failed anti-seizure medications,” study investigator Aiswarya Raj, MBBS, Aster Malabar Institute of Medical Sciences, Kerala, India, said in an interview.
The findings were presented at the annual meeting of the American Epilepsy Society.
Paucity of Adult Data
The MAD is a less restrictive hybrid of the ketogenic diet that limits carbohydrate intake and encourages fat consumption. It does not restrict fluids, calories, or proteins and does not require fats to be weighed or measured.
The diet includes fewer carbohydrates than the traditional Atkins diet and places more emphasis on fat intake. Dr. Raj said that the research suggests that the MAD “is a promising therapy in pediatric populations, but there’s not a lot of data in adults.”
Dr. Raj noted that this diet type has not been that popular in patients who clinicians believe might be better treated with drug therapy, possibly because of concern about the cardiac impact of consuming high-fat foods.
After conducting a systematic literature review assessing the efficacy of MAD in adults, the researchers included three randomized controlled trials and four observational studies published from January 2000 to May 2023 in the analysis.
The randomized controlled trials in the review assessed the primary outcome, a greater than 50% seizure reduction, at the end of 2 months, 3 months, and 6 months. In the MAD group, 32.5% of participants had more than a 50% seizure reduction vs 3% in the control group (odds ratio [OR], 12.62; 95% CI, 4.05-39.29; P < .0001).
Four participants who followed the diet achieved complete seizure-freedom compared with no participants in the control group (OR, 16.20; 95% CI, 0.82-318.82; P = .07).
The prospective studies examined this outcome at the end of 1 month or 3 months. In these studies, 41.9% of individuals experienced more than a 50% seizure reduction after 1 month of following the MAD, and 34.2% experienced this reduction after 3 months (OR, 1.41; 95% CI, 0.79-2.52; P = .24), with zero heterogeneity across studies.
It’s difficult to interpret the difference in seizure reduction between 1 and 3 months of therapy, Dr. Raj noted, because “there’s always the issue of compliance when you put a patient on a long-term diet.”
Positive results for MAD in adults were shown in another recent systematic review and meta-analysis published in Seizure: European Journal of Epilepsy.
That analysis included six studies with 575 patients who were randomly assigned to MAD or usual diet (UD) plus standard drug therapy. After an average follow-up of 12 weeks, MAD was associated with a higher rate of 50% or greater reduction in seizure frequency (relative risk [RR], 6.28; 95% CI, 3.52-10.50; P < .001), both in adults with drug-resistant epilepsy (RR, 6.14; 95% CI, 1.15-32.66; P = .033) and children (RR, 6.28; 95% CI, 3.43-11.49; P < .001).
MAD was also associated with a higher seizure freedom rate compared with UD (RR, 5.94; 95% CI, 1.93-18.31; P = .002).
Cholesterol Concern
In Dr. Raj’s analysis, there was an increment in blood total cholesterol level after 3 months of MAD (standard mean difference, -0.82; 95% CI, -1.23 to -0.40; P = .0001).
Concern about elevated blood cholesterol affecting coronary artery disease risk may explain why doctors sometimes shy away from recommending the MAD to their adult patients. “Some may not want to take that risk; you don’t want patients to succumb to coronary artery disease,” said Dr. Raj.
She noted that 3 months “is a very short time period,” and studies looking at cholesterol levels at the end of at least 1 year are needed to determine whether levels return to normal.
“We’re seeing a lot of literature now that suggests dietary intake does not really have a link with cholesterol levels,” she said. If this can be proven, “then this is definitely a great therapy.”
The evidence of cardiovascular safety of the MAD includes a study of 37 patients who showed that although total cholesterol and low-density lipoprotein (LDL) cholesterol increased over the first 3 months of MAD treatment, these values normalized within 1 year of treatment, including in patients treated with MAD for more than 3 years.
Primary Diet Recommendation
This news organization asked one of the authors of that study, Mackenzie C. Cervenka, MD, professor of neurology and medical director of the Adult Epilepsy Diet Center, Johns Hopkins Hospital, Baltimore, Maryland, to comment on the new research.
She said that she was “thrilled” to see more evidence showing that this diet therapy can be as effective for adults as for children. “This is a really important message to get out there.”
At her adult epilepsy diet center, the MAD is the “primary” diet recommended for patients who are resistant to seizure medication, not tube fed, and are keen to try diet therapy, said Dr. Cervenka.
In her experience, the likelihood of having a 50% or greater seizure reduction is about 40% among medication-resistant patients, “so very similar to what they reported in that review,” she said.
However, she noted that she emphasizes to patients that “diet therapy is not meant to be monotherapy.”
Dr. Cervenka’s team is examining LDL cholesterol levels as well as LDL particle size in adults who have been on the MAD for 2 years. LDL particle size, she noted, is a better predictor of long-term cardiovascular health.
No conflicts of interest were reported.
A version of this article appeared on Medscape.com.
ORLANDO —
The results of the small new review and meta-analysis suggest that “the MAD may be an effective adjuvant therapy for older patients who have failed anti-seizure medications,” study investigator Aiswarya Raj, MBBS, Aster Malabar Institute of Medical Sciences, Kerala, India, said in an interview.
The findings were presented at the annual meeting of the American Epilepsy Society.
Paucity of Adult Data
The MAD is a less restrictive hybrid of the ketogenic diet that limits carbohydrate intake and encourages fat consumption. It does not restrict fluids, calories, or proteins and does not require fats to be weighed or measured.
The diet includes fewer carbohydrates than the traditional Atkins diet and places more emphasis on fat intake. Dr. Raj said that the research suggests that the MAD “is a promising therapy in pediatric populations, but there’s not a lot of data in adults.”
Dr. Raj noted that this diet type has not been that popular in patients who clinicians believe might be better treated with drug therapy, possibly because of concern about the cardiac impact of consuming high-fat foods.
After conducting a systematic literature review assessing the efficacy of MAD in adults, the researchers included three randomized controlled trials and four observational studies published from January 2000 to May 2023 in the analysis.
The randomized controlled trials in the review assessed the primary outcome, a greater than 50% seizure reduction, at the end of 2 months, 3 months, and 6 months. In the MAD group, 32.5% of participants had more than a 50% seizure reduction vs 3% in the control group (odds ratio [OR], 12.62; 95% CI, 4.05-39.29; P < .0001).
Four participants who followed the diet achieved complete seizure-freedom compared with no participants in the control group (OR, 16.20; 95% CI, 0.82-318.82; P = .07).
The prospective studies examined this outcome at the end of 1 month or 3 months. In these studies, 41.9% of individuals experienced more than a 50% seizure reduction after 1 month of following the MAD, and 34.2% experienced this reduction after 3 months (OR, 1.41; 95% CI, 0.79-2.52; P = .24), with zero heterogeneity across studies.
It’s difficult to interpret the difference in seizure reduction between 1 and 3 months of therapy, Dr. Raj noted, because “there’s always the issue of compliance when you put a patient on a long-term diet.”
Positive results for MAD in adults were shown in another recent systematic review and meta-analysis published in Seizure: European Journal of Epilepsy.
That analysis included six studies with 575 patients who were randomly assigned to MAD or usual diet (UD) plus standard drug therapy. After an average follow-up of 12 weeks, MAD was associated with a higher rate of 50% or greater reduction in seizure frequency (relative risk [RR], 6.28; 95% CI, 3.52-10.50; P < .001), both in adults with drug-resistant epilepsy (RR, 6.14; 95% CI, 1.15-32.66; P = .033) and children (RR, 6.28; 95% CI, 3.43-11.49; P < .001).
MAD was also associated with a higher seizure freedom rate compared with UD (RR, 5.94; 95% CI, 1.93-18.31; P = .002).
Cholesterol Concern
In Dr. Raj’s analysis, there was an increment in blood total cholesterol level after 3 months of MAD (standard mean difference, -0.82; 95% CI, -1.23 to -0.40; P = .0001).
Concern about elevated blood cholesterol affecting coronary artery disease risk may explain why doctors sometimes shy away from recommending the MAD to their adult patients. “Some may not want to take that risk; you don’t want patients to succumb to coronary artery disease,” said Dr. Raj.
She noted that 3 months “is a very short time period,” and studies looking at cholesterol levels at the end of at least 1 year are needed to determine whether levels return to normal.
“We’re seeing a lot of literature now that suggests dietary intake does not really have a link with cholesterol levels,” she said. If this can be proven, “then this is definitely a great therapy.”
The evidence of cardiovascular safety of the MAD includes a study of 37 patients who showed that although total cholesterol and low-density lipoprotein (LDL) cholesterol increased over the first 3 months of MAD treatment, these values normalized within 1 year of treatment, including in patients treated with MAD for more than 3 years.
Primary Diet Recommendation
This news organization asked one of the authors of that study, Mackenzie C. Cervenka, MD, professor of neurology and medical director of the Adult Epilepsy Diet Center, Johns Hopkins Hospital, Baltimore, Maryland, to comment on the new research.
She said that she was “thrilled” to see more evidence showing that this diet therapy can be as effective for adults as for children. “This is a really important message to get out there.”
At her adult epilepsy diet center, the MAD is the “primary” diet recommended for patients who are resistant to seizure medication, not tube fed, and are keen to try diet therapy, said Dr. Cervenka.
In her experience, the likelihood of having a 50% or greater seizure reduction is about 40% among medication-resistant patients, “so very similar to what they reported in that review,” she said.
However, she noted that she emphasizes to patients that “diet therapy is not meant to be monotherapy.”
Dr. Cervenka’s team is examining LDL cholesterol levels as well as LDL particle size in adults who have been on the MAD for 2 years. LDL particle size, she noted, is a better predictor of long-term cardiovascular health.
No conflicts of interest were reported.
A version of this article appeared on Medscape.com.
FROM AES 2023
CGRP in migraine prodrome can stop headache, reduce severity
BARCELONA, SPAIN —
“This represents a totally different way of treating a migraine attack – to treat it before the headache starts. This is a paradigm shift in the way we approach the acute treatment of migraine,” study investigator Peter Goadsby, MBBS, MD, PhD, professor of neurology at Kings College London, UK, said in an interview.
The findings were presented at 17th European Headache Congress (EHC) and were also recently published online in The Lancet.
A New Way to Manage Migraine?
The prodrome is usually the earliest phase of a migraine attack and is believed to be experienced by the vast majority of patients with migraine. It consists of various symptoms, including sensitivity to light, fatigue, mood changes, cognitive dysfunction, craving certain foods, and neck pain, which can occur several hours or days before onset.
Dr. Goadsby notes that, at present, there isn’t very much a patient can do about the prodrome.
“We advise patients if they feel an attack is coming not to do anything that might make it worse and make sure they have their acute treatment available for when the headache phase starts. So, we just advise people to prepare for the attack rather than doing anything specific to stop it. But with new data from this study, we now have something that can be done. Patients have an option,” he said.
Dr. Goadsby explained that currently patients are not encouraged to use acute migraine medications such as triptans in the prodrome phase.
“There is actually no evidence that taking a triptan during the prodromal phase works. The advice is to take a triptan as soon as the headache starts, but not before the headache starts.”
He noted that there is also the problem of medication overuse that is seen with triptans, and most other medications used to treat acute migraine, which leads to medication overuse headache, “so we don’t like to encourage patients to increase the frequency of taking triptans for this reason.”
But ubrogepant and other members of the “gepant” class do not seem to have the propensity for medication overuse problems. “Rather, the more a patient takes the less likely they are to get a headache as these drugs also have a preventative effect,” Dr. Goadsby said.
Major Reduction in Severity
The PRODROME trial was conducted at 75 sites in the United States in 518 patients who had at least a 1-year history of migraine with or without aura and a history of two to eight migraine attacks per month with moderate to severe headache in each of the 3 months before study entry.
Participants underwent a rigorous screening period during which they were required to show that they could identify prodromal symptoms that were reliably followed by migraine headache within 1-6 hours.
They were randomly assigned to receive either placebo to treat the first qualifying prodrome event and ubrogepant 100 mg to treat the second qualifying prodrome event or vice versa, with instructions to take the study drug at the onset of the prodrome event.
Efficacy assessments during the double-blind treatment period were recorded by the participant in an electronic diary. On identifying a qualifying prodrome, the patient recorded prodromal symptoms, and was then required to report the absence or presence of a headache at regular intervals up to 48 hours after the study drug dose. If a headache was reported, participants rated the intensity as mild, moderate, or severe and reported whether rescue medication was taken to treat it.
The primary endpoint was absence of moderate or severe intensity headache within 24 hours after study-drug dose. This occurred after 46% of 418 qualifying prodrome events that had been treated with ubrogepant and after 29% of 423 qualifying prodrome events that had been treated with placebo (odds ratio, 2.09; 95% CI, 1.63 - 2.69; P < .0001).
“The incidence of moderate to severe headache was almost halved when ubrogepant was taken in the prodrome,” Dr. Goadsby reported.
Ubrogepant also showed similar impressive results for the secondary endpoints in the absence of moderate to severe headache within 48 hours post-dose and the absence of any headache of any intensity at 24 hours.
Little to No Disability
The researchers also evaluated functional ability, and more participants reported “no disability or able to function normally” during the 24 hours after treatment with ubrogepant than after placebo (OR, 1.66; P < .0001).
Other findings showed that the prodromal symptoms themselves, such as light sensitivity and cognitive dysfunction, were also reduced with ubrogepant.
Dr. Goadsby said he was pleased but not surprised by the results, as the “gepant” class of drugs are used in both the acute treatment of migraine and as preventive agents, although different agents have been approved for different indications in this regard.
“The ‘gepants’ are a class of medication that can be used in almost any way in migraine — to treat an acute migraine headache, to prevent migraine if taken chronically, and now we see that they can also stop a migraine from developing if taken during the initial prodromal phase. That’s unique for a migraine medication,” he said.
While the current study was conducted with ubrogepant, Dr. Goadsby suspects that any of the “gepants” would probably have a similar effect.
He noted that the prodromal phase of migraine has only just started to be explored, with functional imaging studies showing that structural brain changes occur during this phase.
Dr. Goadsby said the current study opens up a whole new area of interest, emphasizing the clinical value of identifying the prodrome in individuals with migraine, better characterizing the symptomology of the prodrome and understanding more about how to treat it.
“It’s the ultimate way of treating migraine early, and by taking this type of medication in the prodromal phase, patients may be able to stop having pain. That’s quite an implication,” he concluded.
The PRODROME study was funded by AbbVie. Dr. Goadsby reports personal fees from AbbVie.
A version of this article appeared on Medscape.com.
BARCELONA, SPAIN —
“This represents a totally different way of treating a migraine attack – to treat it before the headache starts. This is a paradigm shift in the way we approach the acute treatment of migraine,” study investigator Peter Goadsby, MBBS, MD, PhD, professor of neurology at Kings College London, UK, said in an interview.
The findings were presented at 17th European Headache Congress (EHC) and were also recently published online in The Lancet.
A New Way to Manage Migraine?
The prodrome is usually the earliest phase of a migraine attack and is believed to be experienced by the vast majority of patients with migraine. It consists of various symptoms, including sensitivity to light, fatigue, mood changes, cognitive dysfunction, craving certain foods, and neck pain, which can occur several hours or days before onset.
Dr. Goadsby notes that, at present, there isn’t very much a patient can do about the prodrome.
“We advise patients if they feel an attack is coming not to do anything that might make it worse and make sure they have their acute treatment available for when the headache phase starts. So, we just advise people to prepare for the attack rather than doing anything specific to stop it. But with new data from this study, we now have something that can be done. Patients have an option,” he said.
Dr. Goadsby explained that currently patients are not encouraged to use acute migraine medications such as triptans in the prodrome phase.
“There is actually no evidence that taking a triptan during the prodromal phase works. The advice is to take a triptan as soon as the headache starts, but not before the headache starts.”
He noted that there is also the problem of medication overuse that is seen with triptans, and most other medications used to treat acute migraine, which leads to medication overuse headache, “so we don’t like to encourage patients to increase the frequency of taking triptans for this reason.”
But ubrogepant and other members of the “gepant” class do not seem to have the propensity for medication overuse problems. “Rather, the more a patient takes the less likely they are to get a headache as these drugs also have a preventative effect,” Dr. Goadsby said.
Major Reduction in Severity
The PRODROME trial was conducted at 75 sites in the United States in 518 patients who had at least a 1-year history of migraine with or without aura and a history of two to eight migraine attacks per month with moderate to severe headache in each of the 3 months before study entry.
Participants underwent a rigorous screening period during which they were required to show that they could identify prodromal symptoms that were reliably followed by migraine headache within 1-6 hours.
They were randomly assigned to receive either placebo to treat the first qualifying prodrome event and ubrogepant 100 mg to treat the second qualifying prodrome event or vice versa, with instructions to take the study drug at the onset of the prodrome event.
Efficacy assessments during the double-blind treatment period were recorded by the participant in an electronic diary. On identifying a qualifying prodrome, the patient recorded prodromal symptoms, and was then required to report the absence or presence of a headache at regular intervals up to 48 hours after the study drug dose. If a headache was reported, participants rated the intensity as mild, moderate, or severe and reported whether rescue medication was taken to treat it.
The primary endpoint was absence of moderate or severe intensity headache within 24 hours after study-drug dose. This occurred after 46% of 418 qualifying prodrome events that had been treated with ubrogepant and after 29% of 423 qualifying prodrome events that had been treated with placebo (odds ratio, 2.09; 95% CI, 1.63 - 2.69; P < .0001).
“The incidence of moderate to severe headache was almost halved when ubrogepant was taken in the prodrome,” Dr. Goadsby reported.
Ubrogepant also showed similar impressive results for the secondary endpoints in the absence of moderate to severe headache within 48 hours post-dose and the absence of any headache of any intensity at 24 hours.
Little to No Disability
The researchers also evaluated functional ability, and more participants reported “no disability or able to function normally” during the 24 hours after treatment with ubrogepant than after placebo (OR, 1.66; P < .0001).
Other findings showed that the prodromal symptoms themselves, such as light sensitivity and cognitive dysfunction, were also reduced with ubrogepant.
Dr. Goadsby said he was pleased but not surprised by the results, as the “gepant” class of drugs are used in both the acute treatment of migraine and as preventive agents, although different agents have been approved for different indications in this regard.
“The ‘gepants’ are a class of medication that can be used in almost any way in migraine — to treat an acute migraine headache, to prevent migraine if taken chronically, and now we see that they can also stop a migraine from developing if taken during the initial prodromal phase. That’s unique for a migraine medication,” he said.
While the current study was conducted with ubrogepant, Dr. Goadsby suspects that any of the “gepants” would probably have a similar effect.
He noted that the prodromal phase of migraine has only just started to be explored, with functional imaging studies showing that structural brain changes occur during this phase.
Dr. Goadsby said the current study opens up a whole new area of interest, emphasizing the clinical value of identifying the prodrome in individuals with migraine, better characterizing the symptomology of the prodrome and understanding more about how to treat it.
“It’s the ultimate way of treating migraine early, and by taking this type of medication in the prodromal phase, patients may be able to stop having pain. That’s quite an implication,” he concluded.
The PRODROME study was funded by AbbVie. Dr. Goadsby reports personal fees from AbbVie.
A version of this article appeared on Medscape.com.
BARCELONA, SPAIN —
“This represents a totally different way of treating a migraine attack – to treat it before the headache starts. This is a paradigm shift in the way we approach the acute treatment of migraine,” study investigator Peter Goadsby, MBBS, MD, PhD, professor of neurology at Kings College London, UK, said in an interview.
The findings were presented at 17th European Headache Congress (EHC) and were also recently published online in The Lancet.
A New Way to Manage Migraine?
The prodrome is usually the earliest phase of a migraine attack and is believed to be experienced by the vast majority of patients with migraine. It consists of various symptoms, including sensitivity to light, fatigue, mood changes, cognitive dysfunction, craving certain foods, and neck pain, which can occur several hours or days before onset.
Dr. Goadsby notes that, at present, there isn’t very much a patient can do about the prodrome.
“We advise patients if they feel an attack is coming not to do anything that might make it worse and make sure they have their acute treatment available for when the headache phase starts. So, we just advise people to prepare for the attack rather than doing anything specific to stop it. But with new data from this study, we now have something that can be done. Patients have an option,” he said.
Dr. Goadsby explained that currently patients are not encouraged to use acute migraine medications such as triptans in the prodrome phase.
“There is actually no evidence that taking a triptan during the prodromal phase works. The advice is to take a triptan as soon as the headache starts, but not before the headache starts.”
He noted that there is also the problem of medication overuse that is seen with triptans, and most other medications used to treat acute migraine, which leads to medication overuse headache, “so we don’t like to encourage patients to increase the frequency of taking triptans for this reason.”
But ubrogepant and other members of the “gepant” class do not seem to have the propensity for medication overuse problems. “Rather, the more a patient takes the less likely they are to get a headache as these drugs also have a preventative effect,” Dr. Goadsby said.
Major Reduction in Severity
The PRODROME trial was conducted at 75 sites in the United States in 518 patients who had at least a 1-year history of migraine with or without aura and a history of two to eight migraine attacks per month with moderate to severe headache in each of the 3 months before study entry.
Participants underwent a rigorous screening period during which they were required to show that they could identify prodromal symptoms that were reliably followed by migraine headache within 1-6 hours.
They were randomly assigned to receive either placebo to treat the first qualifying prodrome event and ubrogepant 100 mg to treat the second qualifying prodrome event or vice versa, with instructions to take the study drug at the onset of the prodrome event.
Efficacy assessments during the double-blind treatment period were recorded by the participant in an electronic diary. On identifying a qualifying prodrome, the patient recorded prodromal symptoms, and was then required to report the absence or presence of a headache at regular intervals up to 48 hours after the study drug dose. If a headache was reported, participants rated the intensity as mild, moderate, or severe and reported whether rescue medication was taken to treat it.
The primary endpoint was absence of moderate or severe intensity headache within 24 hours after study-drug dose. This occurred after 46% of 418 qualifying prodrome events that had been treated with ubrogepant and after 29% of 423 qualifying prodrome events that had been treated with placebo (odds ratio, 2.09; 95% CI, 1.63 - 2.69; P < .0001).
“The incidence of moderate to severe headache was almost halved when ubrogepant was taken in the prodrome,” Dr. Goadsby reported.
Ubrogepant also showed similar impressive results for the secondary endpoints in the absence of moderate to severe headache within 48 hours post-dose and the absence of any headache of any intensity at 24 hours.
Little to No Disability
The researchers also evaluated functional ability, and more participants reported “no disability or able to function normally” during the 24 hours after treatment with ubrogepant than after placebo (OR, 1.66; P < .0001).
Other findings showed that the prodromal symptoms themselves, such as light sensitivity and cognitive dysfunction, were also reduced with ubrogepant.
Dr. Goadsby said he was pleased but not surprised by the results, as the “gepant” class of drugs are used in both the acute treatment of migraine and as preventive agents, although different agents have been approved for different indications in this regard.
“The ‘gepants’ are a class of medication that can be used in almost any way in migraine — to treat an acute migraine headache, to prevent migraine if taken chronically, and now we see that they can also stop a migraine from developing if taken during the initial prodromal phase. That’s unique for a migraine medication,” he said.
While the current study was conducted with ubrogepant, Dr. Goadsby suspects that any of the “gepants” would probably have a similar effect.
He noted that the prodromal phase of migraine has only just started to be explored, with functional imaging studies showing that structural brain changes occur during this phase.
Dr. Goadsby said the current study opens up a whole new area of interest, emphasizing the clinical value of identifying the prodrome in individuals with migraine, better characterizing the symptomology of the prodrome and understanding more about how to treat it.
“It’s the ultimate way of treating migraine early, and by taking this type of medication in the prodromal phase, patients may be able to stop having pain. That’s quite an implication,” he concluded.
The PRODROME study was funded by AbbVie. Dr. Goadsby reports personal fees from AbbVie.
A version of this article appeared on Medscape.com.
FROM EHC 2023
Why Are Prion Diseases on the Rise?
This transcript has been edited for clarity.
In 1986, in Britain, cattle started dying.
The condition, quickly nicknamed “mad cow disease,” was clearly infectious, but the particular pathogen was difficult to identify. By 1993, 120,000 cattle in Britain were identified as being infected. As yet, no human cases had occurred and the UK government insisted that cattle were a dead-end host for the pathogen. By the mid-1990s, however, multiple human cases, attributable to ingestion of meat and organs from infected cattle, were discovered. In humans, variant Creutzfeldt-Jakob disease (CJD) was a media sensation — a nearly uniformly fatal, untreatable condition with a rapid onset of dementia, mobility issues characterized by jerky movements, and autopsy reports finding that the brain itself had turned into a spongy mess.
The United States banned UK beef imports in 1996 and only lifted the ban in 2020.
The disease was made all the more mysterious because the pathogen involved was not a bacterium, parasite, or virus, but a protein — or a proteinaceous infectious particle, shortened to “prion.”
Prions are misfolded proteins that aggregate in cells — in this case, in nerve cells. But what makes prions different from other misfolded proteins is that the misfolded protein catalyzes the conversion of its non-misfolded counterpart into the misfolded configuration. It creates a chain reaction, leading to rapid accumulation of misfolded proteins and cell death.
And, like a time bomb, we all have prion protein inside us. In its normally folded state, the function of prion protein remains unclear — knockout mice do okay without it — but it is also highly conserved across mammalian species, so it probably does something worthwhile, perhaps protecting nerve fibers.
Far more common than humans contracting mad cow disease is the condition known as sporadic CJD, responsible for 85% of all cases of prion-induced brain disease. The cause of sporadic CJD is unknown.
But one thing is known: Cases are increasing.
I don’t want you to freak out; we are not in the midst of a CJD epidemic. But it’s been a while since I’ve seen people discussing the condition — which remains as horrible as it was in the 1990s — and a new research letter appearing in JAMA Neurology brought it back to the top of my mind.
Researchers, led by Matthew Crane at Hopkins, used the CDC’s WONDER cause-of-death database, which pulls diagnoses from death certificates. Normally, I’m not a fan of using death certificates for cause-of-death analyses, but in this case I’ll give it a pass. Assuming that the diagnosis of CJD is made, it would be really unlikely for it not to appear on a death certificate.
The main findings are seen here.
Note that we can’t tell whether these are sporadic CJD cases or variant CJD cases or even familial CJD cases; however, unless there has been a dramatic change in epidemiology, the vast majority of these will be sporadic.
The question is, why are there more cases?
Whenever this type of question comes up with any disease, there are basically three possibilities:
First, there may be an increase in the susceptible, or at-risk, population. In this case, we know that older people are at higher risk of developing sporadic CJD, and over time, the population has aged. To be fair, the authors adjusted for this and still saw an increase, though it was attenuated.
Second, we might be better at diagnosing the condition. A lot has happened since the mid-1990s, when the diagnosis was based more or less on symptoms. The advent of more sophisticated MRI protocols as well as a new diagnostic test called “real-time quaking-induced conversion testing” may mean we are just better at detecting people with this disease.
Third (and most concerning), a new exposure has occurred. What that exposure might be, where it might come from, is anyone’s guess. It’s hard to do broad-scale epidemiology on very rare diseases.
But given these findings, it seems that a bit more surveillance for this rare but devastating condition is well merited.
F. Perry Wilson, MD, MSCE, is an associate professor of medicine and public health and director of Yale’s Clinical and Translational Research Accelerator. His science communication work can be found in the Huffington Post, on NPR, and here on Medscape. He tweets @fperrywilson and his new book, How Medicine Works and When It Doesn’t, is available now.
F. Perry Wilson, MD, MSCE, has disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
This transcript has been edited for clarity.
In 1986, in Britain, cattle started dying.
The condition, quickly nicknamed “mad cow disease,” was clearly infectious, but the particular pathogen was difficult to identify. By 1993, 120,000 cattle in Britain were identified as being infected. As yet, no human cases had occurred and the UK government insisted that cattle were a dead-end host for the pathogen. By the mid-1990s, however, multiple human cases, attributable to ingestion of meat and organs from infected cattle, were discovered. In humans, variant Creutzfeldt-Jakob disease (CJD) was a media sensation — a nearly uniformly fatal, untreatable condition with a rapid onset of dementia, mobility issues characterized by jerky movements, and autopsy reports finding that the brain itself had turned into a spongy mess.
The United States banned UK beef imports in 1996 and only lifted the ban in 2020.
The disease was made all the more mysterious because the pathogen involved was not a bacterium, parasite, or virus, but a protein — or a proteinaceous infectious particle, shortened to “prion.”
Prions are misfolded proteins that aggregate in cells — in this case, in nerve cells. But what makes prions different from other misfolded proteins is that the misfolded protein catalyzes the conversion of its non-misfolded counterpart into the misfolded configuration. It creates a chain reaction, leading to rapid accumulation of misfolded proteins and cell death.
And, like a time bomb, we all have prion protein inside us. In its normally folded state, the function of prion protein remains unclear — knockout mice do okay without it — but it is also highly conserved across mammalian species, so it probably does something worthwhile, perhaps protecting nerve fibers.
Far more common than humans contracting mad cow disease is the condition known as sporadic CJD, responsible for 85% of all cases of prion-induced brain disease. The cause of sporadic CJD is unknown.
But one thing is known: Cases are increasing.
I don’t want you to freak out; we are not in the midst of a CJD epidemic. But it’s been a while since I’ve seen people discussing the condition — which remains as horrible as it was in the 1990s — and a new research letter appearing in JAMA Neurology brought it back to the top of my mind.
Researchers, led by Matthew Crane at Hopkins, used the CDC’s WONDER cause-of-death database, which pulls diagnoses from death certificates. Normally, I’m not a fan of using death certificates for cause-of-death analyses, but in this case I’ll give it a pass. Assuming that the diagnosis of CJD is made, it would be really unlikely for it not to appear on a death certificate.
The main findings are seen here.
Note that we can’t tell whether these are sporadic CJD cases or variant CJD cases or even familial CJD cases; however, unless there has been a dramatic change in epidemiology, the vast majority of these will be sporadic.
The question is, why are there more cases?
Whenever this type of question comes up with any disease, there are basically three possibilities:
First, there may be an increase in the susceptible, or at-risk, population. In this case, we know that older people are at higher risk of developing sporadic CJD, and over time, the population has aged. To be fair, the authors adjusted for this and still saw an increase, though it was attenuated.
Second, we might be better at diagnosing the condition. A lot has happened since the mid-1990s, when the diagnosis was based more or less on symptoms. The advent of more sophisticated MRI protocols as well as a new diagnostic test called “real-time quaking-induced conversion testing” may mean we are just better at detecting people with this disease.
Third (and most concerning), a new exposure has occurred. What that exposure might be, where it might come from, is anyone’s guess. It’s hard to do broad-scale epidemiology on very rare diseases.
But given these findings, it seems that a bit more surveillance for this rare but devastating condition is well merited.
F. Perry Wilson, MD, MSCE, is an associate professor of medicine and public health and director of Yale’s Clinical and Translational Research Accelerator. His science communication work can be found in the Huffington Post, on NPR, and here on Medscape. He tweets @fperrywilson and his new book, How Medicine Works and When It Doesn’t, is available now.
F. Perry Wilson, MD, MSCE, has disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
This transcript has been edited for clarity.
In 1986, in Britain, cattle started dying.
The condition, quickly nicknamed “mad cow disease,” was clearly infectious, but the particular pathogen was difficult to identify. By 1993, 120,000 cattle in Britain were identified as being infected. As yet, no human cases had occurred and the UK government insisted that cattle were a dead-end host for the pathogen. By the mid-1990s, however, multiple human cases, attributable to ingestion of meat and organs from infected cattle, were discovered. In humans, variant Creutzfeldt-Jakob disease (CJD) was a media sensation — a nearly uniformly fatal, untreatable condition with a rapid onset of dementia, mobility issues characterized by jerky movements, and autopsy reports finding that the brain itself had turned into a spongy mess.
The United States banned UK beef imports in 1996 and only lifted the ban in 2020.
The disease was made all the more mysterious because the pathogen involved was not a bacterium, parasite, or virus, but a protein — or a proteinaceous infectious particle, shortened to “prion.”
Prions are misfolded proteins that aggregate in cells — in this case, in nerve cells. But what makes prions different from other misfolded proteins is that the misfolded protein catalyzes the conversion of its non-misfolded counterpart into the misfolded configuration. It creates a chain reaction, leading to rapid accumulation of misfolded proteins and cell death.
And, like a time bomb, we all have prion protein inside us. In its normally folded state, the function of prion protein remains unclear — knockout mice do okay without it — but it is also highly conserved across mammalian species, so it probably does something worthwhile, perhaps protecting nerve fibers.
Far more common than humans contracting mad cow disease is the condition known as sporadic CJD, responsible for 85% of all cases of prion-induced brain disease. The cause of sporadic CJD is unknown.
But one thing is known: Cases are increasing.
I don’t want you to freak out; we are not in the midst of a CJD epidemic. But it’s been a while since I’ve seen people discussing the condition — which remains as horrible as it was in the 1990s — and a new research letter appearing in JAMA Neurology brought it back to the top of my mind.
Researchers, led by Matthew Crane at Hopkins, used the CDC’s WONDER cause-of-death database, which pulls diagnoses from death certificates. Normally, I’m not a fan of using death certificates for cause-of-death analyses, but in this case I’ll give it a pass. Assuming that the diagnosis of CJD is made, it would be really unlikely for it not to appear on a death certificate.
The main findings are seen here.
Note that we can’t tell whether these are sporadic CJD cases or variant CJD cases or even familial CJD cases; however, unless there has been a dramatic change in epidemiology, the vast majority of these will be sporadic.
The question is, why are there more cases?
Whenever this type of question comes up with any disease, there are basically three possibilities:
First, there may be an increase in the susceptible, or at-risk, population. In this case, we know that older people are at higher risk of developing sporadic CJD, and over time, the population has aged. To be fair, the authors adjusted for this and still saw an increase, though it was attenuated.
Second, we might be better at diagnosing the condition. A lot has happened since the mid-1990s, when the diagnosis was based more or less on symptoms. The advent of more sophisticated MRI protocols as well as a new diagnostic test called “real-time quaking-induced conversion testing” may mean we are just better at detecting people with this disease.
Third (and most concerning), a new exposure has occurred. What that exposure might be, where it might come from, is anyone’s guess. It’s hard to do broad-scale epidemiology on very rare diseases.
But given these findings, it seems that a bit more surveillance for this rare but devastating condition is well merited.
F. Perry Wilson, MD, MSCE, is an associate professor of medicine and public health and director of Yale’s Clinical and Translational Research Accelerator. His science communication work can be found in the Huffington Post, on NPR, and here on Medscape. He tweets @fperrywilson and his new book, How Medicine Works and When It Doesn’t, is available now.
F. Perry Wilson, MD, MSCE, has disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
Is migraine really a female disorder?
BARCELONA, SPAIN — Migraine is widely considered a predominantly female disorder. Its frequency, duration, and severity tend to be higher in women, and women are also more likely than men to receive a migraine diagnosis. However, gender expectations, differences in the likelihood of self-reporting, and problems with how migraine is classified make it difficult to estimate its true prevalence in men and women.
Different Symptoms
Headache disorders are estimated to affect 50% of the general population ; tension-type headache and migraine are the two most common. According to epidemiologic studies, migraine is more prevalent in women, with a female-to-male ratio of 3:1. There are numerous studies of why this might be, most of which focus largely on female-related factors, such as hormones and the menstrual cycle.
“Despite many years of research, there isn’t one clear factor explaining this substantial difference between women and men,” said Tobias Kurth of Charité – Universitätsmedizin Berlin, Germany. “So the question is: Are we missing something else?”
One factor in these perceived sex differences in migraine is that women seem to report their migraines differently from men, and they also have different symptoms. For example, women are more likely than men to report severe pain, and their migraine attacks are more often accompanied by photophobia, phonophobia, and nausea, whereas men’s migraines are more often accompanied by aura.
“By favoring female symptoms, the classification system may not be picking up male symptoms because they’re not being classified in the right way,” Dr. Kurth said, with one consequence being that migraine is underdiagnosed in men. “Before trying to understand the biological and behavioral reasons for these sex differences, we first need to consider these methodological challenges that we all apply knowingly or unknowingly.”
Christian Lampl, professor of neurology at Konventhospital der Barmherzigen Brüder Linz, Austria, and president of the European Headache Federation, said in an interview, “I’m convinced that this 3:1 ratio which has been stated for decades is wrong, but we still don’t have the data. The criteria we have [for classifying migraine] are useful for clinical trials, but they are useless for determining the male-to-female ratio.
“We need a new definition of migraine,” he added. “Migraine is an episode, not an attack. Attacks have a sudden onset, and migraine onset is not sudden — it is an episode with a headache attack.”
Inadequate Menopause Services
Professor Anne MacGregor of St. Bartholomew’s Hospital in London, United Kingdom, specializes in migraine and women’s health. She presented data showing that migraine is underdiagnosed in women; one reason being that the disorder receives inadequate attention from healthcare professionals at specialist menopause services.
Menopause is associated with an increased prevalence of migraine, but women do not discuss headache symptoms at specialist menopause services, Dr. MacGregor said.
She then described unpublished results from a survey of 117 women attending the specialist menopause service at St. Bartholomew’s Hospital. Among the respondents, 34% reported experiencing episodic migraine and an additional 8% reported having chronic migraine.
“Within this population of women who were not reporting headache as a symptom [to the menopause service until asked in the survey], 42% of them were positive for a diagnosis of migraine,” said Dr. MacGregor. “They were mostly relying on prescribed paracetamol and codeine, or buying it over the counter, and only 22% of them were receiving triptans.
“They are clearly being undertreated,” she added. “Part of this issue is that they didn’t spontaneously report headache as a menopause symptom, so they weren’t consulting for headache to their primary care physicians.”
Correct diagnosis by a consultant is a prerequisite for receiving appropriate migraine treatment. Yet, according to a US study published in 2012, only 45.5% of women with episodic migraine consulted a prescribing healthcare professional. Of those who consulted, 89% were diagnosed correctly, and only 68% of those received the appropriate treatment.
A larger, more recent study confirmed that there is a massive unmet need for improving care in this patient population. The Chronic Migraine Epidemiology and Outcomes (CaMEO) Study, which analyzed data from nearly 90,000 participants, showed that just 4.8% of people with chronic migraine received consultation, correct diagnosis, and treatment, with 89% of women with chronic migraine left undiagnosed.
The OVERCOME Study further revealed that although many people with migraine were repeat consulters, they were consulting their physicians for other health problems.
“This makes it very clear that people in other specialties need to be more aware about picking up and diagnosing headache,” said MacGregor. “That’s where the real need is in managing headache. We have the treatments, but if the patients can’t access them, they’re not much good to them.”
A version of this article appeared on Medscape.com.
BARCELONA, SPAIN — Migraine is widely considered a predominantly female disorder. Its frequency, duration, and severity tend to be higher in women, and women are also more likely than men to receive a migraine diagnosis. However, gender expectations, differences in the likelihood of self-reporting, and problems with how migraine is classified make it difficult to estimate its true prevalence in men and women.
Different Symptoms
Headache disorders are estimated to affect 50% of the general population ; tension-type headache and migraine are the two most common. According to epidemiologic studies, migraine is more prevalent in women, with a female-to-male ratio of 3:1. There are numerous studies of why this might be, most of which focus largely on female-related factors, such as hormones and the menstrual cycle.
“Despite many years of research, there isn’t one clear factor explaining this substantial difference between women and men,” said Tobias Kurth of Charité – Universitätsmedizin Berlin, Germany. “So the question is: Are we missing something else?”
One factor in these perceived sex differences in migraine is that women seem to report their migraines differently from men, and they also have different symptoms. For example, women are more likely than men to report severe pain, and their migraine attacks are more often accompanied by photophobia, phonophobia, and nausea, whereas men’s migraines are more often accompanied by aura.
“By favoring female symptoms, the classification system may not be picking up male symptoms because they’re not being classified in the right way,” Dr. Kurth said, with one consequence being that migraine is underdiagnosed in men. “Before trying to understand the biological and behavioral reasons for these sex differences, we first need to consider these methodological challenges that we all apply knowingly or unknowingly.”
Christian Lampl, professor of neurology at Konventhospital der Barmherzigen Brüder Linz, Austria, and president of the European Headache Federation, said in an interview, “I’m convinced that this 3:1 ratio which has been stated for decades is wrong, but we still don’t have the data. The criteria we have [for classifying migraine] are useful for clinical trials, but they are useless for determining the male-to-female ratio.
“We need a new definition of migraine,” he added. “Migraine is an episode, not an attack. Attacks have a sudden onset, and migraine onset is not sudden — it is an episode with a headache attack.”
Inadequate Menopause Services
Professor Anne MacGregor of St. Bartholomew’s Hospital in London, United Kingdom, specializes in migraine and women’s health. She presented data showing that migraine is underdiagnosed in women; one reason being that the disorder receives inadequate attention from healthcare professionals at specialist menopause services.
Menopause is associated with an increased prevalence of migraine, but women do not discuss headache symptoms at specialist menopause services, Dr. MacGregor said.
She then described unpublished results from a survey of 117 women attending the specialist menopause service at St. Bartholomew’s Hospital. Among the respondents, 34% reported experiencing episodic migraine and an additional 8% reported having chronic migraine.
“Within this population of women who were not reporting headache as a symptom [to the menopause service until asked in the survey], 42% of them were positive for a diagnosis of migraine,” said Dr. MacGregor. “They were mostly relying on prescribed paracetamol and codeine, or buying it over the counter, and only 22% of them were receiving triptans.
“They are clearly being undertreated,” she added. “Part of this issue is that they didn’t spontaneously report headache as a menopause symptom, so they weren’t consulting for headache to their primary care physicians.”
Correct diagnosis by a consultant is a prerequisite for receiving appropriate migraine treatment. Yet, according to a US study published in 2012, only 45.5% of women with episodic migraine consulted a prescribing healthcare professional. Of those who consulted, 89% were diagnosed correctly, and only 68% of those received the appropriate treatment.
A larger, more recent study confirmed that there is a massive unmet need for improving care in this patient population. The Chronic Migraine Epidemiology and Outcomes (CaMEO) Study, which analyzed data from nearly 90,000 participants, showed that just 4.8% of people with chronic migraine received consultation, correct diagnosis, and treatment, with 89% of women with chronic migraine left undiagnosed.
The OVERCOME Study further revealed that although many people with migraine were repeat consulters, they were consulting their physicians for other health problems.
“This makes it very clear that people in other specialties need to be more aware about picking up and diagnosing headache,” said MacGregor. “That’s where the real need is in managing headache. We have the treatments, but if the patients can’t access them, they’re not much good to them.”
A version of this article appeared on Medscape.com.
BARCELONA, SPAIN — Migraine is widely considered a predominantly female disorder. Its frequency, duration, and severity tend to be higher in women, and women are also more likely than men to receive a migraine diagnosis. However, gender expectations, differences in the likelihood of self-reporting, and problems with how migraine is classified make it difficult to estimate its true prevalence in men and women.
Different Symptoms
Headache disorders are estimated to affect 50% of the general population ; tension-type headache and migraine are the two most common. According to epidemiologic studies, migraine is more prevalent in women, with a female-to-male ratio of 3:1. There are numerous studies of why this might be, most of which focus largely on female-related factors, such as hormones and the menstrual cycle.
“Despite many years of research, there isn’t one clear factor explaining this substantial difference between women and men,” said Tobias Kurth of Charité – Universitätsmedizin Berlin, Germany. “So the question is: Are we missing something else?”
One factor in these perceived sex differences in migraine is that women seem to report their migraines differently from men, and they also have different symptoms. For example, women are more likely than men to report severe pain, and their migraine attacks are more often accompanied by photophobia, phonophobia, and nausea, whereas men’s migraines are more often accompanied by aura.
“By favoring female symptoms, the classification system may not be picking up male symptoms because they’re not being classified in the right way,” Dr. Kurth said, with one consequence being that migraine is underdiagnosed in men. “Before trying to understand the biological and behavioral reasons for these sex differences, we first need to consider these methodological challenges that we all apply knowingly or unknowingly.”
Christian Lampl, professor of neurology at Konventhospital der Barmherzigen Brüder Linz, Austria, and president of the European Headache Federation, said in an interview, “I’m convinced that this 3:1 ratio which has been stated for decades is wrong, but we still don’t have the data. The criteria we have [for classifying migraine] are useful for clinical trials, but they are useless for determining the male-to-female ratio.
“We need a new definition of migraine,” he added. “Migraine is an episode, not an attack. Attacks have a sudden onset, and migraine onset is not sudden — it is an episode with a headache attack.”
Inadequate Menopause Services
Professor Anne MacGregor of St. Bartholomew’s Hospital in London, United Kingdom, specializes in migraine and women’s health. She presented data showing that migraine is underdiagnosed in women; one reason being that the disorder receives inadequate attention from healthcare professionals at specialist menopause services.
Menopause is associated with an increased prevalence of migraine, but women do not discuss headache symptoms at specialist menopause services, Dr. MacGregor said.
She then described unpublished results from a survey of 117 women attending the specialist menopause service at St. Bartholomew’s Hospital. Among the respondents, 34% reported experiencing episodic migraine and an additional 8% reported having chronic migraine.
“Within this population of women who were not reporting headache as a symptom [to the menopause service until asked in the survey], 42% of them were positive for a diagnosis of migraine,” said Dr. MacGregor. “They were mostly relying on prescribed paracetamol and codeine, or buying it over the counter, and only 22% of them were receiving triptans.
“They are clearly being undertreated,” she added. “Part of this issue is that they didn’t spontaneously report headache as a menopause symptom, so they weren’t consulting for headache to their primary care physicians.”
Correct diagnosis by a consultant is a prerequisite for receiving appropriate migraine treatment. Yet, according to a US study published in 2012, only 45.5% of women with episodic migraine consulted a prescribing healthcare professional. Of those who consulted, 89% were diagnosed correctly, and only 68% of those received the appropriate treatment.
A larger, more recent study confirmed that there is a massive unmet need for improving care in this patient population. The Chronic Migraine Epidemiology and Outcomes (CaMEO) Study, which analyzed data from nearly 90,000 participants, showed that just 4.8% of people with chronic migraine received consultation, correct diagnosis, and treatment, with 89% of women with chronic migraine left undiagnosed.
The OVERCOME Study further revealed that although many people with migraine were repeat consulters, they were consulting their physicians for other health problems.
“This makes it very clear that people in other specialties need to be more aware about picking up and diagnosing headache,” said MacGregor. “That’s where the real need is in managing headache. We have the treatments, but if the patients can’t access them, they’re not much good to them.”
A version of this article appeared on Medscape.com.
FROM EHC 2023
Younger heart disease onset tied to higher dementia risk
TOPLINE:
, with the risk highest — at 36% — if onset is before age 45, results of a large observational study show.
METHODOLOGY:
- The study included 432,667 of the more than 500,000 participants in the UK Biobank, with a mean age of 56.9 years, 50,685 (11.7%) of whom had CHD and 50,445 had data on age at CHD onset.
- Researchers divided participants into three groups according to age at CHD onset (below 45 years, 45-59 years, and 60 years and older), and carried out a propensity score matching analysis.
- Outcomes included all-cause dementia, AD, and VD.
- Covariates included age, sex, race, educational level, body mass index, low-density lipoprotein cholesterol, smoking status, alcohol intake, exercise, depressed mood, hypertension, diabetes, statin use, and apolipoprotein E4 status.
TAKEAWAY:
- During a median follow-up of 12.8 years, researchers identified 5876 cases of all-cause dementia, 2540 cases of AD, and 1220 cases of VD.
- Fully adjusted models showed participants with CHD had significantly higher risks than those without CHD of developing all-cause dementia (hazard ratio [HR], 1.36; 95% CI, 1.28-1.45; P < .001), AD (HR, 1.13; 95% CI, 1.02-1.24; P = .019), and VD (HR, 1.78; 95% CI, 1.56-2.02; P < .001). The higher risk for VD suggests CHD has a more profound influence on neuropathologic changes involved in this dementia type, said the authors.
- Those with CHD diagnosed at a younger age had higher risks of developing dementia (HR per 10-year decrease in age, 1.25; 95% CI, 1.20-1.30 for all-cause dementia, 1.29; 95% CI, 1.20-1.38 for AD, and 1.22; 95% CI, 1.13-1.31 for VD; P for all < .001).
- Propensity score matching analysis showed patients with CHD had significantly higher risks for dementia compared with matched controls, with the highest risk seen in patients diagnosed before age 45 (HR, 2.40; 95% CI, 1.79-3.20; P < .001), followed by those diagnosed between 45 and 59 years (HR, 1.46; 95% CI, 1.32-1.62; P < .001) and at or above 60 years (HR, 1.11; 95% CI, 1.03-1.19; P = .005), with similar results for AD and VD.
IN PRACTICE:
The findings suggest “additional attention should be paid to the cognitive status of patients with CHD, especially the ones diagnosed with CHD at a young age,” the authors conclude, noting that “timely intervention, such as cognitive training, could be implemented once signs of cognitive deteriorations are detected.”
SOURCE:
The study was conducted by Jie Liang, BS, School of Nursing, Chinese Academy of Medical Sciences & Peking Union Medical College, Beijing, and colleagues. It was published online on November 29, 2023, in the Journal of the American Heart Association.
LIMITATIONS:
As this is an observational study, it can’t conclude a causal relationship. Although the authors adjusted for many potential confounders, unknown risk factors that also contribute to CHD can’t be ruled out. As the study excluded 69,744 participants, selection bias is possible. The study included a mostly White population.
DISCLOSURES:
The study was supported by the National Natural Science Foundation of China, the Non-Profit Central Research Institute Fund of the Chinese Academy of Medical Sciences, and the China Medical Board. The authors have no relevant conflicts of interest.
A version of this article first appeared on Medscape.com.
TOPLINE:
, with the risk highest — at 36% — if onset is before age 45, results of a large observational study show.
METHODOLOGY:
- The study included 432,667 of the more than 500,000 participants in the UK Biobank, with a mean age of 56.9 years, 50,685 (11.7%) of whom had CHD and 50,445 had data on age at CHD onset.
- Researchers divided participants into three groups according to age at CHD onset (below 45 years, 45-59 years, and 60 years and older), and carried out a propensity score matching analysis.
- Outcomes included all-cause dementia, AD, and VD.
- Covariates included age, sex, race, educational level, body mass index, low-density lipoprotein cholesterol, smoking status, alcohol intake, exercise, depressed mood, hypertension, diabetes, statin use, and apolipoprotein E4 status.
TAKEAWAY:
- During a median follow-up of 12.8 years, researchers identified 5876 cases of all-cause dementia, 2540 cases of AD, and 1220 cases of VD.
- Fully adjusted models showed participants with CHD had significantly higher risks than those without CHD of developing all-cause dementia (hazard ratio [HR], 1.36; 95% CI, 1.28-1.45; P < .001), AD (HR, 1.13; 95% CI, 1.02-1.24; P = .019), and VD (HR, 1.78; 95% CI, 1.56-2.02; P < .001). The higher risk for VD suggests CHD has a more profound influence on neuropathologic changes involved in this dementia type, said the authors.
- Those with CHD diagnosed at a younger age had higher risks of developing dementia (HR per 10-year decrease in age, 1.25; 95% CI, 1.20-1.30 for all-cause dementia, 1.29; 95% CI, 1.20-1.38 for AD, and 1.22; 95% CI, 1.13-1.31 for VD; P for all < .001).
- Propensity score matching analysis showed patients with CHD had significantly higher risks for dementia compared with matched controls, with the highest risk seen in patients diagnosed before age 45 (HR, 2.40; 95% CI, 1.79-3.20; P < .001), followed by those diagnosed between 45 and 59 years (HR, 1.46; 95% CI, 1.32-1.62; P < .001) and at or above 60 years (HR, 1.11; 95% CI, 1.03-1.19; P = .005), with similar results for AD and VD.
IN PRACTICE:
The findings suggest “additional attention should be paid to the cognitive status of patients with CHD, especially the ones diagnosed with CHD at a young age,” the authors conclude, noting that “timely intervention, such as cognitive training, could be implemented once signs of cognitive deteriorations are detected.”
SOURCE:
The study was conducted by Jie Liang, BS, School of Nursing, Chinese Academy of Medical Sciences & Peking Union Medical College, Beijing, and colleagues. It was published online on November 29, 2023, in the Journal of the American Heart Association.
LIMITATIONS:
As this is an observational study, it can’t conclude a causal relationship. Although the authors adjusted for many potential confounders, unknown risk factors that also contribute to CHD can’t be ruled out. As the study excluded 69,744 participants, selection bias is possible. The study included a mostly White population.
DISCLOSURES:
The study was supported by the National Natural Science Foundation of China, the Non-Profit Central Research Institute Fund of the Chinese Academy of Medical Sciences, and the China Medical Board. The authors have no relevant conflicts of interest.
A version of this article first appeared on Medscape.com.
TOPLINE:
, with the risk highest — at 36% — if onset is before age 45, results of a large observational study show.
METHODOLOGY:
- The study included 432,667 of the more than 500,000 participants in the UK Biobank, with a mean age of 56.9 years, 50,685 (11.7%) of whom had CHD and 50,445 had data on age at CHD onset.
- Researchers divided participants into three groups according to age at CHD onset (below 45 years, 45-59 years, and 60 years and older), and carried out a propensity score matching analysis.
- Outcomes included all-cause dementia, AD, and VD.
- Covariates included age, sex, race, educational level, body mass index, low-density lipoprotein cholesterol, smoking status, alcohol intake, exercise, depressed mood, hypertension, diabetes, statin use, and apolipoprotein E4 status.
TAKEAWAY:
- During a median follow-up of 12.8 years, researchers identified 5876 cases of all-cause dementia, 2540 cases of AD, and 1220 cases of VD.
- Fully adjusted models showed participants with CHD had significantly higher risks than those without CHD of developing all-cause dementia (hazard ratio [HR], 1.36; 95% CI, 1.28-1.45; P < .001), AD (HR, 1.13; 95% CI, 1.02-1.24; P = .019), and VD (HR, 1.78; 95% CI, 1.56-2.02; P < .001). The higher risk for VD suggests CHD has a more profound influence on neuropathologic changes involved in this dementia type, said the authors.
- Those with CHD diagnosed at a younger age had higher risks of developing dementia (HR per 10-year decrease in age, 1.25; 95% CI, 1.20-1.30 for all-cause dementia, 1.29; 95% CI, 1.20-1.38 for AD, and 1.22; 95% CI, 1.13-1.31 for VD; P for all < .001).
- Propensity score matching analysis showed patients with CHD had significantly higher risks for dementia compared with matched controls, with the highest risk seen in patients diagnosed before age 45 (HR, 2.40; 95% CI, 1.79-3.20; P < .001), followed by those diagnosed between 45 and 59 years (HR, 1.46; 95% CI, 1.32-1.62; P < .001) and at or above 60 years (HR, 1.11; 95% CI, 1.03-1.19; P = .005), with similar results for AD and VD.
IN PRACTICE:
The findings suggest “additional attention should be paid to the cognitive status of patients with CHD, especially the ones diagnosed with CHD at a young age,” the authors conclude, noting that “timely intervention, such as cognitive training, could be implemented once signs of cognitive deteriorations are detected.”
SOURCE:
The study was conducted by Jie Liang, BS, School of Nursing, Chinese Academy of Medical Sciences & Peking Union Medical College, Beijing, and colleagues. It was published online on November 29, 2023, in the Journal of the American Heart Association.
LIMITATIONS:
As this is an observational study, it can’t conclude a causal relationship. Although the authors adjusted for many potential confounders, unknown risk factors that also contribute to CHD can’t be ruled out. As the study excluded 69,744 participants, selection bias is possible. The study included a mostly White population.
DISCLOSURES:
The study was supported by the National Natural Science Foundation of China, the Non-Profit Central Research Institute Fund of the Chinese Academy of Medical Sciences, and the China Medical Board. The authors have no relevant conflicts of interest.
A version of this article first appeared on Medscape.com.
Which migraine medications are most effective?
TOPLINE:
new results from large, real-world analysis of self-reported patient data show.
METHODOLOGY:
- Researchers analyzed nearly 11 million migraine attack records extracted from Migraine Buddy, an e-diary smartphone app, over a 6-year period.
- They evaluated self-reported treatment effectiveness for 25 acute migraine medications among seven classes: acetaminophen, NSAIDs, triptans, combination analgesics, ergots, antiemetics, and opioids.
- A two-level nested multivariate logistic regression model adjusted for within-subject dependency and for concomitant medications taken within each analyzed migraine attack.
- The final analysis included nearly 5 million medication-outcome pairs from 3.1 million migraine attacks in 278,000 medication users.
TAKEAWAY:
- Using ibuprofen as the reference, triptans, ergots, and antiemetics were the top three medication classes with the highest effectiveness (mean odds ratios [OR] 4.80, 3.02, and 2.67, respectively).
- The next most effective medication classes were opioids (OR, 2.49), NSAIDs other than ibuprofen (OR, 1.94), combination analgesics acetaminophen/acetylsalicylic acid/caffeine (OR, 1.69), and others (OR, 1.49).
- Acetaminophen (OR, 0.83) was considered to be the least effective.
- The most effective individual medications were eletriptan (Relpax) (OR, 6.1); zolmitriptan (Zomig) (OR, 5.7); and sumatriptan (Imitrex) (OR, 5.2).
IN PRACTICE:
“Our findings that triptans, ergots, and antiemetics are the most effective classes of medications align with the guideline recommendations and offer generalizable insights to complement clinical practice,” the authors wrote.
SOURCE:
The study, with first author Chia-Chun Chiang, MD, Department of Neurology, Mayo Clinic, Rochester, Minnesota, was published online November 29 in Neurology.
LIMITATIONS:
The findings are based on subjective user-reported ratings of effectiveness and information on side effects, dosages, and formulations were not available. The newer migraine medication classes, gepants and ditans, were not included due to the relatively lower number of treated attacks. The regression model did not include age, gender, pain intensity, and other migraine-associated symptoms, which could potentially affect treatment effectiveness.
DISCLOSURES:
Funding for the study was provided by the Kanagawa University of Human Service research fund. A full list of author disclosures can be found with the original article.
A version of this article first appeared on Medscape.com.
TOPLINE:
new results from large, real-world analysis of self-reported patient data show.
METHODOLOGY:
- Researchers analyzed nearly 11 million migraine attack records extracted from Migraine Buddy, an e-diary smartphone app, over a 6-year period.
- They evaluated self-reported treatment effectiveness for 25 acute migraine medications among seven classes: acetaminophen, NSAIDs, triptans, combination analgesics, ergots, antiemetics, and opioids.
- A two-level nested multivariate logistic regression model adjusted for within-subject dependency and for concomitant medications taken within each analyzed migraine attack.
- The final analysis included nearly 5 million medication-outcome pairs from 3.1 million migraine attacks in 278,000 medication users.
TAKEAWAY:
- Using ibuprofen as the reference, triptans, ergots, and antiemetics were the top three medication classes with the highest effectiveness (mean odds ratios [OR] 4.80, 3.02, and 2.67, respectively).
- The next most effective medication classes were opioids (OR, 2.49), NSAIDs other than ibuprofen (OR, 1.94), combination analgesics acetaminophen/acetylsalicylic acid/caffeine (OR, 1.69), and others (OR, 1.49).
- Acetaminophen (OR, 0.83) was considered to be the least effective.
- The most effective individual medications were eletriptan (Relpax) (OR, 6.1); zolmitriptan (Zomig) (OR, 5.7); and sumatriptan (Imitrex) (OR, 5.2).
IN PRACTICE:
“Our findings that triptans, ergots, and antiemetics are the most effective classes of medications align with the guideline recommendations and offer generalizable insights to complement clinical practice,” the authors wrote.
SOURCE:
The study, with first author Chia-Chun Chiang, MD, Department of Neurology, Mayo Clinic, Rochester, Minnesota, was published online November 29 in Neurology.
LIMITATIONS:
The findings are based on subjective user-reported ratings of effectiveness and information on side effects, dosages, and formulations were not available. The newer migraine medication classes, gepants and ditans, were not included due to the relatively lower number of treated attacks. The regression model did not include age, gender, pain intensity, and other migraine-associated symptoms, which could potentially affect treatment effectiveness.
DISCLOSURES:
Funding for the study was provided by the Kanagawa University of Human Service research fund. A full list of author disclosures can be found with the original article.
A version of this article first appeared on Medscape.com.
TOPLINE:
new results from large, real-world analysis of self-reported patient data show.
METHODOLOGY:
- Researchers analyzed nearly 11 million migraine attack records extracted from Migraine Buddy, an e-diary smartphone app, over a 6-year period.
- They evaluated self-reported treatment effectiveness for 25 acute migraine medications among seven classes: acetaminophen, NSAIDs, triptans, combination analgesics, ergots, antiemetics, and opioids.
- A two-level nested multivariate logistic regression model adjusted for within-subject dependency and for concomitant medications taken within each analyzed migraine attack.
- The final analysis included nearly 5 million medication-outcome pairs from 3.1 million migraine attacks in 278,000 medication users.
TAKEAWAY:
- Using ibuprofen as the reference, triptans, ergots, and antiemetics were the top three medication classes with the highest effectiveness (mean odds ratios [OR] 4.80, 3.02, and 2.67, respectively).
- The next most effective medication classes were opioids (OR, 2.49), NSAIDs other than ibuprofen (OR, 1.94), combination analgesics acetaminophen/acetylsalicylic acid/caffeine (OR, 1.69), and others (OR, 1.49).
- Acetaminophen (OR, 0.83) was considered to be the least effective.
- The most effective individual medications were eletriptan (Relpax) (OR, 6.1); zolmitriptan (Zomig) (OR, 5.7); and sumatriptan (Imitrex) (OR, 5.2).
IN PRACTICE:
“Our findings that triptans, ergots, and antiemetics are the most effective classes of medications align with the guideline recommendations and offer generalizable insights to complement clinical practice,” the authors wrote.
SOURCE:
The study, with first author Chia-Chun Chiang, MD, Department of Neurology, Mayo Clinic, Rochester, Minnesota, was published online November 29 in Neurology.
LIMITATIONS:
The findings are based on subjective user-reported ratings of effectiveness and information on side effects, dosages, and formulations were not available. The newer migraine medication classes, gepants and ditans, were not included due to the relatively lower number of treated attacks. The regression model did not include age, gender, pain intensity, and other migraine-associated symptoms, which could potentially affect treatment effectiveness.
DISCLOSURES:
Funding for the study was provided by the Kanagawa University of Human Service research fund. A full list of author disclosures can be found with the original article.
A version of this article first appeared on Medscape.com.
Genetic testing warranted in epilepsy of unknown origin
ORLANDO —
, new research suggests. Investigators found that pathogenic genetic variants were identified in over 40% of patients with epilepsy of unknown cause who underwent genetic testing.Such testing is particularly beneficial for those with early-onset epilepsy and those with comorbid developmental delay, said study investigator Yi Li, MD, PhD, clinical assistant professor, Department of Neurology & Neurological Sciences, Stanford University School of Medicine, Stanford, California.
But every patient with epilepsy of unknown etiology needs to consider genetic testing as part of their standard workup.
Dr. Li noted research showing that a diagnosis of a genetic epilepsy leads to alteration of treatment in about 20% of cases — for example, starting a specific antiseizure medication or avoiding a treatment such as a sodium channel blocker in patients diagnosed with Dravet syndrome. A genetic diagnosis also may make patients eligible for clinical trials investigating gene therapies.
Genetic testing results may end a long and exhausting “diagnostic odyssey” that families have been on, she said. Patients often wait more than a decade to get genetic testing, the study found.
The findings were presented at the annual meeting of the American Epilepsy Society.
Major Delays
About 20%-30% of epilepsy is caused by acquired conditions such as stroke, tumor, or head injury. The remaining 70%-80% is believed to be due to one or more genetic factors.
Genetic testing has become standard for children with early-onset epilepsy, but it’s not common practice among adults with the condition — at least not yet.
The retrospective study involved a chart review of patient electronic health records from 2018-2023. Researchers used the Stanford electronic health record Cohort Discovery tool (STARR) database to identify 286 patients over age 16 years with epilepsy who had records of genetic testing.
Of the 286 patients, 148 were male and 138 female, and mean age was approximately 30 years. Among those with known epilepsy types, 53.6% had focal epilepsy and 28.8% had generalized epilpesy.
The mean age of seizure onset was 11.9 years, but the mean age at genetic testing was 25.1 years. “There’s a gap of about 13 or 14 years for genetic workup after a patient has a first seizure,” said Dr. Li.
Such a “huge delay” means patients may miss out on “potential precision treatment choices,” she said.
And having a diagnosis can connect patients to others with the same condition as well as to related organizations and communities that offer support, she added.
Types of genetic testing identified in the study included panel testing, which looks at the genes associated with epilepsy; whole exome sequencing (WES), which includes all 20,000 genes in one test; and microarray testing, which assesses missing sections of chromosomes. WES had the highest diagnostic yield (48%), followed by genetic panel testing (32.7%) and microarray testing (20.9%).
These tests collectively identified pathogenic variants in 40.9% of patients. In addition, test results showed that 53.10% of patients had variants of uncertain significance.
In the full cohort, the most commonly identified variants were mutations in TSC1 (which causes tuberous sclerosis, SCN1A (which causes Dravet syndrome), and MECP2. Among patients with seizure onset after age 1 year, MECP2 and DEPDC5 were the two most commonly identified pathogenic variants.
Researchers examined factors possibly associated with a higher risk for genetic epilepsy, including family history, comorbid developmental delay, febrile seizures, status epilepticus, perinatal injury, and seizure onset age. In an adjusted analysis, comorbid developmental delay (estimate 2.338; 95% confidence interval [CI], 1.402-3.900; P =.001) and seizure onset before 1 year (estimate 2.365; 95% CI, 1.282-4.366; P =.006) predicted higher yield of pathogenic variants related to epilepsy.
Dr. Li noted that study participants with a family history of epilepsy were not more likely to test positive for a genetic link, so doctors shouldn’t rule out testing in patients if there’s no family history.
Both the International League Against Epilepsy (ILAE) and the National Society of Genetic Counselors (NSGC) recommend genetic testing in adult epilepsy patients, with the AES endorsing the NSGC guideline.
Although testing is becoming increasingly accessible, insurance companies don’t always cover the cost.
Dr. Li said she hopes her research raises awareness among clinicians that there’s more they can do to improve care for epilepsy patients. “We should offer patients genetic testing if we don’t have a clear etiology.”
Valuable Evidence
Commenting on the research findings, Annapurna Poduri, MD, MPH, director, Epilepsy Genetics Program, Boston Children’s Hospital, Boston, Massachusetts, said this research “is incredibly important.”
“What’s really telling about this study and others that have come up over the last few years is they’re real-world retrospective studies, so they’re looking back at patients who have been seen over many, many years.”
The research provides clinicians, insurance companies, and others with evidence that genetic testing is “valuable and can actually improve outcomes,” said Dr. Poduri.
She noted that 20 years ago, there were only a handful of genes identified as being involved with epilepsy, most related to sodium or potassium channels. But since then, “the technology has just raced ahead” to the point where now “dozens of genes” have been identified.
Not only does knowing the genetic basis of epilepsy improve management, but it offers families some peace of mind. “They blame themselves” for their loved one’s condition, said Dr. Poduri. “They may worry it was something they did in pregnancy; for example, maybe it was because [they] didn’t take that vitamin one day.”
Diagnostic certainty also means that patients “don’t have to do more tests which might be invasive” and unnecessarily costly.
Drs. Li and Poduri report no relevant conflicts of interest.
A version of this article first appeared on Medscape.com.
ORLANDO —
, new research suggests. Investigators found that pathogenic genetic variants were identified in over 40% of patients with epilepsy of unknown cause who underwent genetic testing.Such testing is particularly beneficial for those with early-onset epilepsy and those with comorbid developmental delay, said study investigator Yi Li, MD, PhD, clinical assistant professor, Department of Neurology & Neurological Sciences, Stanford University School of Medicine, Stanford, California.
But every patient with epilepsy of unknown etiology needs to consider genetic testing as part of their standard workup.
Dr. Li noted research showing that a diagnosis of a genetic epilepsy leads to alteration of treatment in about 20% of cases — for example, starting a specific antiseizure medication or avoiding a treatment such as a sodium channel blocker in patients diagnosed with Dravet syndrome. A genetic diagnosis also may make patients eligible for clinical trials investigating gene therapies.
Genetic testing results may end a long and exhausting “diagnostic odyssey” that families have been on, she said. Patients often wait more than a decade to get genetic testing, the study found.
The findings were presented at the annual meeting of the American Epilepsy Society.
Major Delays
About 20%-30% of epilepsy is caused by acquired conditions such as stroke, tumor, or head injury. The remaining 70%-80% is believed to be due to one or more genetic factors.
Genetic testing has become standard for children with early-onset epilepsy, but it’s not common practice among adults with the condition — at least not yet.
The retrospective study involved a chart review of patient electronic health records from 2018-2023. Researchers used the Stanford electronic health record Cohort Discovery tool (STARR) database to identify 286 patients over age 16 years with epilepsy who had records of genetic testing.
Of the 286 patients, 148 were male and 138 female, and mean age was approximately 30 years. Among those with known epilepsy types, 53.6% had focal epilepsy and 28.8% had generalized epilpesy.
The mean age of seizure onset was 11.9 years, but the mean age at genetic testing was 25.1 years. “There’s a gap of about 13 or 14 years for genetic workup after a patient has a first seizure,” said Dr. Li.
Such a “huge delay” means patients may miss out on “potential precision treatment choices,” she said.
And having a diagnosis can connect patients to others with the same condition as well as to related organizations and communities that offer support, she added.
Types of genetic testing identified in the study included panel testing, which looks at the genes associated with epilepsy; whole exome sequencing (WES), which includes all 20,000 genes in one test; and microarray testing, which assesses missing sections of chromosomes. WES had the highest diagnostic yield (48%), followed by genetic panel testing (32.7%) and microarray testing (20.9%).
These tests collectively identified pathogenic variants in 40.9% of patients. In addition, test results showed that 53.10% of patients had variants of uncertain significance.
In the full cohort, the most commonly identified variants were mutations in TSC1 (which causes tuberous sclerosis, SCN1A (which causes Dravet syndrome), and MECP2. Among patients with seizure onset after age 1 year, MECP2 and DEPDC5 were the two most commonly identified pathogenic variants.
Researchers examined factors possibly associated with a higher risk for genetic epilepsy, including family history, comorbid developmental delay, febrile seizures, status epilepticus, perinatal injury, and seizure onset age. In an adjusted analysis, comorbid developmental delay (estimate 2.338; 95% confidence interval [CI], 1.402-3.900; P =.001) and seizure onset before 1 year (estimate 2.365; 95% CI, 1.282-4.366; P =.006) predicted higher yield of pathogenic variants related to epilepsy.
Dr. Li noted that study participants with a family history of epilepsy were not more likely to test positive for a genetic link, so doctors shouldn’t rule out testing in patients if there’s no family history.
Both the International League Against Epilepsy (ILAE) and the National Society of Genetic Counselors (NSGC) recommend genetic testing in adult epilepsy patients, with the AES endorsing the NSGC guideline.
Although testing is becoming increasingly accessible, insurance companies don’t always cover the cost.
Dr. Li said she hopes her research raises awareness among clinicians that there’s more they can do to improve care for epilepsy patients. “We should offer patients genetic testing if we don’t have a clear etiology.”
Valuable Evidence
Commenting on the research findings, Annapurna Poduri, MD, MPH, director, Epilepsy Genetics Program, Boston Children’s Hospital, Boston, Massachusetts, said this research “is incredibly important.”
“What’s really telling about this study and others that have come up over the last few years is they’re real-world retrospective studies, so they’re looking back at patients who have been seen over many, many years.”
The research provides clinicians, insurance companies, and others with evidence that genetic testing is “valuable and can actually improve outcomes,” said Dr. Poduri.
She noted that 20 years ago, there were only a handful of genes identified as being involved with epilepsy, most related to sodium or potassium channels. But since then, “the technology has just raced ahead” to the point where now “dozens of genes” have been identified.
Not only does knowing the genetic basis of epilepsy improve management, but it offers families some peace of mind. “They blame themselves” for their loved one’s condition, said Dr. Poduri. “They may worry it was something they did in pregnancy; for example, maybe it was because [they] didn’t take that vitamin one day.”
Diagnostic certainty also means that patients “don’t have to do more tests which might be invasive” and unnecessarily costly.
Drs. Li and Poduri report no relevant conflicts of interest.
A version of this article first appeared on Medscape.com.
ORLANDO —
, new research suggests. Investigators found that pathogenic genetic variants were identified in over 40% of patients with epilepsy of unknown cause who underwent genetic testing.Such testing is particularly beneficial for those with early-onset epilepsy and those with comorbid developmental delay, said study investigator Yi Li, MD, PhD, clinical assistant professor, Department of Neurology & Neurological Sciences, Stanford University School of Medicine, Stanford, California.
But every patient with epilepsy of unknown etiology needs to consider genetic testing as part of their standard workup.
Dr. Li noted research showing that a diagnosis of a genetic epilepsy leads to alteration of treatment in about 20% of cases — for example, starting a specific antiseizure medication or avoiding a treatment such as a sodium channel blocker in patients diagnosed with Dravet syndrome. A genetic diagnosis also may make patients eligible for clinical trials investigating gene therapies.
Genetic testing results may end a long and exhausting “diagnostic odyssey” that families have been on, she said. Patients often wait more than a decade to get genetic testing, the study found.
The findings were presented at the annual meeting of the American Epilepsy Society.
Major Delays
About 20%-30% of epilepsy is caused by acquired conditions such as stroke, tumor, or head injury. The remaining 70%-80% is believed to be due to one or more genetic factors.
Genetic testing has become standard for children with early-onset epilepsy, but it’s not common practice among adults with the condition — at least not yet.
The retrospective study involved a chart review of patient electronic health records from 2018-2023. Researchers used the Stanford electronic health record Cohort Discovery tool (STARR) database to identify 286 patients over age 16 years with epilepsy who had records of genetic testing.
Of the 286 patients, 148 were male and 138 female, and mean age was approximately 30 years. Among those with known epilepsy types, 53.6% had focal epilepsy and 28.8% had generalized epilpesy.
The mean age of seizure onset was 11.9 years, but the mean age at genetic testing was 25.1 years. “There’s a gap of about 13 or 14 years for genetic workup after a patient has a first seizure,” said Dr. Li.
Such a “huge delay” means patients may miss out on “potential precision treatment choices,” she said.
And having a diagnosis can connect patients to others with the same condition as well as to related organizations and communities that offer support, she added.
Types of genetic testing identified in the study included panel testing, which looks at the genes associated with epilepsy; whole exome sequencing (WES), which includes all 20,000 genes in one test; and microarray testing, which assesses missing sections of chromosomes. WES had the highest diagnostic yield (48%), followed by genetic panel testing (32.7%) and microarray testing (20.9%).
These tests collectively identified pathogenic variants in 40.9% of patients. In addition, test results showed that 53.10% of patients had variants of uncertain significance.
In the full cohort, the most commonly identified variants were mutations in TSC1 (which causes tuberous sclerosis, SCN1A (which causes Dravet syndrome), and MECP2. Among patients with seizure onset after age 1 year, MECP2 and DEPDC5 were the two most commonly identified pathogenic variants.
Researchers examined factors possibly associated with a higher risk for genetic epilepsy, including family history, comorbid developmental delay, febrile seizures, status epilepticus, perinatal injury, and seizure onset age. In an adjusted analysis, comorbid developmental delay (estimate 2.338; 95% confidence interval [CI], 1.402-3.900; P =.001) and seizure onset before 1 year (estimate 2.365; 95% CI, 1.282-4.366; P =.006) predicted higher yield of pathogenic variants related to epilepsy.
Dr. Li noted that study participants with a family history of epilepsy were not more likely to test positive for a genetic link, so doctors shouldn’t rule out testing in patients if there’s no family history.
Both the International League Against Epilepsy (ILAE) and the National Society of Genetic Counselors (NSGC) recommend genetic testing in adult epilepsy patients, with the AES endorsing the NSGC guideline.
Although testing is becoming increasingly accessible, insurance companies don’t always cover the cost.
Dr. Li said she hopes her research raises awareness among clinicians that there’s more they can do to improve care for epilepsy patients. “We should offer patients genetic testing if we don’t have a clear etiology.”
Valuable Evidence
Commenting on the research findings, Annapurna Poduri, MD, MPH, director, Epilepsy Genetics Program, Boston Children’s Hospital, Boston, Massachusetts, said this research “is incredibly important.”
“What’s really telling about this study and others that have come up over the last few years is they’re real-world retrospective studies, so they’re looking back at patients who have been seen over many, many years.”
The research provides clinicians, insurance companies, and others with evidence that genetic testing is “valuable and can actually improve outcomes,” said Dr. Poduri.
She noted that 20 years ago, there were only a handful of genes identified as being involved with epilepsy, most related to sodium or potassium channels. But since then, “the technology has just raced ahead” to the point where now “dozens of genes” have been identified.
Not only does knowing the genetic basis of epilepsy improve management, but it offers families some peace of mind. “They blame themselves” for their loved one’s condition, said Dr. Poduri. “They may worry it was something they did in pregnancy; for example, maybe it was because [they] didn’t take that vitamin one day.”
Diagnostic certainty also means that patients “don’t have to do more tests which might be invasive” and unnecessarily costly.
Drs. Li and Poduri report no relevant conflicts of interest.
A version of this article first appeared on Medscape.com.
FROM AES 2023
Excessive TV-watching tied to elevated risk for dementia, Parkinson’s disease, and depression
TOPLINE:
whereas a limited amount of daily computer use that is not work-related is linked to a lower risk for dementia.
METHODOLOGY:
- Investigators analyzed data on 473,184 people aged 39-72 years from the UK Biobank who were enrolled from 2006 to 2010 and followed until a diagnosis of dementia, PD, depression, death, or study end (2018 for Wales residents; 2021 for residents of England and Scotland).
- Participants reported on the number of hours they spent outside of work exercising, watching television, and using the computer.
- MRI was conducted to determine participants’ brain volume.
TAKEAWAY:
- During the study, 6096 people developed dementia, 3000 developed PD, 23,600 developed depression, 1200 developed dementia and depression, and 486 developed PD and depression.
- Compared with those who watched TV for under 1 hour per day, those who reported watching 4 or more hours per day had a 28% higher risk for dementia (adjusted hazard ratio [aHR], 1.28; 95% CI, 1.17-1.39), a 35% higher risk for depression, (aHR, 1.35; 95% CI, 1.29-1.40) and a 16% greater risk for PD (aHR, 1.16; 95% CI, 1.03-1.29).
- However, moderate computer use outside of work seemed somewhat protective. Participants who used the computer for 30-60 minutes per day had lower risks for dementia (aHR, 0.68; 95% CI, 0.64-0.72), PD, (aHR, 0.86; 95% CI, 0.79-0.93), and depression (aHR, 0.85; 95% CI, 0.83-0.88) compared with those who reported the lowest levels of computer usage.
- Replacing 30 minutes per day of computer time with an equal amount of structured exercise was associated with decreased risk for dementia (aHR, 0.74; 95% CI, 0.85-0.95) and PD (aHR, 0.84; 95% CI, 0.78-0.90).
IN PRACTICE:
The association between extended periods of TV use and higher risk for PD and dementia could be explained by a lack of activity, the authors note. They add that sedentary behavior is, “associated with biomarkers of low-grade inflammation and changes in inflammation markers that could initiate and or worsen neuroinflammation and contribute to neurodegeneration.”
SOURCE:
Hanzhang Wu, PhD, of Tianjin University of Traditional Medicine in Tianjin, China, led the study, which was published online in the International Journal of Behavioral Nutrition and Physical Activity.
LIMITATIONS:
Screen behaviors were assessed using self-report measures, which is subject to recall bias. Also, there may have been variables confounding the findings for which investigators did not account.
DISCLOSURES:
The study was funded by the National Natural Science Foundation of China, the Tianjin Major Public Health Science and Technology Project, the National Health Commission of China, the Food Science and Technology Foundation of Chinese Institute of Food Science and Technology, the China Cohort Consortium, and the Chinese Nutrition Society Nutrition Research Foundation–DSM Research Fund, China. There were no disclosures reported.
Eve Bender has no relevant financial relationships.
A version of this article appeared on Medscape.com.
TOPLINE:
whereas a limited amount of daily computer use that is not work-related is linked to a lower risk for dementia.
METHODOLOGY:
- Investigators analyzed data on 473,184 people aged 39-72 years from the UK Biobank who were enrolled from 2006 to 2010 and followed until a diagnosis of dementia, PD, depression, death, or study end (2018 for Wales residents; 2021 for residents of England and Scotland).
- Participants reported on the number of hours they spent outside of work exercising, watching television, and using the computer.
- MRI was conducted to determine participants’ brain volume.
TAKEAWAY:
- During the study, 6096 people developed dementia, 3000 developed PD, 23,600 developed depression, 1200 developed dementia and depression, and 486 developed PD and depression.
- Compared with those who watched TV for under 1 hour per day, those who reported watching 4 or more hours per day had a 28% higher risk for dementia (adjusted hazard ratio [aHR], 1.28; 95% CI, 1.17-1.39), a 35% higher risk for depression, (aHR, 1.35; 95% CI, 1.29-1.40) and a 16% greater risk for PD (aHR, 1.16; 95% CI, 1.03-1.29).
- However, moderate computer use outside of work seemed somewhat protective. Participants who used the computer for 30-60 minutes per day had lower risks for dementia (aHR, 0.68; 95% CI, 0.64-0.72), PD, (aHR, 0.86; 95% CI, 0.79-0.93), and depression (aHR, 0.85; 95% CI, 0.83-0.88) compared with those who reported the lowest levels of computer usage.
- Replacing 30 minutes per day of computer time with an equal amount of structured exercise was associated with decreased risk for dementia (aHR, 0.74; 95% CI, 0.85-0.95) and PD (aHR, 0.84; 95% CI, 0.78-0.90).
IN PRACTICE:
The association between extended periods of TV use and higher risk for PD and dementia could be explained by a lack of activity, the authors note. They add that sedentary behavior is, “associated with biomarkers of low-grade inflammation and changes in inflammation markers that could initiate and or worsen neuroinflammation and contribute to neurodegeneration.”
SOURCE:
Hanzhang Wu, PhD, of Tianjin University of Traditional Medicine in Tianjin, China, led the study, which was published online in the International Journal of Behavioral Nutrition and Physical Activity.
LIMITATIONS:
Screen behaviors were assessed using self-report measures, which is subject to recall bias. Also, there may have been variables confounding the findings for which investigators did not account.
DISCLOSURES:
The study was funded by the National Natural Science Foundation of China, the Tianjin Major Public Health Science and Technology Project, the National Health Commission of China, the Food Science and Technology Foundation of Chinese Institute of Food Science and Technology, the China Cohort Consortium, and the Chinese Nutrition Society Nutrition Research Foundation–DSM Research Fund, China. There were no disclosures reported.
Eve Bender has no relevant financial relationships.
A version of this article appeared on Medscape.com.
TOPLINE:
whereas a limited amount of daily computer use that is not work-related is linked to a lower risk for dementia.
METHODOLOGY:
- Investigators analyzed data on 473,184 people aged 39-72 years from the UK Biobank who were enrolled from 2006 to 2010 and followed until a diagnosis of dementia, PD, depression, death, or study end (2018 for Wales residents; 2021 for residents of England and Scotland).
- Participants reported on the number of hours they spent outside of work exercising, watching television, and using the computer.
- MRI was conducted to determine participants’ brain volume.
TAKEAWAY:
- During the study, 6096 people developed dementia, 3000 developed PD, 23,600 developed depression, 1200 developed dementia and depression, and 486 developed PD and depression.
- Compared with those who watched TV for under 1 hour per day, those who reported watching 4 or more hours per day had a 28% higher risk for dementia (adjusted hazard ratio [aHR], 1.28; 95% CI, 1.17-1.39), a 35% higher risk for depression, (aHR, 1.35; 95% CI, 1.29-1.40) and a 16% greater risk for PD (aHR, 1.16; 95% CI, 1.03-1.29).
- However, moderate computer use outside of work seemed somewhat protective. Participants who used the computer for 30-60 minutes per day had lower risks for dementia (aHR, 0.68; 95% CI, 0.64-0.72), PD, (aHR, 0.86; 95% CI, 0.79-0.93), and depression (aHR, 0.85; 95% CI, 0.83-0.88) compared with those who reported the lowest levels of computer usage.
- Replacing 30 minutes per day of computer time with an equal amount of structured exercise was associated with decreased risk for dementia (aHR, 0.74; 95% CI, 0.85-0.95) and PD (aHR, 0.84; 95% CI, 0.78-0.90).
IN PRACTICE:
The association between extended periods of TV use and higher risk for PD and dementia could be explained by a lack of activity, the authors note. They add that sedentary behavior is, “associated with biomarkers of low-grade inflammation and changes in inflammation markers that could initiate and or worsen neuroinflammation and contribute to neurodegeneration.”
SOURCE:
Hanzhang Wu, PhD, of Tianjin University of Traditional Medicine in Tianjin, China, led the study, which was published online in the International Journal of Behavioral Nutrition and Physical Activity.
LIMITATIONS:
Screen behaviors were assessed using self-report measures, which is subject to recall bias. Also, there may have been variables confounding the findings for which investigators did not account.
DISCLOSURES:
The study was funded by the National Natural Science Foundation of China, the Tianjin Major Public Health Science and Technology Project, the National Health Commission of China, the Food Science and Technology Foundation of Chinese Institute of Food Science and Technology, the China Cohort Consortium, and the Chinese Nutrition Society Nutrition Research Foundation–DSM Research Fund, China. There were no disclosures reported.
Eve Bender has no relevant financial relationships.
A version of this article appeared on Medscape.com.