User login
Consider caregiver oral health for children with bleeding disorders
SAN DIEGO – Caregiver oral health status is an identifiable risk factor that could be used to screen for poor oral health among children and young adults with bleeding disorders, results from a single-center suggest.
“We ask parents one simple question: ‘Have you had a cavity in the last year?’ If they say yes, we would be more concerned that their children would be more likely to have poor oral health,” Elizabeth Hastie said in an interview during a poster session at the biennial summit of the Thrombosis & Hemostasis Societies of North America.
Proper oral health may prevent joint disease and other conditions that predispose patients to bleeds, according to Ms. Hastie, a fourth-year medical student at Emory University, Atlanta. However, of the 147 hemophilia treatment centers in the United States, just 30% have a dentist on staff, while 90% of centers have expressed interest in increasing patient education in oral health.
In an effort to evaluate the dental habits, needs, and oral health issues of children and young adults up to age 18 with bleeding disorders, Ms. Hastie and her associates conducted a cross-sectional study of 226 patients who were evaluated by a staff dental hygienist at Children’s Healthcare of Atlanta Comprehensive Bleeding Disorders Clinic from May 2016 to October 2017.
The evaluation consisted of a 14-question survey derived from the American Academy of Pediatric Dentistry Caries–Risk Assessment Tool completed by the primary caregiver present during the visit and oral screening. The researchers extracted demographic and clinical characteristics from the patient’s chart and included age, race, county of residence, and bleeding disorder type and severity.
Nearly half of the patients (44%) reported they did not brush their teeth twice a day. Children younger than age 5 years were more likely to not brush their teeth twice a day, compared with children aged 5-14 years and young adults aged 15-18 years (57% vs. 44% and 31% respectively, P = .08).
More than one-quarter of patients (27%) reported not having a current dentist and 15% reported specific challenges with access to dental care including burdens related to distance, insurance coverage, and finding a provider willing to treat in the setting of their medical condition. Those who were Medicaid eligible or of low socioeconomic status were significantly more likely to report dental care access issues, compared with other patients (20% vs. 9%; P = .01).
Oral screening performed by the dental hygienist demonstrated significant oral pathology: 89% of patients had plaque accumulation, 57% had white spots or decalcifications, 37% had gingivitis, and 8% had suspicious lesions suggestive of dental caries.
The researchers also found that having a caregiver with active oral disease in the past 12 months increased the odds of suspicious lesions (odds ratio, 4.34), increased the odds of gingivitis (OR, 3.80), and decreased the odds of the patients’ brushing their teeth at least twice per day (OR, 0.17).
“Hopefully, if we can target those high-risk patients in clinic, we could reduce costs, the number of bleeds, the number of products and factor used, and potentially even morbidity in the future,” Ms. Hastie said.
She acknowledged certain limitations of the study, including its single-center design and the fact that a dental hygienist performed the majority of evaluations. She reported having no financial disclosures.
SAN DIEGO – Caregiver oral health status is an identifiable risk factor that could be used to screen for poor oral health among children and young adults with bleeding disorders, results from a single-center suggest.
“We ask parents one simple question: ‘Have you had a cavity in the last year?’ If they say yes, we would be more concerned that their children would be more likely to have poor oral health,” Elizabeth Hastie said in an interview during a poster session at the biennial summit of the Thrombosis & Hemostasis Societies of North America.
Proper oral health may prevent joint disease and other conditions that predispose patients to bleeds, according to Ms. Hastie, a fourth-year medical student at Emory University, Atlanta. However, of the 147 hemophilia treatment centers in the United States, just 30% have a dentist on staff, while 90% of centers have expressed interest in increasing patient education in oral health.
In an effort to evaluate the dental habits, needs, and oral health issues of children and young adults up to age 18 with bleeding disorders, Ms. Hastie and her associates conducted a cross-sectional study of 226 patients who were evaluated by a staff dental hygienist at Children’s Healthcare of Atlanta Comprehensive Bleeding Disorders Clinic from May 2016 to October 2017.
The evaluation consisted of a 14-question survey derived from the American Academy of Pediatric Dentistry Caries–Risk Assessment Tool completed by the primary caregiver present during the visit and oral screening. The researchers extracted demographic and clinical characteristics from the patient’s chart and included age, race, county of residence, and bleeding disorder type and severity.
Nearly half of the patients (44%) reported they did not brush their teeth twice a day. Children younger than age 5 years were more likely to not brush their teeth twice a day, compared with children aged 5-14 years and young adults aged 15-18 years (57% vs. 44% and 31% respectively, P = .08).
More than one-quarter of patients (27%) reported not having a current dentist and 15% reported specific challenges with access to dental care including burdens related to distance, insurance coverage, and finding a provider willing to treat in the setting of their medical condition. Those who were Medicaid eligible or of low socioeconomic status were significantly more likely to report dental care access issues, compared with other patients (20% vs. 9%; P = .01).
Oral screening performed by the dental hygienist demonstrated significant oral pathology: 89% of patients had plaque accumulation, 57% had white spots or decalcifications, 37% had gingivitis, and 8% had suspicious lesions suggestive of dental caries.
The researchers also found that having a caregiver with active oral disease in the past 12 months increased the odds of suspicious lesions (odds ratio, 4.34), increased the odds of gingivitis (OR, 3.80), and decreased the odds of the patients’ brushing their teeth at least twice per day (OR, 0.17).
“Hopefully, if we can target those high-risk patients in clinic, we could reduce costs, the number of bleeds, the number of products and factor used, and potentially even morbidity in the future,” Ms. Hastie said.
She acknowledged certain limitations of the study, including its single-center design and the fact that a dental hygienist performed the majority of evaluations. She reported having no financial disclosures.
SAN DIEGO – Caregiver oral health status is an identifiable risk factor that could be used to screen for poor oral health among children and young adults with bleeding disorders, results from a single-center suggest.
“We ask parents one simple question: ‘Have you had a cavity in the last year?’ If they say yes, we would be more concerned that their children would be more likely to have poor oral health,” Elizabeth Hastie said in an interview during a poster session at the biennial summit of the Thrombosis & Hemostasis Societies of North America.
Proper oral health may prevent joint disease and other conditions that predispose patients to bleeds, according to Ms. Hastie, a fourth-year medical student at Emory University, Atlanta. However, of the 147 hemophilia treatment centers in the United States, just 30% have a dentist on staff, while 90% of centers have expressed interest in increasing patient education in oral health.
In an effort to evaluate the dental habits, needs, and oral health issues of children and young adults up to age 18 with bleeding disorders, Ms. Hastie and her associates conducted a cross-sectional study of 226 patients who were evaluated by a staff dental hygienist at Children’s Healthcare of Atlanta Comprehensive Bleeding Disorders Clinic from May 2016 to October 2017.
The evaluation consisted of a 14-question survey derived from the American Academy of Pediatric Dentistry Caries–Risk Assessment Tool completed by the primary caregiver present during the visit and oral screening. The researchers extracted demographic and clinical characteristics from the patient’s chart and included age, race, county of residence, and bleeding disorder type and severity.
Nearly half of the patients (44%) reported they did not brush their teeth twice a day. Children younger than age 5 years were more likely to not brush their teeth twice a day, compared with children aged 5-14 years and young adults aged 15-18 years (57% vs. 44% and 31% respectively, P = .08).
More than one-quarter of patients (27%) reported not having a current dentist and 15% reported specific challenges with access to dental care including burdens related to distance, insurance coverage, and finding a provider willing to treat in the setting of their medical condition. Those who were Medicaid eligible or of low socioeconomic status were significantly more likely to report dental care access issues, compared with other patients (20% vs. 9%; P = .01).
Oral screening performed by the dental hygienist demonstrated significant oral pathology: 89% of patients had plaque accumulation, 57% had white spots or decalcifications, 37% had gingivitis, and 8% had suspicious lesions suggestive of dental caries.
The researchers also found that having a caregiver with active oral disease in the past 12 months increased the odds of suspicious lesions (odds ratio, 4.34), increased the odds of gingivitis (OR, 3.80), and decreased the odds of the patients’ brushing their teeth at least twice per day (OR, 0.17).
“Hopefully, if we can target those high-risk patients in clinic, we could reduce costs, the number of bleeds, the number of products and factor used, and potentially even morbidity in the future,” Ms. Hastie said.
She acknowledged certain limitations of the study, including its single-center design and the fact that a dental hygienist performed the majority of evaluations. She reported having no financial disclosures.
REPORTING FROM THSNA 2018
Key clinical point:
Major finding: Having a caregiver with active oral disease in the past 12 months increased the odds of the child having a suspicious lesion (OR 4.34) and gingivitis (OR 3.80).
Study details: A cross-sectional study of 226 pediatric patients who were evaluated by a dental hygienist.
Disclosures: Ms. Hastie reported having no financial disclosures.
Source: Hastie E et al. THSNA 2018, Poster 150.
Genetic markers may help predict allogeneic SCT outcomes
SALT LAKE CITY – The presence of p2X7 receptor single nucleotide polymorphisms (SNPs) associated with gain and loss of function may help predict outcomes after allogeneic stem cell transplantation, according to findings from a clinical correlate analysis of recipient and donor DNA samples.
The findings require validation in future studies, but suggest that the presence of the SNPs and p2X7 haplotypes 2 and 4 could be incorporated into disease risk models to improve transplant decision making, David Stuart Ritchie, MD, reported at the combined annual meetings of the Center for International Blood & Marrow Transplant Research and the American Society for Blood and Marrow Transplantation.
The analysis, which specifically looked for the presence of 16 previously identified SNPs and the haplotypes 2 and 4 in p2x7, was performed on pretransplant DNA samples from 333 allogeneic stem cell transplant recipients and 228 donors at a single center between 2002 and 2013. The findings were correlated with patient outcomes.
Five SNPs were excluded from correlation because of low frequency, and of the 11 remaining SNPs, 3 were found to be significantly associated with reduced incidence of acute and/or chronic graft-versus-host disease (GVHD), and 2 were significantly associated with increased relapse or transplant-related mortality, said Dr. Ritchie of the University of Melbourne, Parkville, Australia.
The loss-of-function SNPs rs28360457 and rs3751133 – each linked with decreased inflammation – were significantly associated with a reduced incidence of acute GVHD when comparing grade 0 with grades 1-4 GVHD (P = .0234 and P = .0411, respectively), but not when comparing grades 0-1 and grades 2-4 GVHD.
SNP rs3751133 was also significantly associated with reduced incidence of chronic GVHD when comparing grade 0 with grades 0-4 GVHD (P = .01), but not when comparing grades 0-1 and grades 2-4 GVHD, he said.
The loss-of-function SNP rs1653624 – which is linked with decreased phagocytosis – was associated with an increased incidence of acute GVHD when comparing grade 0 vs. grade 1-4 GVHD (P = .01), but not when comparing grades 0-1 and grades 2-4 GVHD (P = NS).
SNP rs7958311, which had increased surface expression, was associated with a trend toward increased relapse risk (P = .053), and the loss-of-function SNP rs1653624 was associated with an excess of early transplant-related mortality (P = .0471).
“Individual SNPs are interesting, but perhaps more interesting are the haplotypes,” Dr. Ritchie said.
Haplotype 4, which was found in 8 recipients, involves rs1718119 and rs7958311. Those SNPs were previously shown to have 300% and 195% increased expression, respectively, with or without rs2230912, which has been shown to be decreased by 72%, and with or without loss of function rs1653624. Haplotype 4 is associated with a net increase in p2X7 activity, he explained.
In the current study, haplotype 4 was only found to involve rs1718119 co-inherited with rs2230912, and was associated with substantially decreased relapse-free survival overall (hazard ratio, 0.6946), when compared with haplotype 2 (HR, 0.2078), he said.
The differences between haplotype 4 and haplotype 2, and between haplotype 4 and patients with neither haplotype, were highly statistically significant.
Relapse-free survival did not differ significantly between those with haplotype 2 and those with neither haplotype (HR, 0.7717). Similarly, overall survival was significantly poorer among those with haplotype 4 versus haplotype 2 or no haplotype (HR, 0.2812 and 0.2882, respectively), but no difference was seen in overall survival between those with haplotype 2 and those with no haplotype (HR, 1.003), he said.
P2X7 is a purinergic signaling receptor located at chromosome 12q24–a region associated with inflammatory disorders. It plays an important role in immunogenic cell death. It is expressed in all leukocytes, with the highest level of expression seen in monocyte lineage. Binding of its ligand – extracellular adenosine 5’-triphosphate – leads to activation of dendritic cells and release of IL1b leading to T cell recruitment and the production of memory T cells.
Both gain- and loss-of-function SNPs in p2X7 have been reported and implicated in GVHD and other inflammatory disorders, Dr. Ritchie said, explaining the rationale for studying their correlations with outcomes after allogeneic stem cell transplantation.
While such transplants are highly effective for treating hematologic malignancies, outcomes can be adversely affected by infection, acute organ dysfunction, and GVHD. Pretransplant conditioning regimens are associated with high levels of immunogenic cell death and the release of extracellular adenosine 5’-triphosphate, therefore signaling through the p2X7 receptor may lead to activation of downstream effectors that influence transplant outcome, he noted.
“We hypothesized that germline gain or loss of function polymorphisms in this receptor in recipients of allogeneic transplantation would result in an adverse outcome,” he said.
The mean age of the recipients whose samples were analyzed was 46 years, and about half were women. Most (83.8%) had a peripheral blood graft source and 64% of transplants were from related donors. The nonrelapse mortality at 24 months was 12.98%, Dr. Ritchie said, noting that their indications for transplantation were “fairly representative of the adult transplant population, dominated by acute leukemia with a range of other acute conditions.”
The findings – particularly those with respect to haplotype 4, which had the most substantial impact – could play a role in patient risk assessment.
“Potentially, although it is a relatively uncommon haplotype, pretransplant identification of haplotype 4 may well have implications for transplant decision making, given the fact that the majority of our patients with this haplotype did not survive posttransplant,” he concluded, noting that an effort to validate the findings is ongoing in an additional 300 patients.
Dr. Ritchie reported having no financial disclosures.
SOURCE: Koldej R et al. The 2018 BMT Tandem Meetings, Abstract 22.
SALT LAKE CITY – The presence of p2X7 receptor single nucleotide polymorphisms (SNPs) associated with gain and loss of function may help predict outcomes after allogeneic stem cell transplantation, according to findings from a clinical correlate analysis of recipient and donor DNA samples.
The findings require validation in future studies, but suggest that the presence of the SNPs and p2X7 haplotypes 2 and 4 could be incorporated into disease risk models to improve transplant decision making, David Stuart Ritchie, MD, reported at the combined annual meetings of the Center for International Blood & Marrow Transplant Research and the American Society for Blood and Marrow Transplantation.
The analysis, which specifically looked for the presence of 16 previously identified SNPs and the haplotypes 2 and 4 in p2x7, was performed on pretransplant DNA samples from 333 allogeneic stem cell transplant recipients and 228 donors at a single center between 2002 and 2013. The findings were correlated with patient outcomes.
Five SNPs were excluded from correlation because of low frequency, and of the 11 remaining SNPs, 3 were found to be significantly associated with reduced incidence of acute and/or chronic graft-versus-host disease (GVHD), and 2 were significantly associated with increased relapse or transplant-related mortality, said Dr. Ritchie of the University of Melbourne, Parkville, Australia.
The loss-of-function SNPs rs28360457 and rs3751133 – each linked with decreased inflammation – were significantly associated with a reduced incidence of acute GVHD when comparing grade 0 with grades 1-4 GVHD (P = .0234 and P = .0411, respectively), but not when comparing grades 0-1 and grades 2-4 GVHD.
SNP rs3751133 was also significantly associated with reduced incidence of chronic GVHD when comparing grade 0 with grades 0-4 GVHD (P = .01), but not when comparing grades 0-1 and grades 2-4 GVHD, he said.
The loss-of-function SNP rs1653624 – which is linked with decreased phagocytosis – was associated with an increased incidence of acute GVHD when comparing grade 0 vs. grade 1-4 GVHD (P = .01), but not when comparing grades 0-1 and grades 2-4 GVHD (P = NS).
SNP rs7958311, which had increased surface expression, was associated with a trend toward increased relapse risk (P = .053), and the loss-of-function SNP rs1653624 was associated with an excess of early transplant-related mortality (P = .0471).
“Individual SNPs are interesting, but perhaps more interesting are the haplotypes,” Dr. Ritchie said.
Haplotype 4, which was found in 8 recipients, involves rs1718119 and rs7958311. Those SNPs were previously shown to have 300% and 195% increased expression, respectively, with or without rs2230912, which has been shown to be decreased by 72%, and with or without loss of function rs1653624. Haplotype 4 is associated with a net increase in p2X7 activity, he explained.
In the current study, haplotype 4 was only found to involve rs1718119 co-inherited with rs2230912, and was associated with substantially decreased relapse-free survival overall (hazard ratio, 0.6946), when compared with haplotype 2 (HR, 0.2078), he said.
The differences between haplotype 4 and haplotype 2, and between haplotype 4 and patients with neither haplotype, were highly statistically significant.
Relapse-free survival did not differ significantly between those with haplotype 2 and those with neither haplotype (HR, 0.7717). Similarly, overall survival was significantly poorer among those with haplotype 4 versus haplotype 2 or no haplotype (HR, 0.2812 and 0.2882, respectively), but no difference was seen in overall survival between those with haplotype 2 and those with no haplotype (HR, 1.003), he said.
P2X7 is a purinergic signaling receptor located at chromosome 12q24–a region associated with inflammatory disorders. It plays an important role in immunogenic cell death. It is expressed in all leukocytes, with the highest level of expression seen in monocyte lineage. Binding of its ligand – extracellular adenosine 5’-triphosphate – leads to activation of dendritic cells and release of IL1b leading to T cell recruitment and the production of memory T cells.
Both gain- and loss-of-function SNPs in p2X7 have been reported and implicated in GVHD and other inflammatory disorders, Dr. Ritchie said, explaining the rationale for studying their correlations with outcomes after allogeneic stem cell transplantation.
While such transplants are highly effective for treating hematologic malignancies, outcomes can be adversely affected by infection, acute organ dysfunction, and GVHD. Pretransplant conditioning regimens are associated with high levels of immunogenic cell death and the release of extracellular adenosine 5’-triphosphate, therefore signaling through the p2X7 receptor may lead to activation of downstream effectors that influence transplant outcome, he noted.
“We hypothesized that germline gain or loss of function polymorphisms in this receptor in recipients of allogeneic transplantation would result in an adverse outcome,” he said.
The mean age of the recipients whose samples were analyzed was 46 years, and about half were women. Most (83.8%) had a peripheral blood graft source and 64% of transplants were from related donors. The nonrelapse mortality at 24 months was 12.98%, Dr. Ritchie said, noting that their indications for transplantation were “fairly representative of the adult transplant population, dominated by acute leukemia with a range of other acute conditions.”
The findings – particularly those with respect to haplotype 4, which had the most substantial impact – could play a role in patient risk assessment.
“Potentially, although it is a relatively uncommon haplotype, pretransplant identification of haplotype 4 may well have implications for transplant decision making, given the fact that the majority of our patients with this haplotype did not survive posttransplant,” he concluded, noting that an effort to validate the findings is ongoing in an additional 300 patients.
Dr. Ritchie reported having no financial disclosures.
SOURCE: Koldej R et al. The 2018 BMT Tandem Meetings, Abstract 22.
SALT LAKE CITY – The presence of p2X7 receptor single nucleotide polymorphisms (SNPs) associated with gain and loss of function may help predict outcomes after allogeneic stem cell transplantation, according to findings from a clinical correlate analysis of recipient and donor DNA samples.
The findings require validation in future studies, but suggest that the presence of the SNPs and p2X7 haplotypes 2 and 4 could be incorporated into disease risk models to improve transplant decision making, David Stuart Ritchie, MD, reported at the combined annual meetings of the Center for International Blood & Marrow Transplant Research and the American Society for Blood and Marrow Transplantation.
The analysis, which specifically looked for the presence of 16 previously identified SNPs and the haplotypes 2 and 4 in p2x7, was performed on pretransplant DNA samples from 333 allogeneic stem cell transplant recipients and 228 donors at a single center between 2002 and 2013. The findings were correlated with patient outcomes.
Five SNPs were excluded from correlation because of low frequency, and of the 11 remaining SNPs, 3 were found to be significantly associated with reduced incidence of acute and/or chronic graft-versus-host disease (GVHD), and 2 were significantly associated with increased relapse or transplant-related mortality, said Dr. Ritchie of the University of Melbourne, Parkville, Australia.
The loss-of-function SNPs rs28360457 and rs3751133 – each linked with decreased inflammation – were significantly associated with a reduced incidence of acute GVHD when comparing grade 0 with grades 1-4 GVHD (P = .0234 and P = .0411, respectively), but not when comparing grades 0-1 and grades 2-4 GVHD.
SNP rs3751133 was also significantly associated with reduced incidence of chronic GVHD when comparing grade 0 with grades 0-4 GVHD (P = .01), but not when comparing grades 0-1 and grades 2-4 GVHD, he said.
The loss-of-function SNP rs1653624 – which is linked with decreased phagocytosis – was associated with an increased incidence of acute GVHD when comparing grade 0 vs. grade 1-4 GVHD (P = .01), but not when comparing grades 0-1 and grades 2-4 GVHD (P = NS).
SNP rs7958311, which had increased surface expression, was associated with a trend toward increased relapse risk (P = .053), and the loss-of-function SNP rs1653624 was associated with an excess of early transplant-related mortality (P = .0471).
“Individual SNPs are interesting, but perhaps more interesting are the haplotypes,” Dr. Ritchie said.
Haplotype 4, which was found in 8 recipients, involves rs1718119 and rs7958311. Those SNPs were previously shown to have 300% and 195% increased expression, respectively, with or without rs2230912, which has been shown to be decreased by 72%, and with or without loss of function rs1653624. Haplotype 4 is associated with a net increase in p2X7 activity, he explained.
In the current study, haplotype 4 was only found to involve rs1718119 co-inherited with rs2230912, and was associated with substantially decreased relapse-free survival overall (hazard ratio, 0.6946), when compared with haplotype 2 (HR, 0.2078), he said.
The differences between haplotype 4 and haplotype 2, and between haplotype 4 and patients with neither haplotype, were highly statistically significant.
Relapse-free survival did not differ significantly between those with haplotype 2 and those with neither haplotype (HR, 0.7717). Similarly, overall survival was significantly poorer among those with haplotype 4 versus haplotype 2 or no haplotype (HR, 0.2812 and 0.2882, respectively), but no difference was seen in overall survival between those with haplotype 2 and those with no haplotype (HR, 1.003), he said.
P2X7 is a purinergic signaling receptor located at chromosome 12q24–a region associated with inflammatory disorders. It plays an important role in immunogenic cell death. It is expressed in all leukocytes, with the highest level of expression seen in monocyte lineage. Binding of its ligand – extracellular adenosine 5’-triphosphate – leads to activation of dendritic cells and release of IL1b leading to T cell recruitment and the production of memory T cells.
Both gain- and loss-of-function SNPs in p2X7 have been reported and implicated in GVHD and other inflammatory disorders, Dr. Ritchie said, explaining the rationale for studying their correlations with outcomes after allogeneic stem cell transplantation.
While such transplants are highly effective for treating hematologic malignancies, outcomes can be adversely affected by infection, acute organ dysfunction, and GVHD. Pretransplant conditioning regimens are associated with high levels of immunogenic cell death and the release of extracellular adenosine 5’-triphosphate, therefore signaling through the p2X7 receptor may lead to activation of downstream effectors that influence transplant outcome, he noted.
“We hypothesized that germline gain or loss of function polymorphisms in this receptor in recipients of allogeneic transplantation would result in an adverse outcome,” he said.
The mean age of the recipients whose samples were analyzed was 46 years, and about half were women. Most (83.8%) had a peripheral blood graft source and 64% of transplants were from related donors. The nonrelapse mortality at 24 months was 12.98%, Dr. Ritchie said, noting that their indications for transplantation were “fairly representative of the adult transplant population, dominated by acute leukemia with a range of other acute conditions.”
The findings – particularly those with respect to haplotype 4, which had the most substantial impact – could play a role in patient risk assessment.
“Potentially, although it is a relatively uncommon haplotype, pretransplant identification of haplotype 4 may well have implications for transplant decision making, given the fact that the majority of our patients with this haplotype did not survive posttransplant,” he concluded, noting that an effort to validate the findings is ongoing in an additional 300 patients.
Dr. Ritchie reported having no financial disclosures.
SOURCE: Koldej R et al. The 2018 BMT Tandem Meetings, Abstract 22.
REPORTING FROM THE 2018 BMT TANDEM MEETINGS
Key clinical point:
Major finding: Haplotype 4 was associated with substantially decreased relapse-free survival overall (hazard ratio, 0.6946) versus haplotype 2 (HR, 0.2078).
Study details: A clinical correlate analysis of 561 DNA samples.
Disclosures: Dr. Ritchie reported having no financial disclosures.
Source: Koldej R et al. The 2018 BMT Tandem Meetings, Abstract 22.
New options emerge for primary biliary cholangitis
LAS VEGAS – Evidence is mounting for several adjuvant treatments that may be appropriate for patients with primary biliary cholangitis who are not responding to first-line ursodeoxycholic acid (UDCA), according to Cynthia Levy, MD.
Given these new potential treatment options, it’s important for clinicians to assess biochemical response to first-line UDCA, said Dr. Levy, assistant director for the Schiff Center for Liver Diseases at the University of Miami.
“Up until recently, we didn’t have anything to offer to nonresponders,” Dr. Levy said at the inaugural Perspectives in Digestive Diseases meeting held by Global Academy for Medical Education. “Now we know at 1 year, we need to restratify to evaluate the need for adjuvant therapy.”
Many UDCA-treated patients will not respond to that first-line treatment, putting them at risk for progression to hepatocellular carcinoma and end-stage liver disease, according to Dr. Levy.
Obeticholic acid, a farnesoid X receptor agonist, is now a Food and Drug Administration–approved option for these patients, while fibrates and budesonide have recent data supporting their use and are available off-label, she added.
The conditional FDA approval for obeticholic acid, granted in May 2016, is for treatment as monotherapy in patients who do not tolerate UDCA or in combination with UDCA for patients who had had incomplete responses to that treatment for at least a year. Improvement in survival without liver transplantation has not yet been demonstrated for this agent, though studies are ongoing, Dr. Levy noted.
In the meantime, research is progressing with the peroxisome proliferator-activated receptor alpha (PPAR-alpha) agonists fenofibrate and bezafibrate.
In the BEZURSO study, presented at the 2017 EASL Congress, 100 patients with incomplete response to UDCA were randomized to bezafibrate plus UDCA or placebo plus UDCA for 2 years. The primary endpoint, normalization of liver function tests at 2 years, was achieved in 30% of the bezafibrate group and 0% of the placebo group. In addition, 67% of the bezafibrate-treated patients had alkaline phosphatase normalization, compared with 0% in the placebo group.
Bezafibrate was also associated with significant reductions in fatigue and itching, as well as surrogate liver fibrosis markers, according to investigators. The serious adverse event rate was similar between groups, as was the rate of end-stage liver complications, which was 4% for each arm.
Fenofibrate was studied in a small 20-patient, open-label, phase 2 study by Dr. Levy and her colleagues. “Alkaline phosphatase significantly improved fairly early when the drug was started,” she said. Treatment was for 48 weeks, and once the drug was discontinued, there was a rebound in alkaline phosphatase levels.
Seladelpar is another PPAR agonist far along in the pipeline, according to Dr. Levy. This selective PPAR-delta agonist demonstrated significant improvements in alkaline phosphatase and other measures in a 12-week trial that included 75 patients with primary biliary cholangitis and incomplete response to UDCA.
While fenofibrate and bezafibrate can be used off-label for primary biliary cholangitis, Dr. Levy said, fenofibrate labeling indicates that it is contraindicated in patients with hepatic or severe renal dysfunction, including primary biliary cholangitis.
According to Dr. Levy, that contraindication is based on experience with clofibrate, a first-generation fibrate that was associated with increased risk of gallstone formation. “If you choose to use fenofibrate, this is off-label use, and you need to warn your patients,” she told attendees at the meeting.
Two other agents under investigation in primary biliary cholangitis that have shown some promising results recently, according to Dr. Levy, include budesonide and an engineered variant of the human hormone FGF19 known as NGM282.
Global Academy and this news organization are owned by the same parent company.
Dr. Levy reported disclosures related to CymaBay Therapeutics, Enanta Pharmaceuticals, Genfit, GenKyoTex, Gilead Sciences, GlaxoSmithKline, Intercept Pharmaceuticals, NGM Biopharmaceuticals, and Novartis.
LAS VEGAS – Evidence is mounting for several adjuvant treatments that may be appropriate for patients with primary biliary cholangitis who are not responding to first-line ursodeoxycholic acid (UDCA), according to Cynthia Levy, MD.
Given these new potential treatment options, it’s important for clinicians to assess biochemical response to first-line UDCA, said Dr. Levy, assistant director for the Schiff Center for Liver Diseases at the University of Miami.
“Up until recently, we didn’t have anything to offer to nonresponders,” Dr. Levy said at the inaugural Perspectives in Digestive Diseases meeting held by Global Academy for Medical Education. “Now we know at 1 year, we need to restratify to evaluate the need for adjuvant therapy.”
Many UDCA-treated patients will not respond to that first-line treatment, putting them at risk for progression to hepatocellular carcinoma and end-stage liver disease, according to Dr. Levy.
Obeticholic acid, a farnesoid X receptor agonist, is now a Food and Drug Administration–approved option for these patients, while fibrates and budesonide have recent data supporting their use and are available off-label, she added.
The conditional FDA approval for obeticholic acid, granted in May 2016, is for treatment as monotherapy in patients who do not tolerate UDCA or in combination with UDCA for patients who had had incomplete responses to that treatment for at least a year. Improvement in survival without liver transplantation has not yet been demonstrated for this agent, though studies are ongoing, Dr. Levy noted.
In the meantime, research is progressing with the peroxisome proliferator-activated receptor alpha (PPAR-alpha) agonists fenofibrate and bezafibrate.
In the BEZURSO study, presented at the 2017 EASL Congress, 100 patients with incomplete response to UDCA were randomized to bezafibrate plus UDCA or placebo plus UDCA for 2 years. The primary endpoint, normalization of liver function tests at 2 years, was achieved in 30% of the bezafibrate group and 0% of the placebo group. In addition, 67% of the bezafibrate-treated patients had alkaline phosphatase normalization, compared with 0% in the placebo group.
Bezafibrate was also associated with significant reductions in fatigue and itching, as well as surrogate liver fibrosis markers, according to investigators. The serious adverse event rate was similar between groups, as was the rate of end-stage liver complications, which was 4% for each arm.
Fenofibrate was studied in a small 20-patient, open-label, phase 2 study by Dr. Levy and her colleagues. “Alkaline phosphatase significantly improved fairly early when the drug was started,” she said. Treatment was for 48 weeks, and once the drug was discontinued, there was a rebound in alkaline phosphatase levels.
Seladelpar is another PPAR agonist far along in the pipeline, according to Dr. Levy. This selective PPAR-delta agonist demonstrated significant improvements in alkaline phosphatase and other measures in a 12-week trial that included 75 patients with primary biliary cholangitis and incomplete response to UDCA.
While fenofibrate and bezafibrate can be used off-label for primary biliary cholangitis, Dr. Levy said, fenofibrate labeling indicates that it is contraindicated in patients with hepatic or severe renal dysfunction, including primary biliary cholangitis.
According to Dr. Levy, that contraindication is based on experience with clofibrate, a first-generation fibrate that was associated with increased risk of gallstone formation. “If you choose to use fenofibrate, this is off-label use, and you need to warn your patients,” she told attendees at the meeting.
Two other agents under investigation in primary biliary cholangitis that have shown some promising results recently, according to Dr. Levy, include budesonide and an engineered variant of the human hormone FGF19 known as NGM282.
Global Academy and this news organization are owned by the same parent company.
Dr. Levy reported disclosures related to CymaBay Therapeutics, Enanta Pharmaceuticals, Genfit, GenKyoTex, Gilead Sciences, GlaxoSmithKline, Intercept Pharmaceuticals, NGM Biopharmaceuticals, and Novartis.
LAS VEGAS – Evidence is mounting for several adjuvant treatments that may be appropriate for patients with primary biliary cholangitis who are not responding to first-line ursodeoxycholic acid (UDCA), according to Cynthia Levy, MD.
Given these new potential treatment options, it’s important for clinicians to assess biochemical response to first-line UDCA, said Dr. Levy, assistant director for the Schiff Center for Liver Diseases at the University of Miami.
“Up until recently, we didn’t have anything to offer to nonresponders,” Dr. Levy said at the inaugural Perspectives in Digestive Diseases meeting held by Global Academy for Medical Education. “Now we know at 1 year, we need to restratify to evaluate the need for adjuvant therapy.”
Many UDCA-treated patients will not respond to that first-line treatment, putting them at risk for progression to hepatocellular carcinoma and end-stage liver disease, according to Dr. Levy.
Obeticholic acid, a farnesoid X receptor agonist, is now a Food and Drug Administration–approved option for these patients, while fibrates and budesonide have recent data supporting their use and are available off-label, she added.
The conditional FDA approval for obeticholic acid, granted in May 2016, is for treatment as monotherapy in patients who do not tolerate UDCA or in combination with UDCA for patients who had had incomplete responses to that treatment for at least a year. Improvement in survival without liver transplantation has not yet been demonstrated for this agent, though studies are ongoing, Dr. Levy noted.
In the meantime, research is progressing with the peroxisome proliferator-activated receptor alpha (PPAR-alpha) agonists fenofibrate and bezafibrate.
In the BEZURSO study, presented at the 2017 EASL Congress, 100 patients with incomplete response to UDCA were randomized to bezafibrate plus UDCA or placebo plus UDCA for 2 years. The primary endpoint, normalization of liver function tests at 2 years, was achieved in 30% of the bezafibrate group and 0% of the placebo group. In addition, 67% of the bezafibrate-treated patients had alkaline phosphatase normalization, compared with 0% in the placebo group.
Bezafibrate was also associated with significant reductions in fatigue and itching, as well as surrogate liver fibrosis markers, according to investigators. The serious adverse event rate was similar between groups, as was the rate of end-stage liver complications, which was 4% for each arm.
Fenofibrate was studied in a small 20-patient, open-label, phase 2 study by Dr. Levy and her colleagues. “Alkaline phosphatase significantly improved fairly early when the drug was started,” she said. Treatment was for 48 weeks, and once the drug was discontinued, there was a rebound in alkaline phosphatase levels.
Seladelpar is another PPAR agonist far along in the pipeline, according to Dr. Levy. This selective PPAR-delta agonist demonstrated significant improvements in alkaline phosphatase and other measures in a 12-week trial that included 75 patients with primary biliary cholangitis and incomplete response to UDCA.
While fenofibrate and bezafibrate can be used off-label for primary biliary cholangitis, Dr. Levy said, fenofibrate labeling indicates that it is contraindicated in patients with hepatic or severe renal dysfunction, including primary biliary cholangitis.
According to Dr. Levy, that contraindication is based on experience with clofibrate, a first-generation fibrate that was associated with increased risk of gallstone formation. “If you choose to use fenofibrate, this is off-label use, and you need to warn your patients,” she told attendees at the meeting.
Two other agents under investigation in primary biliary cholangitis that have shown some promising results recently, according to Dr. Levy, include budesonide and an engineered variant of the human hormone FGF19 known as NGM282.
Global Academy and this news organization are owned by the same parent company.
Dr. Levy reported disclosures related to CymaBay Therapeutics, Enanta Pharmaceuticals, Genfit, GenKyoTex, Gilead Sciences, GlaxoSmithKline, Intercept Pharmaceuticals, NGM Biopharmaceuticals, and Novartis.
EXPERT ANALYSIS FROM PERSPECTIVES IN DIGESTIVE DISEASES
Low Vitamin B12 Level Is Associated With Worsening in Parkinson’s Disease
People with early, untreated Parkinson’s disease who have low vitamin B12 levels appear to have greater worsening of mobility and cognitive decline over time, according to research published online ahead of print March 6 in Movement Disorders. The results suggest that correcting low levels may slow disease progression.
Previous research revealed that low serum vitamin B12 levels are common in patients with moderately advanced Parkinson’s disease and are associated with neuropathy and cognitive impairment. Investigators led by Chadwick W. Christine, MD, a neurologist at the University of California, San Francisco, sought to understand what contributes to variation in the progression of Parkinson’s disease.
An Analysis of the DATATOP Study
Because little is known about B12’s role in early disease, the investigators analyzed data from patients with early, untreated Parkinson’s disease who participated in the DATATOP study, a double-blind, randomized trial designed to test whether treatment with selegiline, the antioxidant alpha-tocopherol, or both slowed disease progression.
They measured serum methylmalonic acid, homocysteine, and holotranscobalamin in addition to B12 because of the limited sensitivity of serum B12 testing alone to detect B12 deficiency. At baseline, 13% of 680 patients had borderline-low B12 levels (ie, less than 184 pmol/L), and 5% had deficient B12 levels (ie, less than 157 pmol/L). Homocysteine was moderately elevated (ie, greater than 15 mmol/L) in 7% of subjects, and 14% of patients with borderline-low B12 also had elevated homocysteine.
Homocysteine Was Associated With Cognitive Outcomes
Low B12 at baseline predicted greater worsening of mobility, in terms of a higher ambulatory capacity score. Investigators calculated this score by adding the falling, freezing when walking, walking, gait, and postural stability scores of the Unified Parkinson’s Disease Rating Scale (UPDRS). Participants in the low-B12 tertile (ie, less than 234 pmol/L) developed greater morbidity, as assessed by greater annualized worsening of the ambulatory capacity score. Participants in the low-B12 tertile had annualized change of 1.53, compared with 0.77 in the upper tertile. The worsening score mostly resulted from poorer gait and postural instability.
“We consider the magnitude of difference to be clinically relevant, particularly given that components of gait dysfunction that develop in Parkinson’s disease may not respond to dopaminergic treatments or [deep brain stimulation],” said Dr. Christine and colleagues.
Elevated homocysteine predicted greater cognitive decline. Baseline elevated homocysteine was associated with lower baseline Mini-Mental State Examination (MMSE) score, as well as greater annualized decline in MMSE (–1.96 vs 0.06).
Of the 456 subjects who continued in the study for nine to 24 months and had a second blood sample available, 226 had an increase of more than 20% in B12 levels, 210 stayed within 20% of the original B12 measurement, and 19 had a decrease greater than 20%.
Overall, mean annualized increase in B12 was 52.6 pmol/L, mean annualized decrease of homocysteine was 0.83 mmol/L, and mean annualized increase of holotranscobalamin was 14.7 pmol/L.
“These findings are consistent with improved nutritional status during the course of the study, likely attributed to subjects starting the optional [multivitamin] after the baseline visit and/or subjects changing their diets,” said the investigators.
While the improvement in B12 status did not lead to statistically significant improvements in UPDRS scores, there was a trend toward improvement, which provides “support for a disease-modifying effect of B12,” they added.
The researchers speculated that a link between low B12 levels and worse outcomes could be attributed to an independent comorbid effect on the CNS and peripheral nervous system or a direct effect on Parkinson’s disease pathogenesis. Alternatively, low B12 may be a marker of an unknown associated factor.
“Given that low B12 status is associated with neurologic and other medical morbidities and is readily treated, great care would be needed to design an ethically acceptable randomized, prospective study to evaluate the effect of B12 supplementation on Parkinson’s disease progression, given that serum measurements collected as part of a prospective study in unsupplemented patients would likely reveal some subjects with B12 deficiency,” said Dr. Christine and colleagues.
Study Raises Questions
“The course of Parkinson’s disease can be quite variable, and it is difficult for clinicians to predict what will happen to an individual person with Parkinson’s disease, but identifying prognostic factors ought to help practitioners answer their patients’ questions and potentially improve the understanding of mechanisms underlying the disease pathogenesis,” said Francisco Cardoso, MD, PhD, Professor of Neurology at the Federal University of Minas Gerais in Belo Horizonte, Brazil, in an accompanying editorial.
If vitamin B12 is related to the progression of Parkinson’s disease, then replacing it may slow patients’ decline. But the findings raise questions that need to be addressed, said Dr. Cardoso.
“First, what constitutes a low vitamin B12 level is not a simple issue. If the evaluation is limited to measurement of vitamin B12 concentration, the diagnosis of genuine deficiency is unreliable. Most experts agree that combined measurement of vitamin B12 with determination of homocysteine levels is necessary, and while Christine et al measured both levels, their statistical analysis and subsequent conclusions are exclusively based on the levels of the vitamin.
“Moreover, 34 patients in the study were classified as having ‘borderline-low’ vitamin B12 levels, and 14 had both borderline-low B12 level and high homocysteine concentration. This brings into question whether the researchers identified Parkinson’s disease patients who actually had low B12 levels.
“The design of the DATATOP trial could also introduce some bias into the findings. At the time of publication, the study was criticized for disregarding the symptomatic effect of selegiline and for a lack of objective definition of criteria for the trial’s primary end point—introduction of levodopa.
“This could have led to the termination of individuals at different stages of the disease, introducing a potential bias in the sample of patients who remained in the study for enough time to undergo subsequent determination of B12 levels.”
Furthermore, the findings of Christine et al contrast with those of previous research. The underlying mechanism of vitamin B12 in Parkinson’s disease is unclear. “Nevertheless, the results of the study by Dr. Christine and his colleagues are intriguing, and further investigations to address this hypothesis are warranted,” Dr. Cardoso concluded.
—Nicola Garrett
Suggested Reading
Cardoso F. Vitamin B12 and Parkinson’s disease: What is the relationship? Mov Disord. 2018 Mar 6 [Epub ahead of print].
Christine CW, Auinger P, Joslin A, et al. Vitamin B12 and homocysteine levels predict different outcomes in early Parkinson’s disease. Mov Disord. 2018 Mar 6 [Epub ahead of print].
People with early, untreated Parkinson’s disease who have low vitamin B12 levels appear to have greater worsening of mobility and cognitive decline over time, according to research published online ahead of print March 6 in Movement Disorders. The results suggest that correcting low levels may slow disease progression.
Previous research revealed that low serum vitamin B12 levels are common in patients with moderately advanced Parkinson’s disease and are associated with neuropathy and cognitive impairment. Investigators led by Chadwick W. Christine, MD, a neurologist at the University of California, San Francisco, sought to understand what contributes to variation in the progression of Parkinson’s disease.
An Analysis of the DATATOP Study
Because little is known about B12’s role in early disease, the investigators analyzed data from patients with early, untreated Parkinson’s disease who participated in the DATATOP study, a double-blind, randomized trial designed to test whether treatment with selegiline, the antioxidant alpha-tocopherol, or both slowed disease progression.
They measured serum methylmalonic acid, homocysteine, and holotranscobalamin in addition to B12 because of the limited sensitivity of serum B12 testing alone to detect B12 deficiency. At baseline, 13% of 680 patients had borderline-low B12 levels (ie, less than 184 pmol/L), and 5% had deficient B12 levels (ie, less than 157 pmol/L). Homocysteine was moderately elevated (ie, greater than 15 mmol/L) in 7% of subjects, and 14% of patients with borderline-low B12 also had elevated homocysteine.
Homocysteine Was Associated With Cognitive Outcomes
Low B12 at baseline predicted greater worsening of mobility, in terms of a higher ambulatory capacity score. Investigators calculated this score by adding the falling, freezing when walking, walking, gait, and postural stability scores of the Unified Parkinson’s Disease Rating Scale (UPDRS). Participants in the low-B12 tertile (ie, less than 234 pmol/L) developed greater morbidity, as assessed by greater annualized worsening of the ambulatory capacity score. Participants in the low-B12 tertile had annualized change of 1.53, compared with 0.77 in the upper tertile. The worsening score mostly resulted from poorer gait and postural instability.
“We consider the magnitude of difference to be clinically relevant, particularly given that components of gait dysfunction that develop in Parkinson’s disease may not respond to dopaminergic treatments or [deep brain stimulation],” said Dr. Christine and colleagues.
Elevated homocysteine predicted greater cognitive decline. Baseline elevated homocysteine was associated with lower baseline Mini-Mental State Examination (MMSE) score, as well as greater annualized decline in MMSE (–1.96 vs 0.06).
Of the 456 subjects who continued in the study for nine to 24 months and had a second blood sample available, 226 had an increase of more than 20% in B12 levels, 210 stayed within 20% of the original B12 measurement, and 19 had a decrease greater than 20%.
Overall, mean annualized increase in B12 was 52.6 pmol/L, mean annualized decrease of homocysteine was 0.83 mmol/L, and mean annualized increase of holotranscobalamin was 14.7 pmol/L.
“These findings are consistent with improved nutritional status during the course of the study, likely attributed to subjects starting the optional [multivitamin] after the baseline visit and/or subjects changing their diets,” said the investigators.
While the improvement in B12 status did not lead to statistically significant improvements in UPDRS scores, there was a trend toward improvement, which provides “support for a disease-modifying effect of B12,” they added.
The researchers speculated that a link between low B12 levels and worse outcomes could be attributed to an independent comorbid effect on the CNS and peripheral nervous system or a direct effect on Parkinson’s disease pathogenesis. Alternatively, low B12 may be a marker of an unknown associated factor.
“Given that low B12 status is associated with neurologic and other medical morbidities and is readily treated, great care would be needed to design an ethically acceptable randomized, prospective study to evaluate the effect of B12 supplementation on Parkinson’s disease progression, given that serum measurements collected as part of a prospective study in unsupplemented patients would likely reveal some subjects with B12 deficiency,” said Dr. Christine and colleagues.
Study Raises Questions
“The course of Parkinson’s disease can be quite variable, and it is difficult for clinicians to predict what will happen to an individual person with Parkinson’s disease, but identifying prognostic factors ought to help practitioners answer their patients’ questions and potentially improve the understanding of mechanisms underlying the disease pathogenesis,” said Francisco Cardoso, MD, PhD, Professor of Neurology at the Federal University of Minas Gerais in Belo Horizonte, Brazil, in an accompanying editorial.
If vitamin B12 is related to the progression of Parkinson’s disease, then replacing it may slow patients’ decline. But the findings raise questions that need to be addressed, said Dr. Cardoso.
“First, what constitutes a low vitamin B12 level is not a simple issue. If the evaluation is limited to measurement of vitamin B12 concentration, the diagnosis of genuine deficiency is unreliable. Most experts agree that combined measurement of vitamin B12 with determination of homocysteine levels is necessary, and while Christine et al measured both levels, their statistical analysis and subsequent conclusions are exclusively based on the levels of the vitamin.
“Moreover, 34 patients in the study were classified as having ‘borderline-low’ vitamin B12 levels, and 14 had both borderline-low B12 level and high homocysteine concentration. This brings into question whether the researchers identified Parkinson’s disease patients who actually had low B12 levels.
“The design of the DATATOP trial could also introduce some bias into the findings. At the time of publication, the study was criticized for disregarding the symptomatic effect of selegiline and for a lack of objective definition of criteria for the trial’s primary end point—introduction of levodopa.
“This could have led to the termination of individuals at different stages of the disease, introducing a potential bias in the sample of patients who remained in the study for enough time to undergo subsequent determination of B12 levels.”
Furthermore, the findings of Christine et al contrast with those of previous research. The underlying mechanism of vitamin B12 in Parkinson’s disease is unclear. “Nevertheless, the results of the study by Dr. Christine and his colleagues are intriguing, and further investigations to address this hypothesis are warranted,” Dr. Cardoso concluded.
—Nicola Garrett
Suggested Reading
Cardoso F. Vitamin B12 and Parkinson’s disease: What is the relationship? Mov Disord. 2018 Mar 6 [Epub ahead of print].
Christine CW, Auinger P, Joslin A, et al. Vitamin B12 and homocysteine levels predict different outcomes in early Parkinson’s disease. Mov Disord. 2018 Mar 6 [Epub ahead of print].
People with early, untreated Parkinson’s disease who have low vitamin B12 levels appear to have greater worsening of mobility and cognitive decline over time, according to research published online ahead of print March 6 in Movement Disorders. The results suggest that correcting low levels may slow disease progression.
Previous research revealed that low serum vitamin B12 levels are common in patients with moderately advanced Parkinson’s disease and are associated with neuropathy and cognitive impairment. Investigators led by Chadwick W. Christine, MD, a neurologist at the University of California, San Francisco, sought to understand what contributes to variation in the progression of Parkinson’s disease.
An Analysis of the DATATOP Study
Because little is known about B12’s role in early disease, the investigators analyzed data from patients with early, untreated Parkinson’s disease who participated in the DATATOP study, a double-blind, randomized trial designed to test whether treatment with selegiline, the antioxidant alpha-tocopherol, or both slowed disease progression.
They measured serum methylmalonic acid, homocysteine, and holotranscobalamin in addition to B12 because of the limited sensitivity of serum B12 testing alone to detect B12 deficiency. At baseline, 13% of 680 patients had borderline-low B12 levels (ie, less than 184 pmol/L), and 5% had deficient B12 levels (ie, less than 157 pmol/L). Homocysteine was moderately elevated (ie, greater than 15 mmol/L) in 7% of subjects, and 14% of patients with borderline-low B12 also had elevated homocysteine.
Homocysteine Was Associated With Cognitive Outcomes
Low B12 at baseline predicted greater worsening of mobility, in terms of a higher ambulatory capacity score. Investigators calculated this score by adding the falling, freezing when walking, walking, gait, and postural stability scores of the Unified Parkinson’s Disease Rating Scale (UPDRS). Participants in the low-B12 tertile (ie, less than 234 pmol/L) developed greater morbidity, as assessed by greater annualized worsening of the ambulatory capacity score. Participants in the low-B12 tertile had annualized change of 1.53, compared with 0.77 in the upper tertile. The worsening score mostly resulted from poorer gait and postural instability.
“We consider the magnitude of difference to be clinically relevant, particularly given that components of gait dysfunction that develop in Parkinson’s disease may not respond to dopaminergic treatments or [deep brain stimulation],” said Dr. Christine and colleagues.
Elevated homocysteine predicted greater cognitive decline. Baseline elevated homocysteine was associated with lower baseline Mini-Mental State Examination (MMSE) score, as well as greater annualized decline in MMSE (–1.96 vs 0.06).
Of the 456 subjects who continued in the study for nine to 24 months and had a second blood sample available, 226 had an increase of more than 20% in B12 levels, 210 stayed within 20% of the original B12 measurement, and 19 had a decrease greater than 20%.
Overall, mean annualized increase in B12 was 52.6 pmol/L, mean annualized decrease of homocysteine was 0.83 mmol/L, and mean annualized increase of holotranscobalamin was 14.7 pmol/L.
“These findings are consistent with improved nutritional status during the course of the study, likely attributed to subjects starting the optional [multivitamin] after the baseline visit and/or subjects changing their diets,” said the investigators.
While the improvement in B12 status did not lead to statistically significant improvements in UPDRS scores, there was a trend toward improvement, which provides “support for a disease-modifying effect of B12,” they added.
The researchers speculated that a link between low B12 levels and worse outcomes could be attributed to an independent comorbid effect on the CNS and peripheral nervous system or a direct effect on Parkinson’s disease pathogenesis. Alternatively, low B12 may be a marker of an unknown associated factor.
“Given that low B12 status is associated with neurologic and other medical morbidities and is readily treated, great care would be needed to design an ethically acceptable randomized, prospective study to evaluate the effect of B12 supplementation on Parkinson’s disease progression, given that serum measurements collected as part of a prospective study in unsupplemented patients would likely reveal some subjects with B12 deficiency,” said Dr. Christine and colleagues.
Study Raises Questions
“The course of Parkinson’s disease can be quite variable, and it is difficult for clinicians to predict what will happen to an individual person with Parkinson’s disease, but identifying prognostic factors ought to help practitioners answer their patients’ questions and potentially improve the understanding of mechanisms underlying the disease pathogenesis,” said Francisco Cardoso, MD, PhD, Professor of Neurology at the Federal University of Minas Gerais in Belo Horizonte, Brazil, in an accompanying editorial.
If vitamin B12 is related to the progression of Parkinson’s disease, then replacing it may slow patients’ decline. But the findings raise questions that need to be addressed, said Dr. Cardoso.
“First, what constitutes a low vitamin B12 level is not a simple issue. If the evaluation is limited to measurement of vitamin B12 concentration, the diagnosis of genuine deficiency is unreliable. Most experts agree that combined measurement of vitamin B12 with determination of homocysteine levels is necessary, and while Christine et al measured both levels, their statistical analysis and subsequent conclusions are exclusively based on the levels of the vitamin.
“Moreover, 34 patients in the study were classified as having ‘borderline-low’ vitamin B12 levels, and 14 had both borderline-low B12 level and high homocysteine concentration. This brings into question whether the researchers identified Parkinson’s disease patients who actually had low B12 levels.
“The design of the DATATOP trial could also introduce some bias into the findings. At the time of publication, the study was criticized for disregarding the symptomatic effect of selegiline and for a lack of objective definition of criteria for the trial’s primary end point—introduction of levodopa.
“This could have led to the termination of individuals at different stages of the disease, introducing a potential bias in the sample of patients who remained in the study for enough time to undergo subsequent determination of B12 levels.”
Furthermore, the findings of Christine et al contrast with those of previous research. The underlying mechanism of vitamin B12 in Parkinson’s disease is unclear. “Nevertheless, the results of the study by Dr. Christine and his colleagues are intriguing, and further investigations to address this hypothesis are warranted,” Dr. Cardoso concluded.
—Nicola Garrett
Suggested Reading
Cardoso F. Vitamin B12 and Parkinson’s disease: What is the relationship? Mov Disord. 2018 Mar 6 [Epub ahead of print].
Christine CW, Auinger P, Joslin A, et al. Vitamin B12 and homocysteine levels predict different outcomes in early Parkinson’s disease. Mov Disord. 2018 Mar 6 [Epub ahead of print].
Young Women With Stroke Have Higher Rates of Pregnancy Complications
When compared with the general population, young women with stroke have more pregnancy loss throughout their lives, according to research published in the April issue of Stroke. After stroke, nulliparous women more frequently experience serious pregnancy complications, compared with the general population. “We found that one out of three women experiences a serious pregnancy complication after stroke,” said Mayte E. van Alebeek, MD, of the Department of Neurology at the Donders Institute for Brain, Cognition, and Behavior, Center for Neuroscience in Nijmegen, the Netherlands, and colleagues.
“Our cohort shows high rates of miscarriages, multiple miscarriages, and extremely high rates of fetal death,” said Dr. van Alebeek. “Our study provides insight about the frequency of pregnancy complications in women who experience a stroke at young age.”
A Prospective Stroke Study
Although
The study was a part of the Dutch Follow-Up of TIA and Stroke Patients and Unelucidated Risk Factor Evaluation (FUTURE) study. Eligible participants were women with first-ever TIA or ischemic stroke who reported that they had been pregnant at least once. Exclusion criteria were cerebral venous sinus thrombosis and retinal infarction. The investigators defined TIA as rapidly evolving focal neurologic deficit without positive phenomena such as twitches, jerks, or myoclonus, with vascular cause only, and persisting for fewer than 24 hours. They defined stroke as focal neurologic deficit persisting for more than 24 hours.
The primary outcome was the occurrence of pregnancy complications (ie, gestational hypertension; preeclampsia; hemolysis, elevated liver enzymes, low platelet count [HELLP] syndrome; preterm delivery; gestational diabetes mellitus; and miscarriage). The secondary outcome was the risk of any vascular event after stroke, stratified by the occurrence of pregnancy complications. Researchers identified the occurrence of recurrent vascular events during a telephone assessment.
Miscarriages Occurred in 35.2% of Women With Stroke
Two hundred thirteen participants completed follow-up assessment on vascular events and pregnancy complications. The mean age at event was 39.6, with a mean follow-up of 12.7 years. The number of pregnancies was unknown for three women. Of the remaining 210 women, 569 pregnancies resulted in 425 live births. All pregnancy complications were equally reported in the nulliparious (patients who experienced stroke/TIA before their first pregnancy of a live-born child), primi/multiparous (patients who have had one more pregnancies), and the gravidas (during pregnancy or postpartum, defined as within six weeks after delivery) groups.
Miscarriage occurred in 35.2% of women with stroke vs 13.5% of the Dutch population. Fetal death occurred in 6.1% of women with stroke vs 0.9% of the Dutch population.
Compared with the Dutch population, nulliparous women after stroke had a high prevalence of hypertensive disorders in pregnancy (12.2% vs 33.3%), HELLP syndrome (0.5% vs 9.5%), and early preterm delivery less than 32 weeks (1.4% vs 9.0%).
In primiparous and multiparous women after stroke, 29 events occurred. None of these events occurred during subsequent pregnancies. A history of hypertensive disorder in pregnancy did not modify this risk.
This is the first study to address the risk of pregnancy complications in a large group of women after their stroke, said the researchers.
These findings “may imply that women with a history of stroke should be put under intensive control of a gynecologist during pregnancy to prevent serious and possibly life-threatening pregnancy complications,” Dr. van Alebeek and his team concluded.
—Erica Tricarico
Suggested Reading
van Alebeek ME, de Vrijer M, Arntz RM, et al. Increased risk of pregnancy complications after stroke: the FUTURE study (Follow-Up of Transient Ischemic Attack and Stroke Patients and Unelucidated Risk Factor Evaluation). Stroke. 2018;49(4):877-883.
When compared with the general population, young women with stroke have more pregnancy loss throughout their lives, according to research published in the April issue of Stroke. After stroke, nulliparous women more frequently experience serious pregnancy complications, compared with the general population. “We found that one out of three women experiences a serious pregnancy complication after stroke,” said Mayte E. van Alebeek, MD, of the Department of Neurology at the Donders Institute for Brain, Cognition, and Behavior, Center for Neuroscience in Nijmegen, the Netherlands, and colleagues.
“Our cohort shows high rates of miscarriages, multiple miscarriages, and extremely high rates of fetal death,” said Dr. van Alebeek. “Our study provides insight about the frequency of pregnancy complications in women who experience a stroke at young age.”
A Prospective Stroke Study
Although
The study was a part of the Dutch Follow-Up of TIA and Stroke Patients and Unelucidated Risk Factor Evaluation (FUTURE) study. Eligible participants were women with first-ever TIA or ischemic stroke who reported that they had been pregnant at least once. Exclusion criteria were cerebral venous sinus thrombosis and retinal infarction. The investigators defined TIA as rapidly evolving focal neurologic deficit without positive phenomena such as twitches, jerks, or myoclonus, with vascular cause only, and persisting for fewer than 24 hours. They defined stroke as focal neurologic deficit persisting for more than 24 hours.
The primary outcome was the occurrence of pregnancy complications (ie, gestational hypertension; preeclampsia; hemolysis, elevated liver enzymes, low platelet count [HELLP] syndrome; preterm delivery; gestational diabetes mellitus; and miscarriage). The secondary outcome was the risk of any vascular event after stroke, stratified by the occurrence of pregnancy complications. Researchers identified the occurrence of recurrent vascular events during a telephone assessment.
Miscarriages Occurred in 35.2% of Women With Stroke
Two hundred thirteen participants completed follow-up assessment on vascular events and pregnancy complications. The mean age at event was 39.6, with a mean follow-up of 12.7 years. The number of pregnancies was unknown for three women. Of the remaining 210 women, 569 pregnancies resulted in 425 live births. All pregnancy complications were equally reported in the nulliparious (patients who experienced stroke/TIA before their first pregnancy of a live-born child), primi/multiparous (patients who have had one more pregnancies), and the gravidas (during pregnancy or postpartum, defined as within six weeks after delivery) groups.
Miscarriage occurred in 35.2% of women with stroke vs 13.5% of the Dutch population. Fetal death occurred in 6.1% of women with stroke vs 0.9% of the Dutch population.
Compared with the Dutch population, nulliparous women after stroke had a high prevalence of hypertensive disorders in pregnancy (12.2% vs 33.3%), HELLP syndrome (0.5% vs 9.5%), and early preterm delivery less than 32 weeks (1.4% vs 9.0%).
In primiparous and multiparous women after stroke, 29 events occurred. None of these events occurred during subsequent pregnancies. A history of hypertensive disorder in pregnancy did not modify this risk.
This is the first study to address the risk of pregnancy complications in a large group of women after their stroke, said the researchers.
These findings “may imply that women with a history of stroke should be put under intensive control of a gynecologist during pregnancy to prevent serious and possibly life-threatening pregnancy complications,” Dr. van Alebeek and his team concluded.
—Erica Tricarico
Suggested Reading
van Alebeek ME, de Vrijer M, Arntz RM, et al. Increased risk of pregnancy complications after stroke: the FUTURE study (Follow-Up of Transient Ischemic Attack and Stroke Patients and Unelucidated Risk Factor Evaluation). Stroke. 2018;49(4):877-883.
When compared with the general population, young women with stroke have more pregnancy loss throughout their lives, according to research published in the April issue of Stroke. After stroke, nulliparous women more frequently experience serious pregnancy complications, compared with the general population. “We found that one out of three women experiences a serious pregnancy complication after stroke,” said Mayte E. van Alebeek, MD, of the Department of Neurology at the Donders Institute for Brain, Cognition, and Behavior, Center for Neuroscience in Nijmegen, the Netherlands, and colleagues.
“Our cohort shows high rates of miscarriages, multiple miscarriages, and extremely high rates of fetal death,” said Dr. van Alebeek. “Our study provides insight about the frequency of pregnancy complications in women who experience a stroke at young age.”
A Prospective Stroke Study
Although
The study was a part of the Dutch Follow-Up of TIA and Stroke Patients and Unelucidated Risk Factor Evaluation (FUTURE) study. Eligible participants were women with first-ever TIA or ischemic stroke who reported that they had been pregnant at least once. Exclusion criteria were cerebral venous sinus thrombosis and retinal infarction. The investigators defined TIA as rapidly evolving focal neurologic deficit without positive phenomena such as twitches, jerks, or myoclonus, with vascular cause only, and persisting for fewer than 24 hours. They defined stroke as focal neurologic deficit persisting for more than 24 hours.
The primary outcome was the occurrence of pregnancy complications (ie, gestational hypertension; preeclampsia; hemolysis, elevated liver enzymes, low platelet count [HELLP] syndrome; preterm delivery; gestational diabetes mellitus; and miscarriage). The secondary outcome was the risk of any vascular event after stroke, stratified by the occurrence of pregnancy complications. Researchers identified the occurrence of recurrent vascular events during a telephone assessment.
Miscarriages Occurred in 35.2% of Women With Stroke
Two hundred thirteen participants completed follow-up assessment on vascular events and pregnancy complications. The mean age at event was 39.6, with a mean follow-up of 12.7 years. The number of pregnancies was unknown for three women. Of the remaining 210 women, 569 pregnancies resulted in 425 live births. All pregnancy complications were equally reported in the nulliparious (patients who experienced stroke/TIA before their first pregnancy of a live-born child), primi/multiparous (patients who have had one more pregnancies), and the gravidas (during pregnancy or postpartum, defined as within six weeks after delivery) groups.
Miscarriage occurred in 35.2% of women with stroke vs 13.5% of the Dutch population. Fetal death occurred in 6.1% of women with stroke vs 0.9% of the Dutch population.
Compared with the Dutch population, nulliparous women after stroke had a high prevalence of hypertensive disorders in pregnancy (12.2% vs 33.3%), HELLP syndrome (0.5% vs 9.5%), and early preterm delivery less than 32 weeks (1.4% vs 9.0%).
In primiparous and multiparous women after stroke, 29 events occurred. None of these events occurred during subsequent pregnancies. A history of hypertensive disorder in pregnancy did not modify this risk.
This is the first study to address the risk of pregnancy complications in a large group of women after their stroke, said the researchers.
These findings “may imply that women with a history of stroke should be put under intensive control of a gynecologist during pregnancy to prevent serious and possibly life-threatening pregnancy complications,” Dr. van Alebeek and his team concluded.
—Erica Tricarico
Suggested Reading
van Alebeek ME, de Vrijer M, Arntz RM, et al. Increased risk of pregnancy complications after stroke: the FUTURE study (Follow-Up of Transient Ischemic Attack and Stroke Patients and Unelucidated Risk Factor Evaluation). Stroke. 2018;49(4):877-883.
Siponimod May Benefit Patients With Secondary Progressive MS
Among patients with secondary progressive multiple sclerosis (MS), siponimod appears to decrease the risk of disability progression, according to data published online ahead of print March 22 in Lancet. The drug’s safety profile is similar to those of other sphingosine-1-phosphate receptor modulators, and siponimod “might be a useful treatment for patients with secondary progressive MS,” according to the authors.
To date, no molecule has slowed disability progression consistently in clinical trials of patients with secondary progressive MS. Preclinical data indicate that siponimod might prevent synaptic neurodegeneration and promote remyelination. The drug reduced the number of active brain lesions and the annualized relapse rate in a phase II study.
Patients Received 2 mg/day of Oral Siponimod
Ludwig Kappos, MD, Professor of Neurology at the University of Basel in Switzerland, and colleagues conducted a double-blind, phase III trial to assess siponimod’s efficacy and safety in patients with secondary progressive MS. They enrolled 1,651 patients at 292 hospitals and MS centers in 31 countries. Eligible participants were between ages 18 and 60, had an Expanded Disability Status Scale (EDSS) score of 3.0 to 6.5, a history of relapsing-remitting MS, EDSS progression in the previous two years, and no evidence of relapse in the three months before randomization.
The investigators randomized patients 2:1 to once-daily oral siponimod (2 mg) or matching placebo. A trained assessor performed a full neurologic examination every three months. Participants underwent MRI scans at baseline, 12 months, 24 months, 36 months, and the end of the controlled treatment phase. Patients with six-month confirmed disability progression during the double-blind phase were given the opportunity to continue double-blind treatment, switch to open-label siponimod, or stop study treatment and remain untreated or receive another therapy.
The study’s primary end point was time to three-month confirmed disability progression, which was defined as a one-point increase in EDSS for patients with a baseline score of 3.0 to 5.0, or a 0.5-point increase for patients with a baseline score of 5.5 to 6.5. The key secondary end points were time to three-month confirmed worsening of at least 20% from baseline on the Timed 25-Foot Walk (T25FW) and change from baseline in T2 lesion volume.
Treatment Did Not Affect the T25FW
Dr. Kappos and colleagues randomized 1,105 patients to siponimod and 546 patients to placebo. Baseline characteristics were similar between the two study arms. Mean age was 48, and about 60% of patients were women. Participants’ median time in the study was 21 months, and median exposure to study drug was 18 months. Approximately 82% of the siponimod group and 78% of controls completed the study.
The rate of three-month confirmed disability progression was 26% in the siponimod group and 32% in the placebo group. Siponimod thus reduced the risk of this outcome by 21%. The researchers did not observe a significant difference between groups in time to three-month confirmed worsening of at least 20% on the T25FW. The mean increase in T2 lesion volume from baseline was 183.9 mm3 in the siponimod group and 879.2 mm3 among controls.
Siponimod reduced the risk of six-month confirmed disability progression by 26%, compared with placebo. It also was associated with a lower annualized relapse rate and a longer time to confirmed first relapse, compared with placebo.
The rate of adverse events was 89% in the siponimod group and 82% among controls. The rate of serious adverse events was 18% in the siponimod group and 15% among controls. The most frequent adverse events were headache, nasopharyngitis, urinary tract infection, and falls. Serious adverse events included increased liver transaminase concentrations, basal cell carcinoma, concussion, depression, urinary tract infection, suicide attempt, gait disturbance, MS relapse, and paraparesis. The rate of discontinuation because of adverse events was 8% for siponimod and 5% for placebo.
Subgroup analyses suggested that the treatment effect of siponimod decreased with increasing age, disability, baseline disease duration, and diminishing signs of disease activity. One interpretation of this finding “is that siponimod exerts its effect on both aspects of the pathogenesis of secondary progressive disease, albeit not equally,” according to the authors.
The study was funded by Novartis Pharma, which helped design and conduct the study; collect, manage, analyze, and interpret the data; and write the study report.
Drug Might Affect Inflammatory Activity
“The reduction in the proportion of participants reaching the primary end point of only 6% and the absence of a significant difference for the key secondary clinical outcome are disappointing results and do not suggest that siponimod is an effective treatment for secondary progressive MS,” said Luanne M. Metz, MD, Professor of Medicine, and Wei-Qiao Liu, MD, a doctoral student, both at the University of Calgary, Alberta, in an accompanying editorial.
Although siponimod had a “small benefit” on the primary end point, it did not reduce the time to three-month confirmed worsening of the T25FW, said the authors. “Worsening of the T25FW by 20%, as required in this trial, is a reliable measure of change and suggests clinical significance.”
The significant differences between the siponimod group and controls on the other secondary outcomes “might all reflect an effect on the inflammatory disease activity that characterizes relapsing-remitting MS, and this trial does not, in our opinion, provide convincing evidence that we have found a treatment that exerts its clinical effect through other mechanisms,” said Drs. Metz and Liu.
“Confidence in the treatment benefit of siponimod in progressive MS will … require confirmation in a second trial,” they continued. “Trials of other novel treatments that target noninflammatory mechanisms are still needed.”
—Erik Greb
Suggested Reading
Kappos L, Bar-Or A, Cree BAC, et al. Siponimod versus placebo in secondary progressive multiple sclerosis (EXPAND): a double-blind, randomised, phase 3 study. Lancet. 2018 Mar 22 [Epub ahead of print].
Metz LM, Liu WQ. Effective treatment of progressive MS remains elusive. Lancet. 2018 Mar 22 [Epub ahead of print].
Among patients with secondary progressive multiple sclerosis (MS), siponimod appears to decrease the risk of disability progression, according to data published online ahead of print March 22 in Lancet. The drug’s safety profile is similar to those of other sphingosine-1-phosphate receptor modulators, and siponimod “might be a useful treatment for patients with secondary progressive MS,” according to the authors.
To date, no molecule has slowed disability progression consistently in clinical trials of patients with secondary progressive MS. Preclinical data indicate that siponimod might prevent synaptic neurodegeneration and promote remyelination. The drug reduced the number of active brain lesions and the annualized relapse rate in a phase II study.
Patients Received 2 mg/day of Oral Siponimod
Ludwig Kappos, MD, Professor of Neurology at the University of Basel in Switzerland, and colleagues conducted a double-blind, phase III trial to assess siponimod’s efficacy and safety in patients with secondary progressive MS. They enrolled 1,651 patients at 292 hospitals and MS centers in 31 countries. Eligible participants were between ages 18 and 60, had an Expanded Disability Status Scale (EDSS) score of 3.0 to 6.5, a history of relapsing-remitting MS, EDSS progression in the previous two years, and no evidence of relapse in the three months before randomization.
The investigators randomized patients 2:1 to once-daily oral siponimod (2 mg) or matching placebo. A trained assessor performed a full neurologic examination every three months. Participants underwent MRI scans at baseline, 12 months, 24 months, 36 months, and the end of the controlled treatment phase. Patients with six-month confirmed disability progression during the double-blind phase were given the opportunity to continue double-blind treatment, switch to open-label siponimod, or stop study treatment and remain untreated or receive another therapy.
The study’s primary end point was time to three-month confirmed disability progression, which was defined as a one-point increase in EDSS for patients with a baseline score of 3.0 to 5.0, or a 0.5-point increase for patients with a baseline score of 5.5 to 6.5. The key secondary end points were time to three-month confirmed worsening of at least 20% from baseline on the Timed 25-Foot Walk (T25FW) and change from baseline in T2 lesion volume.
Treatment Did Not Affect the T25FW
Dr. Kappos and colleagues randomized 1,105 patients to siponimod and 546 patients to placebo. Baseline characteristics were similar between the two study arms. Mean age was 48, and about 60% of patients were women. Participants’ median time in the study was 21 months, and median exposure to study drug was 18 months. Approximately 82% of the siponimod group and 78% of controls completed the study.
The rate of three-month confirmed disability progression was 26% in the siponimod group and 32% in the placebo group. Siponimod thus reduced the risk of this outcome by 21%. The researchers did not observe a significant difference between groups in time to three-month confirmed worsening of at least 20% on the T25FW. The mean increase in T2 lesion volume from baseline was 183.9 mm3 in the siponimod group and 879.2 mm3 among controls.
Siponimod reduced the risk of six-month confirmed disability progression by 26%, compared with placebo. It also was associated with a lower annualized relapse rate and a longer time to confirmed first relapse, compared with placebo.
The rate of adverse events was 89% in the siponimod group and 82% among controls. The rate of serious adverse events was 18% in the siponimod group and 15% among controls. The most frequent adverse events were headache, nasopharyngitis, urinary tract infection, and falls. Serious adverse events included increased liver transaminase concentrations, basal cell carcinoma, concussion, depression, urinary tract infection, suicide attempt, gait disturbance, MS relapse, and paraparesis. The rate of discontinuation because of adverse events was 8% for siponimod and 5% for placebo.
Subgroup analyses suggested that the treatment effect of siponimod decreased with increasing age, disability, baseline disease duration, and diminishing signs of disease activity. One interpretation of this finding “is that siponimod exerts its effect on both aspects of the pathogenesis of secondary progressive disease, albeit not equally,” according to the authors.
The study was funded by Novartis Pharma, which helped design and conduct the study; collect, manage, analyze, and interpret the data; and write the study report.
Drug Might Affect Inflammatory Activity
“The reduction in the proportion of participants reaching the primary end point of only 6% and the absence of a significant difference for the key secondary clinical outcome are disappointing results and do not suggest that siponimod is an effective treatment for secondary progressive MS,” said Luanne M. Metz, MD, Professor of Medicine, and Wei-Qiao Liu, MD, a doctoral student, both at the University of Calgary, Alberta, in an accompanying editorial.
Although siponimod had a “small benefit” on the primary end point, it did not reduce the time to three-month confirmed worsening of the T25FW, said the authors. “Worsening of the T25FW by 20%, as required in this trial, is a reliable measure of change and suggests clinical significance.”
The significant differences between the siponimod group and controls on the other secondary outcomes “might all reflect an effect on the inflammatory disease activity that characterizes relapsing-remitting MS, and this trial does not, in our opinion, provide convincing evidence that we have found a treatment that exerts its clinical effect through other mechanisms,” said Drs. Metz and Liu.
“Confidence in the treatment benefit of siponimod in progressive MS will … require confirmation in a second trial,” they continued. “Trials of other novel treatments that target noninflammatory mechanisms are still needed.”
—Erik Greb
Suggested Reading
Kappos L, Bar-Or A, Cree BAC, et al. Siponimod versus placebo in secondary progressive multiple sclerosis (EXPAND): a double-blind, randomised, phase 3 study. Lancet. 2018 Mar 22 [Epub ahead of print].
Metz LM, Liu WQ. Effective treatment of progressive MS remains elusive. Lancet. 2018 Mar 22 [Epub ahead of print].
Among patients with secondary progressive multiple sclerosis (MS), siponimod appears to decrease the risk of disability progression, according to data published online ahead of print March 22 in Lancet. The drug’s safety profile is similar to those of other sphingosine-1-phosphate receptor modulators, and siponimod “might be a useful treatment for patients with secondary progressive MS,” according to the authors.
To date, no molecule has slowed disability progression consistently in clinical trials of patients with secondary progressive MS. Preclinical data indicate that siponimod might prevent synaptic neurodegeneration and promote remyelination. The drug reduced the number of active brain lesions and the annualized relapse rate in a phase II study.
Patients Received 2 mg/day of Oral Siponimod
Ludwig Kappos, MD, Professor of Neurology at the University of Basel in Switzerland, and colleagues conducted a double-blind, phase III trial to assess siponimod’s efficacy and safety in patients with secondary progressive MS. They enrolled 1,651 patients at 292 hospitals and MS centers in 31 countries. Eligible participants were between ages 18 and 60, had an Expanded Disability Status Scale (EDSS) score of 3.0 to 6.5, a history of relapsing-remitting MS, EDSS progression in the previous two years, and no evidence of relapse in the three months before randomization.
The investigators randomized patients 2:1 to once-daily oral siponimod (2 mg) or matching placebo. A trained assessor performed a full neurologic examination every three months. Participants underwent MRI scans at baseline, 12 months, 24 months, 36 months, and the end of the controlled treatment phase. Patients with six-month confirmed disability progression during the double-blind phase were given the opportunity to continue double-blind treatment, switch to open-label siponimod, or stop study treatment and remain untreated or receive another therapy.
The study’s primary end point was time to three-month confirmed disability progression, which was defined as a one-point increase in EDSS for patients with a baseline score of 3.0 to 5.0, or a 0.5-point increase for patients with a baseline score of 5.5 to 6.5. The key secondary end points were time to three-month confirmed worsening of at least 20% from baseline on the Timed 25-Foot Walk (T25FW) and change from baseline in T2 lesion volume.
Treatment Did Not Affect the T25FW
Dr. Kappos and colleagues randomized 1,105 patients to siponimod and 546 patients to placebo. Baseline characteristics were similar between the two study arms. Mean age was 48, and about 60% of patients were women. Participants’ median time in the study was 21 months, and median exposure to study drug was 18 months. Approximately 82% of the siponimod group and 78% of controls completed the study.
The rate of three-month confirmed disability progression was 26% in the siponimod group and 32% in the placebo group. Siponimod thus reduced the risk of this outcome by 21%. The researchers did not observe a significant difference between groups in time to three-month confirmed worsening of at least 20% on the T25FW. The mean increase in T2 lesion volume from baseline was 183.9 mm3 in the siponimod group and 879.2 mm3 among controls.
Siponimod reduced the risk of six-month confirmed disability progression by 26%, compared with placebo. It also was associated with a lower annualized relapse rate and a longer time to confirmed first relapse, compared with placebo.
The rate of adverse events was 89% in the siponimod group and 82% among controls. The rate of serious adverse events was 18% in the siponimod group and 15% among controls. The most frequent adverse events were headache, nasopharyngitis, urinary tract infection, and falls. Serious adverse events included increased liver transaminase concentrations, basal cell carcinoma, concussion, depression, urinary tract infection, suicide attempt, gait disturbance, MS relapse, and paraparesis. The rate of discontinuation because of adverse events was 8% for siponimod and 5% for placebo.
Subgroup analyses suggested that the treatment effect of siponimod decreased with increasing age, disability, baseline disease duration, and diminishing signs of disease activity. One interpretation of this finding “is that siponimod exerts its effect on both aspects of the pathogenesis of secondary progressive disease, albeit not equally,” according to the authors.
The study was funded by Novartis Pharma, which helped design and conduct the study; collect, manage, analyze, and interpret the data; and write the study report.
Drug Might Affect Inflammatory Activity
“The reduction in the proportion of participants reaching the primary end point of only 6% and the absence of a significant difference for the key secondary clinical outcome are disappointing results and do not suggest that siponimod is an effective treatment for secondary progressive MS,” said Luanne M. Metz, MD, Professor of Medicine, and Wei-Qiao Liu, MD, a doctoral student, both at the University of Calgary, Alberta, in an accompanying editorial.
Although siponimod had a “small benefit” on the primary end point, it did not reduce the time to three-month confirmed worsening of the T25FW, said the authors. “Worsening of the T25FW by 20%, as required in this trial, is a reliable measure of change and suggests clinical significance.”
The significant differences between the siponimod group and controls on the other secondary outcomes “might all reflect an effect on the inflammatory disease activity that characterizes relapsing-remitting MS, and this trial does not, in our opinion, provide convincing evidence that we have found a treatment that exerts its clinical effect through other mechanisms,” said Drs. Metz and Liu.
“Confidence in the treatment benefit of siponimod in progressive MS will … require confirmation in a second trial,” they continued. “Trials of other novel treatments that target noninflammatory mechanisms are still needed.”
—Erik Greb
Suggested Reading
Kappos L, Bar-Or A, Cree BAC, et al. Siponimod versus placebo in secondary progressive multiple sclerosis (EXPAND): a double-blind, randomised, phase 3 study. Lancet. 2018 Mar 22 [Epub ahead of print].
Metz LM, Liu WQ. Effective treatment of progressive MS remains elusive. Lancet. 2018 Mar 22 [Epub ahead of print].
MRI Techniques Could Help Distinguish Between MS and Migraine
STOWE, VT—Some patients with migraine receive an inappropriate diagnosis of multiple sclerosis (MS). The two disorders share certain clinical and radiologic features, and misdiagnosis is a significant problem. Using MRI scanners widely available to clinicians, researchers are developing several imaging techniques that can provide an objective basis for distinguishing between MS and migraine, according to an overview provided at the Headache Cooperative of New England’s 28th Annual Stowe Headache Symposium.
The imaging techniques evaluate different aspects of MS pathology, said Andrew J. Solomon, MD, Associate Professor of Neurological Sciences at the University of Vermont College of Medicine in Burlington. The techniques have been automated to a large extent, which reduces the need for human interpretation of data. The incorporation of machine learning could further aid differential diagnosis.
Grounds for Confusion
Various similarities between migraine and MS increase the likelihood of misdiagnosis. The two disorders are chronic and entail attacks and remissions. Both are associated with changes in brain structure and white matter abnormalities that may be subclinical.
In a study of patients with migraine by Liu et al, between 25% and 35% of participants met MRI criteria for dissemination in space for MS, depending on how lesions were defined. The first report of natalizumab-associated progressive multifocal leukoencephalopathy occurred in a patient who, on autopsy, was found not to have had MS. In a 1988 study, Engell and colleagues found that of 518 consecutive patients who had died with a diagnosis of clinically definite MS, the diagnosis was incorrect for 6%.
In 2005, Carmosino and colleagues evaluated 281 patients who had been referred to an MS center and found that 67% of them did not have MS. The investigators identified 37 alternative diagnoses, of which migraine was the second most common. About 10% of participants had a final diagnosis of migraine.
In a recent survey, Dr. Solomon and colleagues asked more than 100 MS specialists whether they had seen patients who had had a diagnosis of MS for more than one year, but, on evaluation, determined that they did not have MS. Approximately 95% of respondents answered affirmatively. About 40% of respondents reported having seen three to five such patients in the previous year.
The current diagnostic criteria for MS rely on clinicians to interpret clinical and radiologic data and contain many caveats regarding their application, said Dr. Solomon. The criteria “were not developed to differentiate MS from other disorders,” but to predict which patients with an initial neurologic syndrome typical for MS will subsequently develop MS, he added. Physicians who are unfamiliar with the diagnostic criteria may misapply them and make an incorrect diagnosis.
The Central Vein Sign
Autopsy studies have indicated that MS lesions generally center around veins. Researchers have recently been able to visualize these veins within MS lesions using 7-T MRI. This finding, which investigators have called the central vein sign, could be a way to distinguish MS from other disorders. But 7-T MRI generally is not available to clinical neurologists. In 2012, scientists at the NIH developed a method that combines T2* imaging, which helps visualize veins, and fluid-attenuated inversion recovery (FLAIR) imaging that visualizes MS lesions. This method visualizes veins within lesions, or central vein sign, using 3-T MRI, which is more commonly available to clinical neurologists. The researchers called this sequence FLAIR*, and numerous studies have suggested that it may differentiate MS from other diagnoses.
Dr. Solomon and collaborators tested this technique on a group of 10 patients with MS who had no other comorbidities for white matter disease and 10 patients with migraine and white matter abnormalities who also had no other comorbidities for white matter disease. The mean percentage of lesions with central vessels per participant was 80% in patients with MS and 34% in migraineurs. The patients with migraine had fewer juxtacortical, periventricular, and infratentorial lesions, compared with patients with MS.
Because researchers have used various definitions of the central vein sign, Dr. Solomon and colleagues published a consensus statement to improve the interpretation of the imaging findings. They recommended that neurologists disregard periventricular lesions and concentrate on subcortical and white matter lesions that are visible from two perspectives.
Another limitation of this diagnostic imaging technique is that it “requires evaluation of every single lesion to determine if a central vein was present,” said Dr. Solomon. He and his colleagues developed a simplified algorithm that required the examination of three lesions. To test this algorithm, they examined their original cohort plus 10 patients with MS and comorbidities for white matter disease (eg, migraine or hypertension) and 10 patients who had been misdiagnosed with MS (most of whom had migraine). Three blinded raters examined three lesions chosen at random from each MRI. This method had a 0.98 specificity for MS and a sensitivity of 0.52. The study demonstrated problems with inter-rater reliability, however.
Dr. Solomon later collaborated with researchers at the University of Pennsylvania to develop a machine learning technique that could identify the central vein sign. When they applied the technique to the expanded cohort of 40 patients, it identified the sign accurately with an area under the curve of about 0.86. The central vein sign may be a good biomarker for MS, and using this automated technique to assess 3-T MRI images appears to be clinically applicable, said Dr. Solomon.
Thalamic Volume
Thalamic atrophy is common in the early stages of relapsing-remitting MS. The thalamus also is implicated in migraine. Although studies have examined volumetric brain changes in migraine, none has examined thalamic volume specifically, said Dr. Solomon.
He and his colleagues used an automatic segmentation method to analyze thalamic volume in their cohort of 40 patients. Analysis of variance indicated that thalamic volume was significantly smaller in patients with MS, compared with patients without MS. When the researchers used a thalamic volume less than 0.0077 as a cutoff, the technique’s sensitivity and specificity for the diagnosis of MS were 0.75.
Recent data suggest that thalamic atrophy in MS does not result from thalamic lesions, but from diffuse white matter abnormalities. Like the central vein sign, thalamic atrophy may reflect MS pathophysiology and could be incorporated into MS diagnostic criteria, said Dr. Solomon.
Cortical Lesions
Autopsy and MRI studies have shown that cortical lesions are characteristic of MS, but MRI studies have suggested that migraineurs generally do not have cortical lesions. Although neurologists can see these lesions in vivo on 7-T MRI, 3-T MRI is not as sensitive and makes cortical lesion detection challenging.
In 2017, Nakamura and colleagues found that ratio maps of T1- and T2-weighted 3-T MRI, images that are acquired in routine clinical care for MS, could identify areas of cortical demyelination. Dr. Solomon and colleagues tested whether this method could distinguish MS from migraine. They defined a z score of less than 3 as an indication of low myelin density. When they examined the cohort of 40 patients, they were able to correlate areas with z scores below the cutoff with cortical lesions that were visible on conventional imaging. The technique accurately distinguished patients with MS from patients with migraine.
None of these emerging imaging techniques is 100% accurate. In the future, however, combining several of these techniques in conjunction with tests of blood biomarkers such as microRNA could accurately distinguish between MS and other disorders with high specificity and sensitivity, Dr. Solomon concluded.
—Erik Greb
Suggested Reading
Carmosino MJ, Brousseau KM, Arciniegas DB, Corboy JR. Initial evaluations for multiple sclerosis in a university multiple sclerosis center: outcomes and role of magnetic resonance imaging in referral. Arch Neurol. 2005;62(4):585-590.
Engell T. A clinico-pathoanatomical study of multiple sclerosis diagnosis. Acta Neurol Scand. 1988;78(1):39-44.
Liu S, Kullnat J, Bourdette D, et al. Prevalence of brain magnetic resonance imaging meeting Barkhof and McDonald criteria for dissemination in space among headache patients. Mult Scler. 2013;19(8):1101-1105.
Nakamura K, Chen JT, Ontaneda D, et al. T1-/T2-weighted ratio differs in demyelinated cortex in multiple sclerosis. Ann Neurol. 2017;82(4):635-639.
Sati P, Oh J, Constable RT, et al. The central vein sign and its clinical evaluation for the diagnosis of multiple sclerosis: a consensus statement from the North American Imaging in Multiple Sclerosis Cooperative. Nat Rev Neurol. 2016;12(12):714-722.
Solomon AJ, Klein EP, Bourdette D. “Undiagnosing” multiple sclerosis: the challenge of misdiagnosis in MS. Neurology. 2012;78(24):1986-1991.
Solomon AJ, Schindler MK, Howard DB, et al. “Central vessel sign” on 3T FLAIR* MRI for the differentiation of multiple sclerosis from migraine. Ann Clin Transl Neurol. 2015;3(2):82-87.
Solomon AJ, Watts R, Dewey BE, Reich DS. MRI evaluation of thalamic volume differentiates MS from common mimics. Neurol Neuroimmunol Neuroinflamm. 2017;4(5):e387.
STOWE, VT—Some patients with migraine receive an inappropriate diagnosis of multiple sclerosis (MS). The two disorders share certain clinical and radiologic features, and misdiagnosis is a significant problem. Using MRI scanners widely available to clinicians, researchers are developing several imaging techniques that can provide an objective basis for distinguishing between MS and migraine, according to an overview provided at the Headache Cooperative of New England’s 28th Annual Stowe Headache Symposium.
The imaging techniques evaluate different aspects of MS pathology, said Andrew J. Solomon, MD, Associate Professor of Neurological Sciences at the University of Vermont College of Medicine in Burlington. The techniques have been automated to a large extent, which reduces the need for human interpretation of data. The incorporation of machine learning could further aid differential diagnosis.
Grounds for Confusion
Various similarities between migraine and MS increase the likelihood of misdiagnosis. The two disorders are chronic and entail attacks and remissions. Both are associated with changes in brain structure and white matter abnormalities that may be subclinical.
In a study of patients with migraine by Liu et al, between 25% and 35% of participants met MRI criteria for dissemination in space for MS, depending on how lesions were defined. The first report of natalizumab-associated progressive multifocal leukoencephalopathy occurred in a patient who, on autopsy, was found not to have had MS. In a 1988 study, Engell and colleagues found that of 518 consecutive patients who had died with a diagnosis of clinically definite MS, the diagnosis was incorrect for 6%.
In 2005, Carmosino and colleagues evaluated 281 patients who had been referred to an MS center and found that 67% of them did not have MS. The investigators identified 37 alternative diagnoses, of which migraine was the second most common. About 10% of participants had a final diagnosis of migraine.
In a recent survey, Dr. Solomon and colleagues asked more than 100 MS specialists whether they had seen patients who had had a diagnosis of MS for more than one year, but, on evaluation, determined that they did not have MS. Approximately 95% of respondents answered affirmatively. About 40% of respondents reported having seen three to five such patients in the previous year.
The current diagnostic criteria for MS rely on clinicians to interpret clinical and radiologic data and contain many caveats regarding their application, said Dr. Solomon. The criteria “were not developed to differentiate MS from other disorders,” but to predict which patients with an initial neurologic syndrome typical for MS will subsequently develop MS, he added. Physicians who are unfamiliar with the diagnostic criteria may misapply them and make an incorrect diagnosis.
The Central Vein Sign
Autopsy studies have indicated that MS lesions generally center around veins. Researchers have recently been able to visualize these veins within MS lesions using 7-T MRI. This finding, which investigators have called the central vein sign, could be a way to distinguish MS from other disorders. But 7-T MRI generally is not available to clinical neurologists. In 2012, scientists at the NIH developed a method that combines T2* imaging, which helps visualize veins, and fluid-attenuated inversion recovery (FLAIR) imaging that visualizes MS lesions. This method visualizes veins within lesions, or central vein sign, using 3-T MRI, which is more commonly available to clinical neurologists. The researchers called this sequence FLAIR*, and numerous studies have suggested that it may differentiate MS from other diagnoses.
Dr. Solomon and collaborators tested this technique on a group of 10 patients with MS who had no other comorbidities for white matter disease and 10 patients with migraine and white matter abnormalities who also had no other comorbidities for white matter disease. The mean percentage of lesions with central vessels per participant was 80% in patients with MS and 34% in migraineurs. The patients with migraine had fewer juxtacortical, periventricular, and infratentorial lesions, compared with patients with MS.
Because researchers have used various definitions of the central vein sign, Dr. Solomon and colleagues published a consensus statement to improve the interpretation of the imaging findings. They recommended that neurologists disregard periventricular lesions and concentrate on subcortical and white matter lesions that are visible from two perspectives.
Another limitation of this diagnostic imaging technique is that it “requires evaluation of every single lesion to determine if a central vein was present,” said Dr. Solomon. He and his colleagues developed a simplified algorithm that required the examination of three lesions. To test this algorithm, they examined their original cohort plus 10 patients with MS and comorbidities for white matter disease (eg, migraine or hypertension) and 10 patients who had been misdiagnosed with MS (most of whom had migraine). Three blinded raters examined three lesions chosen at random from each MRI. This method had a 0.98 specificity for MS and a sensitivity of 0.52. The study demonstrated problems with inter-rater reliability, however.
Dr. Solomon later collaborated with researchers at the University of Pennsylvania to develop a machine learning technique that could identify the central vein sign. When they applied the technique to the expanded cohort of 40 patients, it identified the sign accurately with an area under the curve of about 0.86. The central vein sign may be a good biomarker for MS, and using this automated technique to assess 3-T MRI images appears to be clinically applicable, said Dr. Solomon.
Thalamic Volume
Thalamic atrophy is common in the early stages of relapsing-remitting MS. The thalamus also is implicated in migraine. Although studies have examined volumetric brain changes in migraine, none has examined thalamic volume specifically, said Dr. Solomon.
He and his colleagues used an automatic segmentation method to analyze thalamic volume in their cohort of 40 patients. Analysis of variance indicated that thalamic volume was significantly smaller in patients with MS, compared with patients without MS. When the researchers used a thalamic volume less than 0.0077 as a cutoff, the technique’s sensitivity and specificity for the diagnosis of MS were 0.75.
Recent data suggest that thalamic atrophy in MS does not result from thalamic lesions, but from diffuse white matter abnormalities. Like the central vein sign, thalamic atrophy may reflect MS pathophysiology and could be incorporated into MS diagnostic criteria, said Dr. Solomon.
Cortical Lesions
Autopsy and MRI studies have shown that cortical lesions are characteristic of MS, but MRI studies have suggested that migraineurs generally do not have cortical lesions. Although neurologists can see these lesions in vivo on 7-T MRI, 3-T MRI is not as sensitive and makes cortical lesion detection challenging.
In 2017, Nakamura and colleagues found that ratio maps of T1- and T2-weighted 3-T MRI, images that are acquired in routine clinical care for MS, could identify areas of cortical demyelination. Dr. Solomon and colleagues tested whether this method could distinguish MS from migraine. They defined a z score of less than 3 as an indication of low myelin density. When they examined the cohort of 40 patients, they were able to correlate areas with z scores below the cutoff with cortical lesions that were visible on conventional imaging. The technique accurately distinguished patients with MS from patients with migraine.
None of these emerging imaging techniques is 100% accurate. In the future, however, combining several of these techniques in conjunction with tests of blood biomarkers such as microRNA could accurately distinguish between MS and other disorders with high specificity and sensitivity, Dr. Solomon concluded.
—Erik Greb
Suggested Reading
Carmosino MJ, Brousseau KM, Arciniegas DB, Corboy JR. Initial evaluations for multiple sclerosis in a university multiple sclerosis center: outcomes and role of magnetic resonance imaging in referral. Arch Neurol. 2005;62(4):585-590.
Engell T. A clinico-pathoanatomical study of multiple sclerosis diagnosis. Acta Neurol Scand. 1988;78(1):39-44.
Liu S, Kullnat J, Bourdette D, et al. Prevalence of brain magnetic resonance imaging meeting Barkhof and McDonald criteria for dissemination in space among headache patients. Mult Scler. 2013;19(8):1101-1105.
Nakamura K, Chen JT, Ontaneda D, et al. T1-/T2-weighted ratio differs in demyelinated cortex in multiple sclerosis. Ann Neurol. 2017;82(4):635-639.
Sati P, Oh J, Constable RT, et al. The central vein sign and its clinical evaluation for the diagnosis of multiple sclerosis: a consensus statement from the North American Imaging in Multiple Sclerosis Cooperative. Nat Rev Neurol. 2016;12(12):714-722.
Solomon AJ, Klein EP, Bourdette D. “Undiagnosing” multiple sclerosis: the challenge of misdiagnosis in MS. Neurology. 2012;78(24):1986-1991.
Solomon AJ, Schindler MK, Howard DB, et al. “Central vessel sign” on 3T FLAIR* MRI for the differentiation of multiple sclerosis from migraine. Ann Clin Transl Neurol. 2015;3(2):82-87.
Solomon AJ, Watts R, Dewey BE, Reich DS. MRI evaluation of thalamic volume differentiates MS from common mimics. Neurol Neuroimmunol Neuroinflamm. 2017;4(5):e387.
STOWE, VT—Some patients with migraine receive an inappropriate diagnosis of multiple sclerosis (MS). The two disorders share certain clinical and radiologic features, and misdiagnosis is a significant problem. Using MRI scanners widely available to clinicians, researchers are developing several imaging techniques that can provide an objective basis for distinguishing between MS and migraine, according to an overview provided at the Headache Cooperative of New England’s 28th Annual Stowe Headache Symposium.
The imaging techniques evaluate different aspects of MS pathology, said Andrew J. Solomon, MD, Associate Professor of Neurological Sciences at the University of Vermont College of Medicine in Burlington. The techniques have been automated to a large extent, which reduces the need for human interpretation of data. The incorporation of machine learning could further aid differential diagnosis.
Grounds for Confusion
Various similarities between migraine and MS increase the likelihood of misdiagnosis. The two disorders are chronic and entail attacks and remissions. Both are associated with changes in brain structure and white matter abnormalities that may be subclinical.
In a study of patients with migraine by Liu et al, between 25% and 35% of participants met MRI criteria for dissemination in space for MS, depending on how lesions were defined. The first report of natalizumab-associated progressive multifocal leukoencephalopathy occurred in a patient who, on autopsy, was found not to have had MS. In a 1988 study, Engell and colleagues found that of 518 consecutive patients who had died with a diagnosis of clinically definite MS, the diagnosis was incorrect for 6%.
In 2005, Carmosino and colleagues evaluated 281 patients who had been referred to an MS center and found that 67% of them did not have MS. The investigators identified 37 alternative diagnoses, of which migraine was the second most common. About 10% of participants had a final diagnosis of migraine.
In a recent survey, Dr. Solomon and colleagues asked more than 100 MS specialists whether they had seen patients who had had a diagnosis of MS for more than one year, but, on evaluation, determined that they did not have MS. Approximately 95% of respondents answered affirmatively. About 40% of respondents reported having seen three to five such patients in the previous year.
The current diagnostic criteria for MS rely on clinicians to interpret clinical and radiologic data and contain many caveats regarding their application, said Dr. Solomon. The criteria “were not developed to differentiate MS from other disorders,” but to predict which patients with an initial neurologic syndrome typical for MS will subsequently develop MS, he added. Physicians who are unfamiliar with the diagnostic criteria may misapply them and make an incorrect diagnosis.
The Central Vein Sign
Autopsy studies have indicated that MS lesions generally center around veins. Researchers have recently been able to visualize these veins within MS lesions using 7-T MRI. This finding, which investigators have called the central vein sign, could be a way to distinguish MS from other disorders. But 7-T MRI generally is not available to clinical neurologists. In 2012, scientists at the NIH developed a method that combines T2* imaging, which helps visualize veins, and fluid-attenuated inversion recovery (FLAIR) imaging that visualizes MS lesions. This method visualizes veins within lesions, or central vein sign, using 3-T MRI, which is more commonly available to clinical neurologists. The researchers called this sequence FLAIR*, and numerous studies have suggested that it may differentiate MS from other diagnoses.
Dr. Solomon and collaborators tested this technique on a group of 10 patients with MS who had no other comorbidities for white matter disease and 10 patients with migraine and white matter abnormalities who also had no other comorbidities for white matter disease. The mean percentage of lesions with central vessels per participant was 80% in patients with MS and 34% in migraineurs. The patients with migraine had fewer juxtacortical, periventricular, and infratentorial lesions, compared with patients with MS.
Because researchers have used various definitions of the central vein sign, Dr. Solomon and colleagues published a consensus statement to improve the interpretation of the imaging findings. They recommended that neurologists disregard periventricular lesions and concentrate on subcortical and white matter lesions that are visible from two perspectives.
Another limitation of this diagnostic imaging technique is that it “requires evaluation of every single lesion to determine if a central vein was present,” said Dr. Solomon. He and his colleagues developed a simplified algorithm that required the examination of three lesions. To test this algorithm, they examined their original cohort plus 10 patients with MS and comorbidities for white matter disease (eg, migraine or hypertension) and 10 patients who had been misdiagnosed with MS (most of whom had migraine). Three blinded raters examined three lesions chosen at random from each MRI. This method had a 0.98 specificity for MS and a sensitivity of 0.52. The study demonstrated problems with inter-rater reliability, however.
Dr. Solomon later collaborated with researchers at the University of Pennsylvania to develop a machine learning technique that could identify the central vein sign. When they applied the technique to the expanded cohort of 40 patients, it identified the sign accurately with an area under the curve of about 0.86. The central vein sign may be a good biomarker for MS, and using this automated technique to assess 3-T MRI images appears to be clinically applicable, said Dr. Solomon.
Thalamic Volume
Thalamic atrophy is common in the early stages of relapsing-remitting MS. The thalamus also is implicated in migraine. Although studies have examined volumetric brain changes in migraine, none has examined thalamic volume specifically, said Dr. Solomon.
He and his colleagues used an automatic segmentation method to analyze thalamic volume in their cohort of 40 patients. Analysis of variance indicated that thalamic volume was significantly smaller in patients with MS, compared with patients without MS. When the researchers used a thalamic volume less than 0.0077 as a cutoff, the technique’s sensitivity and specificity for the diagnosis of MS were 0.75.
Recent data suggest that thalamic atrophy in MS does not result from thalamic lesions, but from diffuse white matter abnormalities. Like the central vein sign, thalamic atrophy may reflect MS pathophysiology and could be incorporated into MS diagnostic criteria, said Dr. Solomon.
Cortical Lesions
Autopsy and MRI studies have shown that cortical lesions are characteristic of MS, but MRI studies have suggested that migraineurs generally do not have cortical lesions. Although neurologists can see these lesions in vivo on 7-T MRI, 3-T MRI is not as sensitive and makes cortical lesion detection challenging.
In 2017, Nakamura and colleagues found that ratio maps of T1- and T2-weighted 3-T MRI, images that are acquired in routine clinical care for MS, could identify areas of cortical demyelination. Dr. Solomon and colleagues tested whether this method could distinguish MS from migraine. They defined a z score of less than 3 as an indication of low myelin density. When they examined the cohort of 40 patients, they were able to correlate areas with z scores below the cutoff with cortical lesions that were visible on conventional imaging. The technique accurately distinguished patients with MS from patients with migraine.
None of these emerging imaging techniques is 100% accurate. In the future, however, combining several of these techniques in conjunction with tests of blood biomarkers such as microRNA could accurately distinguish between MS and other disorders with high specificity and sensitivity, Dr. Solomon concluded.
—Erik Greb
Suggested Reading
Carmosino MJ, Brousseau KM, Arciniegas DB, Corboy JR. Initial evaluations for multiple sclerosis in a university multiple sclerosis center: outcomes and role of magnetic resonance imaging in referral. Arch Neurol. 2005;62(4):585-590.
Engell T. A clinico-pathoanatomical study of multiple sclerosis diagnosis. Acta Neurol Scand. 1988;78(1):39-44.
Liu S, Kullnat J, Bourdette D, et al. Prevalence of brain magnetic resonance imaging meeting Barkhof and McDonald criteria for dissemination in space among headache patients. Mult Scler. 2013;19(8):1101-1105.
Nakamura K, Chen JT, Ontaneda D, et al. T1-/T2-weighted ratio differs in demyelinated cortex in multiple sclerosis. Ann Neurol. 2017;82(4):635-639.
Sati P, Oh J, Constable RT, et al. The central vein sign and its clinical evaluation for the diagnosis of multiple sclerosis: a consensus statement from the North American Imaging in Multiple Sclerosis Cooperative. Nat Rev Neurol. 2016;12(12):714-722.
Solomon AJ, Klein EP, Bourdette D. “Undiagnosing” multiple sclerosis: the challenge of misdiagnosis in MS. Neurology. 2012;78(24):1986-1991.
Solomon AJ, Schindler MK, Howard DB, et al. “Central vessel sign” on 3T FLAIR* MRI for the differentiation of multiple sclerosis from migraine. Ann Clin Transl Neurol. 2015;3(2):82-87.
Solomon AJ, Watts R, Dewey BE, Reich DS. MRI evaluation of thalamic volume differentiates MS from common mimics. Neurol Neuroimmunol Neuroinflamm. 2017;4(5):e387.
Conference News Roundup—European Society of Cardiology
Stroke Prevention Drugs May Reduce Dementia Risk
Patients with atrial fibrillation could reduce the risk of dementia by taking stroke prevention medications, according to recommendations published online ahead of print March 18 in EP Europace and presented at the conference. The international consensus document was also published in Heart Rhythm, the official journal of the Heart Rhythm Society (HRS), and Journal of Arrhythmia, the official journal of the Japanese Heart Rhythm Society (JHRS) and the Asia Pacific Heart Rhythm Society (APHRS).
The expert consensus statement on arrhythmias and cognitive function was developed by the European Heart Rhythm Association (EHRA), a branch of the European Society of Cardiology (ESC); HRS; APHRS; and the Latin American Heart Rhythm Society (LAHRS).
Arrhythmias, as well as some procedures undertaken to treat them, can increase the risk of cognitive decline and dementia. The international consensus document was written for doctors specializing in arrhythmias and aims to raise awareness of the risks of cognitive impairment and dementia and of methods to reduce them.
Atrial fibrillation is associated with a higher risk for cognitive impairment and dementia, even in the absence of apparent stroke, according to the document. This increased risk may arise because atrial fibrillation is linked with a more than twofold risk of silent strokes. The accumulation of silent strokes and the associated brain injuries over time may contribute to cognitive impairment.
Stroke prevention with oral anticoagulant drugs is the main priority in the management of patients with atrial fibrillation. Oral anticoagulation may reduce the risk of dementia, according to the consensus document.
Adopting a healthy lifestyle also may reduce the risk of cognitive decline in patients with atrial fibrillation. This lifestyle includes not smoking and preventing or controlling hypertension, obesity, diabetes, and sleep apnea.
The document also reviews the association between other arrhythmias and cognitive dysfunction, including postcardiac arrest, in patients with cardiac implantable devices such as implantable cardioverter defibrillators and pacemakers, and ablation procedures.
Treatment of atrial fibrillation with catheter ablation can itself lead to silent strokes and cognitive impairment. To reduce this risk, physicians should follow recommendations for performing ablation and for the management of patients before and after the procedure, according to the document.
Physicians may suspect cognitive impairment if a patient’s appearance or behavior changes (eg, if appointments are missed). Family members should be asked for collateral information. If suspicions are confirmed, the consensus document recommends tools to conduct an objective assessment of cognitive function.
The paper highlights gaps in knowledge and areas for further research. These gaps include, for instance, how to identify patients with atrial fibrillation at increased risk of cognitive impairment and dementia, the effect of rhythm control on cognitive function, and the impact of cardiac resynchronization therapy on cognitive function.
EHRA Updates Guide on NOACs
A new version of the European Heart Rhythm Association (EHRA) Practical Guide on the use of non-vitamin K antagonist oral anticoagulants (NOACs) in patients with atrial fibrillation was published online ahead of print March 19 in European Heart Journal and presented at the meeting.
“European Society of Cardiology guidelines state that NOACs should be preferred over vitamin K antagonists, such as warfarin, for stroke prevention in patients with atrial fibrillation, except those with a mechanical heart valve or rheumatic mitral valve stenosis, and their use in clinical practice is increasing,” said Jan Steffel, MD, Head of the Department of Cardiology at University Heart Center Zurich.
The guide gives advice about how to use NOACs in specific clinical situations. While companies provide a Summary of Product Characteristics for a drug, there are legal restrictions on the content, and the information is often not detailed enough for doctors.
The 2018 edition of the guide has several new chapters. One outlines how to use NOACs in particular groups of patients, including those with very low body weight, the very obese, athletes, frail patients for whom there is concern about bleeding, and patients with cognitive impairment who may forget to take their pills.
Another new chapter briefly summarizes the correct dosing of NOACs in conditions other than atrial fibrillation, such as prevention of deep venous thrombosis, treatment of venous thromboembolism, and treatment of ischemic heart disease. The dosing for each condition is different, which underscores the need for clarity.
Updated advice is given on the combined use of antiplatelets and NOACs in patients with coronary artery disease, particularly those with an acute coronary syndrome or patients scheduled for percutaneous coronary intervention with stenting.
The guide also offers scientific evidence about the use of anticoagulants around cardioversion. The authors give detailed advice about what to do in patients on long-term NOAC treatment who need cardioversion versus patients newly diagnosed with atrial fibrillation and started on a NOAC before cardioversion.
Since the previous edition of the guide was published, the first NOAC reversal agent has received market approval. The authors provide advice about using idarucizumab, which reverses the anticoagulant effect of dabigatran, when there is acute bleeding, when urgent surgery is required, or when the patient has a stroke. Guidance is also included on andexanet alfa, another reversal agent expected to receive market approval, with the caveat that the instructions on the label should be followed.
Unlike warfarin, NOACs do not require monitoring of plasma levels followed by dose adjustments. The guide describes rare scenarios in which physicians might want to know the NOAC plasma level. One scenario concerns patients undergoing major surgery in whom it is unclear, for example because of other drugs or renal dysfunction, whether the usual practice of stopping the NOAC 48 hours in advance is sufficient. The plasma level of the NOAC could be measured just before surgery to confirm that the anticoagulant effect has waned.
The chapter on drug–drug interactions has been expanded with information about anticancer and antiepileptic drugs. “While this is mostly based on potential pharmacokinetic interactions and case reports, it is the first of its kind. This is likely to be adapted and become more complete over the years as our experience increases at this new frontier,” said Dr. Steffel.
Apixaban Is Safe During Catheter Ablation
Apixaban and warfarin are equally safe during catheter ablation of atrial fibrillation, according to results of the AXAFA-AFNET 5 trial. The drugs have similar rates of stroke and bleeding, and an improvement in cognitive function was shown for the first time.
Nearly one-third of all strokes are caused by atrial fibrillation. Oral anticoagulation is the cornerstone of stroke prevention in patients with atrial fibrillation. European Society of Cardiology (ESC) guidelines recommend non-vitamin K antagonist oral anticoagulants (NOACs) in preference over vitamin K antagonists (VKAs) such as warfarin, except in patients with a mechanical heart valve or rheumatic mitral valve stenosis. Unlike VKAs, NOACs do not require frequent monitoring and dose adjustment, and NOACs reduce long-term rates of stroke and death, compared with VKAs.
Catheter ablation is used in patients with atrial fibrillation to restore and maintain the heart’s normal rhythm, but the procedure entails risks of stroke, bleeding, acute brain lesions, and cognitive impairment. ESC guidelines recommend that patients continue taking their prescribed NOAC or VKA during the procedure. The results of this study confirm that the NOAC apixaban is as safe as a VKA in this situation.
The AXAFA-AFNET 5 trial was the first randomized trial to examine whether continuous apixaban was a safe alternative to a VKA during catheter ablation of atrial fibrillation. In all, 633 patients with atrial fibrillation and additional stroke risk factors scheduled to undergo atrial fibrillation ablation in Europe and the United States were randomized to receive either continuous apixaban or the locally used VKA (ie, warfarin, phenprocoumon, acenocoumarol, or fluindione).
The primary outcome was a composite of all-cause death, stroke, and major bleeding up to three months after ablation. It occurred in 22 patients randomized to apixaban and 23 randomized to VKA. “The results show that apixaban is a safe alternative to warfarin during catheter ablation of atrial fibrillation in patients at risk of stroke,” said Professor Paulus Kirchhof, MD, Chair in Cardiovascular Medicine at the University of Birmingham in the United Kingdom.
The researchers assessed cognitive function at the beginning and end of the trial and found that it improved equally in both treatment groups. “This is the first randomized trial to show that cognitive function is improving after atrial fibrillation ablation,” said Professor Kirchhof. “It is possible that this is due to continuous anticoagulation, although we did not test this specifically.” An MRI substudy in 335 patients showed a similar rate of silent strokes in the apixaban (27%) and VKA (25%) groups.
Patients in the trial were four years older than participants of previous studies with the NOACs rivaroxaban and dabigatran, said Professor Kirchhof. Local investigators chose the VKA and catheter ablation procedure, which led to the use of various drugs and techniques. “These characteristics of the trial mean that the results apply to older patients and in different clinical settings,” said Professor Kirchhof.
European Society of Cardiology Publishes Guidelines on Syncope
European Society of Cardiology guidelines on syncope were presented at the conference and published online ahead of print March 19 in the European Heart Journal.
Syncope is a transient loss of consciousness caused by reduced blood flow to the brain. Approximately 50% of people have one syncopal event during their lifetime. The most common type is vasovagal syncope, commonly known as fainting, triggered by fear, seeing blood, or prolonged standing, for example.
The challenge for doctors is to identify the minority of patients whose syncope is caused by a potentially deadly heart problem. The guidelines recommend a new algorithm for emergency departments to stratify patients and discharge those at low risk. Patients at intermediate or high risk should receive diagnostic tests in the emergency department or an outpatient syncope clinic.
“The new pathway avoids costly hospitalizations while ensuring the patient is properly diagnosed and treated,” said Professor Michele Brignole, MD, a cardiologist at Ospedali del Tigullio in Lavagna, Italy.
Most syncope does not increase the risk of death, but it can cause injury due to falls or be dangerous in certain occupations, such as airline pilots. The guidelines provide recommendations on how to prevent syncope, which include keeping hydrated; avoiding hot, crowded environments; tensing the muscles; and lying down. The document gives advice on driving for patients with syncope, although the risk of accidents is low.
The document emphasizes the value of video recording in the hospital or at home to improve diagnosis. It recommends that friends and relatives use their smartphones to film the attack and recovery. Clinical clues, such as the duration of the loss of consciousness, whether the patient’s eyes are open or closed, and jerky movements, can distinguish between syncope, epilepsy, and other conditions.
Another diagnostic tool is the implantable loop recorder, a small device inserted underneath the skin of the chest that records the heart’s electrical signals. The guidelines recommend extending its use for diagnosis in patients with unexplained falls, suspected epilepsy, or recurrent episodes of unexplained syncope and a low risk of sudden cardiac death.
The guidelines include an addendum with practical instructions for doctors about how to perform and interpret diagnostic tests.
“The Task Force that prepared the guidelines was truly multidisciplinary,” said Professor Brignole. “A minority of cardiologists was joined by experts in emergency medicine, internal medicine and physiology, neurology and autonomic diseases, geriatric medicine, and nursing.”
Drinking Alcohol Makes the Heart Race
The more alcohol one drinks, the higher one’s heart rate gets, according to research. Binge drinking has been linked with atrial fibrillation, a phenomenon called “the holiday heart syndrome.” The connection was initially based on small studies and anecdotal evidence from the late 1970s.
The Munich Beer Related Electro-cardiogram Workup (Munich BREW) study was conducted by researchers from the LMU University Hospital Munich Department of Cardiology and supported by the German Cardiovascular Research Centre and the European Commission. It was the first assessment of the acute effects of alcohol on ECG readings. The study included more than 3,000 people attending the 2015 Munich Oktoberfest.ECG readings were taken, and breath alcohol concentrations were measured. Age, sex, heart disease, heart medications, and smoking status were recorded. Participants were, on average, 35 years old, and 30% were women.
The average breath alcohol concentration was 0.85 g/kg. Increasing breath alcohol concentration was significantly associated with sinus tachycardia of more than 100 bpm in 25.9% of the cohort.
The current analysis of the MunichBREW study looked in more detail at the quantitative ECG measurements in 3,012 participants. The researchers investigated the association between blood alcohol concentration and the ECG parameters of excitation (ie, heart rate), conduction (ie, PR interval and QRS complex), and repolarization (ie, QT interval).
Increased heart rate was associated with higher breath alcohol concentration, confirming the initial results of the MunichBREW study. The association was linear, with no threshold. Alcohol consumption had no effect on the other three parameters.
“The more alcohol you drink, the higher your heart rate gets,” said Stefan Brunner, MD, a cardiologist at the University Hospital Munich, one of the lead authors.
The researchers are currently investigating whether the increase in heart rate with alcohol consumption could lead to heart rhythm disorders in the longer term.
Stroke Prevention Drugs May Reduce Dementia Risk
Patients with atrial fibrillation could reduce the risk of dementia by taking stroke prevention medications, according to recommendations published online ahead of print March 18 in EP Europace and presented at the conference. The international consensus document was also published in Heart Rhythm, the official journal of the Heart Rhythm Society (HRS), and Journal of Arrhythmia, the official journal of the Japanese Heart Rhythm Society (JHRS) and the Asia Pacific Heart Rhythm Society (APHRS).
The expert consensus statement on arrhythmias and cognitive function was developed by the European Heart Rhythm Association (EHRA), a branch of the European Society of Cardiology (ESC); HRS; APHRS; and the Latin American Heart Rhythm Society (LAHRS).
Arrhythmias, as well as some procedures undertaken to treat them, can increase the risk of cognitive decline and dementia. The international consensus document was written for doctors specializing in arrhythmias and aims to raise awareness of the risks of cognitive impairment and dementia and of methods to reduce them.
Atrial fibrillation is associated with a higher risk for cognitive impairment and dementia, even in the absence of apparent stroke, according to the document. This increased risk may arise because atrial fibrillation is linked with a more than twofold risk of silent strokes. The accumulation of silent strokes and the associated brain injuries over time may contribute to cognitive impairment.
Stroke prevention with oral anticoagulant drugs is the main priority in the management of patients with atrial fibrillation. Oral anticoagulation may reduce the risk of dementia, according to the consensus document.
Adopting a healthy lifestyle also may reduce the risk of cognitive decline in patients with atrial fibrillation. This lifestyle includes not smoking and preventing or controlling hypertension, obesity, diabetes, and sleep apnea.
The document also reviews the association between other arrhythmias and cognitive dysfunction, including postcardiac arrest, in patients with cardiac implantable devices such as implantable cardioverter defibrillators and pacemakers, and ablation procedures.
Treatment of atrial fibrillation with catheter ablation can itself lead to silent strokes and cognitive impairment. To reduce this risk, physicians should follow recommendations for performing ablation and for the management of patients before and after the procedure, according to the document.
Physicians may suspect cognitive impairment if a patient’s appearance or behavior changes (eg, if appointments are missed). Family members should be asked for collateral information. If suspicions are confirmed, the consensus document recommends tools to conduct an objective assessment of cognitive function.
The paper highlights gaps in knowledge and areas for further research. These gaps include, for instance, how to identify patients with atrial fibrillation at increased risk of cognitive impairment and dementia, the effect of rhythm control on cognitive function, and the impact of cardiac resynchronization therapy on cognitive function.
EHRA Updates Guide on NOACs
A new version of the European Heart Rhythm Association (EHRA) Practical Guide on the use of non-vitamin K antagonist oral anticoagulants (NOACs) in patients with atrial fibrillation was published online ahead of print March 19 in European Heart Journal and presented at the meeting.
“European Society of Cardiology guidelines state that NOACs should be preferred over vitamin K antagonists, such as warfarin, for stroke prevention in patients with atrial fibrillation, except those with a mechanical heart valve or rheumatic mitral valve stenosis, and their use in clinical practice is increasing,” said Jan Steffel, MD, Head of the Department of Cardiology at University Heart Center Zurich.
The guide gives advice about how to use NOACs in specific clinical situations. While companies provide a Summary of Product Characteristics for a drug, there are legal restrictions on the content, and the information is often not detailed enough for doctors.
The 2018 edition of the guide has several new chapters. One outlines how to use NOACs in particular groups of patients, including those with very low body weight, the very obese, athletes, frail patients for whom there is concern about bleeding, and patients with cognitive impairment who may forget to take their pills.
Another new chapter briefly summarizes the correct dosing of NOACs in conditions other than atrial fibrillation, such as prevention of deep venous thrombosis, treatment of venous thromboembolism, and treatment of ischemic heart disease. The dosing for each condition is different, which underscores the need for clarity.
Updated advice is given on the combined use of antiplatelets and NOACs in patients with coronary artery disease, particularly those with an acute coronary syndrome or patients scheduled for percutaneous coronary intervention with stenting.
The guide also offers scientific evidence about the use of anticoagulants around cardioversion. The authors give detailed advice about what to do in patients on long-term NOAC treatment who need cardioversion versus patients newly diagnosed with atrial fibrillation and started on a NOAC before cardioversion.
Since the previous edition of the guide was published, the first NOAC reversal agent has received market approval. The authors provide advice about using idarucizumab, which reverses the anticoagulant effect of dabigatran, when there is acute bleeding, when urgent surgery is required, or when the patient has a stroke. Guidance is also included on andexanet alfa, another reversal agent expected to receive market approval, with the caveat that the instructions on the label should be followed.
Unlike warfarin, NOACs do not require monitoring of plasma levels followed by dose adjustments. The guide describes rare scenarios in which physicians might want to know the NOAC plasma level. One scenario concerns patients undergoing major surgery in whom it is unclear, for example because of other drugs or renal dysfunction, whether the usual practice of stopping the NOAC 48 hours in advance is sufficient. The plasma level of the NOAC could be measured just before surgery to confirm that the anticoagulant effect has waned.
The chapter on drug–drug interactions has been expanded with information about anticancer and antiepileptic drugs. “While this is mostly based on potential pharmacokinetic interactions and case reports, it is the first of its kind. This is likely to be adapted and become more complete over the years as our experience increases at this new frontier,” said Dr. Steffel.
Apixaban Is Safe During Catheter Ablation
Apixaban and warfarin are equally safe during catheter ablation of atrial fibrillation, according to results of the AXAFA-AFNET 5 trial. The drugs have similar rates of stroke and bleeding, and an improvement in cognitive function was shown for the first time.
Nearly one-third of all strokes are caused by atrial fibrillation. Oral anticoagulation is the cornerstone of stroke prevention in patients with atrial fibrillation. European Society of Cardiology (ESC) guidelines recommend non-vitamin K antagonist oral anticoagulants (NOACs) in preference over vitamin K antagonists (VKAs) such as warfarin, except in patients with a mechanical heart valve or rheumatic mitral valve stenosis. Unlike VKAs, NOACs do not require frequent monitoring and dose adjustment, and NOACs reduce long-term rates of stroke and death, compared with VKAs.
Catheter ablation is used in patients with atrial fibrillation to restore and maintain the heart’s normal rhythm, but the procedure entails risks of stroke, bleeding, acute brain lesions, and cognitive impairment. ESC guidelines recommend that patients continue taking their prescribed NOAC or VKA during the procedure. The results of this study confirm that the NOAC apixaban is as safe as a VKA in this situation.
The AXAFA-AFNET 5 trial was the first randomized trial to examine whether continuous apixaban was a safe alternative to a VKA during catheter ablation of atrial fibrillation. In all, 633 patients with atrial fibrillation and additional stroke risk factors scheduled to undergo atrial fibrillation ablation in Europe and the United States were randomized to receive either continuous apixaban or the locally used VKA (ie, warfarin, phenprocoumon, acenocoumarol, or fluindione).
The primary outcome was a composite of all-cause death, stroke, and major bleeding up to three months after ablation. It occurred in 22 patients randomized to apixaban and 23 randomized to VKA. “The results show that apixaban is a safe alternative to warfarin during catheter ablation of atrial fibrillation in patients at risk of stroke,” said Professor Paulus Kirchhof, MD, Chair in Cardiovascular Medicine at the University of Birmingham in the United Kingdom.
The researchers assessed cognitive function at the beginning and end of the trial and found that it improved equally in both treatment groups. “This is the first randomized trial to show that cognitive function is improving after atrial fibrillation ablation,” said Professor Kirchhof. “It is possible that this is due to continuous anticoagulation, although we did not test this specifically.” An MRI substudy in 335 patients showed a similar rate of silent strokes in the apixaban (27%) and VKA (25%) groups.
Patients in the trial were four years older than participants of previous studies with the NOACs rivaroxaban and dabigatran, said Professor Kirchhof. Local investigators chose the VKA and catheter ablation procedure, which led to the use of various drugs and techniques. “These characteristics of the trial mean that the results apply to older patients and in different clinical settings,” said Professor Kirchhof.
European Society of Cardiology Publishes Guidelines on Syncope
European Society of Cardiology guidelines on syncope were presented at the conference and published online ahead of print March 19 in the European Heart Journal.
Syncope is a transient loss of consciousness caused by reduced blood flow to the brain. Approximately 50% of people have one syncopal event during their lifetime. The most common type is vasovagal syncope, commonly known as fainting, triggered by fear, seeing blood, or prolonged standing, for example.
The challenge for doctors is to identify the minority of patients whose syncope is caused by a potentially deadly heart problem. The guidelines recommend a new algorithm for emergency departments to stratify patients and discharge those at low risk. Patients at intermediate or high risk should receive diagnostic tests in the emergency department or an outpatient syncope clinic.
“The new pathway avoids costly hospitalizations while ensuring the patient is properly diagnosed and treated,” said Professor Michele Brignole, MD, a cardiologist at Ospedali del Tigullio in Lavagna, Italy.
Most syncope does not increase the risk of death, but it can cause injury due to falls or be dangerous in certain occupations, such as airline pilots. The guidelines provide recommendations on how to prevent syncope, which include keeping hydrated; avoiding hot, crowded environments; tensing the muscles; and lying down. The document gives advice on driving for patients with syncope, although the risk of accidents is low.
The document emphasizes the value of video recording in the hospital or at home to improve diagnosis. It recommends that friends and relatives use their smartphones to film the attack and recovery. Clinical clues, such as the duration of the loss of consciousness, whether the patient’s eyes are open or closed, and jerky movements, can distinguish between syncope, epilepsy, and other conditions.
Another diagnostic tool is the implantable loop recorder, a small device inserted underneath the skin of the chest that records the heart’s electrical signals. The guidelines recommend extending its use for diagnosis in patients with unexplained falls, suspected epilepsy, or recurrent episodes of unexplained syncope and a low risk of sudden cardiac death.
The guidelines include an addendum with practical instructions for doctors about how to perform and interpret diagnostic tests.
“The Task Force that prepared the guidelines was truly multidisciplinary,” said Professor Brignole. “A minority of cardiologists was joined by experts in emergency medicine, internal medicine and physiology, neurology and autonomic diseases, geriatric medicine, and nursing.”
Drinking Alcohol Makes the Heart Race
The more alcohol one drinks, the higher one’s heart rate gets, according to research. Binge drinking has been linked with atrial fibrillation, a phenomenon called “the holiday heart syndrome.” The connection was initially based on small studies and anecdotal evidence from the late 1970s.
The Munich Beer Related Electro-cardiogram Workup (Munich BREW) study was conducted by researchers from the LMU University Hospital Munich Department of Cardiology and supported by the German Cardiovascular Research Centre and the European Commission. It was the first assessment of the acute effects of alcohol on ECG readings. The study included more than 3,000 people attending the 2015 Munich Oktoberfest.ECG readings were taken, and breath alcohol concentrations were measured. Age, sex, heart disease, heart medications, and smoking status were recorded. Participants were, on average, 35 years old, and 30% were women.
The average breath alcohol concentration was 0.85 g/kg. Increasing breath alcohol concentration was significantly associated with sinus tachycardia of more than 100 bpm in 25.9% of the cohort.
The current analysis of the MunichBREW study looked in more detail at the quantitative ECG measurements in 3,012 participants. The researchers investigated the association between blood alcohol concentration and the ECG parameters of excitation (ie, heart rate), conduction (ie, PR interval and QRS complex), and repolarization (ie, QT interval).
Increased heart rate was associated with higher breath alcohol concentration, confirming the initial results of the MunichBREW study. The association was linear, with no threshold. Alcohol consumption had no effect on the other three parameters.
“The more alcohol you drink, the higher your heart rate gets,” said Stefan Brunner, MD, a cardiologist at the University Hospital Munich, one of the lead authors.
The researchers are currently investigating whether the increase in heart rate with alcohol consumption could lead to heart rhythm disorders in the longer term.
Stroke Prevention Drugs May Reduce Dementia Risk
Patients with atrial fibrillation could reduce the risk of dementia by taking stroke prevention medications, according to recommendations published online ahead of print March 18 in EP Europace and presented at the conference. The international consensus document was also published in Heart Rhythm, the official journal of the Heart Rhythm Society (HRS), and Journal of Arrhythmia, the official journal of the Japanese Heart Rhythm Society (JHRS) and the Asia Pacific Heart Rhythm Society (APHRS).
The expert consensus statement on arrhythmias and cognitive function was developed by the European Heart Rhythm Association (EHRA), a branch of the European Society of Cardiology (ESC); HRS; APHRS; and the Latin American Heart Rhythm Society (LAHRS).
Arrhythmias, as well as some procedures undertaken to treat them, can increase the risk of cognitive decline and dementia. The international consensus document was written for doctors specializing in arrhythmias and aims to raise awareness of the risks of cognitive impairment and dementia and of methods to reduce them.
Atrial fibrillation is associated with a higher risk for cognitive impairment and dementia, even in the absence of apparent stroke, according to the document. This increased risk may arise because atrial fibrillation is linked with a more than twofold risk of silent strokes. The accumulation of silent strokes and the associated brain injuries over time may contribute to cognitive impairment.
Stroke prevention with oral anticoagulant drugs is the main priority in the management of patients with atrial fibrillation. Oral anticoagulation may reduce the risk of dementia, according to the consensus document.
Adopting a healthy lifestyle also may reduce the risk of cognitive decline in patients with atrial fibrillation. This lifestyle includes not smoking and preventing or controlling hypertension, obesity, diabetes, and sleep apnea.
The document also reviews the association between other arrhythmias and cognitive dysfunction, including postcardiac arrest, in patients with cardiac implantable devices such as implantable cardioverter defibrillators and pacemakers, and ablation procedures.
Treatment of atrial fibrillation with catheter ablation can itself lead to silent strokes and cognitive impairment. To reduce this risk, physicians should follow recommendations for performing ablation and for the management of patients before and after the procedure, according to the document.
Physicians may suspect cognitive impairment if a patient’s appearance or behavior changes (eg, if appointments are missed). Family members should be asked for collateral information. If suspicions are confirmed, the consensus document recommends tools to conduct an objective assessment of cognitive function.
The paper highlights gaps in knowledge and areas for further research. These gaps include, for instance, how to identify patients with atrial fibrillation at increased risk of cognitive impairment and dementia, the effect of rhythm control on cognitive function, and the impact of cardiac resynchronization therapy on cognitive function.
EHRA Updates Guide on NOACs
A new version of the European Heart Rhythm Association (EHRA) Practical Guide on the use of non-vitamin K antagonist oral anticoagulants (NOACs) in patients with atrial fibrillation was published online ahead of print March 19 in European Heart Journal and presented at the meeting.
“European Society of Cardiology guidelines state that NOACs should be preferred over vitamin K antagonists, such as warfarin, for stroke prevention in patients with atrial fibrillation, except those with a mechanical heart valve or rheumatic mitral valve stenosis, and their use in clinical practice is increasing,” said Jan Steffel, MD, Head of the Department of Cardiology at University Heart Center Zurich.
The guide gives advice about how to use NOACs in specific clinical situations. While companies provide a Summary of Product Characteristics for a drug, there are legal restrictions on the content, and the information is often not detailed enough for doctors.
The 2018 edition of the guide has several new chapters. One outlines how to use NOACs in particular groups of patients, including those with very low body weight, the very obese, athletes, frail patients for whom there is concern about bleeding, and patients with cognitive impairment who may forget to take their pills.
Another new chapter briefly summarizes the correct dosing of NOACs in conditions other than atrial fibrillation, such as prevention of deep venous thrombosis, treatment of venous thromboembolism, and treatment of ischemic heart disease. The dosing for each condition is different, which underscores the need for clarity.
Updated advice is given on the combined use of antiplatelets and NOACs in patients with coronary artery disease, particularly those with an acute coronary syndrome or patients scheduled for percutaneous coronary intervention with stenting.
The guide also offers scientific evidence about the use of anticoagulants around cardioversion. The authors give detailed advice about what to do in patients on long-term NOAC treatment who need cardioversion versus patients newly diagnosed with atrial fibrillation and started on a NOAC before cardioversion.
Since the previous edition of the guide was published, the first NOAC reversal agent has received market approval. The authors provide advice about using idarucizumab, which reverses the anticoagulant effect of dabigatran, when there is acute bleeding, when urgent surgery is required, or when the patient has a stroke. Guidance is also included on andexanet alfa, another reversal agent expected to receive market approval, with the caveat that the instructions on the label should be followed.
Unlike warfarin, NOACs do not require monitoring of plasma levels followed by dose adjustments. The guide describes rare scenarios in which physicians might want to know the NOAC plasma level. One scenario concerns patients undergoing major surgery in whom it is unclear, for example because of other drugs or renal dysfunction, whether the usual practice of stopping the NOAC 48 hours in advance is sufficient. The plasma level of the NOAC could be measured just before surgery to confirm that the anticoagulant effect has waned.
The chapter on drug–drug interactions has been expanded with information about anticancer and antiepileptic drugs. “While this is mostly based on potential pharmacokinetic interactions and case reports, it is the first of its kind. This is likely to be adapted and become more complete over the years as our experience increases at this new frontier,” said Dr. Steffel.
Apixaban Is Safe During Catheter Ablation
Apixaban and warfarin are equally safe during catheter ablation of atrial fibrillation, according to results of the AXAFA-AFNET 5 trial. The drugs have similar rates of stroke and bleeding, and an improvement in cognitive function was shown for the first time.
Nearly one-third of all strokes are caused by atrial fibrillation. Oral anticoagulation is the cornerstone of stroke prevention in patients with atrial fibrillation. European Society of Cardiology (ESC) guidelines recommend non-vitamin K antagonist oral anticoagulants (NOACs) in preference over vitamin K antagonists (VKAs) such as warfarin, except in patients with a mechanical heart valve or rheumatic mitral valve stenosis. Unlike VKAs, NOACs do not require frequent monitoring and dose adjustment, and NOACs reduce long-term rates of stroke and death, compared with VKAs.
Catheter ablation is used in patients with atrial fibrillation to restore and maintain the heart’s normal rhythm, but the procedure entails risks of stroke, bleeding, acute brain lesions, and cognitive impairment. ESC guidelines recommend that patients continue taking their prescribed NOAC or VKA during the procedure. The results of this study confirm that the NOAC apixaban is as safe as a VKA in this situation.
The AXAFA-AFNET 5 trial was the first randomized trial to examine whether continuous apixaban was a safe alternative to a VKA during catheter ablation of atrial fibrillation. In all, 633 patients with atrial fibrillation and additional stroke risk factors scheduled to undergo atrial fibrillation ablation in Europe and the United States were randomized to receive either continuous apixaban or the locally used VKA (ie, warfarin, phenprocoumon, acenocoumarol, or fluindione).
The primary outcome was a composite of all-cause death, stroke, and major bleeding up to three months after ablation. It occurred in 22 patients randomized to apixaban and 23 randomized to VKA. “The results show that apixaban is a safe alternative to warfarin during catheter ablation of atrial fibrillation in patients at risk of stroke,” said Professor Paulus Kirchhof, MD, Chair in Cardiovascular Medicine at the University of Birmingham in the United Kingdom.
The researchers assessed cognitive function at the beginning and end of the trial and found that it improved equally in both treatment groups. “This is the first randomized trial to show that cognitive function is improving after atrial fibrillation ablation,” said Professor Kirchhof. “It is possible that this is due to continuous anticoagulation, although we did not test this specifically.” An MRI substudy in 335 patients showed a similar rate of silent strokes in the apixaban (27%) and VKA (25%) groups.
Patients in the trial were four years older than participants of previous studies with the NOACs rivaroxaban and dabigatran, said Professor Kirchhof. Local investigators chose the VKA and catheter ablation procedure, which led to the use of various drugs and techniques. “These characteristics of the trial mean that the results apply to older patients and in different clinical settings,” said Professor Kirchhof.
European Society of Cardiology Publishes Guidelines on Syncope
European Society of Cardiology guidelines on syncope were presented at the conference and published online ahead of print March 19 in the European Heart Journal.
Syncope is a transient loss of consciousness caused by reduced blood flow to the brain. Approximately 50% of people have one syncopal event during their lifetime. The most common type is vasovagal syncope, commonly known as fainting, triggered by fear, seeing blood, or prolonged standing, for example.
The challenge for doctors is to identify the minority of patients whose syncope is caused by a potentially deadly heart problem. The guidelines recommend a new algorithm for emergency departments to stratify patients and discharge those at low risk. Patients at intermediate or high risk should receive diagnostic tests in the emergency department or an outpatient syncope clinic.
“The new pathway avoids costly hospitalizations while ensuring the patient is properly diagnosed and treated,” said Professor Michele Brignole, MD, a cardiologist at Ospedali del Tigullio in Lavagna, Italy.
Most syncope does not increase the risk of death, but it can cause injury due to falls or be dangerous in certain occupations, such as airline pilots. The guidelines provide recommendations on how to prevent syncope, which include keeping hydrated; avoiding hot, crowded environments; tensing the muscles; and lying down. The document gives advice on driving for patients with syncope, although the risk of accidents is low.
The document emphasizes the value of video recording in the hospital or at home to improve diagnosis. It recommends that friends and relatives use their smartphones to film the attack and recovery. Clinical clues, such as the duration of the loss of consciousness, whether the patient’s eyes are open or closed, and jerky movements, can distinguish between syncope, epilepsy, and other conditions.
Another diagnostic tool is the implantable loop recorder, a small device inserted underneath the skin of the chest that records the heart’s electrical signals. The guidelines recommend extending its use for diagnosis in patients with unexplained falls, suspected epilepsy, or recurrent episodes of unexplained syncope and a low risk of sudden cardiac death.
The guidelines include an addendum with practical instructions for doctors about how to perform and interpret diagnostic tests.
“The Task Force that prepared the guidelines was truly multidisciplinary,” said Professor Brignole. “A minority of cardiologists was joined by experts in emergency medicine, internal medicine and physiology, neurology and autonomic diseases, geriatric medicine, and nursing.”
Drinking Alcohol Makes the Heart Race
The more alcohol one drinks, the higher one’s heart rate gets, according to research. Binge drinking has been linked with atrial fibrillation, a phenomenon called “the holiday heart syndrome.” The connection was initially based on small studies and anecdotal evidence from the late 1970s.
The Munich Beer Related Electro-cardiogram Workup (Munich BREW) study was conducted by researchers from the LMU University Hospital Munich Department of Cardiology and supported by the German Cardiovascular Research Centre and the European Commission. It was the first assessment of the acute effects of alcohol on ECG readings. The study included more than 3,000 people attending the 2015 Munich Oktoberfest.ECG readings were taken, and breath alcohol concentrations were measured. Age, sex, heart disease, heart medications, and smoking status were recorded. Participants were, on average, 35 years old, and 30% were women.
The average breath alcohol concentration was 0.85 g/kg. Increasing breath alcohol concentration was significantly associated with sinus tachycardia of more than 100 bpm in 25.9% of the cohort.
The current analysis of the MunichBREW study looked in more detail at the quantitative ECG measurements in 3,012 participants. The researchers investigated the association between blood alcohol concentration and the ECG parameters of excitation (ie, heart rate), conduction (ie, PR interval and QRS complex), and repolarization (ie, QT interval).
Increased heart rate was associated with higher breath alcohol concentration, confirming the initial results of the MunichBREW study. The association was linear, with no threshold. Alcohol consumption had no effect on the other three parameters.
“The more alcohol you drink, the higher your heart rate gets,” said Stefan Brunner, MD, a cardiologist at the University Hospital Munich, one of the lead authors.
The researchers are currently investigating whether the increase in heart rate with alcohol consumption could lead to heart rhythm disorders in the longer term.
Headache Remains a Significant Public Health Problem
Severe headache and migraine remain significant public health problems, and their prevalence has been stable for years, according to a review published online ahead of print March 12 in Headache. Results confirm that “migraine disproportionately affects women and several other historically disadvantaged segments of the population,” according to the authors. “These inequities could be exacerbated if new high-cost treatments are inaccessible to those who need them most.”
Rebecca Burch, MD, Instructor in Neurology at Harvard Medical School in Boston, and colleagues reviewed population-based US government surveys to obtain updated estimates of the prevalence of migraine and severe headache in adults. The authors examined the most recent data from the National Health Interview Survey, the National Hospital Ambulatory Medical Care Survey, and the National Ambulatory Medical Care Survey.
The most recent National Health Interview Survey data were from 2015. They indicated that the overall prevalence of migraine or severe headache was 15.3%. The prevalence was 20.7% in women and 9.7% in men. The age group with the highest prevalence of migraine (17.9%) included patients between ages 18 and 44. Prevalence was 15.9% in people between ages 45 and 64.
The prevalence of migraine or severe headache also varied by race. The highest prevalence (20.3%) was among native Hawaiians and other Pacific islanders. Prevalence was 18.4% among American Indians or Alaska natives, 16.2% among blacks or African Americans, 15.4% among whites, and 11.3% among Asians.
Data indicated that prevalence varied with income and insurance status. People living below the poverty line had a prevalence of 21.7%, and those with an annual family income of less than $35,000 had a prevalence of 19.9%. For people younger than 65, prevalence was higher in people insured by Medicaid (26.0%), compared with people with private insurance (15.1%) or no insurance (17.1%).
The most recent data for the National Hospital Ambulatory Medical Care Survey were from 2014. In that year, headache or pain in the head prompted approximately four million emergency department visits. Women of childbearing age made more than half of emergency department visits for headache.
Headache or pain in the head accounted for 3.0% of all emergency department visits and was the fifth leading cause of visits to the emergency department, as reported by patients. Headache was the 12th most common diagnosis among emergency department physicians (1.8% of all visits). It was the sixth most common diagnosis for women aged 15 to 64 (1.7%), and migraine was the 15th most common for this population (0.8%). Headache was the 19th most common diagnosis among men between ages 15 and 64 (0.5%).
No new data about headache or head pain from the National Ambulatory Medical Care Survey were available. Headache has not been among the top 20 reasons for outpatient visits since the 2009–2010 survey.
“It is important to understand the distribution of headache in specific segments of the population,” said Dr. Burch and colleagues. “This can guide efforts to ensure that treatments are accessible to those with the highest level of need.”
—Erik Greb
Suggested Reading
Burch R, Rizzoli P, Loder E. The prevalence and impact of migraine and severe headache in the United States: figures and trends from government health studies. Headache. 2018 Mar 12 [Epub ahead of print].
Severe headache and migraine remain significant public health problems, and their prevalence has been stable for years, according to a review published online ahead of print March 12 in Headache. Results confirm that “migraine disproportionately affects women and several other historically disadvantaged segments of the population,” according to the authors. “These inequities could be exacerbated if new high-cost treatments are inaccessible to those who need them most.”
Rebecca Burch, MD, Instructor in Neurology at Harvard Medical School in Boston, and colleagues reviewed population-based US government surveys to obtain updated estimates of the prevalence of migraine and severe headache in adults. The authors examined the most recent data from the National Health Interview Survey, the National Hospital Ambulatory Medical Care Survey, and the National Ambulatory Medical Care Survey.
The most recent National Health Interview Survey data were from 2015. They indicated that the overall prevalence of migraine or severe headache was 15.3%. The prevalence was 20.7% in women and 9.7% in men. The age group with the highest prevalence of migraine (17.9%) included patients between ages 18 and 44. Prevalence was 15.9% in people between ages 45 and 64.
The prevalence of migraine or severe headache also varied by race. The highest prevalence (20.3%) was among native Hawaiians and other Pacific islanders. Prevalence was 18.4% among American Indians or Alaska natives, 16.2% among blacks or African Americans, 15.4% among whites, and 11.3% among Asians.
Data indicated that prevalence varied with income and insurance status. People living below the poverty line had a prevalence of 21.7%, and those with an annual family income of less than $35,000 had a prevalence of 19.9%. For people younger than 65, prevalence was higher in people insured by Medicaid (26.0%), compared with people with private insurance (15.1%) or no insurance (17.1%).
The most recent data for the National Hospital Ambulatory Medical Care Survey were from 2014. In that year, headache or pain in the head prompted approximately four million emergency department visits. Women of childbearing age made more than half of emergency department visits for headache.
Headache or pain in the head accounted for 3.0% of all emergency department visits and was the fifth leading cause of visits to the emergency department, as reported by patients. Headache was the 12th most common diagnosis among emergency department physicians (1.8% of all visits). It was the sixth most common diagnosis for women aged 15 to 64 (1.7%), and migraine was the 15th most common for this population (0.8%). Headache was the 19th most common diagnosis among men between ages 15 and 64 (0.5%).
No new data about headache or head pain from the National Ambulatory Medical Care Survey were available. Headache has not been among the top 20 reasons for outpatient visits since the 2009–2010 survey.
“It is important to understand the distribution of headache in specific segments of the population,” said Dr. Burch and colleagues. “This can guide efforts to ensure that treatments are accessible to those with the highest level of need.”
—Erik Greb
Suggested Reading
Burch R, Rizzoli P, Loder E. The prevalence and impact of migraine and severe headache in the United States: figures and trends from government health studies. Headache. 2018 Mar 12 [Epub ahead of print].
Severe headache and migraine remain significant public health problems, and their prevalence has been stable for years, according to a review published online ahead of print March 12 in Headache. Results confirm that “migraine disproportionately affects women and several other historically disadvantaged segments of the population,” according to the authors. “These inequities could be exacerbated if new high-cost treatments are inaccessible to those who need them most.”
Rebecca Burch, MD, Instructor in Neurology at Harvard Medical School in Boston, and colleagues reviewed population-based US government surveys to obtain updated estimates of the prevalence of migraine and severe headache in adults. The authors examined the most recent data from the National Health Interview Survey, the National Hospital Ambulatory Medical Care Survey, and the National Ambulatory Medical Care Survey.
The most recent National Health Interview Survey data were from 2015. They indicated that the overall prevalence of migraine or severe headache was 15.3%. The prevalence was 20.7% in women and 9.7% in men. The age group with the highest prevalence of migraine (17.9%) included patients between ages 18 and 44. Prevalence was 15.9% in people between ages 45 and 64.
The prevalence of migraine or severe headache also varied by race. The highest prevalence (20.3%) was among native Hawaiians and other Pacific islanders. Prevalence was 18.4% among American Indians or Alaska natives, 16.2% among blacks or African Americans, 15.4% among whites, and 11.3% among Asians.
Data indicated that prevalence varied with income and insurance status. People living below the poverty line had a prevalence of 21.7%, and those with an annual family income of less than $35,000 had a prevalence of 19.9%. For people younger than 65, prevalence was higher in people insured by Medicaid (26.0%), compared with people with private insurance (15.1%) or no insurance (17.1%).
The most recent data for the National Hospital Ambulatory Medical Care Survey were from 2014. In that year, headache or pain in the head prompted approximately four million emergency department visits. Women of childbearing age made more than half of emergency department visits for headache.
Headache or pain in the head accounted for 3.0% of all emergency department visits and was the fifth leading cause of visits to the emergency department, as reported by patients. Headache was the 12th most common diagnosis among emergency department physicians (1.8% of all visits). It was the sixth most common diagnosis for women aged 15 to 64 (1.7%), and migraine was the 15th most common for this population (0.8%). Headache was the 19th most common diagnosis among men between ages 15 and 64 (0.5%).
No new data about headache or head pain from the National Ambulatory Medical Care Survey were available. Headache has not been among the top 20 reasons for outpatient visits since the 2009–2010 survey.
“It is important to understand the distribution of headache in specific segments of the population,” said Dr. Burch and colleagues. “This can guide efforts to ensure that treatments are accessible to those with the highest level of need.”
—Erik Greb
Suggested Reading
Burch R, Rizzoli P, Loder E. The prevalence and impact of migraine and severe headache in the United States: figures and trends from government health studies. Headache. 2018 Mar 12 [Epub ahead of print].
Is Chronification the Natural History of Migraine?
OJAI, CA—Diagnosis, while critically important in the management of migraine, “is just half the picture,” said Robert Cowan, MD, Higgins Professor of Neurology and Neurosciences and Director of the Division of Headache and Facial Pain at Stanford University in California. Changes over time, attack frequency, chronification, comorbidity, and disability complicate the management of this disorder. At the 11th Annual Headache Cooperative of the Pacific’s Winter Conference, Dr. Cowan reviewed the clinical evidence suggesting that episodic and chronic migraine are two distinct entities, stressed the importance of classification and staging in the diagnosis and treatment of migraine, and elucidated the signs and symptoms of migraine chronification.
Challenging Assumptions
For years, the prevailing perception among clinicians has been that patients with migraine progress from episodic migraine into a state of chronic headache at an annual conversion rate of about 3%. A wealth of data supports this concept. But recent studies have challenged the idea that episodic migraine and chronic migraine are the same entity differentiated only by attack frequency. Research by Schwedt and colleagues, for example, seemed to differentiate between episodic and chronic migraine purely on the basis of anatomy. Additionally, evidence suggests that there are differences between episodic and chronic migraine in regard to connectivity based on functional MRI mapping of pain processing networks. Chronic and episodic migraine seem to affect the brain in different ways, Dr. Cowan said. Structural, functional, and pharmacologic changes in the brain differentiate chronic migraine from episodic migraine.
Classification and Staging
There is evidence that years of headache may make a difference. “Over time, headache after headache can remodel certain areas of the brain, in some areas thickening cortex, and in others, thinning it. This is likely the result of recurrent migraine attacks over time,” Dr. Cowan said. “The patient who has had chronic headache for 30 years is likely to process sensory input differently from the patient who has gone from eight to 14 headaches per month in the last thr
The Physician’s Role in Chronification
Physicians themselves can play a role in migraine chronification. Misdiagnosis and underdiagnosis increase the risk of migraine chronification. “When a patient is not doing well, I think it is important to go back and revisit the diagnosis. Make sure that you have the right diagnosis and that the diagnosis has not changed. Are there additional diagnoses?” Other pitfalls that may lead to chronification include failure to recognize treatable comorbidities, inadequate or inappropriate medication use, and signs of chronification such as central sensitization. Weak or recall-biased intervisit data collection also may be a problem.
Risk Factors for Chronification
“It is useful to differentiate between modifiable and nonmodifiable factors,” Dr. Cowan said. Modifiable risk factors include medication overuse, ineffective acute treatment, obesity, depression, stressful life events, and low educational level. Nonmodifiable risk factors include age and female sex. “Another approach is to look at state-specific factors versus process-related factors,” Dr. Cowan said. State-specific factors include obesity, history of abuse, comorbid pain, and head injury. Process-related factors include years of headache, increasing frequency, catastrophizing, and medication overuse.
Chronification Patterns
Clinical clues that chronification may be occurring include increasing headache frequency, severity, or duration, emergence of a second headache type, and a change in pattern of symptoms unrelated to pain. Additionally, allodynia may be a marker of chronification, and central sensitization plays a large role in chronification. “These are things we can assess clinically,” Dr. Cowan said. “We should be thinking about all these things and asking our patients about them as we follow them from visit to visit.” As headache specialists, “our job is not done once we have a diagnosis and go through the Rolodex of treatments for that diagnosis.” NR
—Glenn S. Williams
Suggested Reading
Bigal ME, Lipton RB. Clinical course in migraine: conceptualizing migraine transformation. Neurology. 2008;71(11)848-855.
Mainero C, Boshyan J, Hadjikhani N. Altered functional magnetic resonance imaging resting-state connectivity in periaqueductal gray networks in migraine. Ann Neurol. 2011;70(5):838-845.
May A, Schulte LH. Chronic migraine: risk factors, mechanisms and treatment. Nat Rev Neurol. 2016;12(8):455-464.
Schwedt TJ, Chong CD, Wu T, et al. Accurate classification of chronic migraine via brain magnetic resonance imaging. Headache. 2015;55(6):762-777.
OJAI, CA—Diagnosis, while critically important in the management of migraine, “is just half the picture,” said Robert Cowan, MD, Higgins Professor of Neurology and Neurosciences and Director of the Division of Headache and Facial Pain at Stanford University in California. Changes over time, attack frequency, chronification, comorbidity, and disability complicate the management of this disorder. At the 11th Annual Headache Cooperative of the Pacific’s Winter Conference, Dr. Cowan reviewed the clinical evidence suggesting that episodic and chronic migraine are two distinct entities, stressed the importance of classification and staging in the diagnosis and treatment of migraine, and elucidated the signs and symptoms of migraine chronification.
Challenging Assumptions
For years, the prevailing perception among clinicians has been that patients with migraine progress from episodic migraine into a state of chronic headache at an annual conversion rate of about 3%. A wealth of data supports this concept. But recent studies have challenged the idea that episodic migraine and chronic migraine are the same entity differentiated only by attack frequency. Research by Schwedt and colleagues, for example, seemed to differentiate between episodic and chronic migraine purely on the basis of anatomy. Additionally, evidence suggests that there are differences between episodic and chronic migraine in regard to connectivity based on functional MRI mapping of pain processing networks. Chronic and episodic migraine seem to affect the brain in different ways, Dr. Cowan said. Structural, functional, and pharmacologic changes in the brain differentiate chronic migraine from episodic migraine.
Classification and Staging
There is evidence that years of headache may make a difference. “Over time, headache after headache can remodel certain areas of the brain, in some areas thickening cortex, and in others, thinning it. This is likely the result of recurrent migraine attacks over time,” Dr. Cowan said. “The patient who has had chronic headache for 30 years is likely to process sensory input differently from the patient who has gone from eight to 14 headaches per month in the last thr
The Physician’s Role in Chronification
Physicians themselves can play a role in migraine chronification. Misdiagnosis and underdiagnosis increase the risk of migraine chronification. “When a patient is not doing well, I think it is important to go back and revisit the diagnosis. Make sure that you have the right diagnosis and that the diagnosis has not changed. Are there additional diagnoses?” Other pitfalls that may lead to chronification include failure to recognize treatable comorbidities, inadequate or inappropriate medication use, and signs of chronification such as central sensitization. Weak or recall-biased intervisit data collection also may be a problem.
Risk Factors for Chronification
“It is useful to differentiate between modifiable and nonmodifiable factors,” Dr. Cowan said. Modifiable risk factors include medication overuse, ineffective acute treatment, obesity, depression, stressful life events, and low educational level. Nonmodifiable risk factors include age and female sex. “Another approach is to look at state-specific factors versus process-related factors,” Dr. Cowan said. State-specific factors include obesity, history of abuse, comorbid pain, and head injury. Process-related factors include years of headache, increasing frequency, catastrophizing, and medication overuse.
Chronification Patterns
Clinical clues that chronification may be occurring include increasing headache frequency, severity, or duration, emergence of a second headache type, and a change in pattern of symptoms unrelated to pain. Additionally, allodynia may be a marker of chronification, and central sensitization plays a large role in chronification. “These are things we can assess clinically,” Dr. Cowan said. “We should be thinking about all these things and asking our patients about them as we follow them from visit to visit.” As headache specialists, “our job is not done once we have a diagnosis and go through the Rolodex of treatments for that diagnosis.” NR
—Glenn S. Williams
Suggested Reading
Bigal ME, Lipton RB. Clinical course in migraine: conceptualizing migraine transformation. Neurology. 2008;71(11)848-855.
Mainero C, Boshyan J, Hadjikhani N. Altered functional magnetic resonance imaging resting-state connectivity in periaqueductal gray networks in migraine. Ann Neurol. 2011;70(5):838-845.
May A, Schulte LH. Chronic migraine: risk factors, mechanisms and treatment. Nat Rev Neurol. 2016;12(8):455-464.
Schwedt TJ, Chong CD, Wu T, et al. Accurate classification of chronic migraine via brain magnetic resonance imaging. Headache. 2015;55(6):762-777.
OJAI, CA—Diagnosis, while critically important in the management of migraine, “is just half the picture,” said Robert Cowan, MD, Higgins Professor of Neurology and Neurosciences and Director of the Division of Headache and Facial Pain at Stanford University in California. Changes over time, attack frequency, chronification, comorbidity, and disability complicate the management of this disorder. At the 11th Annual Headache Cooperative of the Pacific’s Winter Conference, Dr. Cowan reviewed the clinical evidence suggesting that episodic and chronic migraine are two distinct entities, stressed the importance of classification and staging in the diagnosis and treatment of migraine, and elucidated the signs and symptoms of migraine chronification.
Challenging Assumptions
For years, the prevailing perception among clinicians has been that patients with migraine progress from episodic migraine into a state of chronic headache at an annual conversion rate of about 3%. A wealth of data supports this concept. But recent studies have challenged the idea that episodic migraine and chronic migraine are the same entity differentiated only by attack frequency. Research by Schwedt and colleagues, for example, seemed to differentiate between episodic and chronic migraine purely on the basis of anatomy. Additionally, evidence suggests that there are differences between episodic and chronic migraine in regard to connectivity based on functional MRI mapping of pain processing networks. Chronic and episodic migraine seem to affect the brain in different ways, Dr. Cowan said. Structural, functional, and pharmacologic changes in the brain differentiate chronic migraine from episodic migraine.
Classification and Staging
There is evidence that years of headache may make a difference. “Over time, headache after headache can remodel certain areas of the brain, in some areas thickening cortex, and in others, thinning it. This is likely the result of recurrent migraine attacks over time,” Dr. Cowan said. “The patient who has had chronic headache for 30 years is likely to process sensory input differently from the patient who has gone from eight to 14 headaches per month in the last thr
The Physician’s Role in Chronification
Physicians themselves can play a role in migraine chronification. Misdiagnosis and underdiagnosis increase the risk of migraine chronification. “When a patient is not doing well, I think it is important to go back and revisit the diagnosis. Make sure that you have the right diagnosis and that the diagnosis has not changed. Are there additional diagnoses?” Other pitfalls that may lead to chronification include failure to recognize treatable comorbidities, inadequate or inappropriate medication use, and signs of chronification such as central sensitization. Weak or recall-biased intervisit data collection also may be a problem.
Risk Factors for Chronification
“It is useful to differentiate between modifiable and nonmodifiable factors,” Dr. Cowan said. Modifiable risk factors include medication overuse, ineffective acute treatment, obesity, depression, stressful life events, and low educational level. Nonmodifiable risk factors include age and female sex. “Another approach is to look at state-specific factors versus process-related factors,” Dr. Cowan said. State-specific factors include obesity, history of abuse, comorbid pain, and head injury. Process-related factors include years of headache, increasing frequency, catastrophizing, and medication overuse.
Chronification Patterns
Clinical clues that chronification may be occurring include increasing headache frequency, severity, or duration, emergence of a second headache type, and a change in pattern of symptoms unrelated to pain. Additionally, allodynia may be a marker of chronification, and central sensitization plays a large role in chronification. “These are things we can assess clinically,” Dr. Cowan said. “We should be thinking about all these things and asking our patients about them as we follow them from visit to visit.” As headache specialists, “our job is not done once we have a diagnosis and go through the Rolodex of treatments for that diagnosis.” NR
—Glenn S. Williams
Suggested Reading
Bigal ME, Lipton RB. Clinical course in migraine: conceptualizing migraine transformation. Neurology. 2008;71(11)848-855.
Mainero C, Boshyan J, Hadjikhani N. Altered functional magnetic resonance imaging resting-state connectivity in periaqueductal gray networks in migraine. Ann Neurol. 2011;70(5):838-845.
May A, Schulte LH. Chronic migraine: risk factors, mechanisms and treatment. Nat Rev Neurol. 2016;12(8):455-464.
Schwedt TJ, Chong CD, Wu T, et al. Accurate classification of chronic migraine via brain magnetic resonance imaging. Headache. 2015;55(6):762-777.