User login
Palliative care for patients with dementia: When to refer?
Palliative care for people with dementia is increasingly recognized as a way to improve quality of life and provide relief from the myriad physical and psychological symptoms of advancing neurodegenerative disease. But unlike in cancer,
A new literature review has found these referrals to be all over the map among patients with dementia – with many occurring very late in the disease process – and do not reflect any consistent criteria based on patient needs.
For their research, published March 2 in the Journal of the American Geriatrics Society, Li Mo, MD, of the University of Texas MD Anderson Cancer Center in Houston, and colleagues looked at nearly 60 studies dating back to the early 1990s that contained information on referrals to palliative care for patients with dementia. While a palliative care approach can be provided by nonspecialists, all the included studies dealt at least in part with specialist care.
Standardized criteria is lacking
The investigators found advanced or late-stage dementia to be the most common reason cited for referral, with three quarters of the studies recommending palliative care for late-stage or advanced dementia, generally without qualifying what symptoms or needs were present. Patients received palliative care across a range of settings, including nursing homes, hospitals, and their own homes, though many articles did not include information on where patients received care.
A fifth of the articles suggested that medical complications of dementia including falls, pneumonia, and ulcers should trigger referrals to palliative care, while another fifth cited poor prognosis, defined varyingly as having between 2 years and 6 months likely left to live. Poor nutrition status was identified in 10% of studies as meriting referral.
Only 20% of the studies identified patient needs – evidence of psychological distress or functional decline, for example – as criteria for referral, despite these being ubiquitous in dementia. The authors said they were surprised by this finding, which could possibly be explained, they wrote, by “the interest among geriatrician, neurologist, and primary care teams to provide good symptom management,” reflecting a de facto palliative care approach. “There is also significant stigma associated with a specialist palliative care referral,” the authors noted.
Curiously, the researchers noted, a new diagnosis of dementia in more than a quarter of the studies triggered referral, a finding that possibly reflected delayed diagnoses.
The findings revealed “heterogeneity in the literature in reasons for involving specialist palliative care, which may partly explain the variation in patterns of palliative care referral,” Dr. Mo and colleagues wrote, stressing that more standardized criteria are urgently needed to bring dementia in line with cancer in terms of providing timely palliative care.
Patients with advancing dementia have little chance to self-report symptoms, meaning that more attention to patient complaints earlier in the disease course, and greater sensitivity to patient distress, are required. By routinely screening symptoms, clinicians could use specific cutoffs “as triggers to initiate automatic timely palliative care referral,” the authors concluded, noting that more research was needed before these cutoffs, whether based on symptom intensity or other measures, could be calculated.
Dr. Mo and colleagues acknowledged as weaknesses of their study the fact that a third of the articles in the review were based on expert consensus, while others did not distinguish clearly between primary and specialist palliative care.
A starting point for further discussion
Asked to comment on the findings, Elizabeth Sampson, MD, a palliative care researcher at University College London, praised Dr. Mo and colleagues’ study as “starting to pull together the strands” of a systematic approach to referrals and access to palliative care in dementia.
“Sometimes you need a paper like this to kick off the discussion to say look, this is where we are,” Dr. Sampson said, noting that the focus on need-based criteria dovetailed with a “general feeling in the field that we need to really think about needs, and what palliative care needs might be. What the threshold for referral should be we don’t know yet. Should it be three unmet needs? Or five? We’re still a long way from knowing.”
Dr. Sampson’s group is leading a UK-government funded research effort that aims to develop cost-effective palliative care interventions in dementia, in part through a tool that uses caregiver reports to assess symptom burden and patient needs. The research program “is founded on a needs-based approach, which aims to look at people’s individual needs and responding to them in a proactive way,” she said.
One of the obstacles to timely palliative care in dementia, Dr. Sampson said, is weighing resource allocation against what can be wildly varying prognoses. “Hospices understand when someone has terminal cancer and [is] likely to die within a few weeks, but it’s not unheard of for someone in very advanced stages of dementia to live another year,” she said. “There are concerns that a rapid increase in people with dementia being moved to palliative care could overwhelm already limited hospice capacity. We would argue that the best approach is to get palliative care out to where people with dementia live, which is usually the care home.”
Dr. Mo and colleagues’ study received funding from the National Institutes of Health, and its authors disclosed no financial conflicts of interest. Dr. Sampson’s work is supported by the UK’s Economic and Social Research Council and National Institute for Health Research. She disclosed no conflicts of interest.
Palliative care for people with dementia is increasingly recognized as a way to improve quality of life and provide relief from the myriad physical and psychological symptoms of advancing neurodegenerative disease. But unlike in cancer,
A new literature review has found these referrals to be all over the map among patients with dementia – with many occurring very late in the disease process – and do not reflect any consistent criteria based on patient needs.
For their research, published March 2 in the Journal of the American Geriatrics Society, Li Mo, MD, of the University of Texas MD Anderson Cancer Center in Houston, and colleagues looked at nearly 60 studies dating back to the early 1990s that contained information on referrals to palliative care for patients with dementia. While a palliative care approach can be provided by nonspecialists, all the included studies dealt at least in part with specialist care.
Standardized criteria is lacking
The investigators found advanced or late-stage dementia to be the most common reason cited for referral, with three quarters of the studies recommending palliative care for late-stage or advanced dementia, generally without qualifying what symptoms or needs were present. Patients received palliative care across a range of settings, including nursing homes, hospitals, and their own homes, though many articles did not include information on where patients received care.
A fifth of the articles suggested that medical complications of dementia including falls, pneumonia, and ulcers should trigger referrals to palliative care, while another fifth cited poor prognosis, defined varyingly as having between 2 years and 6 months likely left to live. Poor nutrition status was identified in 10% of studies as meriting referral.
Only 20% of the studies identified patient needs – evidence of psychological distress or functional decline, for example – as criteria for referral, despite these being ubiquitous in dementia. The authors said they were surprised by this finding, which could possibly be explained, they wrote, by “the interest among geriatrician, neurologist, and primary care teams to provide good symptom management,” reflecting a de facto palliative care approach. “There is also significant stigma associated with a specialist palliative care referral,” the authors noted.
Curiously, the researchers noted, a new diagnosis of dementia in more than a quarter of the studies triggered referral, a finding that possibly reflected delayed diagnoses.
The findings revealed “heterogeneity in the literature in reasons for involving specialist palliative care, which may partly explain the variation in patterns of palliative care referral,” Dr. Mo and colleagues wrote, stressing that more standardized criteria are urgently needed to bring dementia in line with cancer in terms of providing timely palliative care.
Patients with advancing dementia have little chance to self-report symptoms, meaning that more attention to patient complaints earlier in the disease course, and greater sensitivity to patient distress, are required. By routinely screening symptoms, clinicians could use specific cutoffs “as triggers to initiate automatic timely palliative care referral,” the authors concluded, noting that more research was needed before these cutoffs, whether based on symptom intensity or other measures, could be calculated.
Dr. Mo and colleagues acknowledged as weaknesses of their study the fact that a third of the articles in the review were based on expert consensus, while others did not distinguish clearly between primary and specialist palliative care.
A starting point for further discussion
Asked to comment on the findings, Elizabeth Sampson, MD, a palliative care researcher at University College London, praised Dr. Mo and colleagues’ study as “starting to pull together the strands” of a systematic approach to referrals and access to palliative care in dementia.
“Sometimes you need a paper like this to kick off the discussion to say look, this is where we are,” Dr. Sampson said, noting that the focus on need-based criteria dovetailed with a “general feeling in the field that we need to really think about needs, and what palliative care needs might be. What the threshold for referral should be we don’t know yet. Should it be three unmet needs? Or five? We’re still a long way from knowing.”
Dr. Sampson’s group is leading a UK-government funded research effort that aims to develop cost-effective palliative care interventions in dementia, in part through a tool that uses caregiver reports to assess symptom burden and patient needs. The research program “is founded on a needs-based approach, which aims to look at people’s individual needs and responding to them in a proactive way,” she said.
One of the obstacles to timely palliative care in dementia, Dr. Sampson said, is weighing resource allocation against what can be wildly varying prognoses. “Hospices understand when someone has terminal cancer and [is] likely to die within a few weeks, but it’s not unheard of for someone in very advanced stages of dementia to live another year,” she said. “There are concerns that a rapid increase in people with dementia being moved to palliative care could overwhelm already limited hospice capacity. We would argue that the best approach is to get palliative care out to where people with dementia live, which is usually the care home.”
Dr. Mo and colleagues’ study received funding from the National Institutes of Health, and its authors disclosed no financial conflicts of interest. Dr. Sampson’s work is supported by the UK’s Economic and Social Research Council and National Institute for Health Research. She disclosed no conflicts of interest.
Palliative care for people with dementia is increasingly recognized as a way to improve quality of life and provide relief from the myriad physical and psychological symptoms of advancing neurodegenerative disease. But unlike in cancer,
A new literature review has found these referrals to be all over the map among patients with dementia – with many occurring very late in the disease process – and do not reflect any consistent criteria based on patient needs.
For their research, published March 2 in the Journal of the American Geriatrics Society, Li Mo, MD, of the University of Texas MD Anderson Cancer Center in Houston, and colleagues looked at nearly 60 studies dating back to the early 1990s that contained information on referrals to palliative care for patients with dementia. While a palliative care approach can be provided by nonspecialists, all the included studies dealt at least in part with specialist care.
Standardized criteria is lacking
The investigators found advanced or late-stage dementia to be the most common reason cited for referral, with three quarters of the studies recommending palliative care for late-stage or advanced dementia, generally without qualifying what symptoms or needs were present. Patients received palliative care across a range of settings, including nursing homes, hospitals, and their own homes, though many articles did not include information on where patients received care.
A fifth of the articles suggested that medical complications of dementia including falls, pneumonia, and ulcers should trigger referrals to palliative care, while another fifth cited poor prognosis, defined varyingly as having between 2 years and 6 months likely left to live. Poor nutrition status was identified in 10% of studies as meriting referral.
Only 20% of the studies identified patient needs – evidence of psychological distress or functional decline, for example – as criteria for referral, despite these being ubiquitous in dementia. The authors said they were surprised by this finding, which could possibly be explained, they wrote, by “the interest among geriatrician, neurologist, and primary care teams to provide good symptom management,” reflecting a de facto palliative care approach. “There is also significant stigma associated with a specialist palliative care referral,” the authors noted.
Curiously, the researchers noted, a new diagnosis of dementia in more than a quarter of the studies triggered referral, a finding that possibly reflected delayed diagnoses.
The findings revealed “heterogeneity in the literature in reasons for involving specialist palliative care, which may partly explain the variation in patterns of palliative care referral,” Dr. Mo and colleagues wrote, stressing that more standardized criteria are urgently needed to bring dementia in line with cancer in terms of providing timely palliative care.
Patients with advancing dementia have little chance to self-report symptoms, meaning that more attention to patient complaints earlier in the disease course, and greater sensitivity to patient distress, are required. By routinely screening symptoms, clinicians could use specific cutoffs “as triggers to initiate automatic timely palliative care referral,” the authors concluded, noting that more research was needed before these cutoffs, whether based on symptom intensity or other measures, could be calculated.
Dr. Mo and colleagues acknowledged as weaknesses of their study the fact that a third of the articles in the review were based on expert consensus, while others did not distinguish clearly between primary and specialist palliative care.
A starting point for further discussion
Asked to comment on the findings, Elizabeth Sampson, MD, a palliative care researcher at University College London, praised Dr. Mo and colleagues’ study as “starting to pull together the strands” of a systematic approach to referrals and access to palliative care in dementia.
“Sometimes you need a paper like this to kick off the discussion to say look, this is where we are,” Dr. Sampson said, noting that the focus on need-based criteria dovetailed with a “general feeling in the field that we need to really think about needs, and what palliative care needs might be. What the threshold for referral should be we don’t know yet. Should it be three unmet needs? Or five? We’re still a long way from knowing.”
Dr. Sampson’s group is leading a UK-government funded research effort that aims to develop cost-effective palliative care interventions in dementia, in part through a tool that uses caregiver reports to assess symptom burden and patient needs. The research program “is founded on a needs-based approach, which aims to look at people’s individual needs and responding to them in a proactive way,” she said.
One of the obstacles to timely palliative care in dementia, Dr. Sampson said, is weighing resource allocation against what can be wildly varying prognoses. “Hospices understand when someone has terminal cancer and [is] likely to die within a few weeks, but it’s not unheard of for someone in very advanced stages of dementia to live another year,” she said. “There are concerns that a rapid increase in people with dementia being moved to palliative care could overwhelm already limited hospice capacity. We would argue that the best approach is to get palliative care out to where people with dementia live, which is usually the care home.”
Dr. Mo and colleagues’ study received funding from the National Institutes of Health, and its authors disclosed no financial conflicts of interest. Dr. Sampson’s work is supported by the UK’s Economic and Social Research Council and National Institute for Health Research. She disclosed no conflicts of interest.
FROM THE JOURNAL OF THE AMERICAN GERIATRICS SOCIETY
Risdiplam study shows promise for spinal muscular atrophy
FIREFISH study.
A boost in SMN expression has been linked to improvements in survival and motor function, which was also observed in exploratory efficacy outcomes in the 2-part, phase 2-3, open-label study.
“No surviving infant was receiving permanent ventilation at month 12, and 7 of the 21 infants were able to sit without support, which is not expected in patients with type 1 spinal muscular atrophy, according to historical experience,” reported the FIREFISH Working Group led by Giovanni Baranello, MD, PhD, from the Dubowitz Neuromuscular Centre, National Institute for Health Research Great Ormond Street Hospital Biomedical Research Centre, Great Ormond Street Institute of Child Health University College London, and Great Ormond Street Hospital Trust, London.
However, “it cannot be stated with confidence that there was clinical benefit of the agent because the exploratory clinical endpoints were analyzed post hoc and can only be qualitatively compared with historical cohorts,” they added.
The findings were published online Feb. 24 in the New England Journal of Medicine.
A phase 2-3 open-label study
The study enrolled 21 infants with type 1 SMA, between the ages of 1 and 7 months. The majority (n = 17) were treated for 1 year with high-dose risdiplam, reaching 0.2 mg/kg of body weight per day by the twelfth month. Four infants in a low-dose cohort were treated with 0.08 mg/kg by the twelfth month. The medication was administered once daily orally in infants who were able to swallow, or by feeding tube for those who could not.
The primary outcomes of this first part of the study were safety, pharmacokinetics, pharmacodynamics (including the blood SMN protein concentration), and selection of the risdiplam dose for part 2 of the study. Exploratory outcomes included event-free survival, defined as being alive without tracheostomy or the use of permanent ventilation for 16 or more hours per day, and the ability to sit without support for at least 5 seconds.
In terms of safety, the study recorded 24 serious adverse events. “The most common serious adverse events were infections of the respiratory tract, and four infants died of respiratory complications; these findings are consistent with the neuromuscular respiratory failure that characterizes spinal muscular atrophy,” the authors reported. “The risdiplam-associated retinal toxic effects that had been previously observed in monkeys were not observed in the current study,” they added.
Regarding SMN protein levels, a median level of 2.1 times the baseline level was observed within 4 weeks after the initiation of treatment in the high-dose cohort, they reported. By 12 months, these median values had increased to 3.0 times and 1.9 times the baseline values in the low-dose and high-dose cohorts, respectively.
Looking at exploratory efficacy outcomes, 90% of infants survived without ventilatory support, and seven infants in the high-dose cohort were able to sit without support for at least 5 seconds. The higher dose of risdiplam (0.2 mg/kg per day) was selected for part 2 of the study.
The first oral treatment option
Risdiplam is the third SMA treatment approved by the Food and Drug Administration, “and has the potential to expand access to treatment for people with SMA,” commented Mary Schroth, MD, chief medical officer of Cure SMA, who was not involved in the research. She added that the exploratory outcomes of the FIREFISH study represent “a significant milestone for symptomatic infants with SMA type 1.”
While the other two approved SMA therapies – nusinersen and onasemnogene abeparvovec – have led to improvements in survival and motor function, they are administered either intrathecally or intravenously respectively, while risdiplam is an oral therapy.
Dr. Schroth says there are currently no studies comparing the different SMA treatments. “Cure SMA is actively collecting real-world experience with risdiplam and other SMA treatments through multiple pathways,” she said. “Every individual and family, in collaboration with their health care provider, should discuss SMA treatments and make the decision that is best for them.”
Writing in Neuroscience Insights, a few months after risdiplam’s FDA approval last summer, Ravindra N. Singh MD, from the department of biomedical sciences, Iowa State University, Ames, wrote that, as an orally deliverable small molecule, risdiplam “is a major advancement for the treatment of SMA.”
Now, the FIREFISH study is “welcome news,” he said in an interview. “The results look promising so far,” he added. “I am cautiously optimistic that risdiplam would prove to be a viable alternative to the currently available invasive approaches. However, long-term studies (with appropriate age and sex-matched cohorts) would be needed to fully rule out the potential side effects of the repeated administrations.”
The therapy “is particularly great news for a group of SMA patients that might have tolerability and/or immune response concerns when it comes to nusinersen and gene therapy,” he noted in his article, adding that the ability to store and ship the drug at ambient temperatures, as well as its comparatively low cost are added benefits.
The study was supported by F. Hoffmann–La Roche. Dr. Baranello disclosed that he serves as a consultant for AveXis, F. Hoffmann-La Roche, and Sarepta Therapeutics, as well as PTC Therapeutics, from whom he also receives speaker honoraria. Dr. Schroth disclosed no personal conflicts and is an employee of Cure SMA. Cure SMA works to develop strategic relationships with corporate partners with the goal of working together to lead the way to a world without SMA. In advancement of that mission, Cure SMA has received funding from multiple corporate sources including Aetna, Biogen, Blue Cross Blue Shield, Genentech, Kaiser Permanente, Novartis Gene Therapies, Scholar Rock, and United HealthCare. Cure SMA has no financial stake in any treatment and does not advocate for one treatment over another. Dr. Singh disclosed that Spinraza (Nusinersen), the first FDA-approved SMA drug, is based on the target (US patent # 7,838,657) that was discovered in his former laboratory at UMASS Medical School, Worcester, Mass.
FIREFISH study.
A boost in SMN expression has been linked to improvements in survival and motor function, which was also observed in exploratory efficacy outcomes in the 2-part, phase 2-3, open-label study.
“No surviving infant was receiving permanent ventilation at month 12, and 7 of the 21 infants were able to sit without support, which is not expected in patients with type 1 spinal muscular atrophy, according to historical experience,” reported the FIREFISH Working Group led by Giovanni Baranello, MD, PhD, from the Dubowitz Neuromuscular Centre, National Institute for Health Research Great Ormond Street Hospital Biomedical Research Centre, Great Ormond Street Institute of Child Health University College London, and Great Ormond Street Hospital Trust, London.
However, “it cannot be stated with confidence that there was clinical benefit of the agent because the exploratory clinical endpoints were analyzed post hoc and can only be qualitatively compared with historical cohorts,” they added.
The findings were published online Feb. 24 in the New England Journal of Medicine.
A phase 2-3 open-label study
The study enrolled 21 infants with type 1 SMA, between the ages of 1 and 7 months. The majority (n = 17) were treated for 1 year with high-dose risdiplam, reaching 0.2 mg/kg of body weight per day by the twelfth month. Four infants in a low-dose cohort were treated with 0.08 mg/kg by the twelfth month. The medication was administered once daily orally in infants who were able to swallow, or by feeding tube for those who could not.
The primary outcomes of this first part of the study were safety, pharmacokinetics, pharmacodynamics (including the blood SMN protein concentration), and selection of the risdiplam dose for part 2 of the study. Exploratory outcomes included event-free survival, defined as being alive without tracheostomy or the use of permanent ventilation for 16 or more hours per day, and the ability to sit without support for at least 5 seconds.
In terms of safety, the study recorded 24 serious adverse events. “The most common serious adverse events were infections of the respiratory tract, and four infants died of respiratory complications; these findings are consistent with the neuromuscular respiratory failure that characterizes spinal muscular atrophy,” the authors reported. “The risdiplam-associated retinal toxic effects that had been previously observed in monkeys were not observed in the current study,” they added.
Regarding SMN protein levels, a median level of 2.1 times the baseline level was observed within 4 weeks after the initiation of treatment in the high-dose cohort, they reported. By 12 months, these median values had increased to 3.0 times and 1.9 times the baseline values in the low-dose and high-dose cohorts, respectively.
Looking at exploratory efficacy outcomes, 90% of infants survived without ventilatory support, and seven infants in the high-dose cohort were able to sit without support for at least 5 seconds. The higher dose of risdiplam (0.2 mg/kg per day) was selected for part 2 of the study.
The first oral treatment option
Risdiplam is the third SMA treatment approved by the Food and Drug Administration, “and has the potential to expand access to treatment for people with SMA,” commented Mary Schroth, MD, chief medical officer of Cure SMA, who was not involved in the research. She added that the exploratory outcomes of the FIREFISH study represent “a significant milestone for symptomatic infants with SMA type 1.”
While the other two approved SMA therapies – nusinersen and onasemnogene abeparvovec – have led to improvements in survival and motor function, they are administered either intrathecally or intravenously respectively, while risdiplam is an oral therapy.
Dr. Schroth says there are currently no studies comparing the different SMA treatments. “Cure SMA is actively collecting real-world experience with risdiplam and other SMA treatments through multiple pathways,” she said. “Every individual and family, in collaboration with their health care provider, should discuss SMA treatments and make the decision that is best for them.”
Writing in Neuroscience Insights, a few months after risdiplam’s FDA approval last summer, Ravindra N. Singh MD, from the department of biomedical sciences, Iowa State University, Ames, wrote that, as an orally deliverable small molecule, risdiplam “is a major advancement for the treatment of SMA.”
Now, the FIREFISH study is “welcome news,” he said in an interview. “The results look promising so far,” he added. “I am cautiously optimistic that risdiplam would prove to be a viable alternative to the currently available invasive approaches. However, long-term studies (with appropriate age and sex-matched cohorts) would be needed to fully rule out the potential side effects of the repeated administrations.”
The therapy “is particularly great news for a group of SMA patients that might have tolerability and/or immune response concerns when it comes to nusinersen and gene therapy,” he noted in his article, adding that the ability to store and ship the drug at ambient temperatures, as well as its comparatively low cost are added benefits.
The study was supported by F. Hoffmann–La Roche. Dr. Baranello disclosed that he serves as a consultant for AveXis, F. Hoffmann-La Roche, and Sarepta Therapeutics, as well as PTC Therapeutics, from whom he also receives speaker honoraria. Dr. Schroth disclosed no personal conflicts and is an employee of Cure SMA. Cure SMA works to develop strategic relationships with corporate partners with the goal of working together to lead the way to a world without SMA. In advancement of that mission, Cure SMA has received funding from multiple corporate sources including Aetna, Biogen, Blue Cross Blue Shield, Genentech, Kaiser Permanente, Novartis Gene Therapies, Scholar Rock, and United HealthCare. Cure SMA has no financial stake in any treatment and does not advocate for one treatment over another. Dr. Singh disclosed that Spinraza (Nusinersen), the first FDA-approved SMA drug, is based on the target (US patent # 7,838,657) that was discovered in his former laboratory at UMASS Medical School, Worcester, Mass.
FIREFISH study.
A boost in SMN expression has been linked to improvements in survival and motor function, which was also observed in exploratory efficacy outcomes in the 2-part, phase 2-3, open-label study.
“No surviving infant was receiving permanent ventilation at month 12, and 7 of the 21 infants were able to sit without support, which is not expected in patients with type 1 spinal muscular atrophy, according to historical experience,” reported the FIREFISH Working Group led by Giovanni Baranello, MD, PhD, from the Dubowitz Neuromuscular Centre, National Institute for Health Research Great Ormond Street Hospital Biomedical Research Centre, Great Ormond Street Institute of Child Health University College London, and Great Ormond Street Hospital Trust, London.
However, “it cannot be stated with confidence that there was clinical benefit of the agent because the exploratory clinical endpoints were analyzed post hoc and can only be qualitatively compared with historical cohorts,” they added.
The findings were published online Feb. 24 in the New England Journal of Medicine.
A phase 2-3 open-label study
The study enrolled 21 infants with type 1 SMA, between the ages of 1 and 7 months. The majority (n = 17) were treated for 1 year with high-dose risdiplam, reaching 0.2 mg/kg of body weight per day by the twelfth month. Four infants in a low-dose cohort were treated with 0.08 mg/kg by the twelfth month. The medication was administered once daily orally in infants who were able to swallow, or by feeding tube for those who could not.
The primary outcomes of this first part of the study were safety, pharmacokinetics, pharmacodynamics (including the blood SMN protein concentration), and selection of the risdiplam dose for part 2 of the study. Exploratory outcomes included event-free survival, defined as being alive without tracheostomy or the use of permanent ventilation for 16 or more hours per day, and the ability to sit without support for at least 5 seconds.
In terms of safety, the study recorded 24 serious adverse events. “The most common serious adverse events were infections of the respiratory tract, and four infants died of respiratory complications; these findings are consistent with the neuromuscular respiratory failure that characterizes spinal muscular atrophy,” the authors reported. “The risdiplam-associated retinal toxic effects that had been previously observed in monkeys were not observed in the current study,” they added.
Regarding SMN protein levels, a median level of 2.1 times the baseline level was observed within 4 weeks after the initiation of treatment in the high-dose cohort, they reported. By 12 months, these median values had increased to 3.0 times and 1.9 times the baseline values in the low-dose and high-dose cohorts, respectively.
Looking at exploratory efficacy outcomes, 90% of infants survived without ventilatory support, and seven infants in the high-dose cohort were able to sit without support for at least 5 seconds. The higher dose of risdiplam (0.2 mg/kg per day) was selected for part 2 of the study.
The first oral treatment option
Risdiplam is the third SMA treatment approved by the Food and Drug Administration, “and has the potential to expand access to treatment for people with SMA,” commented Mary Schroth, MD, chief medical officer of Cure SMA, who was not involved in the research. She added that the exploratory outcomes of the FIREFISH study represent “a significant milestone for symptomatic infants with SMA type 1.”
While the other two approved SMA therapies – nusinersen and onasemnogene abeparvovec – have led to improvements in survival and motor function, they are administered either intrathecally or intravenously respectively, while risdiplam is an oral therapy.
Dr. Schroth says there are currently no studies comparing the different SMA treatments. “Cure SMA is actively collecting real-world experience with risdiplam and other SMA treatments through multiple pathways,” she said. “Every individual and family, in collaboration with their health care provider, should discuss SMA treatments and make the decision that is best for them.”
Writing in Neuroscience Insights, a few months after risdiplam’s FDA approval last summer, Ravindra N. Singh MD, from the department of biomedical sciences, Iowa State University, Ames, wrote that, as an orally deliverable small molecule, risdiplam “is a major advancement for the treatment of SMA.”
Now, the FIREFISH study is “welcome news,” he said in an interview. “The results look promising so far,” he added. “I am cautiously optimistic that risdiplam would prove to be a viable alternative to the currently available invasive approaches. However, long-term studies (with appropriate age and sex-matched cohorts) would be needed to fully rule out the potential side effects of the repeated administrations.”
The therapy “is particularly great news for a group of SMA patients that might have tolerability and/or immune response concerns when it comes to nusinersen and gene therapy,” he noted in his article, adding that the ability to store and ship the drug at ambient temperatures, as well as its comparatively low cost are added benefits.
The study was supported by F. Hoffmann–La Roche. Dr. Baranello disclosed that he serves as a consultant for AveXis, F. Hoffmann-La Roche, and Sarepta Therapeutics, as well as PTC Therapeutics, from whom he also receives speaker honoraria. Dr. Schroth disclosed no personal conflicts and is an employee of Cure SMA. Cure SMA works to develop strategic relationships with corporate partners with the goal of working together to lead the way to a world without SMA. In advancement of that mission, Cure SMA has received funding from multiple corporate sources including Aetna, Biogen, Blue Cross Blue Shield, Genentech, Kaiser Permanente, Novartis Gene Therapies, Scholar Rock, and United HealthCare. Cure SMA has no financial stake in any treatment and does not advocate for one treatment over another. Dr. Singh disclosed that Spinraza (Nusinersen), the first FDA-approved SMA drug, is based on the target (US patent # 7,838,657) that was discovered in his former laboratory at UMASS Medical School, Worcester, Mass.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Adherence and discontinuation limit triptan outcomes
“Few people continue on triptans either due to lack of efficacy or too many adverse events,” said Alan Rapoport, MD, clinical professor of neurology at the University of California, Los Angeles. “Some people overuse triptans when they are available and work well, but the patients are not properly informed, and do not listen.”
Migraine headaches fall among some of the most common neurologic disorders and claims the No. 2 spot in diseases that contribute to life lived with disability. An estimated 11.7% have migraine episodes annually, and the disorder carries a high prevalence through the duration of the patient’s life.
Triptans were noted as being a highly effective solution for acute migraine management when they were first introduced in the early 1990s and still remain the first-line treatment for acute migraine management not adequately controlled by ordinary analgesics and NSAIDs. As a drug class, the side-effect profile of triptans can vary, but frequent users run the risk of medication overuse headache, a condition noted by migraines of increased frequency and intensity.
25 years of triptan use
Study investigators conducted a nationwide, register-based cohort study using data collected from 7,435,758 Danish residents who accessed the public health care system between Jan. 1, 1994, and Oct. 31, 2019. The time frame accounts for a period of 139.0 million person-years when the residents were both alive and living in Denmark. Their findings were published online Feb. 14, 2021, in Cephalalgia.
Researchers evaluated and summarized purchases of all triptans in all dosage forms sold in Denmark during that time frame. These were sumatriptan, naratriptan, zolmitriptan, rizatriptan, almotriptan, eletriptan, and frovatriptan. Based on their finding, 381,695 patients purchased triptans at least one time. Triptan users were more likely to be female (75.7%) than male (24.3%).
Dr. Rapoport, who was not involved in the study, feels the differences in use between genders extrapolate to the U.S. migraine population as well. “Three times more women have migraines than men and buy triptans in that ratio,” he said.
Any patient who purchased at least one of any triptan at any point during the course of the study was classified as a triptan user. Triptan overuse is defined as using a triptan greater for at least 10 days a month for 3 consecutive months, as defined by the International Classification of Headache Disorders. It’s important to note that triptan are prescribed to patients for only two indications – migraines and cluster headaches. However, cluster headaches are extremely rare.
The study’s investigators summarized data collected throughout Denmark for more than a quarter of a century. The findings show an increase in triptan use from 345 defined daily doses to 945 defined daily doses per 1,000 residents per year along with an increased prevalence on triptan use from 5.17 to 14.57 per 1,000 inhabitants. In addition, 12.3% of the Danish residents who had migraines bought a triptan between 2014 and 2019 – data Dr. Rapoport noted falls in lines with trends in other Western countries, which range between 12% and 13%.
Nearly half of the first-time triptan buyers (43%) did not purchase another triptan for 5 years. In conflict with established guidelines, 90% of patients that discontinued triptan-based treatment had tried only one triptan type.
One important factor contributing to the ease of data collection is that the Danish population has free health care, coupled with sizable reimbursements for their spending. The country’s accessible health care system negates the effects of barriers related to price and availability while engendering data that more accurately reflects the patients’ experience based on treatment need and satisfaction.
“In a cohort with access to free clinical consultations and low medication costs, we observed low rates of triptan adherence, likely due to disappointing efficacy and/or unpleasant side effects rather than economic considerations. Triptan success continues to be hindered by poor implementation of clinical guidelines and high rates of treatment discontinuance,” the researchers concluded.
“The most surprising thing about this study is it is exactly what I would have expected if triptans in the U.S. were free,” Dr. Rapoport said.
Dr. Rapoport is the editor in chief of Neurology Reviews and serves as a consultant to several pharmaceutical companies.
“Few people continue on triptans either due to lack of efficacy or too many adverse events,” said Alan Rapoport, MD, clinical professor of neurology at the University of California, Los Angeles. “Some people overuse triptans when they are available and work well, but the patients are not properly informed, and do not listen.”
Migraine headaches fall among some of the most common neurologic disorders and claims the No. 2 spot in diseases that contribute to life lived with disability. An estimated 11.7% have migraine episodes annually, and the disorder carries a high prevalence through the duration of the patient’s life.
Triptans were noted as being a highly effective solution for acute migraine management when they were first introduced in the early 1990s and still remain the first-line treatment for acute migraine management not adequately controlled by ordinary analgesics and NSAIDs. As a drug class, the side-effect profile of triptans can vary, but frequent users run the risk of medication overuse headache, a condition noted by migraines of increased frequency and intensity.
25 years of triptan use
Study investigators conducted a nationwide, register-based cohort study using data collected from 7,435,758 Danish residents who accessed the public health care system between Jan. 1, 1994, and Oct. 31, 2019. The time frame accounts for a period of 139.0 million person-years when the residents were both alive and living in Denmark. Their findings were published online Feb. 14, 2021, in Cephalalgia.
Researchers evaluated and summarized purchases of all triptans in all dosage forms sold in Denmark during that time frame. These were sumatriptan, naratriptan, zolmitriptan, rizatriptan, almotriptan, eletriptan, and frovatriptan. Based on their finding, 381,695 patients purchased triptans at least one time. Triptan users were more likely to be female (75.7%) than male (24.3%).
Dr. Rapoport, who was not involved in the study, feels the differences in use between genders extrapolate to the U.S. migraine population as well. “Three times more women have migraines than men and buy triptans in that ratio,” he said.
Any patient who purchased at least one of any triptan at any point during the course of the study was classified as a triptan user. Triptan overuse is defined as using a triptan greater for at least 10 days a month for 3 consecutive months, as defined by the International Classification of Headache Disorders. It’s important to note that triptan are prescribed to patients for only two indications – migraines and cluster headaches. However, cluster headaches are extremely rare.
The study’s investigators summarized data collected throughout Denmark for more than a quarter of a century. The findings show an increase in triptan use from 345 defined daily doses to 945 defined daily doses per 1,000 residents per year along with an increased prevalence on triptan use from 5.17 to 14.57 per 1,000 inhabitants. In addition, 12.3% of the Danish residents who had migraines bought a triptan between 2014 and 2019 – data Dr. Rapoport noted falls in lines with trends in other Western countries, which range between 12% and 13%.
Nearly half of the first-time triptan buyers (43%) did not purchase another triptan for 5 years. In conflict with established guidelines, 90% of patients that discontinued triptan-based treatment had tried only one triptan type.
One important factor contributing to the ease of data collection is that the Danish population has free health care, coupled with sizable reimbursements for their spending. The country’s accessible health care system negates the effects of barriers related to price and availability while engendering data that more accurately reflects the patients’ experience based on treatment need and satisfaction.
“In a cohort with access to free clinical consultations and low medication costs, we observed low rates of triptan adherence, likely due to disappointing efficacy and/or unpleasant side effects rather than economic considerations. Triptan success continues to be hindered by poor implementation of clinical guidelines and high rates of treatment discontinuance,” the researchers concluded.
“The most surprising thing about this study is it is exactly what I would have expected if triptans in the U.S. were free,” Dr. Rapoport said.
Dr. Rapoport is the editor in chief of Neurology Reviews and serves as a consultant to several pharmaceutical companies.
“Few people continue on triptans either due to lack of efficacy or too many adverse events,” said Alan Rapoport, MD, clinical professor of neurology at the University of California, Los Angeles. “Some people overuse triptans when they are available and work well, but the patients are not properly informed, and do not listen.”
Migraine headaches fall among some of the most common neurologic disorders and claims the No. 2 spot in diseases that contribute to life lived with disability. An estimated 11.7% have migraine episodes annually, and the disorder carries a high prevalence through the duration of the patient’s life.
Triptans were noted as being a highly effective solution for acute migraine management when they were first introduced in the early 1990s and still remain the first-line treatment for acute migraine management not adequately controlled by ordinary analgesics and NSAIDs. As a drug class, the side-effect profile of triptans can vary, but frequent users run the risk of medication overuse headache, a condition noted by migraines of increased frequency and intensity.
25 years of triptan use
Study investigators conducted a nationwide, register-based cohort study using data collected from 7,435,758 Danish residents who accessed the public health care system between Jan. 1, 1994, and Oct. 31, 2019. The time frame accounts for a period of 139.0 million person-years when the residents were both alive and living in Denmark. Their findings were published online Feb. 14, 2021, in Cephalalgia.
Researchers evaluated and summarized purchases of all triptans in all dosage forms sold in Denmark during that time frame. These were sumatriptan, naratriptan, zolmitriptan, rizatriptan, almotriptan, eletriptan, and frovatriptan. Based on their finding, 381,695 patients purchased triptans at least one time. Triptan users were more likely to be female (75.7%) than male (24.3%).
Dr. Rapoport, who was not involved in the study, feels the differences in use between genders extrapolate to the U.S. migraine population as well. “Three times more women have migraines than men and buy triptans in that ratio,” he said.
Any patient who purchased at least one of any triptan at any point during the course of the study was classified as a triptan user. Triptan overuse is defined as using a triptan greater for at least 10 days a month for 3 consecutive months, as defined by the International Classification of Headache Disorders. It’s important to note that triptan are prescribed to patients for only two indications – migraines and cluster headaches. However, cluster headaches are extremely rare.
The study’s investigators summarized data collected throughout Denmark for more than a quarter of a century. The findings show an increase in triptan use from 345 defined daily doses to 945 defined daily doses per 1,000 residents per year along with an increased prevalence on triptan use from 5.17 to 14.57 per 1,000 inhabitants. In addition, 12.3% of the Danish residents who had migraines bought a triptan between 2014 and 2019 – data Dr. Rapoport noted falls in lines with trends in other Western countries, which range between 12% and 13%.
Nearly half of the first-time triptan buyers (43%) did not purchase another triptan for 5 years. In conflict with established guidelines, 90% of patients that discontinued triptan-based treatment had tried only one triptan type.
One important factor contributing to the ease of data collection is that the Danish population has free health care, coupled with sizable reimbursements for their spending. The country’s accessible health care system negates the effects of barriers related to price and availability while engendering data that more accurately reflects the patients’ experience based on treatment need and satisfaction.
“In a cohort with access to free clinical consultations and low medication costs, we observed low rates of triptan adherence, likely due to disappointing efficacy and/or unpleasant side effects rather than economic considerations. Triptan success continues to be hindered by poor implementation of clinical guidelines and high rates of treatment discontinuance,” the researchers concluded.
“The most surprising thing about this study is it is exactly what I would have expected if triptans in the U.S. were free,” Dr. Rapoport said.
Dr. Rapoport is the editor in chief of Neurology Reviews and serves as a consultant to several pharmaceutical companies.
FROM CEPHALALGIA
Core feature of frontotemporal dementia may aid diagnosis
(FTD) in findings that may help physicians make this difficult diagnosis that affects adults in their prime.
“The assessment of WMH can aid differential diagnosis of bvFTD [behavioral-variant FTD] against other neurodegenerative conditions in the absence of vascular risk factors, especially when considering their spatial distribution,” said senior author Ramón Landin-Romero, PhD, Appenzeller Neuroscience Fellow, Frontotemporal Dementia Research Group, University of Sydney.
“Clinicians can ask for specific sequences in routine MRI scans to visually detect WMH,” said Dr. Landin-Romero, who is also a senior lecturer in the School of Psychology and Brain and Mind Center.
The study was published online Feb. 17 in Neurology.
Difficult diagnosis
“FTD is a collection of unrecognized young-onset (before age 65) dementia syndromes that affect people in their prime,” said Dr. Landin-Romero. He added that heterogeneity in progression trajectories and symptoms, which can include changes in behavior and personality, language impairments, and psychosis, make it a difficult disease to diagnose.
“As such, our research was motivated by the need of sensitive and specific biomarkers of FTD, which are urgently needed to aid diagnosis, prognosis, and treatment development,” he said.
Previous research has been limited; there have only been a “handful” of cohort and case studies and studies involving individuals with mutations in one FTD-causative gene.
FTD is genetically and pathologically complex, and there has been no clear correlation between genetic mutations/underlying pathology and clinical presentation, Dr. Landin-Romero said.
WMH are common in older individuals and are linked to increased risk for cognitive impairment and dementia. Traditionally, they have been associated with vascular risk factors, such as smoking and diabetes. “But the presentation of WMH in FTD and its associations with the severity of symptoms and brain atrophy across FTD symptoms remains to be established,” said Dr. Landin-Romero.
Higher disease severity
To explore the possible association, the researchers studied 129 patients with either bvFTD (n = 64; mean age, 64 years) or Alzheimer’s disease (n = 65; mean age, 64.66 years).
Neuropsychological assessments, medical and neurologic examinations, clinical interview, and structural brain MRI were conducted for all patients, who were compared with 66 age-, sex-, and education-matched healthy control persons (mean age, 64.69 years).
Some participants in the FTD, Alzheimer’s disease, and healthy control groups (n = 54, 44, and 26, respectively) also underwent genetic screening. Postmortem pathology findings were available for a small number of FTD and Alzheimer’s disease participants (n = 13 and 5, respectively).
The medical history included lifestyle and cardiovascular risk factors, as well as other health and neurologic conditions and medication history. Hypertension, hypercholesterolemia, diabetes, and smoking were used to assess vascular risk.
The FTD and Alzheimer’s disease groups did not differ with regard to disease duration (3.55 years; standard deviation, 1.75, and 3.24 years; SD, 1.59, respectively). However, disease severity was significantly higher among those with FTD than among those with Alzheimer’s disease, as measured by the FTD Rating Scale Rasch score (–0.52; SD, 1.28, vs. 0.78; SD, 1.55; P < .001).
Compared with healthy controls, patients in the FTD and Alzheimer’s disease groups scored significantly lower on the Addenbrooke’s Cognitive Examination–Revised (ACE-R) or ACE-III scale. Patients with Alzheimer’s disease showed “disproportionately larger deficits” in memory and visuospatial processing, compared with those with FTD, whereas those with FTD performed significantly worse than those with Alzheimer’s disease in the fluency subdomain.
A larger number of patients in the FTD group screened positive for genetic abnormalities than in the Alzheimer’s disease group; no participants in the healthy control group had genetic mutations.
Unexpected findings
Mean WMH volume was significantly higher in participants with FTD than in participants with Alzheimer’s disease and in healthy controls (mean, 0.76 mL, 0.40 mL, and 0.12 mL respectively). These larger volumes contributed to greater disease severity and cortical atrophy. Moreover, disease severity was “found to be a strong predictor of WMH volume in FTD,” the authors stated. Among patients with FTD, WMH volumes did not differ significantly with regard to genetic mutation status or presence of strong family history.
After controlling for age, vascular risk did not significantly predict WMH volume in the FTD group (P = .16); however, that did not hold true in the Alzheimer’s disease group.
Increased WMH were associated with anterior brain regions in FTD and with posterior brain regions in Alzheimer’s disease. In both disorders, higher WMH volume in the corpus callosum was associated with poorer cognitive performance in the domain of attention.
“The spatial distribution of WMH mirrored patterns of brain atrophy in FTD and Alzheimer’s disease, was partially independent of cortical degeneration, and was correlated with cognitive deficits,” said Dr. Landin-Romero.
The findings were not what he and his research colleagues expected. “We were expecting that the amounts of WMH would be similar in FTD and Alzheimer’s disease, but we actually found higher levels in participants with FTD,” he said. Additionally, he anticipated that patients with either FTD or Alzheimer’s disease who had more severe disease would have more WMH, but that finding only held true for people with FTD.
“In sum, our findings show that WMH are a core feature of FTD and Alzheimer’s disease that can contribute to cognitive problems, and not simply as a marker of vascular disease,” said Dr. Landin-Romero.
Major research contribution
Commenting on the study, Jordi Matias-Guiu, PhD, MD, of the department of neurology, Hospital Clinico, San Carlos, Spain, considers the study to be a “great contribution to the field.” Dr. Matias-Guiu, who was not involved with the study, said that WMH “do not necessarily mean vascular pathology, and atrophy may partially explain these abnormalities and should be taken into account in the interpretation of brain MRI.
“WMH are present in both Alzheimer’s disease and FTD and are relevant to cognitive deficits found in these disorders,” he added.
The study was funded by grants from the National Health and Medical Research Council of Australia, the Dementia Research Team, and the ARC Center of Excellence in Cognition and Its Disorders. Dr. Landin-Romero is supported by the Appenzeller Neuroscience Fellowship in Alzheimer’s Disease and the ARC Center of Excellence in Cognition and Its Disorders Memory Program. The other authors’ disclosures are listed on the original article. Dr. Matias-Guiu reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
(FTD) in findings that may help physicians make this difficult diagnosis that affects adults in their prime.
“The assessment of WMH can aid differential diagnosis of bvFTD [behavioral-variant FTD] against other neurodegenerative conditions in the absence of vascular risk factors, especially when considering their spatial distribution,” said senior author Ramón Landin-Romero, PhD, Appenzeller Neuroscience Fellow, Frontotemporal Dementia Research Group, University of Sydney.
“Clinicians can ask for specific sequences in routine MRI scans to visually detect WMH,” said Dr. Landin-Romero, who is also a senior lecturer in the School of Psychology and Brain and Mind Center.
The study was published online Feb. 17 in Neurology.
Difficult diagnosis
“FTD is a collection of unrecognized young-onset (before age 65) dementia syndromes that affect people in their prime,” said Dr. Landin-Romero. He added that heterogeneity in progression trajectories and symptoms, which can include changes in behavior and personality, language impairments, and psychosis, make it a difficult disease to diagnose.
“As such, our research was motivated by the need of sensitive and specific biomarkers of FTD, which are urgently needed to aid diagnosis, prognosis, and treatment development,” he said.
Previous research has been limited; there have only been a “handful” of cohort and case studies and studies involving individuals with mutations in one FTD-causative gene.
FTD is genetically and pathologically complex, and there has been no clear correlation between genetic mutations/underlying pathology and clinical presentation, Dr. Landin-Romero said.
WMH are common in older individuals and are linked to increased risk for cognitive impairment and dementia. Traditionally, they have been associated with vascular risk factors, such as smoking and diabetes. “But the presentation of WMH in FTD and its associations with the severity of symptoms and brain atrophy across FTD symptoms remains to be established,” said Dr. Landin-Romero.
Higher disease severity
To explore the possible association, the researchers studied 129 patients with either bvFTD (n = 64; mean age, 64 years) or Alzheimer’s disease (n = 65; mean age, 64.66 years).
Neuropsychological assessments, medical and neurologic examinations, clinical interview, and structural brain MRI were conducted for all patients, who were compared with 66 age-, sex-, and education-matched healthy control persons (mean age, 64.69 years).
Some participants in the FTD, Alzheimer’s disease, and healthy control groups (n = 54, 44, and 26, respectively) also underwent genetic screening. Postmortem pathology findings were available for a small number of FTD and Alzheimer’s disease participants (n = 13 and 5, respectively).
The medical history included lifestyle and cardiovascular risk factors, as well as other health and neurologic conditions and medication history. Hypertension, hypercholesterolemia, diabetes, and smoking were used to assess vascular risk.
The FTD and Alzheimer’s disease groups did not differ with regard to disease duration (3.55 years; standard deviation, 1.75, and 3.24 years; SD, 1.59, respectively). However, disease severity was significantly higher among those with FTD than among those with Alzheimer’s disease, as measured by the FTD Rating Scale Rasch score (–0.52; SD, 1.28, vs. 0.78; SD, 1.55; P < .001).
Compared with healthy controls, patients in the FTD and Alzheimer’s disease groups scored significantly lower on the Addenbrooke’s Cognitive Examination–Revised (ACE-R) or ACE-III scale. Patients with Alzheimer’s disease showed “disproportionately larger deficits” in memory and visuospatial processing, compared with those with FTD, whereas those with FTD performed significantly worse than those with Alzheimer’s disease in the fluency subdomain.
A larger number of patients in the FTD group screened positive for genetic abnormalities than in the Alzheimer’s disease group; no participants in the healthy control group had genetic mutations.
Unexpected findings
Mean WMH volume was significantly higher in participants with FTD than in participants with Alzheimer’s disease and in healthy controls (mean, 0.76 mL, 0.40 mL, and 0.12 mL respectively). These larger volumes contributed to greater disease severity and cortical atrophy. Moreover, disease severity was “found to be a strong predictor of WMH volume in FTD,” the authors stated. Among patients with FTD, WMH volumes did not differ significantly with regard to genetic mutation status or presence of strong family history.
After controlling for age, vascular risk did not significantly predict WMH volume in the FTD group (P = .16); however, that did not hold true in the Alzheimer’s disease group.
Increased WMH were associated with anterior brain regions in FTD and with posterior brain regions in Alzheimer’s disease. In both disorders, higher WMH volume in the corpus callosum was associated with poorer cognitive performance in the domain of attention.
“The spatial distribution of WMH mirrored patterns of brain atrophy in FTD and Alzheimer’s disease, was partially independent of cortical degeneration, and was correlated with cognitive deficits,” said Dr. Landin-Romero.
The findings were not what he and his research colleagues expected. “We were expecting that the amounts of WMH would be similar in FTD and Alzheimer’s disease, but we actually found higher levels in participants with FTD,” he said. Additionally, he anticipated that patients with either FTD or Alzheimer’s disease who had more severe disease would have more WMH, but that finding only held true for people with FTD.
“In sum, our findings show that WMH are a core feature of FTD and Alzheimer’s disease that can contribute to cognitive problems, and not simply as a marker of vascular disease,” said Dr. Landin-Romero.
Major research contribution
Commenting on the study, Jordi Matias-Guiu, PhD, MD, of the department of neurology, Hospital Clinico, San Carlos, Spain, considers the study to be a “great contribution to the field.” Dr. Matias-Guiu, who was not involved with the study, said that WMH “do not necessarily mean vascular pathology, and atrophy may partially explain these abnormalities and should be taken into account in the interpretation of brain MRI.
“WMH are present in both Alzheimer’s disease and FTD and are relevant to cognitive deficits found in these disorders,” he added.
The study was funded by grants from the National Health and Medical Research Council of Australia, the Dementia Research Team, and the ARC Center of Excellence in Cognition and Its Disorders. Dr. Landin-Romero is supported by the Appenzeller Neuroscience Fellowship in Alzheimer’s Disease and the ARC Center of Excellence in Cognition and Its Disorders Memory Program. The other authors’ disclosures are listed on the original article. Dr. Matias-Guiu reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
(FTD) in findings that may help physicians make this difficult diagnosis that affects adults in their prime.
“The assessment of WMH can aid differential diagnosis of bvFTD [behavioral-variant FTD] against other neurodegenerative conditions in the absence of vascular risk factors, especially when considering their spatial distribution,” said senior author Ramón Landin-Romero, PhD, Appenzeller Neuroscience Fellow, Frontotemporal Dementia Research Group, University of Sydney.
“Clinicians can ask for specific sequences in routine MRI scans to visually detect WMH,” said Dr. Landin-Romero, who is also a senior lecturer in the School of Psychology and Brain and Mind Center.
The study was published online Feb. 17 in Neurology.
Difficult diagnosis
“FTD is a collection of unrecognized young-onset (before age 65) dementia syndromes that affect people in their prime,” said Dr. Landin-Romero. He added that heterogeneity in progression trajectories and symptoms, which can include changes in behavior and personality, language impairments, and psychosis, make it a difficult disease to diagnose.
“As such, our research was motivated by the need of sensitive and specific biomarkers of FTD, which are urgently needed to aid diagnosis, prognosis, and treatment development,” he said.
Previous research has been limited; there have only been a “handful” of cohort and case studies and studies involving individuals with mutations in one FTD-causative gene.
FTD is genetically and pathologically complex, and there has been no clear correlation between genetic mutations/underlying pathology and clinical presentation, Dr. Landin-Romero said.
WMH are common in older individuals and are linked to increased risk for cognitive impairment and dementia. Traditionally, they have been associated with vascular risk factors, such as smoking and diabetes. “But the presentation of WMH in FTD and its associations with the severity of symptoms and brain atrophy across FTD symptoms remains to be established,” said Dr. Landin-Romero.
Higher disease severity
To explore the possible association, the researchers studied 129 patients with either bvFTD (n = 64; mean age, 64 years) or Alzheimer’s disease (n = 65; mean age, 64.66 years).
Neuropsychological assessments, medical and neurologic examinations, clinical interview, and structural brain MRI were conducted for all patients, who were compared with 66 age-, sex-, and education-matched healthy control persons (mean age, 64.69 years).
Some participants in the FTD, Alzheimer’s disease, and healthy control groups (n = 54, 44, and 26, respectively) also underwent genetic screening. Postmortem pathology findings were available for a small number of FTD and Alzheimer’s disease participants (n = 13 and 5, respectively).
The medical history included lifestyle and cardiovascular risk factors, as well as other health and neurologic conditions and medication history. Hypertension, hypercholesterolemia, diabetes, and smoking were used to assess vascular risk.
The FTD and Alzheimer’s disease groups did not differ with regard to disease duration (3.55 years; standard deviation, 1.75, and 3.24 years; SD, 1.59, respectively). However, disease severity was significantly higher among those with FTD than among those with Alzheimer’s disease, as measured by the FTD Rating Scale Rasch score (–0.52; SD, 1.28, vs. 0.78; SD, 1.55; P < .001).
Compared with healthy controls, patients in the FTD and Alzheimer’s disease groups scored significantly lower on the Addenbrooke’s Cognitive Examination–Revised (ACE-R) or ACE-III scale. Patients with Alzheimer’s disease showed “disproportionately larger deficits” in memory and visuospatial processing, compared with those with FTD, whereas those with FTD performed significantly worse than those with Alzheimer’s disease in the fluency subdomain.
A larger number of patients in the FTD group screened positive for genetic abnormalities than in the Alzheimer’s disease group; no participants in the healthy control group had genetic mutations.
Unexpected findings
Mean WMH volume was significantly higher in participants with FTD than in participants with Alzheimer’s disease and in healthy controls (mean, 0.76 mL, 0.40 mL, and 0.12 mL respectively). These larger volumes contributed to greater disease severity and cortical atrophy. Moreover, disease severity was “found to be a strong predictor of WMH volume in FTD,” the authors stated. Among patients with FTD, WMH volumes did not differ significantly with regard to genetic mutation status or presence of strong family history.
After controlling for age, vascular risk did not significantly predict WMH volume in the FTD group (P = .16); however, that did not hold true in the Alzheimer’s disease group.
Increased WMH were associated with anterior brain regions in FTD and with posterior brain regions in Alzheimer’s disease. In both disorders, higher WMH volume in the corpus callosum was associated with poorer cognitive performance in the domain of attention.
“The spatial distribution of WMH mirrored patterns of brain atrophy in FTD and Alzheimer’s disease, was partially independent of cortical degeneration, and was correlated with cognitive deficits,” said Dr. Landin-Romero.
The findings were not what he and his research colleagues expected. “We were expecting that the amounts of WMH would be similar in FTD and Alzheimer’s disease, but we actually found higher levels in participants with FTD,” he said. Additionally, he anticipated that patients with either FTD or Alzheimer’s disease who had more severe disease would have more WMH, but that finding only held true for people with FTD.
“In sum, our findings show that WMH are a core feature of FTD and Alzheimer’s disease that can contribute to cognitive problems, and not simply as a marker of vascular disease,” said Dr. Landin-Romero.
Major research contribution
Commenting on the study, Jordi Matias-Guiu, PhD, MD, of the department of neurology, Hospital Clinico, San Carlos, Spain, considers the study to be a “great contribution to the field.” Dr. Matias-Guiu, who was not involved with the study, said that WMH “do not necessarily mean vascular pathology, and atrophy may partially explain these abnormalities and should be taken into account in the interpretation of brain MRI.
“WMH are present in both Alzheimer’s disease and FTD and are relevant to cognitive deficits found in these disorders,” he added.
The study was funded by grants from the National Health and Medical Research Council of Australia, the Dementia Research Team, and the ARC Center of Excellence in Cognition and Its Disorders. Dr. Landin-Romero is supported by the Appenzeller Neuroscience Fellowship in Alzheimer’s Disease and the ARC Center of Excellence in Cognition and Its Disorders Memory Program. The other authors’ disclosures are listed on the original article. Dr. Matias-Guiu reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
EEG data may help aid diagnosis, treatment of focal epilepsy
, new research suggests. Findings from a large longitudinal study show that seizure onset in patients with focal epilepsy follows circadian, multiday, and annual cycles.
“Although daily and multiday rhythms have previously been identified, the extent to which these nonrandom rhythms exist in a larger cohort has been unclear,” said study investigator Joline Marie Fan, MD, a clinical fellow at the University of California, San Francisco. “This means that a patient with epilepsy may have a unique combination of seizure rhythms that can inform the days and timing of his or her highest seizure risk,” she added.
The study was published online Feb. 8 in JAMA Neurology.
Distinct chronotypes
Clinicians and patients alike have long observed cyclical patterns in the onset of epileptic seizures. However, such patterns have rarely been measured in a quantitative way.
Previous studies have examined seizure cycles using inpatient seizure monitoring and patients’ seizure diaries, but the duration of these recordings and their accuracy have been limited. Within the past decade, the advent of cEEG has allowed researchers to observe the cyclical pattern of interictal epileptiform activity, but the numbers of patients involved in such studies have been limited.
To investigate seizure chronotypes in greater detail, the researchers examined retrospective data for 222 adults with medically refractory focal epilepsy who took part in clinical trials of the NeuroPace responsive neurostimulation (RNS) system.
After implantation in the brain, this system monitors the seizure focus or foci continuously and delivers stimulation to stop seizures. Participants also kept seizure diaries and classified their seizures as simple motor, simple other, complex partial, and generalized tonic-clonic.
Dr. Fan’s group examined three subpopulations of patients to investigate three durations of seizure cycles. They examined self-reported disabling seizures, electrographic seizures, and interictal epileptiform activity. Because patients did not record the time of their disabling seizures, the investigators examined them only in multidien and circannual cycles.
To examine circannual seizure cycles, the investigators included 194 patients who kept continuous seizure diaries for 2 or more years and who reported 24 or more days in which disabling seizures occurred.
To examine multidien seizure cycles, they included 186 participants who reported 24 or more days with disabling seizures over a period of 6 or more months during which the RNS system collected cEEG data. They included 85 patients who had 48 hours or more in which electrographic seizure counts were above zero during 6 or more months of cEEG data collection to examine circadian seizure cycles.
Phase-locking value (PLV) was used to determine the strength of a cycle (i.e., the degree of consistency with which seizures occur during certain phases of a cycle). A PLV of 0 represents a uniform distribution of events during various phases of a cycle; a PLV of 1 indicates that all events occur exactly at the same phase of a cycle.
The population’s median age was 35 years, and the sample included approximately equal numbers of men and women. Patients’ focal epilepsies included mesiotemporal (57.2%), frontal (14.0%), neocortical-temporal (9.9%), parietal (4.1%), occipital (1.4%), and multifocal (13.5%). The data included 1,118 patient-years of cEEG, 754,108 electrographic seizures, and 313,995 self-reported seizures.
The prevalence of statistically significant circannual seizure cycles in this population was 12%. The prevalence of multidien seizure cycles was 60%, and the prevalence of circadian seizure cycles was 89%. Multidien cycles (mean PLV, 0.34) and circadian cycles (mean PLV, 0.34) were stronger than were circannual cycles (mean PLV, 0.17).
Among patients with circannual seizure cycles, there was a weak to moderate tendency for seizures to occur during one of the four seasons. There was no overall trend toward seizure onset in one season among this group.
Among patients with multidien seizure cycles, investigators identified five patterns of interictal epileptiform activity fluctuations. One pattern had irregular periodicity, and the others reached peak periodicity at 7, 15, 20, and 30 days. For some patients, one or more periodicities occurred. For most patients, electrographic or self-reported seizures tended to occur on the rising phase of the interictal epileptiform activity cycle. Interictal epileptiform activity increased on days around seizures.
Results showed there were five main seizure peak times among patients with circadian seizure cycles: midnight, 3:00 a.m., 9:00 a.m., 2:00 p.m., and 6:00 p.m. These findings corroborate the observations of previous investigations, the researchers noted. Hourly interictal epileptiform activity peaked during the night, regardless of peak seizure time.
“Although the neurostimulation device offers us a unique opportunity to investigate electrographic seizure activity quantitatively, the generalizability of our study is limited to the patient cohort that we studied,” said Dr. Fan. “The study findings are limited to patients with neurostimulation devices used for intractable focal epilepsies.”
The results support patients’ impressions that their seizures occur in a cyclical pattern.
“Ultimately, these findings will be helpful for developing models to aid with seizure forecasting and prediction in order to help reduce the uncertainty of seizure timing for patients with epilepsy,” said Dr. Fan.
“Other implications include optimizing the timing for patients to be admitted into the hospital for seizure characterization based on their seizure chronotype, or possibly tailoring a medication regimen in accordance with a patient’s seizure cycles,” she added.
Need for more research
Commenting on the findings, Tobias Loddenkemper, MD, professor of neurology at Harvard Medical School, Boston, noted that the study is “one of the largest longitudinal seizure pattern analyses, based on the gold standard of intracranially recorded epileptic seizures.”
The research, he added, extends neurologists’ understanding of seizure patterns over time, expands knowledge about seizure chronotypes, and emphasizes a relationship between interictal epileptiform activity and seizures.
The strengths of the study include the recording of seizures with intracranial EEG, its large number of participants, and the long duration of recordings, Dr. Loddenkemper said.
However, he said, it is important to note that self-reports are not always reliable. The results may also reflect the influence of potential confounders of seizure patterns, such as seizure triggers, treatment, stimulation, or sleep-wake, circadian, or hormonal cycles, he added.
“In the short term, validation studies, as well as confirmatory studies with less invasive sensors, may be needed,” said Dr. Loddenkemper.
“This could potentially include a trial that confirms findings prospectively, utilizing results from video EEG monitoring admissions. In the long term, seizure detection and prediction, as well as interventional chronotherapeutic trials, may be enabled, predicting seizures in individual patients and treating at times of greatest seizure susceptibility.”
The study was supported by grants to some of the authors from the Wyss Center for Bio and Neuroengineering, the Ernest Gallo Foundation, the Swiss National Science Foundation, and the Velux Stiftung. Dr. Fan has disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, new research suggests. Findings from a large longitudinal study show that seizure onset in patients with focal epilepsy follows circadian, multiday, and annual cycles.
“Although daily and multiday rhythms have previously been identified, the extent to which these nonrandom rhythms exist in a larger cohort has been unclear,” said study investigator Joline Marie Fan, MD, a clinical fellow at the University of California, San Francisco. “This means that a patient with epilepsy may have a unique combination of seizure rhythms that can inform the days and timing of his or her highest seizure risk,” she added.
The study was published online Feb. 8 in JAMA Neurology.
Distinct chronotypes
Clinicians and patients alike have long observed cyclical patterns in the onset of epileptic seizures. However, such patterns have rarely been measured in a quantitative way.
Previous studies have examined seizure cycles using inpatient seizure monitoring and patients’ seizure diaries, but the duration of these recordings and their accuracy have been limited. Within the past decade, the advent of cEEG has allowed researchers to observe the cyclical pattern of interictal epileptiform activity, but the numbers of patients involved in such studies have been limited.
To investigate seizure chronotypes in greater detail, the researchers examined retrospective data for 222 adults with medically refractory focal epilepsy who took part in clinical trials of the NeuroPace responsive neurostimulation (RNS) system.
After implantation in the brain, this system monitors the seizure focus or foci continuously and delivers stimulation to stop seizures. Participants also kept seizure diaries and classified their seizures as simple motor, simple other, complex partial, and generalized tonic-clonic.
Dr. Fan’s group examined three subpopulations of patients to investigate three durations of seizure cycles. They examined self-reported disabling seizures, electrographic seizures, and interictal epileptiform activity. Because patients did not record the time of their disabling seizures, the investigators examined them only in multidien and circannual cycles.
To examine circannual seizure cycles, the investigators included 194 patients who kept continuous seizure diaries for 2 or more years and who reported 24 or more days in which disabling seizures occurred.
To examine multidien seizure cycles, they included 186 participants who reported 24 or more days with disabling seizures over a period of 6 or more months during which the RNS system collected cEEG data. They included 85 patients who had 48 hours or more in which electrographic seizure counts were above zero during 6 or more months of cEEG data collection to examine circadian seizure cycles.
Phase-locking value (PLV) was used to determine the strength of a cycle (i.e., the degree of consistency with which seizures occur during certain phases of a cycle). A PLV of 0 represents a uniform distribution of events during various phases of a cycle; a PLV of 1 indicates that all events occur exactly at the same phase of a cycle.
The population’s median age was 35 years, and the sample included approximately equal numbers of men and women. Patients’ focal epilepsies included mesiotemporal (57.2%), frontal (14.0%), neocortical-temporal (9.9%), parietal (4.1%), occipital (1.4%), and multifocal (13.5%). The data included 1,118 patient-years of cEEG, 754,108 electrographic seizures, and 313,995 self-reported seizures.
The prevalence of statistically significant circannual seizure cycles in this population was 12%. The prevalence of multidien seizure cycles was 60%, and the prevalence of circadian seizure cycles was 89%. Multidien cycles (mean PLV, 0.34) and circadian cycles (mean PLV, 0.34) were stronger than were circannual cycles (mean PLV, 0.17).
Among patients with circannual seizure cycles, there was a weak to moderate tendency for seizures to occur during one of the four seasons. There was no overall trend toward seizure onset in one season among this group.
Among patients with multidien seizure cycles, investigators identified five patterns of interictal epileptiform activity fluctuations. One pattern had irregular periodicity, and the others reached peak periodicity at 7, 15, 20, and 30 days. For some patients, one or more periodicities occurred. For most patients, electrographic or self-reported seizures tended to occur on the rising phase of the interictal epileptiform activity cycle. Interictal epileptiform activity increased on days around seizures.
Results showed there were five main seizure peak times among patients with circadian seizure cycles: midnight, 3:00 a.m., 9:00 a.m., 2:00 p.m., and 6:00 p.m. These findings corroborate the observations of previous investigations, the researchers noted. Hourly interictal epileptiform activity peaked during the night, regardless of peak seizure time.
“Although the neurostimulation device offers us a unique opportunity to investigate electrographic seizure activity quantitatively, the generalizability of our study is limited to the patient cohort that we studied,” said Dr. Fan. “The study findings are limited to patients with neurostimulation devices used for intractable focal epilepsies.”
The results support patients’ impressions that their seizures occur in a cyclical pattern.
“Ultimately, these findings will be helpful for developing models to aid with seizure forecasting and prediction in order to help reduce the uncertainty of seizure timing for patients with epilepsy,” said Dr. Fan.
“Other implications include optimizing the timing for patients to be admitted into the hospital for seizure characterization based on their seizure chronotype, or possibly tailoring a medication regimen in accordance with a patient’s seizure cycles,” she added.
Need for more research
Commenting on the findings, Tobias Loddenkemper, MD, professor of neurology at Harvard Medical School, Boston, noted that the study is “one of the largest longitudinal seizure pattern analyses, based on the gold standard of intracranially recorded epileptic seizures.”
The research, he added, extends neurologists’ understanding of seizure patterns over time, expands knowledge about seizure chronotypes, and emphasizes a relationship between interictal epileptiform activity and seizures.
The strengths of the study include the recording of seizures with intracranial EEG, its large number of participants, and the long duration of recordings, Dr. Loddenkemper said.
However, he said, it is important to note that self-reports are not always reliable. The results may also reflect the influence of potential confounders of seizure patterns, such as seizure triggers, treatment, stimulation, or sleep-wake, circadian, or hormonal cycles, he added.
“In the short term, validation studies, as well as confirmatory studies with less invasive sensors, may be needed,” said Dr. Loddenkemper.
“This could potentially include a trial that confirms findings prospectively, utilizing results from video EEG monitoring admissions. In the long term, seizure detection and prediction, as well as interventional chronotherapeutic trials, may be enabled, predicting seizures in individual patients and treating at times of greatest seizure susceptibility.”
The study was supported by grants to some of the authors from the Wyss Center for Bio and Neuroengineering, the Ernest Gallo Foundation, the Swiss National Science Foundation, and the Velux Stiftung. Dr. Fan has disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
, new research suggests. Findings from a large longitudinal study show that seizure onset in patients with focal epilepsy follows circadian, multiday, and annual cycles.
“Although daily and multiday rhythms have previously been identified, the extent to which these nonrandom rhythms exist in a larger cohort has been unclear,” said study investigator Joline Marie Fan, MD, a clinical fellow at the University of California, San Francisco. “This means that a patient with epilepsy may have a unique combination of seizure rhythms that can inform the days and timing of his or her highest seizure risk,” she added.
The study was published online Feb. 8 in JAMA Neurology.
Distinct chronotypes
Clinicians and patients alike have long observed cyclical patterns in the onset of epileptic seizures. However, such patterns have rarely been measured in a quantitative way.
Previous studies have examined seizure cycles using inpatient seizure monitoring and patients’ seizure diaries, but the duration of these recordings and their accuracy have been limited. Within the past decade, the advent of cEEG has allowed researchers to observe the cyclical pattern of interictal epileptiform activity, but the numbers of patients involved in such studies have been limited.
To investigate seizure chronotypes in greater detail, the researchers examined retrospective data for 222 adults with medically refractory focal epilepsy who took part in clinical trials of the NeuroPace responsive neurostimulation (RNS) system.
After implantation in the brain, this system monitors the seizure focus or foci continuously and delivers stimulation to stop seizures. Participants also kept seizure diaries and classified their seizures as simple motor, simple other, complex partial, and generalized tonic-clonic.
Dr. Fan’s group examined three subpopulations of patients to investigate three durations of seizure cycles. They examined self-reported disabling seizures, electrographic seizures, and interictal epileptiform activity. Because patients did not record the time of their disabling seizures, the investigators examined them only in multidien and circannual cycles.
To examine circannual seizure cycles, the investigators included 194 patients who kept continuous seizure diaries for 2 or more years and who reported 24 or more days in which disabling seizures occurred.
To examine multidien seizure cycles, they included 186 participants who reported 24 or more days with disabling seizures over a period of 6 or more months during which the RNS system collected cEEG data. They included 85 patients who had 48 hours or more in which electrographic seizure counts were above zero during 6 or more months of cEEG data collection to examine circadian seizure cycles.
Phase-locking value (PLV) was used to determine the strength of a cycle (i.e., the degree of consistency with which seizures occur during certain phases of a cycle). A PLV of 0 represents a uniform distribution of events during various phases of a cycle; a PLV of 1 indicates that all events occur exactly at the same phase of a cycle.
The population’s median age was 35 years, and the sample included approximately equal numbers of men and women. Patients’ focal epilepsies included mesiotemporal (57.2%), frontal (14.0%), neocortical-temporal (9.9%), parietal (4.1%), occipital (1.4%), and multifocal (13.5%). The data included 1,118 patient-years of cEEG, 754,108 electrographic seizures, and 313,995 self-reported seizures.
The prevalence of statistically significant circannual seizure cycles in this population was 12%. The prevalence of multidien seizure cycles was 60%, and the prevalence of circadian seizure cycles was 89%. Multidien cycles (mean PLV, 0.34) and circadian cycles (mean PLV, 0.34) were stronger than were circannual cycles (mean PLV, 0.17).
Among patients with circannual seizure cycles, there was a weak to moderate tendency for seizures to occur during one of the four seasons. There was no overall trend toward seizure onset in one season among this group.
Among patients with multidien seizure cycles, investigators identified five patterns of interictal epileptiform activity fluctuations. One pattern had irregular periodicity, and the others reached peak periodicity at 7, 15, 20, and 30 days. For some patients, one or more periodicities occurred. For most patients, electrographic or self-reported seizures tended to occur on the rising phase of the interictal epileptiform activity cycle. Interictal epileptiform activity increased on days around seizures.
Results showed there were five main seizure peak times among patients with circadian seizure cycles: midnight, 3:00 a.m., 9:00 a.m., 2:00 p.m., and 6:00 p.m. These findings corroborate the observations of previous investigations, the researchers noted. Hourly interictal epileptiform activity peaked during the night, regardless of peak seizure time.
“Although the neurostimulation device offers us a unique opportunity to investigate electrographic seizure activity quantitatively, the generalizability of our study is limited to the patient cohort that we studied,” said Dr. Fan. “The study findings are limited to patients with neurostimulation devices used for intractable focal epilepsies.”
The results support patients’ impressions that their seizures occur in a cyclical pattern.
“Ultimately, these findings will be helpful for developing models to aid with seizure forecasting and prediction in order to help reduce the uncertainty of seizure timing for patients with epilepsy,” said Dr. Fan.
“Other implications include optimizing the timing for patients to be admitted into the hospital for seizure characterization based on their seizure chronotype, or possibly tailoring a medication regimen in accordance with a patient’s seizure cycles,” she added.
Need for more research
Commenting on the findings, Tobias Loddenkemper, MD, professor of neurology at Harvard Medical School, Boston, noted that the study is “one of the largest longitudinal seizure pattern analyses, based on the gold standard of intracranially recorded epileptic seizures.”
The research, he added, extends neurologists’ understanding of seizure patterns over time, expands knowledge about seizure chronotypes, and emphasizes a relationship between interictal epileptiform activity and seizures.
The strengths of the study include the recording of seizures with intracranial EEG, its large number of participants, and the long duration of recordings, Dr. Loddenkemper said.
However, he said, it is important to note that self-reports are not always reliable. The results may also reflect the influence of potential confounders of seizure patterns, such as seizure triggers, treatment, stimulation, or sleep-wake, circadian, or hormonal cycles, he added.
“In the short term, validation studies, as well as confirmatory studies with less invasive sensors, may be needed,” said Dr. Loddenkemper.
“This could potentially include a trial that confirms findings prospectively, utilizing results from video EEG monitoring admissions. In the long term, seizure detection and prediction, as well as interventional chronotherapeutic trials, may be enabled, predicting seizures in individual patients and treating at times of greatest seizure susceptibility.”
The study was supported by grants to some of the authors from the Wyss Center for Bio and Neuroengineering, the Ernest Gallo Foundation, the Swiss National Science Foundation, and the Velux Stiftung. Dr. Fan has disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM JAMA NEUROLOGY
New data may help intercept head injuries in college football
Novel research from the Concussion Assessment, Research and Education (CARE) Consortium sheds new light on how to effectively reduce the incidence of concussion and head injury exposure in college football.
The study, led by neurotrauma experts Michael McCrea, PhD, and Brian Stemper, PhD, professors of neurosurgery at the Medical College of Wisconsin in Milwaukee, reports data from hundreds of college football players across five seasons and shows
The research also reveals that such injuries occur more often during practices than games.
“We think that with the findings from this paper, there’s a role for everybody to play in reducing injury,” Dr. McCrea said. “We hope these data help inform broad-based policy about practice and preseason training policies in collegiate football. We also think there’s a role for athletic administrators, coaches, and even athletes themselves.”
The study was published online Feb. 1 in JAMA Neurology.
More injuries in preseason
Concussion is one of the most common injuries in football. Beyond these harms are growing concerns that repetitive HIE may increase the risk of long-term neurologic health problems including chronic traumatic encephalopathy (CTE).
The CARE Consortium, which has been conducting research with college athletes across 26 sports and military cadets since 2014, has been interested in multiple facets of concussion and brain trauma.
“We’ve enrolled more than 50,000 athletes and service academy cadets into the consortium over the last 6 years to research all involved aspects including the clinical core, the imaging core, the blood biomarker core, and the genetic core, and we have a head impact measurement core.”
To investigate the pattern of concussion incidence across the football season in college players, the investigators used impact measurement technology across six Division I NCAA football programs participating in the CARE Consortium from 2015 to 2019.
A total of 658 players – all male, mean age 19 years – were fitted with the Head Impact Telemetry System (HITS) sensor arrays in their helmets to measure head impact frequency, location, and magnitude during play.
“This particular study had built-in algorithms that weeded out impacts that were below 10G of linear magnitude, because those have been determined not likely to be real impacts,” Dr. McCrea said.
Across the five seasons studied, 528,684 head impacts recorded met the quality standards for analysis. Players sustained a median of 415 (interquartile range [IQR], 190-727) impacts per season.
Of those, 68 players sustained a diagnosed concussion. In total, 48.5% of concussions occurred during preseason training, despite preseason representing only 20.8% of the football season. Total head injury exposure in the preseason occurred at twice the proportion of the regular season (324.9 vs. 162.4 impacts per team per day; mean difference, 162.6 impacts; 95% confidence interval, 110.9-214.3; P < .001).
“Preseason training often has a much higher intensity to it, in terms of the total hours, the actual training, and the heavy emphasis on full-contact drills like tackling and blocking,” said Dr. McCrea. “Even the volume of players that are participating is greater.”
Results also showed that in each of the five seasons, head injury exposure per athlete was highest in August (preseason) (median, 146.0 impacts; IQR, 63.0-247.8) and lowest in November (median, 80.0 impacts; IQR, 35.0-148.0). In the studied period, 72% of concussions and 66.9% of head injury exposure occurred in practice. Even within the regular season, total head injury exposure in practices was 84.2% higher than in games.
“This incredible dataset we have on head impact measurement also gives us the opportunity to compare it with our other research looking at the correlation between a single head impact and changes in brain structure and function on MRI, on blood biomarkers, giving us the ability to look at the connection between mechanism of effect of injury and recovery from injury,” said Dr. McCrea.
These findings also provide an opportunity to modify approaches to preseason training and football practices to keep players safer, said Dr. McCrea, noting that about half of the variance in head injury exposure is at the level of the individual athlete.
“With this large body of athletes we’ve instrumented, we can look at, for instance, all of the running backs and understand the athlete and what his head injury exposure looks like compared to all other running backs. If we find out that an athlete has a rate of head injury exposure that’s 300% higher than most other players that play the same position, we can take that data directly to the athlete to work on their technique and approach to the game.
“Every researcher wishes that their basic science or their clinical research findings will have some impact on the health and well-being of the population they’re studying. By modifying practices and preseason training, football teams could greatly reduce the risk of injury and exposure for their players, while still maintaining the competitive nature of game play,” he added.
Through a combination of policy and education, similar strategies could be implemented to help prevent concussion and HIE in high school and youth football too, said Dr. McCrea.
‘Shocking’ findings
In an accompanying editorial, Christopher J. Nowinski, PhD, of the Concussion Legacy Foundation, Boston, and Robert C. Cantu, MD, department of neurosurgery, Emerson Hospital, Concord, Massachusetts, said the findings could have significant policy implications and offer a valuable expansion of prior research.
“From 2005 to 2010, studies on college football revealed that about two-thirds of head impacts occurred in practice,” they noted. “We cited this data in 2010 when we proposed to the NFL Players Association that the most effective way to reduce the risks of negative neurological outcomes was to reduce hitting in practice. They agreed, and in 2011 collectively bargained for severe contact limits in practice, with 14 full-contact practices allowed during the 17-week season. Since that rule was implemented, only 18% of NFL concussions have occurred in practice.”
“Against this backdrop, the results of the study by McCrea et al. are shocking,” they added. “It reveals that college football players still experience 72% of their concussions and 67% of their total head injury exposure in practice.”
Even more shocking, noted Dr. Nowinski and Dr. Cantu, is that these numbers are almost certainly an underestimate of the dangers of practice.
“As a former college football player and a former team physician, respectively, we find this situation inexcusable. Concussions in games are inevitable, but concussions in practice are preventable,” they wrote.
“Laudably,” they added “the investigators call on the NCAA and football conferences to explore policy and rule changes to reduce concussion incidence and HIE and to create robust educational offerings to encourage change from coaches and college administrators.”
A version of this article first appeared on Medscape.com.
Novel research from the Concussion Assessment, Research and Education (CARE) Consortium sheds new light on how to effectively reduce the incidence of concussion and head injury exposure in college football.
The study, led by neurotrauma experts Michael McCrea, PhD, and Brian Stemper, PhD, professors of neurosurgery at the Medical College of Wisconsin in Milwaukee, reports data from hundreds of college football players across five seasons and shows
The research also reveals that such injuries occur more often during practices than games.
“We think that with the findings from this paper, there’s a role for everybody to play in reducing injury,” Dr. McCrea said. “We hope these data help inform broad-based policy about practice and preseason training policies in collegiate football. We also think there’s a role for athletic administrators, coaches, and even athletes themselves.”
The study was published online Feb. 1 in JAMA Neurology.
More injuries in preseason
Concussion is one of the most common injuries in football. Beyond these harms are growing concerns that repetitive HIE may increase the risk of long-term neurologic health problems including chronic traumatic encephalopathy (CTE).
The CARE Consortium, which has been conducting research with college athletes across 26 sports and military cadets since 2014, has been interested in multiple facets of concussion and brain trauma.
“We’ve enrolled more than 50,000 athletes and service academy cadets into the consortium over the last 6 years to research all involved aspects including the clinical core, the imaging core, the blood biomarker core, and the genetic core, and we have a head impact measurement core.”
To investigate the pattern of concussion incidence across the football season in college players, the investigators used impact measurement technology across six Division I NCAA football programs participating in the CARE Consortium from 2015 to 2019.
A total of 658 players – all male, mean age 19 years – were fitted with the Head Impact Telemetry System (HITS) sensor arrays in their helmets to measure head impact frequency, location, and magnitude during play.
“This particular study had built-in algorithms that weeded out impacts that were below 10G of linear magnitude, because those have been determined not likely to be real impacts,” Dr. McCrea said.
Across the five seasons studied, 528,684 head impacts recorded met the quality standards for analysis. Players sustained a median of 415 (interquartile range [IQR], 190-727) impacts per season.
Of those, 68 players sustained a diagnosed concussion. In total, 48.5% of concussions occurred during preseason training, despite preseason representing only 20.8% of the football season. Total head injury exposure in the preseason occurred at twice the proportion of the regular season (324.9 vs. 162.4 impacts per team per day; mean difference, 162.6 impacts; 95% confidence interval, 110.9-214.3; P < .001).
“Preseason training often has a much higher intensity to it, in terms of the total hours, the actual training, and the heavy emphasis on full-contact drills like tackling and blocking,” said Dr. McCrea. “Even the volume of players that are participating is greater.”
Results also showed that in each of the five seasons, head injury exposure per athlete was highest in August (preseason) (median, 146.0 impacts; IQR, 63.0-247.8) and lowest in November (median, 80.0 impacts; IQR, 35.0-148.0). In the studied period, 72% of concussions and 66.9% of head injury exposure occurred in practice. Even within the regular season, total head injury exposure in practices was 84.2% higher than in games.
“This incredible dataset we have on head impact measurement also gives us the opportunity to compare it with our other research looking at the correlation between a single head impact and changes in brain structure and function on MRI, on blood biomarkers, giving us the ability to look at the connection between mechanism of effect of injury and recovery from injury,” said Dr. McCrea.
These findings also provide an opportunity to modify approaches to preseason training and football practices to keep players safer, said Dr. McCrea, noting that about half of the variance in head injury exposure is at the level of the individual athlete.
“With this large body of athletes we’ve instrumented, we can look at, for instance, all of the running backs and understand the athlete and what his head injury exposure looks like compared to all other running backs. If we find out that an athlete has a rate of head injury exposure that’s 300% higher than most other players that play the same position, we can take that data directly to the athlete to work on their technique and approach to the game.
“Every researcher wishes that their basic science or their clinical research findings will have some impact on the health and well-being of the population they’re studying. By modifying practices and preseason training, football teams could greatly reduce the risk of injury and exposure for their players, while still maintaining the competitive nature of game play,” he added.
Through a combination of policy and education, similar strategies could be implemented to help prevent concussion and HIE in high school and youth football too, said Dr. McCrea.
‘Shocking’ findings
In an accompanying editorial, Christopher J. Nowinski, PhD, of the Concussion Legacy Foundation, Boston, and Robert C. Cantu, MD, department of neurosurgery, Emerson Hospital, Concord, Massachusetts, said the findings could have significant policy implications and offer a valuable expansion of prior research.
“From 2005 to 2010, studies on college football revealed that about two-thirds of head impacts occurred in practice,” they noted. “We cited this data in 2010 when we proposed to the NFL Players Association that the most effective way to reduce the risks of negative neurological outcomes was to reduce hitting in practice. They agreed, and in 2011 collectively bargained for severe contact limits in practice, with 14 full-contact practices allowed during the 17-week season. Since that rule was implemented, only 18% of NFL concussions have occurred in practice.”
“Against this backdrop, the results of the study by McCrea et al. are shocking,” they added. “It reveals that college football players still experience 72% of their concussions and 67% of their total head injury exposure in practice.”
Even more shocking, noted Dr. Nowinski and Dr. Cantu, is that these numbers are almost certainly an underestimate of the dangers of practice.
“As a former college football player and a former team physician, respectively, we find this situation inexcusable. Concussions in games are inevitable, but concussions in practice are preventable,” they wrote.
“Laudably,” they added “the investigators call on the NCAA and football conferences to explore policy and rule changes to reduce concussion incidence and HIE and to create robust educational offerings to encourage change from coaches and college administrators.”
A version of this article first appeared on Medscape.com.
Novel research from the Concussion Assessment, Research and Education (CARE) Consortium sheds new light on how to effectively reduce the incidence of concussion and head injury exposure in college football.
The study, led by neurotrauma experts Michael McCrea, PhD, and Brian Stemper, PhD, professors of neurosurgery at the Medical College of Wisconsin in Milwaukee, reports data from hundreds of college football players across five seasons and shows
The research also reveals that such injuries occur more often during practices than games.
“We think that with the findings from this paper, there’s a role for everybody to play in reducing injury,” Dr. McCrea said. “We hope these data help inform broad-based policy about practice and preseason training policies in collegiate football. We also think there’s a role for athletic administrators, coaches, and even athletes themselves.”
The study was published online Feb. 1 in JAMA Neurology.
More injuries in preseason
Concussion is one of the most common injuries in football. Beyond these harms are growing concerns that repetitive HIE may increase the risk of long-term neurologic health problems including chronic traumatic encephalopathy (CTE).
The CARE Consortium, which has been conducting research with college athletes across 26 sports and military cadets since 2014, has been interested in multiple facets of concussion and brain trauma.
“We’ve enrolled more than 50,000 athletes and service academy cadets into the consortium over the last 6 years to research all involved aspects including the clinical core, the imaging core, the blood biomarker core, and the genetic core, and we have a head impact measurement core.”
To investigate the pattern of concussion incidence across the football season in college players, the investigators used impact measurement technology across six Division I NCAA football programs participating in the CARE Consortium from 2015 to 2019.
A total of 658 players – all male, mean age 19 years – were fitted with the Head Impact Telemetry System (HITS) sensor arrays in their helmets to measure head impact frequency, location, and magnitude during play.
“This particular study had built-in algorithms that weeded out impacts that were below 10G of linear magnitude, because those have been determined not likely to be real impacts,” Dr. McCrea said.
Across the five seasons studied, 528,684 head impacts recorded met the quality standards for analysis. Players sustained a median of 415 (interquartile range [IQR], 190-727) impacts per season.
Of those, 68 players sustained a diagnosed concussion. In total, 48.5% of concussions occurred during preseason training, despite preseason representing only 20.8% of the football season. Total head injury exposure in the preseason occurred at twice the proportion of the regular season (324.9 vs. 162.4 impacts per team per day; mean difference, 162.6 impacts; 95% confidence interval, 110.9-214.3; P < .001).
“Preseason training often has a much higher intensity to it, in terms of the total hours, the actual training, and the heavy emphasis on full-contact drills like tackling and blocking,” said Dr. McCrea. “Even the volume of players that are participating is greater.”
Results also showed that in each of the five seasons, head injury exposure per athlete was highest in August (preseason) (median, 146.0 impacts; IQR, 63.0-247.8) and lowest in November (median, 80.0 impacts; IQR, 35.0-148.0). In the studied period, 72% of concussions and 66.9% of head injury exposure occurred in practice. Even within the regular season, total head injury exposure in practices was 84.2% higher than in games.
“This incredible dataset we have on head impact measurement also gives us the opportunity to compare it with our other research looking at the correlation between a single head impact and changes in brain structure and function on MRI, on blood biomarkers, giving us the ability to look at the connection between mechanism of effect of injury and recovery from injury,” said Dr. McCrea.
These findings also provide an opportunity to modify approaches to preseason training and football practices to keep players safer, said Dr. McCrea, noting that about half of the variance in head injury exposure is at the level of the individual athlete.
“With this large body of athletes we’ve instrumented, we can look at, for instance, all of the running backs and understand the athlete and what his head injury exposure looks like compared to all other running backs. If we find out that an athlete has a rate of head injury exposure that’s 300% higher than most other players that play the same position, we can take that data directly to the athlete to work on their technique and approach to the game.
“Every researcher wishes that their basic science or their clinical research findings will have some impact on the health and well-being of the population they’re studying. By modifying practices and preseason training, football teams could greatly reduce the risk of injury and exposure for their players, while still maintaining the competitive nature of game play,” he added.
Through a combination of policy and education, similar strategies could be implemented to help prevent concussion and HIE in high school and youth football too, said Dr. McCrea.
‘Shocking’ findings
In an accompanying editorial, Christopher J. Nowinski, PhD, of the Concussion Legacy Foundation, Boston, and Robert C. Cantu, MD, department of neurosurgery, Emerson Hospital, Concord, Massachusetts, said the findings could have significant policy implications and offer a valuable expansion of prior research.
“From 2005 to 2010, studies on college football revealed that about two-thirds of head impacts occurred in practice,” they noted. “We cited this data in 2010 when we proposed to the NFL Players Association that the most effective way to reduce the risks of negative neurological outcomes was to reduce hitting in practice. They agreed, and in 2011 collectively bargained for severe contact limits in practice, with 14 full-contact practices allowed during the 17-week season. Since that rule was implemented, only 18% of NFL concussions have occurred in practice.”
“Against this backdrop, the results of the study by McCrea et al. are shocking,” they added. “It reveals that college football players still experience 72% of their concussions and 67% of their total head injury exposure in practice.”
Even more shocking, noted Dr. Nowinski and Dr. Cantu, is that these numbers are almost certainly an underestimate of the dangers of practice.
“As a former college football player and a former team physician, respectively, we find this situation inexcusable. Concussions in games are inevitable, but concussions in practice are preventable,” they wrote.
“Laudably,” they added “the investigators call on the NCAA and football conferences to explore policy and rule changes to reduce concussion incidence and HIE and to create robust educational offerings to encourage change from coaches and college administrators.”
A version of this article first appeared on Medscape.com.
FROM JAMA NEUROLOGY
New steroid dosing regimen for myasthenia gravis
. The trial showed that the conventional slow tapering regimen enabled discontinuation of prednisone earlier than previously reported but the new rapid-tapering regimen enabled an even faster discontinuation.
Noting that although both regimens led to a comparable myasthenia gravis status and prednisone dose at 15 months, the authors stated: “We think that the reduction of the cumulative dose over a year (equivalent to 5 mg/day) is a clinically relevant reduction, since the risk of complications is proportional to the daily or cumulative doses of prednisone.
“Our results warrant testing of a more rapid-tapering regimen in a future trial. In the meantime, our trial provides useful information on how prednisone tapering could be managed in patients with generalized myasthenia gravis treated with azathioprine,” they concluded.
The trial was published online Feb. 8 in JAMA Neurology.
Myasthenia gravis is a disorder of neuromuscular transmission, resulting from autoantibodies to components of the neuromuscular junction, most commonly the acetylcholine receptor. The incidence ranges from 0.3 to 2.8 per 100,000, and it is estimated to affect more than 700,000 people worldwide.
The authors of the current paper, led by Tarek Sharshar, MD, PhD, Groupe Hospitalier Universitaire, Paris, explained that many patients whose symptoms are not controlled by cholinesterase inhibitors are treated with corticosteroids and an immunosuppressant, usually azathioprine. No specific dosing protocol for prednisone has been validated, but it is commonly gradually increased to 0.75 mg/kg on alternate days and reduced progressively when minimal manifestation status (MMS; no symptoms or functional limitations) is reached.
They noted that this regimen leads to high and prolonged corticosteroid treatment – often for several years – with the mean daily prednisone dose exceeding 30 mg/day at 15 months and 20 mg/day at 36 months. As long-term use of corticosteroids is often associated with significant complications, reducing or even discontinuing prednisone treatment without destabilizing myasthenia gravis is therefore a therapeutic goal.
Evaluating dosage regimens
To investigate whether different dosage regimens could help wean patients with generalized myasthenia gravis from corticosteroid therapy without compromising efficacy, the researchers conducted this study in which the current recommended regimen was compared with an approach using higher initial corticosteroid doses followed by rapid tapering.
In the conventional slow-tapering group (control group), prednisone was given on alternate days, starting at a dose of 10 mg then increased by increments of 10 mg every 2 days up to 1.5 mg/kg on alternate days without exceeding 100 mg. This dose was maintained until MMS was reached and then reduced by 10 mg every 2 weeks until a dosage of 40 mg was reached, with subsequent slowing of the taper to 5 mg monthly. If MMS was not maintained, the alternate-day prednisone dose was increased by 10 mg every 2 weeks until MMS was restored, and the tapering resumed 4 weeks later.
In the new rapid-tapering group, oral prednisone was immediately started at 0.75 mg/kg per day, and this was followed by an earlier and rapid decrease once improved myasthenia gravis status was attained. Three different tapering schedules were applied dependent on the improvement status of the patient.
First, If the patient reached MMS at 1 month, the dose of prednisone was reduced by 0.1 mg/kg every 10 days up to 0.45 mg/kg per day, then 0.05 mg/kg every 10 days up to 0.25 mg/kg per day, then in decrements of 1 mg by adjusting the duration of the decrements according to the participant’s weight with the aim of achieving complete cessation of corticosteroid therapy within 18-20 weeks for this third stage of tapering.
Second, if the state of MMS was not reached at 1 month but the participant had improved, a slower tapering was conducted, with the dosage reduced in a similar way to the first instance but with each reduction introduced every 20 days. If the participant reached MMS during this tapering process, the tapering of prednisone was similar to the sequence described in the first group.
Third, if MMS was not reached and the participant had not improved, the initial dose was maintained for the first 3 months; beyond that time, a decrease in the prednisone dose was undertaken as in the second group to a minimum dose of 0.25 mg/kg per day, after which the prednisone dose was not reduced further. If the patient improved, the tapering of prednisone followed the sequence described in the second category.
Reductions in prednisone dose could be accelerated in the case of severe prednisone adverse effects, according to the prescriber’s decision.
In the event of a myasthenia gravis exacerbation, the patient was hospitalized and the dose of prednisone was routinely doubled, or for a more moderate aggravation, the dose was increased to the previous dose recommended in the tapering regimen.
Azathioprine, up to a maximum dose of 3 mg/kg per day, was prescribed for all participants. In all, 117 patients were randomly assigned, and 113 completed the study.
The primary outcome was the proportion of participants having reached MMS without prednisone at 12 months and having not relapsed or taken prednisone between months 12 and 15. This was achieved by significantly more patients in the rapid-tapering group (39% vs. 9%; risk ratio, 3.61; P < .001).
Rapid tapering allowed sparing of a mean of 1,898 mg of prednisone over 1 year (5.3 mg/day) per patient.
The rate of myasthenia gravis exacerbation or worsening did not differ significantly between the two groups, nor did the use of plasmapheresis or IVIG or the doses of azathioprine.
The overall number of serious adverse events did not differ significantly between the two groups (slow tapering, 22% vs. rapid-tapering, 36%; P = .15).
The researchers said it is possible that prednisone tapering would differ with another immunosuppressive agent but as azathioprine is the first-line immunosuppressant usually recommended, these results are relevant for a large proportion of patients.
They said the better outcome of the intervention group could have been related to one or more of four differences in prednisone administration: An immediate high dose versus a slow increase of the prednisone dose; daily versus alternate-day dosing; earlier tapering initiation; and faster tapering. However, the structure of the study did not allow identification of which of these factors was responsible.
“Researching the best prednisone-tapering scheme is not only a major issue for patients with myasthenia gravis but also for other autoimmune or inflammatory diseases, because validated prednisone-tapering regimens are scarce,” the authors said.
The rapid tapering of prednisone therapy appears to be feasible, beneficial, and safe in patients with generalized myasthenia gravis and “warrants testing in other autoimmune diseases,” they added.
Particularly relevant to late-onset disease
Commenting on the study, Raffi Topakian, MD, Klinikum Wels-Grieskirchen, Wels, Austria, said the results showed that in patients with moderate to severe generalized myasthenia gravis requiring high-dose prednisone, azathioprine, a widely used immunosuppressant, may have a quicker steroid-sparing effect than previously thought, and that rapid steroid tapering can be achieved safely, resulting in a reduction of the cumulative steroid dose over a year despite higher initial doses.
Dr. Topakian, who was not involved with the research, pointed out that the median age was advanced (around 56 years), and the benefit of a regimen that leads to a reduction of the cumulative steroid dose over a year may be disproportionately larger for older, sicker patients with many comorbidities who are at considerably higher risk for a prednisone-induced increase in cardiovascular complications, osteoporotic fractures, and gastrointestinal bleeding.
“The study findings are particularly relevant for the management of late-onset myasthenia gravis (when first symptoms start after age 45-50 years), which is being encountered more frequently over the past years,” he said.
“But the holy grail of myasthenia gravis treatment has not been found yet,” Dr. Topakian noted. “Disappointingly, rapid tapering of steroids (compared to slow tapering) resulted in a reduction of the cumulative steroid dose only, but was not associated with better myasthenia gravis functional status or lower doses of steroids at 15 months. To my view, this finding points to the limited immunosuppressive efficacy of azathioprine.”
He added that the study findings should not be extrapolated to patients with mild presentations or to those with muscle-specific kinase myasthenia gravis.
Dr. Sharshar disclosed no relevant financial relationships. Disclosures for the study coauthors appear in the original article.
A version of this article first appeared on Medscape.com.
. The trial showed that the conventional slow tapering regimen enabled discontinuation of prednisone earlier than previously reported but the new rapid-tapering regimen enabled an even faster discontinuation.
Noting that although both regimens led to a comparable myasthenia gravis status and prednisone dose at 15 months, the authors stated: “We think that the reduction of the cumulative dose over a year (equivalent to 5 mg/day) is a clinically relevant reduction, since the risk of complications is proportional to the daily or cumulative doses of prednisone.
“Our results warrant testing of a more rapid-tapering regimen in a future trial. In the meantime, our trial provides useful information on how prednisone tapering could be managed in patients with generalized myasthenia gravis treated with azathioprine,” they concluded.
The trial was published online Feb. 8 in JAMA Neurology.
Myasthenia gravis is a disorder of neuromuscular transmission, resulting from autoantibodies to components of the neuromuscular junction, most commonly the acetylcholine receptor. The incidence ranges from 0.3 to 2.8 per 100,000, and it is estimated to affect more than 700,000 people worldwide.
The authors of the current paper, led by Tarek Sharshar, MD, PhD, Groupe Hospitalier Universitaire, Paris, explained that many patients whose symptoms are not controlled by cholinesterase inhibitors are treated with corticosteroids and an immunosuppressant, usually azathioprine. No specific dosing protocol for prednisone has been validated, but it is commonly gradually increased to 0.75 mg/kg on alternate days and reduced progressively when minimal manifestation status (MMS; no symptoms or functional limitations) is reached.
They noted that this regimen leads to high and prolonged corticosteroid treatment – often for several years – with the mean daily prednisone dose exceeding 30 mg/day at 15 months and 20 mg/day at 36 months. As long-term use of corticosteroids is often associated with significant complications, reducing or even discontinuing prednisone treatment without destabilizing myasthenia gravis is therefore a therapeutic goal.
Evaluating dosage regimens
To investigate whether different dosage regimens could help wean patients with generalized myasthenia gravis from corticosteroid therapy without compromising efficacy, the researchers conducted this study in which the current recommended regimen was compared with an approach using higher initial corticosteroid doses followed by rapid tapering.
In the conventional slow-tapering group (control group), prednisone was given on alternate days, starting at a dose of 10 mg then increased by increments of 10 mg every 2 days up to 1.5 mg/kg on alternate days without exceeding 100 mg. This dose was maintained until MMS was reached and then reduced by 10 mg every 2 weeks until a dosage of 40 mg was reached, with subsequent slowing of the taper to 5 mg monthly. If MMS was not maintained, the alternate-day prednisone dose was increased by 10 mg every 2 weeks until MMS was restored, and the tapering resumed 4 weeks later.
In the new rapid-tapering group, oral prednisone was immediately started at 0.75 mg/kg per day, and this was followed by an earlier and rapid decrease once improved myasthenia gravis status was attained. Three different tapering schedules were applied dependent on the improvement status of the patient.
First, If the patient reached MMS at 1 month, the dose of prednisone was reduced by 0.1 mg/kg every 10 days up to 0.45 mg/kg per day, then 0.05 mg/kg every 10 days up to 0.25 mg/kg per day, then in decrements of 1 mg by adjusting the duration of the decrements according to the participant’s weight with the aim of achieving complete cessation of corticosteroid therapy within 18-20 weeks for this third stage of tapering.
Second, if the state of MMS was not reached at 1 month but the participant had improved, a slower tapering was conducted, with the dosage reduced in a similar way to the first instance but with each reduction introduced every 20 days. If the participant reached MMS during this tapering process, the tapering of prednisone was similar to the sequence described in the first group.
Third, if MMS was not reached and the participant had not improved, the initial dose was maintained for the first 3 months; beyond that time, a decrease in the prednisone dose was undertaken as in the second group to a minimum dose of 0.25 mg/kg per day, after which the prednisone dose was not reduced further. If the patient improved, the tapering of prednisone followed the sequence described in the second category.
Reductions in prednisone dose could be accelerated in the case of severe prednisone adverse effects, according to the prescriber’s decision.
In the event of a myasthenia gravis exacerbation, the patient was hospitalized and the dose of prednisone was routinely doubled, or for a more moderate aggravation, the dose was increased to the previous dose recommended in the tapering regimen.
Azathioprine, up to a maximum dose of 3 mg/kg per day, was prescribed for all participants. In all, 117 patients were randomly assigned, and 113 completed the study.
The primary outcome was the proportion of participants having reached MMS without prednisone at 12 months and having not relapsed or taken prednisone between months 12 and 15. This was achieved by significantly more patients in the rapid-tapering group (39% vs. 9%; risk ratio, 3.61; P < .001).
Rapid tapering allowed sparing of a mean of 1,898 mg of prednisone over 1 year (5.3 mg/day) per patient.
The rate of myasthenia gravis exacerbation or worsening did not differ significantly between the two groups, nor did the use of plasmapheresis or IVIG or the doses of azathioprine.
The overall number of serious adverse events did not differ significantly between the two groups (slow tapering, 22% vs. rapid-tapering, 36%; P = .15).
The researchers said it is possible that prednisone tapering would differ with another immunosuppressive agent but as azathioprine is the first-line immunosuppressant usually recommended, these results are relevant for a large proportion of patients.
They said the better outcome of the intervention group could have been related to one or more of four differences in prednisone administration: An immediate high dose versus a slow increase of the prednisone dose; daily versus alternate-day dosing; earlier tapering initiation; and faster tapering. However, the structure of the study did not allow identification of which of these factors was responsible.
“Researching the best prednisone-tapering scheme is not only a major issue for patients with myasthenia gravis but also for other autoimmune or inflammatory diseases, because validated prednisone-tapering regimens are scarce,” the authors said.
The rapid tapering of prednisone therapy appears to be feasible, beneficial, and safe in patients with generalized myasthenia gravis and “warrants testing in other autoimmune diseases,” they added.
Particularly relevant to late-onset disease
Commenting on the study, Raffi Topakian, MD, Klinikum Wels-Grieskirchen, Wels, Austria, said the results showed that in patients with moderate to severe generalized myasthenia gravis requiring high-dose prednisone, azathioprine, a widely used immunosuppressant, may have a quicker steroid-sparing effect than previously thought, and that rapid steroid tapering can be achieved safely, resulting in a reduction of the cumulative steroid dose over a year despite higher initial doses.
Dr. Topakian, who was not involved with the research, pointed out that the median age was advanced (around 56 years), and the benefit of a regimen that leads to a reduction of the cumulative steroid dose over a year may be disproportionately larger for older, sicker patients with many comorbidities who are at considerably higher risk for a prednisone-induced increase in cardiovascular complications, osteoporotic fractures, and gastrointestinal bleeding.
“The study findings are particularly relevant for the management of late-onset myasthenia gravis (when first symptoms start after age 45-50 years), which is being encountered more frequently over the past years,” he said.
“But the holy grail of myasthenia gravis treatment has not been found yet,” Dr. Topakian noted. “Disappointingly, rapid tapering of steroids (compared to slow tapering) resulted in a reduction of the cumulative steroid dose only, but was not associated with better myasthenia gravis functional status or lower doses of steroids at 15 months. To my view, this finding points to the limited immunosuppressive efficacy of azathioprine.”
He added that the study findings should not be extrapolated to patients with mild presentations or to those with muscle-specific kinase myasthenia gravis.
Dr. Sharshar disclosed no relevant financial relationships. Disclosures for the study coauthors appear in the original article.
A version of this article first appeared on Medscape.com.
. The trial showed that the conventional slow tapering regimen enabled discontinuation of prednisone earlier than previously reported but the new rapid-tapering regimen enabled an even faster discontinuation.
Noting that although both regimens led to a comparable myasthenia gravis status and prednisone dose at 15 months, the authors stated: “We think that the reduction of the cumulative dose over a year (equivalent to 5 mg/day) is a clinically relevant reduction, since the risk of complications is proportional to the daily or cumulative doses of prednisone.
“Our results warrant testing of a more rapid-tapering regimen in a future trial. In the meantime, our trial provides useful information on how prednisone tapering could be managed in patients with generalized myasthenia gravis treated with azathioprine,” they concluded.
The trial was published online Feb. 8 in JAMA Neurology.
Myasthenia gravis is a disorder of neuromuscular transmission, resulting from autoantibodies to components of the neuromuscular junction, most commonly the acetylcholine receptor. The incidence ranges from 0.3 to 2.8 per 100,000, and it is estimated to affect more than 700,000 people worldwide.
The authors of the current paper, led by Tarek Sharshar, MD, PhD, Groupe Hospitalier Universitaire, Paris, explained that many patients whose symptoms are not controlled by cholinesterase inhibitors are treated with corticosteroids and an immunosuppressant, usually azathioprine. No specific dosing protocol for prednisone has been validated, but it is commonly gradually increased to 0.75 mg/kg on alternate days and reduced progressively when minimal manifestation status (MMS; no symptoms or functional limitations) is reached.
They noted that this regimen leads to high and prolonged corticosteroid treatment – often for several years – with the mean daily prednisone dose exceeding 30 mg/day at 15 months and 20 mg/day at 36 months. As long-term use of corticosteroids is often associated with significant complications, reducing or even discontinuing prednisone treatment without destabilizing myasthenia gravis is therefore a therapeutic goal.
Evaluating dosage regimens
To investigate whether different dosage regimens could help wean patients with generalized myasthenia gravis from corticosteroid therapy without compromising efficacy, the researchers conducted this study in which the current recommended regimen was compared with an approach using higher initial corticosteroid doses followed by rapid tapering.
In the conventional slow-tapering group (control group), prednisone was given on alternate days, starting at a dose of 10 mg then increased by increments of 10 mg every 2 days up to 1.5 mg/kg on alternate days without exceeding 100 mg. This dose was maintained until MMS was reached and then reduced by 10 mg every 2 weeks until a dosage of 40 mg was reached, with subsequent slowing of the taper to 5 mg monthly. If MMS was not maintained, the alternate-day prednisone dose was increased by 10 mg every 2 weeks until MMS was restored, and the tapering resumed 4 weeks later.
In the new rapid-tapering group, oral prednisone was immediately started at 0.75 mg/kg per day, and this was followed by an earlier and rapid decrease once improved myasthenia gravis status was attained. Three different tapering schedules were applied dependent on the improvement status of the patient.
First, If the patient reached MMS at 1 month, the dose of prednisone was reduced by 0.1 mg/kg every 10 days up to 0.45 mg/kg per day, then 0.05 mg/kg every 10 days up to 0.25 mg/kg per day, then in decrements of 1 mg by adjusting the duration of the decrements according to the participant’s weight with the aim of achieving complete cessation of corticosteroid therapy within 18-20 weeks for this third stage of tapering.
Second, if the state of MMS was not reached at 1 month but the participant had improved, a slower tapering was conducted, with the dosage reduced in a similar way to the first instance but with each reduction introduced every 20 days. If the participant reached MMS during this tapering process, the tapering of prednisone was similar to the sequence described in the first group.
Third, if MMS was not reached and the participant had not improved, the initial dose was maintained for the first 3 months; beyond that time, a decrease in the prednisone dose was undertaken as in the second group to a minimum dose of 0.25 mg/kg per day, after which the prednisone dose was not reduced further. If the patient improved, the tapering of prednisone followed the sequence described in the second category.
Reductions in prednisone dose could be accelerated in the case of severe prednisone adverse effects, according to the prescriber’s decision.
In the event of a myasthenia gravis exacerbation, the patient was hospitalized and the dose of prednisone was routinely doubled, or for a more moderate aggravation, the dose was increased to the previous dose recommended in the tapering regimen.
Azathioprine, up to a maximum dose of 3 mg/kg per day, was prescribed for all participants. In all, 117 patients were randomly assigned, and 113 completed the study.
The primary outcome was the proportion of participants having reached MMS without prednisone at 12 months and having not relapsed or taken prednisone between months 12 and 15. This was achieved by significantly more patients in the rapid-tapering group (39% vs. 9%; risk ratio, 3.61; P < .001).
Rapid tapering allowed sparing of a mean of 1,898 mg of prednisone over 1 year (5.3 mg/day) per patient.
The rate of myasthenia gravis exacerbation or worsening did not differ significantly between the two groups, nor did the use of plasmapheresis or IVIG or the doses of azathioprine.
The overall number of serious adverse events did not differ significantly between the two groups (slow tapering, 22% vs. rapid-tapering, 36%; P = .15).
The researchers said it is possible that prednisone tapering would differ with another immunosuppressive agent but as azathioprine is the first-line immunosuppressant usually recommended, these results are relevant for a large proportion of patients.
They said the better outcome of the intervention group could have been related to one or more of four differences in prednisone administration: An immediate high dose versus a slow increase of the prednisone dose; daily versus alternate-day dosing; earlier tapering initiation; and faster tapering. However, the structure of the study did not allow identification of which of these factors was responsible.
“Researching the best prednisone-tapering scheme is not only a major issue for patients with myasthenia gravis but also for other autoimmune or inflammatory diseases, because validated prednisone-tapering regimens are scarce,” the authors said.
The rapid tapering of prednisone therapy appears to be feasible, beneficial, and safe in patients with generalized myasthenia gravis and “warrants testing in other autoimmune diseases,” they added.
Particularly relevant to late-onset disease
Commenting on the study, Raffi Topakian, MD, Klinikum Wels-Grieskirchen, Wels, Austria, said the results showed that in patients with moderate to severe generalized myasthenia gravis requiring high-dose prednisone, azathioprine, a widely used immunosuppressant, may have a quicker steroid-sparing effect than previously thought, and that rapid steroid tapering can be achieved safely, resulting in a reduction of the cumulative steroid dose over a year despite higher initial doses.
Dr. Topakian, who was not involved with the research, pointed out that the median age was advanced (around 56 years), and the benefit of a regimen that leads to a reduction of the cumulative steroid dose over a year may be disproportionately larger for older, sicker patients with many comorbidities who are at considerably higher risk for a prednisone-induced increase in cardiovascular complications, osteoporotic fractures, and gastrointestinal bleeding.
“The study findings are particularly relevant for the management of late-onset myasthenia gravis (when first symptoms start after age 45-50 years), which is being encountered more frequently over the past years,” he said.
“But the holy grail of myasthenia gravis treatment has not been found yet,” Dr. Topakian noted. “Disappointingly, rapid tapering of steroids (compared to slow tapering) resulted in a reduction of the cumulative steroid dose only, but was not associated with better myasthenia gravis functional status or lower doses of steroids at 15 months. To my view, this finding points to the limited immunosuppressive efficacy of azathioprine.”
He added that the study findings should not be extrapolated to patients with mild presentations or to those with muscle-specific kinase myasthenia gravis.
Dr. Sharshar disclosed no relevant financial relationships. Disclosures for the study coauthors appear in the original article.
A version of this article first appeared on Medscape.com.
FROM JAMA NEUROLOGY
Alien cells may explain COVID-19 brain fog
, a new report suggests.
The authors report five separate post-mortem cases from patients who died with COVID-19 in which large cells resembling megakaryocytes were identified in cortical capillaries. Immunohistochemistry subsequently confirmed their megakaryocyte identity.
They point out that the finding is of interest as – to their knowledge – megakaryocytes have not been found in the brain before.
The observations are described in a research letter published online Feb. 12 in JAMA Neurology.
Bone marrow cells in the brain
Lead author David Nauen, MD, PhD, a neuropathologist from Johns Hopkins University, Baltimore, reported that he identified these cells in the first analysis of post-mortem brain tissue from a patient who had COVID-19.
“Some other viruses cause changes in the brain such as encephalopathy, and as neurologic symptoms are often reported in COVID-19, I was curious to see if similar effects were seen in brain post-mortem samples from patients who had died with the infection,” Dr. Nauen said.
On his first analysis of the brain tissue of a patient who had COVID-19, Dr. Nauen saw no evidence of viral encephalitis, but he observed some “unusually large” cells in the brain capillaries.
“I was taken aback; I couldn’t figure out what they were. Then I realized these cells were megakaryocytes from the bone marrow. I have never seen these cells in the brain before. I asked several colleagues and none of them had either. After extensive literature searches, I could find no evidence of megakaryocytes being in the brain,” Dr. Nauen noted.
Megakaryocytes, he explained, are “very large cells, and the brain capillaries are very small – just large enough to let red blood cells and lymphocytes pass through. To see these very large cells in such vessels is extremely unusual. It looks like they are causing occlusions.”
By occluding flow through individual capillaries, these large cells could cause ischemic alteration in a distinct pattern, potentially resulting in an atypical form of neurologic impairment, the authors suggest.
“This might alter the hemodynamics and put pressure on other vessels, possibly contributing to the increased risk of stroke that has been reported in COVID-19,” Dr. Nauen said. None of the samples he examined came from patients with COVID-19 who had had a stroke, he reported.
Other than the presence of megakaryocytes in the capillaries, the brain looked normal, he said. He has now examined samples from 15 brains of patients who had COVID-19 and megakaryocytes have been found in the brain capillaries in five cases.
New neurologic complication
Classic encephalitis found with other viruses has not been reported in brain post-mortem examinations from patients who had COVID-19, Dr. Nauen noted. “The cognitive issues such as grogginess associated with COVID-19 would indicate problems with the cortex but that hasn’t been documented. This occlusion of a multitude of tiny vessels by megalokaryocytes may offer some explanation of the cognitive issues. This is a new kind of vascular insult seen on pathology, and suggests a new kind of neurologic complication,” he added.
The big question is what these megakaryocytes are doing in the brain.
“Megakaryocytes are bone marrow cells. They are not immune cells. Their job is to produce platelets to help the blood clot. They are not normally found outside the bone marrow, but they have been reported in other organs in COVID-19 patients.
“But the big puzzle associated with finding them in the brain is how they get through the very fine network of blood vessels in the lungs. The geometry just doesn’t work. We don’t know which part of the COVID inflammatory response makes this happen,” said Dr. Nauen.
The authors suggest one possibility is that altered endothelial or other signaling is recruiting megakaryocytes into the circulation and somehow permitting them to pass through the lungs.
“We need to try and understand if there is anything distinctive about these megakaryocytes – which proteins are they expressing that may explain why they are behaving in such an unusual way,” said Dr. Nauen.
Noting that many patients with severe COVID-19 have problems with clotting, and megakaryocytes are part of the clotting system, he speculated that some sort of aberrant message is being sent to these cells.
“It is notable that we found megakaryocytes in cortical capillaries in 33% of cases examined. Because the standard brain autopsy sections taken sampled at random [are] only a minute portion of the cortical volume, finding these cells suggests the total burden could be considerable,” the authors wrote.
Dr. Nauen added that to his knowledge, this is the first report of such observations, and the next step is to look for similar findings in larger sample sizes.
A version of this article first appeared on Medscape.com.
, a new report suggests.
The authors report five separate post-mortem cases from patients who died with COVID-19 in which large cells resembling megakaryocytes were identified in cortical capillaries. Immunohistochemistry subsequently confirmed their megakaryocyte identity.
They point out that the finding is of interest as – to their knowledge – megakaryocytes have not been found in the brain before.
The observations are described in a research letter published online Feb. 12 in JAMA Neurology.
Bone marrow cells in the brain
Lead author David Nauen, MD, PhD, a neuropathologist from Johns Hopkins University, Baltimore, reported that he identified these cells in the first analysis of post-mortem brain tissue from a patient who had COVID-19.
“Some other viruses cause changes in the brain such as encephalopathy, and as neurologic symptoms are often reported in COVID-19, I was curious to see if similar effects were seen in brain post-mortem samples from patients who had died with the infection,” Dr. Nauen said.
On his first analysis of the brain tissue of a patient who had COVID-19, Dr. Nauen saw no evidence of viral encephalitis, but he observed some “unusually large” cells in the brain capillaries.
“I was taken aback; I couldn’t figure out what they were. Then I realized these cells were megakaryocytes from the bone marrow. I have never seen these cells in the brain before. I asked several colleagues and none of them had either. After extensive literature searches, I could find no evidence of megakaryocytes being in the brain,” Dr. Nauen noted.
Megakaryocytes, he explained, are “very large cells, and the brain capillaries are very small – just large enough to let red blood cells and lymphocytes pass through. To see these very large cells in such vessels is extremely unusual. It looks like they are causing occlusions.”
By occluding flow through individual capillaries, these large cells could cause ischemic alteration in a distinct pattern, potentially resulting in an atypical form of neurologic impairment, the authors suggest.
“This might alter the hemodynamics and put pressure on other vessels, possibly contributing to the increased risk of stroke that has been reported in COVID-19,” Dr. Nauen said. None of the samples he examined came from patients with COVID-19 who had had a stroke, he reported.
Other than the presence of megakaryocytes in the capillaries, the brain looked normal, he said. He has now examined samples from 15 brains of patients who had COVID-19 and megakaryocytes have been found in the brain capillaries in five cases.
New neurologic complication
Classic encephalitis found with other viruses has not been reported in brain post-mortem examinations from patients who had COVID-19, Dr. Nauen noted. “The cognitive issues such as grogginess associated with COVID-19 would indicate problems with the cortex but that hasn’t been documented. This occlusion of a multitude of tiny vessels by megalokaryocytes may offer some explanation of the cognitive issues. This is a new kind of vascular insult seen on pathology, and suggests a new kind of neurologic complication,” he added.
The big question is what these megakaryocytes are doing in the brain.
“Megakaryocytes are bone marrow cells. They are not immune cells. Their job is to produce platelets to help the blood clot. They are not normally found outside the bone marrow, but they have been reported in other organs in COVID-19 patients.
“But the big puzzle associated with finding them in the brain is how they get through the very fine network of blood vessels in the lungs. The geometry just doesn’t work. We don’t know which part of the COVID inflammatory response makes this happen,” said Dr. Nauen.
The authors suggest one possibility is that altered endothelial or other signaling is recruiting megakaryocytes into the circulation and somehow permitting them to pass through the lungs.
“We need to try and understand if there is anything distinctive about these megakaryocytes – which proteins are they expressing that may explain why they are behaving in such an unusual way,” said Dr. Nauen.
Noting that many patients with severe COVID-19 have problems with clotting, and megakaryocytes are part of the clotting system, he speculated that some sort of aberrant message is being sent to these cells.
“It is notable that we found megakaryocytes in cortical capillaries in 33% of cases examined. Because the standard brain autopsy sections taken sampled at random [are] only a minute portion of the cortical volume, finding these cells suggests the total burden could be considerable,” the authors wrote.
Dr. Nauen added that to his knowledge, this is the first report of such observations, and the next step is to look for similar findings in larger sample sizes.
A version of this article first appeared on Medscape.com.
, a new report suggests.
The authors report five separate post-mortem cases from patients who died with COVID-19 in which large cells resembling megakaryocytes were identified in cortical capillaries. Immunohistochemistry subsequently confirmed their megakaryocyte identity.
They point out that the finding is of interest as – to their knowledge – megakaryocytes have not been found in the brain before.
The observations are described in a research letter published online Feb. 12 in JAMA Neurology.
Bone marrow cells in the brain
Lead author David Nauen, MD, PhD, a neuropathologist from Johns Hopkins University, Baltimore, reported that he identified these cells in the first analysis of post-mortem brain tissue from a patient who had COVID-19.
“Some other viruses cause changes in the brain such as encephalopathy, and as neurologic symptoms are often reported in COVID-19, I was curious to see if similar effects were seen in brain post-mortem samples from patients who had died with the infection,” Dr. Nauen said.
On his first analysis of the brain tissue of a patient who had COVID-19, Dr. Nauen saw no evidence of viral encephalitis, but he observed some “unusually large” cells in the brain capillaries.
“I was taken aback; I couldn’t figure out what they were. Then I realized these cells were megakaryocytes from the bone marrow. I have never seen these cells in the brain before. I asked several colleagues and none of them had either. After extensive literature searches, I could find no evidence of megakaryocytes being in the brain,” Dr. Nauen noted.
Megakaryocytes, he explained, are “very large cells, and the brain capillaries are very small – just large enough to let red blood cells and lymphocytes pass through. To see these very large cells in such vessels is extremely unusual. It looks like they are causing occlusions.”
By occluding flow through individual capillaries, these large cells could cause ischemic alteration in a distinct pattern, potentially resulting in an atypical form of neurologic impairment, the authors suggest.
“This might alter the hemodynamics and put pressure on other vessels, possibly contributing to the increased risk of stroke that has been reported in COVID-19,” Dr. Nauen said. None of the samples he examined came from patients with COVID-19 who had had a stroke, he reported.
Other than the presence of megakaryocytes in the capillaries, the brain looked normal, he said. He has now examined samples from 15 brains of patients who had COVID-19 and megakaryocytes have been found in the brain capillaries in five cases.
New neurologic complication
Classic encephalitis found with other viruses has not been reported in brain post-mortem examinations from patients who had COVID-19, Dr. Nauen noted. “The cognitive issues such as grogginess associated with COVID-19 would indicate problems with the cortex but that hasn’t been documented. This occlusion of a multitude of tiny vessels by megalokaryocytes may offer some explanation of the cognitive issues. This is a new kind of vascular insult seen on pathology, and suggests a new kind of neurologic complication,” he added.
The big question is what these megakaryocytes are doing in the brain.
“Megakaryocytes are bone marrow cells. They are not immune cells. Their job is to produce platelets to help the blood clot. They are not normally found outside the bone marrow, but they have been reported in other organs in COVID-19 patients.
“But the big puzzle associated with finding them in the brain is how they get through the very fine network of blood vessels in the lungs. The geometry just doesn’t work. We don’t know which part of the COVID inflammatory response makes this happen,” said Dr. Nauen.
The authors suggest one possibility is that altered endothelial or other signaling is recruiting megakaryocytes into the circulation and somehow permitting them to pass through the lungs.
“We need to try and understand if there is anything distinctive about these megakaryocytes – which proteins are they expressing that may explain why they are behaving in such an unusual way,” said Dr. Nauen.
Noting that many patients with severe COVID-19 have problems with clotting, and megakaryocytes are part of the clotting system, he speculated that some sort of aberrant message is being sent to these cells.
“It is notable that we found megakaryocytes in cortical capillaries in 33% of cases examined. Because the standard brain autopsy sections taken sampled at random [are] only a minute portion of the cortical volume, finding these cells suggests the total burden could be considerable,” the authors wrote.
Dr. Nauen added that to his knowledge, this is the first report of such observations, and the next step is to look for similar findings in larger sample sizes.
A version of this article first appeared on Medscape.com.
FROM JAMA NEUROLOGY
The true measure of cluster headache
Patients with cluster headache face a double whammy: Physicians too often fail to recognize it, and their condition is among the most severe and debilitating among headache types. In fact,
The study’s comparison of cluster headaches to other common painful experiences can help nonsufferers relate to the experience, said Larry Schor, PhD, a coauthor of the paper. “Headache is a terrible word. Bee stings sting, burns burn. [A cluster headache] doesn’t ache. It’s a piercing intensity like you just can’t believe,” said Dr. Schor, professor of psychology at the University of West Georgia, Carrollton, and a cluster headache patient since he first experienced an attack at the age of 21.
The study was published in the January 2021 issue of Headache.
Ranking cluster headaches as worse than experiences such as childbirth or kidney stones is “kind of eye opening, and helps to describe the experience in terms that more people can relate to. I think it helps to share the experience of cluster headache more broadly, because we’re in a situation where cluster headache remains underfunded, and we don’t have enough treatments for it. I think one way to overcome that is to spread awareness of what this problem is, and the impact it has on human life,” said Rashmi Halker Singh, MD, associate professor of neurology at the Mayo Clinic in Scottsdale, Ariz., and deputy editor of Headache. She was not involved in the study.
Dr. Schor called for physicians to consider cluster headache an emergency, because of the severity of pain and also the potential for suicidality. Treatments remain comparatively sparse, but high-flow oxygen can help some patients, and intranasal or intravenous triptans can treat acute pain. In 2018, the Food and Drug Administration approved galcanezumab (Eli Lilly) for prevention of episodic cluster headaches.
But cluster headaches are often misdiagnosed. For many patients, it takes more than a year or even as long as 5 years to get an accurate diagnosis, according to Dr. Schor. Women may be particularly vulnerable to misdiagnosis, because migraines are more common in women. It doesn’t help that many neurologists are taught that cluster headache is primarily a male disease. “Because that idea is so ingrained, I think a lot of women who have cluster headache are probably missed and told they have migraine instead. There are a lot of women who have cluster headache, and that gender difference might not be as big a difference as we were initially taught. We need to do a better job of recognizing cluster headache to better understand what the true prevalence is,” said Dr. Halker Singh.
She noted that patients with side-locked headache should be evaluated for cluster headache, and asked how long the pain lasts in the absence of medication. “Also ask about the presence of cranial autonomic symptoms, and if they occur in the context of headache pain, and if they are side-locked to the side of the headache. Those are important questions that can tease out cluster headache from other conditions,” said Dr. Halker Singh.
For the survey, the researchers asked 1,604 patients with cluster headache patients to rate pain on a scale of 1 to 10. Cluster headache ranked highest at 9.7, then labor pain (7.2), pancreatitis (7.0), and nephrolithiasis (6.9). Cluster headache pain was ranked at 10.0 by 72.1% of respondents. Those reporting maximal pain or were more likely to have cranial autonomic features in comparison with patients who reported less pain, including conjunctival injection or lacrimation (91% versus 85%), eyelid edema (77% versus 66%), forehead/facial sweating (60% versus 49%), fullness in the ear (47% versus 35%), and miosis or ptosis (85% versus 75%). They had more frequent attacks (4.0 versus 3.5 per day), higher Hopelessness Depression Symptom Questionnaire scores (24.5 versus 21.1), and reduced effectiveness of calcium channel blockers (2.2 versus 2.5 on a 5-point Likert scale). They were more often female (34% versus 24%). (P < .001 for all).
The study received funding from Autonomic Technologies and Cluster Busters. Dr. Schor and Dr. Halker Singh had no relevant financial disclosures.
Patients with cluster headache face a double whammy: Physicians too often fail to recognize it, and their condition is among the most severe and debilitating among headache types. In fact,
The study’s comparison of cluster headaches to other common painful experiences can help nonsufferers relate to the experience, said Larry Schor, PhD, a coauthor of the paper. “Headache is a terrible word. Bee stings sting, burns burn. [A cluster headache] doesn’t ache. It’s a piercing intensity like you just can’t believe,” said Dr. Schor, professor of psychology at the University of West Georgia, Carrollton, and a cluster headache patient since he first experienced an attack at the age of 21.
The study was published in the January 2021 issue of Headache.
Ranking cluster headaches as worse than experiences such as childbirth or kidney stones is “kind of eye opening, and helps to describe the experience in terms that more people can relate to. I think it helps to share the experience of cluster headache more broadly, because we’re in a situation where cluster headache remains underfunded, and we don’t have enough treatments for it. I think one way to overcome that is to spread awareness of what this problem is, and the impact it has on human life,” said Rashmi Halker Singh, MD, associate professor of neurology at the Mayo Clinic in Scottsdale, Ariz., and deputy editor of Headache. She was not involved in the study.
Dr. Schor called for physicians to consider cluster headache an emergency, because of the severity of pain and also the potential for suicidality. Treatments remain comparatively sparse, but high-flow oxygen can help some patients, and intranasal or intravenous triptans can treat acute pain. In 2018, the Food and Drug Administration approved galcanezumab (Eli Lilly) for prevention of episodic cluster headaches.
But cluster headaches are often misdiagnosed. For many patients, it takes more than a year or even as long as 5 years to get an accurate diagnosis, according to Dr. Schor. Women may be particularly vulnerable to misdiagnosis, because migraines are more common in women. It doesn’t help that many neurologists are taught that cluster headache is primarily a male disease. “Because that idea is so ingrained, I think a lot of women who have cluster headache are probably missed and told they have migraine instead. There are a lot of women who have cluster headache, and that gender difference might not be as big a difference as we were initially taught. We need to do a better job of recognizing cluster headache to better understand what the true prevalence is,” said Dr. Halker Singh.
She noted that patients with side-locked headache should be evaluated for cluster headache, and asked how long the pain lasts in the absence of medication. “Also ask about the presence of cranial autonomic symptoms, and if they occur in the context of headache pain, and if they are side-locked to the side of the headache. Those are important questions that can tease out cluster headache from other conditions,” said Dr. Halker Singh.
For the survey, the researchers asked 1,604 patients with cluster headache patients to rate pain on a scale of 1 to 10. Cluster headache ranked highest at 9.7, then labor pain (7.2), pancreatitis (7.0), and nephrolithiasis (6.9). Cluster headache pain was ranked at 10.0 by 72.1% of respondents. Those reporting maximal pain or were more likely to have cranial autonomic features in comparison with patients who reported less pain, including conjunctival injection or lacrimation (91% versus 85%), eyelid edema (77% versus 66%), forehead/facial sweating (60% versus 49%), fullness in the ear (47% versus 35%), and miosis or ptosis (85% versus 75%). They had more frequent attacks (4.0 versus 3.5 per day), higher Hopelessness Depression Symptom Questionnaire scores (24.5 versus 21.1), and reduced effectiveness of calcium channel blockers (2.2 versus 2.5 on a 5-point Likert scale). They were more often female (34% versus 24%). (P < .001 for all).
The study received funding from Autonomic Technologies and Cluster Busters. Dr. Schor and Dr. Halker Singh had no relevant financial disclosures.
Patients with cluster headache face a double whammy: Physicians too often fail to recognize it, and their condition is among the most severe and debilitating among headache types. In fact,
The study’s comparison of cluster headaches to other common painful experiences can help nonsufferers relate to the experience, said Larry Schor, PhD, a coauthor of the paper. “Headache is a terrible word. Bee stings sting, burns burn. [A cluster headache] doesn’t ache. It’s a piercing intensity like you just can’t believe,” said Dr. Schor, professor of psychology at the University of West Georgia, Carrollton, and a cluster headache patient since he first experienced an attack at the age of 21.
The study was published in the January 2021 issue of Headache.
Ranking cluster headaches as worse than experiences such as childbirth or kidney stones is “kind of eye opening, and helps to describe the experience in terms that more people can relate to. I think it helps to share the experience of cluster headache more broadly, because we’re in a situation where cluster headache remains underfunded, and we don’t have enough treatments for it. I think one way to overcome that is to spread awareness of what this problem is, and the impact it has on human life,” said Rashmi Halker Singh, MD, associate professor of neurology at the Mayo Clinic in Scottsdale, Ariz., and deputy editor of Headache. She was not involved in the study.
Dr. Schor called for physicians to consider cluster headache an emergency, because of the severity of pain and also the potential for suicidality. Treatments remain comparatively sparse, but high-flow oxygen can help some patients, and intranasal or intravenous triptans can treat acute pain. In 2018, the Food and Drug Administration approved galcanezumab (Eli Lilly) for prevention of episodic cluster headaches.
But cluster headaches are often misdiagnosed. For many patients, it takes more than a year or even as long as 5 years to get an accurate diagnosis, according to Dr. Schor. Women may be particularly vulnerable to misdiagnosis, because migraines are more common in women. It doesn’t help that many neurologists are taught that cluster headache is primarily a male disease. “Because that idea is so ingrained, I think a lot of women who have cluster headache are probably missed and told they have migraine instead. There are a lot of women who have cluster headache, and that gender difference might not be as big a difference as we were initially taught. We need to do a better job of recognizing cluster headache to better understand what the true prevalence is,” said Dr. Halker Singh.
She noted that patients with side-locked headache should be evaluated for cluster headache, and asked how long the pain lasts in the absence of medication. “Also ask about the presence of cranial autonomic symptoms, and if they occur in the context of headache pain, and if they are side-locked to the side of the headache. Those are important questions that can tease out cluster headache from other conditions,” said Dr. Halker Singh.
For the survey, the researchers asked 1,604 patients with cluster headache patients to rate pain on a scale of 1 to 10. Cluster headache ranked highest at 9.7, then labor pain (7.2), pancreatitis (7.0), and nephrolithiasis (6.9). Cluster headache pain was ranked at 10.0 by 72.1% of respondents. Those reporting maximal pain or were more likely to have cranial autonomic features in comparison with patients who reported less pain, including conjunctival injection or lacrimation (91% versus 85%), eyelid edema (77% versus 66%), forehead/facial sweating (60% versus 49%), fullness in the ear (47% versus 35%), and miosis or ptosis (85% versus 75%). They had more frequent attacks (4.0 versus 3.5 per day), higher Hopelessness Depression Symptom Questionnaire scores (24.5 versus 21.1), and reduced effectiveness of calcium channel blockers (2.2 versus 2.5 on a 5-point Likert scale). They were more often female (34% versus 24%). (P < .001 for all).
The study received funding from Autonomic Technologies and Cluster Busters. Dr. Schor and Dr. Halker Singh had no relevant financial disclosures.
FROM HEADACHE
Prostate drugs tied to lower risk for Parkinson’s disease
terazosin (Hytrin), doxazosin (Cardura), or alfuzosin (Uroxatral), all of which enhance glycolysis, was associated with a lower risk of developing Parkinson’s disease than patients taking a drug used for the same indication, tamsulosin (Flomax), which does not affect glycolysis.
new research suggests. Treatment of BPH with“If giving someone terazosin or similar medications truly reduces their risk of disease, these results could have significant clinical implications for neurologists,” said lead author Jacob E. Simmering, PhD, assistant professor of internal medicine at the University of Iowa, Iowa City.
There are few reliable neuroprotective treatments for Parkinson’s disease, he said. “We can manage some of the symptoms, but we can’t stop it from progressing. If a randomized trial finds the same result, this will provide a new option to slow progression of Parkinson’s disease.”
The pathogenesis of Parkinson’s disease is heterogeneous, however, and not all patients may benefit from glycolysis-enhancing drugs, the investigators noted. Future research will be needed to identify potential candidates for this treatment, and clarify the effects of these drugs, they wrote.
The findings were published online Feb. 1, 2021, in JAMA Neurology.
Time-dependent effects
The major risk factor for Parkinson’s disease is age, which is associated with impaired energy metabolism. Glycolysis is decreased among patients with Parkinson’s disease, yet impaired energy metabolism has not been investigated widely as a pathogenic factor in the disease, the authors wrote.
Studies have indicated that terazosin increases the activity of an enzyme important in glycolysis. Doxazosin and alfuzosin have a similar mechanism of action and enhance energy metabolism. Tamsulosin, a structurally unrelated drug, has the same mechanism of action as the other three drugs, but does not enhance energy metabolism.
In this report, the researchers investigated the hypothesis that patients who received therapy with terazosin, doxazosin, or alfuzosin would have a lower risk of developing Parkinson’s disease than patients receiving tamsulosin. To do that, they used health care utilization data from Denmark and the United States, including the Danish National Prescription Registry, the Danish National Patient Registry, the Danish Civil Registration System, and the Truven Health Analytics MarketScan database.
The investigators searched the records for patients who filled prescriptions for any of the four drugs of interest. They excluded any patients who developed Parkinson’s disease within 1 year of starting medication. Because use of these drugs is rare among women, they included only men in their analysis.
They looked at patient outcomes beginning at 1 year after the initiation of treatment. They also required patients to fill at least two prescriptions before the beginning of follow-up. Patients who switched from tamsulosin to any of the other drugs, or vice versa, were excluded from analysis.
The investigators used propensity-score matching to ensure that patients in the tamsulosin and terazosin/doxazosin/alfuzosin groups were similar in terms of their other potential risk factors. The primary outcome was the development of Parkinson’s disease.
They identified 52,365 propensity score–matched pairs in the Danish registries and 94,883 pairs in the Truven database. The mean age was 67.9 years in the Danish registries and 63.8 years in the Truven database, and follow-up was approximately 5 years and 3 years respectively. Baseline covariates were well balanced between cohorts.
Among Danish patients, those who took terazosin, doxazosin, or alfuzosin had a lower risk of developing Parkinson’s disease versus those who took tamsulosin (hazard ratio, 0.88). Similarly, patients in the Truven database who took terazosin, doxazosin, or alfuzosin had a lower risk of developing Parkinson’s disease than those who took tamsulosin (HR, 0.63).
In both cohorts, the risk for Parkinson’s disease among patients receiving terazosin, doxazosin, or alfuzosin, compared with those receiving tamsulosin, decreased with increasing numbers of prescriptions filled. Long-term treatment with any of the three glycolysis-enhancing drugs was associated with greater risk reduction in the Danish (HR, 0.79) and Truven (HR, 0.46) cohorts versus tamsulosin.
Differences in case definitions, which may reflect how Parkinson’s disease was managed, complicate comparisons between the Danish and Truven cohorts, said Dr. Simmering. Another challenge is the source of the data. “The Truven data set was derived from insurance claims from people with private insurance or Medicare supplemental plans,” he said. “This group is quite large but may not be representative of everyone in the United States. We would also only be able to follow people while they were on one insurance plan. If they switched coverage to a company that doesn’t contribute data, we would lose them.”
The Danish database, however, includes all residents of Denmark. Only people who left the country were lost to follow-up.
The results support the hypothesis that increasing energy in cells slows disease progression, Dr. Simmering added. “There are a few conditions, mostly REM sleep disorders, that are associated with future diagnosis of Parkinson’s disease. Right now, we don’t have anything to offer people at elevated risk of Parkinson’s disease that might prevent the disease. If a controlled trial finds that terazosin slows or prevents Parkinson’s disease, we would have something truly protective to offer these patients.”
Biomarker needed
Commenting on the results, Alberto J. Espay, MD, MSc, professor of neurology at the University of Cincinnati Academic Health Center, was cautious. “These findings are of unclear applicability to any particular patient without a biomarker for a deficit of glycolysis that these drugs are presumed to affect,” Dr. Espay said. “Hence, there is no feasible or warranted change in practice as a result of this study.”
Pathogenic mechanisms are heterogeneous among patients with Parkinson’s disease, Dr. Espay added. “We will need to understand who among the large biological universe of Parkinson’s patients may have impaired energy metabolism as a pathogenic mechanism to be selected for a future clinical trial evaluating terazosin, doxazosin, or alfuzosin as a potential disease-modifying intervention.”
Parkinson’s disease is not one disease, but a group of disorders with unique biological abnormalities, said Dr. Espay. “We know so much about ‘Parkinson’s disease’ and next to nothing about the biology of individuals with Parkinson’s disease.”
This situation has enabled the development of symptomatic treatments, such as dopaminergic therapies, but failed to yield disease-modifying treatments, he said.
The University of Iowa contributed funds for this study. Dr. Simmering has received pilot funding from the University of Iowa Institute for Clinical and Translational Science. He had no conflicts of interest to disclose. Dr. Espay disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
terazosin (Hytrin), doxazosin (Cardura), or alfuzosin (Uroxatral), all of which enhance glycolysis, was associated with a lower risk of developing Parkinson’s disease than patients taking a drug used for the same indication, tamsulosin (Flomax), which does not affect glycolysis.
new research suggests. Treatment of BPH with“If giving someone terazosin or similar medications truly reduces their risk of disease, these results could have significant clinical implications for neurologists,” said lead author Jacob E. Simmering, PhD, assistant professor of internal medicine at the University of Iowa, Iowa City.
There are few reliable neuroprotective treatments for Parkinson’s disease, he said. “We can manage some of the symptoms, but we can’t stop it from progressing. If a randomized trial finds the same result, this will provide a new option to slow progression of Parkinson’s disease.”
The pathogenesis of Parkinson’s disease is heterogeneous, however, and not all patients may benefit from glycolysis-enhancing drugs, the investigators noted. Future research will be needed to identify potential candidates for this treatment, and clarify the effects of these drugs, they wrote.
The findings were published online Feb. 1, 2021, in JAMA Neurology.
Time-dependent effects
The major risk factor for Parkinson’s disease is age, which is associated with impaired energy metabolism. Glycolysis is decreased among patients with Parkinson’s disease, yet impaired energy metabolism has not been investigated widely as a pathogenic factor in the disease, the authors wrote.
Studies have indicated that terazosin increases the activity of an enzyme important in glycolysis. Doxazosin and alfuzosin have a similar mechanism of action and enhance energy metabolism. Tamsulosin, a structurally unrelated drug, has the same mechanism of action as the other three drugs, but does not enhance energy metabolism.
In this report, the researchers investigated the hypothesis that patients who received therapy with terazosin, doxazosin, or alfuzosin would have a lower risk of developing Parkinson’s disease than patients receiving tamsulosin. To do that, they used health care utilization data from Denmark and the United States, including the Danish National Prescription Registry, the Danish National Patient Registry, the Danish Civil Registration System, and the Truven Health Analytics MarketScan database.
The investigators searched the records for patients who filled prescriptions for any of the four drugs of interest. They excluded any patients who developed Parkinson’s disease within 1 year of starting medication. Because use of these drugs is rare among women, they included only men in their analysis.
They looked at patient outcomes beginning at 1 year after the initiation of treatment. They also required patients to fill at least two prescriptions before the beginning of follow-up. Patients who switched from tamsulosin to any of the other drugs, or vice versa, were excluded from analysis.
The investigators used propensity-score matching to ensure that patients in the tamsulosin and terazosin/doxazosin/alfuzosin groups were similar in terms of their other potential risk factors. The primary outcome was the development of Parkinson’s disease.
They identified 52,365 propensity score–matched pairs in the Danish registries and 94,883 pairs in the Truven database. The mean age was 67.9 years in the Danish registries and 63.8 years in the Truven database, and follow-up was approximately 5 years and 3 years respectively. Baseline covariates were well balanced between cohorts.
Among Danish patients, those who took terazosin, doxazosin, or alfuzosin had a lower risk of developing Parkinson’s disease versus those who took tamsulosin (hazard ratio, 0.88). Similarly, patients in the Truven database who took terazosin, doxazosin, or alfuzosin had a lower risk of developing Parkinson’s disease than those who took tamsulosin (HR, 0.63).
In both cohorts, the risk for Parkinson’s disease among patients receiving terazosin, doxazosin, or alfuzosin, compared with those receiving tamsulosin, decreased with increasing numbers of prescriptions filled. Long-term treatment with any of the three glycolysis-enhancing drugs was associated with greater risk reduction in the Danish (HR, 0.79) and Truven (HR, 0.46) cohorts versus tamsulosin.
Differences in case definitions, which may reflect how Parkinson’s disease was managed, complicate comparisons between the Danish and Truven cohorts, said Dr. Simmering. Another challenge is the source of the data. “The Truven data set was derived from insurance claims from people with private insurance or Medicare supplemental plans,” he said. “This group is quite large but may not be representative of everyone in the United States. We would also only be able to follow people while they were on one insurance plan. If they switched coverage to a company that doesn’t contribute data, we would lose them.”
The Danish database, however, includes all residents of Denmark. Only people who left the country were lost to follow-up.
The results support the hypothesis that increasing energy in cells slows disease progression, Dr. Simmering added. “There are a few conditions, mostly REM sleep disorders, that are associated with future diagnosis of Parkinson’s disease. Right now, we don’t have anything to offer people at elevated risk of Parkinson’s disease that might prevent the disease. If a controlled trial finds that terazosin slows or prevents Parkinson’s disease, we would have something truly protective to offer these patients.”
Biomarker needed
Commenting on the results, Alberto J. Espay, MD, MSc, professor of neurology at the University of Cincinnati Academic Health Center, was cautious. “These findings are of unclear applicability to any particular patient without a biomarker for a deficit of glycolysis that these drugs are presumed to affect,” Dr. Espay said. “Hence, there is no feasible or warranted change in practice as a result of this study.”
Pathogenic mechanisms are heterogeneous among patients with Parkinson’s disease, Dr. Espay added. “We will need to understand who among the large biological universe of Parkinson’s patients may have impaired energy metabolism as a pathogenic mechanism to be selected for a future clinical trial evaluating terazosin, doxazosin, or alfuzosin as a potential disease-modifying intervention.”
Parkinson’s disease is not one disease, but a group of disorders with unique biological abnormalities, said Dr. Espay. “We know so much about ‘Parkinson’s disease’ and next to nothing about the biology of individuals with Parkinson’s disease.”
This situation has enabled the development of symptomatic treatments, such as dopaminergic therapies, but failed to yield disease-modifying treatments, he said.
The University of Iowa contributed funds for this study. Dr. Simmering has received pilot funding from the University of Iowa Institute for Clinical and Translational Science. He had no conflicts of interest to disclose. Dr. Espay disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
terazosin (Hytrin), doxazosin (Cardura), or alfuzosin (Uroxatral), all of which enhance glycolysis, was associated with a lower risk of developing Parkinson’s disease than patients taking a drug used for the same indication, tamsulosin (Flomax), which does not affect glycolysis.
new research suggests. Treatment of BPH with“If giving someone terazosin or similar medications truly reduces their risk of disease, these results could have significant clinical implications for neurologists,” said lead author Jacob E. Simmering, PhD, assistant professor of internal medicine at the University of Iowa, Iowa City.
There are few reliable neuroprotective treatments for Parkinson’s disease, he said. “We can manage some of the symptoms, but we can’t stop it from progressing. If a randomized trial finds the same result, this will provide a new option to slow progression of Parkinson’s disease.”
The pathogenesis of Parkinson’s disease is heterogeneous, however, and not all patients may benefit from glycolysis-enhancing drugs, the investigators noted. Future research will be needed to identify potential candidates for this treatment, and clarify the effects of these drugs, they wrote.
The findings were published online Feb. 1, 2021, in JAMA Neurology.
Time-dependent effects
The major risk factor for Parkinson’s disease is age, which is associated with impaired energy metabolism. Glycolysis is decreased among patients with Parkinson’s disease, yet impaired energy metabolism has not been investigated widely as a pathogenic factor in the disease, the authors wrote.
Studies have indicated that terazosin increases the activity of an enzyme important in glycolysis. Doxazosin and alfuzosin have a similar mechanism of action and enhance energy metabolism. Tamsulosin, a structurally unrelated drug, has the same mechanism of action as the other three drugs, but does not enhance energy metabolism.
In this report, the researchers investigated the hypothesis that patients who received therapy with terazosin, doxazosin, or alfuzosin would have a lower risk of developing Parkinson’s disease than patients receiving tamsulosin. To do that, they used health care utilization data from Denmark and the United States, including the Danish National Prescription Registry, the Danish National Patient Registry, the Danish Civil Registration System, and the Truven Health Analytics MarketScan database.
The investigators searched the records for patients who filled prescriptions for any of the four drugs of interest. They excluded any patients who developed Parkinson’s disease within 1 year of starting medication. Because use of these drugs is rare among women, they included only men in their analysis.
They looked at patient outcomes beginning at 1 year after the initiation of treatment. They also required patients to fill at least two prescriptions before the beginning of follow-up. Patients who switched from tamsulosin to any of the other drugs, or vice versa, were excluded from analysis.
The investigators used propensity-score matching to ensure that patients in the tamsulosin and terazosin/doxazosin/alfuzosin groups were similar in terms of their other potential risk factors. The primary outcome was the development of Parkinson’s disease.
They identified 52,365 propensity score–matched pairs in the Danish registries and 94,883 pairs in the Truven database. The mean age was 67.9 years in the Danish registries and 63.8 years in the Truven database, and follow-up was approximately 5 years and 3 years respectively. Baseline covariates were well balanced between cohorts.
Among Danish patients, those who took terazosin, doxazosin, or alfuzosin had a lower risk of developing Parkinson’s disease versus those who took tamsulosin (hazard ratio, 0.88). Similarly, patients in the Truven database who took terazosin, doxazosin, or alfuzosin had a lower risk of developing Parkinson’s disease than those who took tamsulosin (HR, 0.63).
In both cohorts, the risk for Parkinson’s disease among patients receiving terazosin, doxazosin, or alfuzosin, compared with those receiving tamsulosin, decreased with increasing numbers of prescriptions filled. Long-term treatment with any of the three glycolysis-enhancing drugs was associated with greater risk reduction in the Danish (HR, 0.79) and Truven (HR, 0.46) cohorts versus tamsulosin.
Differences in case definitions, which may reflect how Parkinson’s disease was managed, complicate comparisons between the Danish and Truven cohorts, said Dr. Simmering. Another challenge is the source of the data. “The Truven data set was derived from insurance claims from people with private insurance or Medicare supplemental plans,” he said. “This group is quite large but may not be representative of everyone in the United States. We would also only be able to follow people while they were on one insurance plan. If they switched coverage to a company that doesn’t contribute data, we would lose them.”
The Danish database, however, includes all residents of Denmark. Only people who left the country were lost to follow-up.
The results support the hypothesis that increasing energy in cells slows disease progression, Dr. Simmering added. “There are a few conditions, mostly REM sleep disorders, that are associated with future diagnosis of Parkinson’s disease. Right now, we don’t have anything to offer people at elevated risk of Parkinson’s disease that might prevent the disease. If a controlled trial finds that terazosin slows or prevents Parkinson’s disease, we would have something truly protective to offer these patients.”
Biomarker needed
Commenting on the results, Alberto J. Espay, MD, MSc, professor of neurology at the University of Cincinnati Academic Health Center, was cautious. “These findings are of unclear applicability to any particular patient without a biomarker for a deficit of glycolysis that these drugs are presumed to affect,” Dr. Espay said. “Hence, there is no feasible or warranted change in practice as a result of this study.”
Pathogenic mechanisms are heterogeneous among patients with Parkinson’s disease, Dr. Espay added. “We will need to understand who among the large biological universe of Parkinson’s patients may have impaired energy metabolism as a pathogenic mechanism to be selected for a future clinical trial evaluating terazosin, doxazosin, or alfuzosin as a potential disease-modifying intervention.”
Parkinson’s disease is not one disease, but a group of disorders with unique biological abnormalities, said Dr. Espay. “We know so much about ‘Parkinson’s disease’ and next to nothing about the biology of individuals with Parkinson’s disease.”
This situation has enabled the development of symptomatic treatments, such as dopaminergic therapies, but failed to yield disease-modifying treatments, he said.
The University of Iowa contributed funds for this study. Dr. Simmering has received pilot funding from the University of Iowa Institute for Clinical and Translational Science. He had no conflicts of interest to disclose. Dr. Espay disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM NEUROLOGY