Slot System
Featured Buckets
Featured Buckets Admin
Reverse Chronological Sort
Allow Teaser Image

Ultrasound ablation for Parkinson’s disease: Benefit limited by adverse effects

Article Type
Changed
Thu, 12/15/2022 - 15:42

Focused ultrasound for ablation of the subthalamic nucleus in one hemisphere improved motor features in a selected group of patients with markedly asymmetric Parkinson’s disease, but was associated with a high rate of adverse events, including dyskinesias and other neurologic complications, in a new randomized, sham-controlled trial.

“Longer-term and larger trials are needed to determine the role of focused ultrasound subthalamotomy in the management of Parkinson’s disease and its effect as compared with other available treatments, including deep-brain stimulation,” the authors concluded.

The trial was published online Dec.24, 2020, in the New England Journal of Medicine.

An accompanying editorial concluded that the high rate of adverse events and the lack of ability to modulate treatment over time to treat prominent tremor “raise questions about the appropriate implementation of focused ultrasound–produced lesions for the treatment of Parkinson’s disease.”
 

A scalpel-free alternative to brain surgery

The study authors, led by Raul Martinez-Fernandez, MD, PhD, University Hospital HM Puerta del Sur, Mostoles, Spain, explained that, in severe cases of refractory motor manifestations such as tremor and motor complications, a neurosurgical approach using deep-brain stimulation of the subthalamic nucleus can be used. But to avoid craniotomy and electrode penetration, MRI-guided focused ultrasound for the ablation of deep-brain structures, including the subthalamic nucleus, is being investigated as a treatment for Parkinson’s disease.

Patients are potential candidates for ultrasound ablation if they have prominently asymmetric parkinsonism, if they are not considered to be clinically suitable candidates for surgery because of contraindications, or if they are reluctant to undergo a brain operation or to have an implanted device.

The current trial involved 40 patients with markedly asymmetric Parkinson’s disease who had motor signs not fully controlled by medication or who were ineligible for deep-brain stimulation surgery. They were randomly assigned in a 2:1 ratio to undergo focused ultrasound subthalamotomy on the side opposite their main motor signs or a sham procedure.

Results showed that the mean Movement Disorder Society–Unified Parkinson’s Disease Rating Scale part III (MDS-UPDRS III) motor score for the more affected side – which was the primary endpoint – decreased from 19.9 at baseline to 9.9 at 4 months in the active-treatment group (least-squares mean difference, 9.8 points); and from 18.7 to 17.1 in the control group (least-squares mean difference, 1.7 points). The between-group difference was 8.1 points (P < .001).

The change from baseline in the MDS-UPDRS III score for the more affected side in patients who underwent active treatment varied, ranging from 5% to 95%; the changes were qualitatively more evident for reduction of tremor and rigidity than for bradykinesia.

Adverse events in the active-treatment group were the following:

  • Dyskinesia in the off-medication state in six patients and in the on-medication state in six, which persisted in three and one, respectively, at 4 months.
  • Weakness on the treated side in five patients, which persisted in two at 4 months.
  • Speech disturbance in 15 patients, which persisted in 3 at 4 months.
  • Facial weakness in three patients, which persisted in one at 4 months.
  • in 13 patients, which persisted in two at 4 months.
 

 

In six patients in the active-treatment group, some of these deficits were present at 12 months.

The researchers noted that an approach that has been suggested to reduce the risk of dyskinesias has been to extend ablations dorsal to the subthalamic nucleus in order to interrupt the pallidothalamic-projecting neurons.

The study also showed a greater reduction in the use of dopaminergic medication in the active-treatment group versus the control group, but the researchers noted that the 95% confidence intervals for this and other secondary outcomes were not adjusted for multiple comparisons, so no definite conclusions can be drawn from these data.

They also pointed out that subthalamotomy was performed in one hemisphere, and the natural evolution of Parkinson’s disease eventually leads to motor impairment on both sides of the body in most patients.

“The likely need for an increase in the daily dose of levodopa equivalent to maintain function on the untreated side of the body could lead to the development of dyskinesias on the treated side. However, the few open-label studies of long-term (≥36 months) follow-up of radiofrequency subthalamotomy performed in one hemisphere do not provide support for this concern,” they said.
 

An important step, but improvements are needed

In an accompanying editorial, Joel S. Perlmutter, MD, and Mwiza Ushe, MD, Washington University, St. Louis, noted that surgical deep brain stimulation of the left and right subthalamic nuclei has shown a reduction in the severity of motor signs of 40%-60% and a reduction in medication use of up to 50%. But this technique involves a small craniotomy with implantation of stimulating electrodes, which has a 1%-5% risk of major adverse events such as hemorrhage, stroke, or infection.

Less severe complications include dystonia, dysarthria, gait impairment, dyskinesia, swallowing dysfunction, or change in verbal fluency; however, modification of the device programming may alleviate these effects. Nevertheless, some patients are wary of the implantation surgery and hardware and therefore decline to undergo deep-brain stimulation, the editorialists explained.

“The development of alternative procedures to deep-brain stimulation is important to the field of Parkinson’s disease treatment. The current trial begins the path to that goal, and improvements in targeting may improve the risk-benefit ratio and permit the use of lesions in both hemispheres, which would widen the population of eligible patients,” Dr. Perlmutter and Dr. Ushe wrote.

They pointed out that limiting the treatment to one side of the brain by ultrasound-produced lesioning constrains the application, since most patients with Parkinson’s disease have progression of symptoms on both sides of the body.

“The potential advantages and limitations of focused ultrasound–produced lesioning should be discussed with patients. We hope that improved technique will reduce the associated risks and increase the applicability of this provocative procedure,” the editorialists concluded.

This study was supported by Insightec, the Focused Ultrasound Foundation, Fundacion MAPFRE, Fundacion Hospitales de Madrid, and the University of Virginia Center of Excellence. Dr. Martinez-Fernandez reported receiving for consultancy fees for Insightec. Dr. Ushe reported non-financial support for Abbott outside the submitted work. Dr. Perlmutter disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Publications
Topics
Sections

Focused ultrasound for ablation of the subthalamic nucleus in one hemisphere improved motor features in a selected group of patients with markedly asymmetric Parkinson’s disease, but was associated with a high rate of adverse events, including dyskinesias and other neurologic complications, in a new randomized, sham-controlled trial.

“Longer-term and larger trials are needed to determine the role of focused ultrasound subthalamotomy in the management of Parkinson’s disease and its effect as compared with other available treatments, including deep-brain stimulation,” the authors concluded.

The trial was published online Dec.24, 2020, in the New England Journal of Medicine.

An accompanying editorial concluded that the high rate of adverse events and the lack of ability to modulate treatment over time to treat prominent tremor “raise questions about the appropriate implementation of focused ultrasound–produced lesions for the treatment of Parkinson’s disease.”
 

A scalpel-free alternative to brain surgery

The study authors, led by Raul Martinez-Fernandez, MD, PhD, University Hospital HM Puerta del Sur, Mostoles, Spain, explained that, in severe cases of refractory motor manifestations such as tremor and motor complications, a neurosurgical approach using deep-brain stimulation of the subthalamic nucleus can be used. But to avoid craniotomy and electrode penetration, MRI-guided focused ultrasound for the ablation of deep-brain structures, including the subthalamic nucleus, is being investigated as a treatment for Parkinson’s disease.

Patients are potential candidates for ultrasound ablation if they have prominently asymmetric parkinsonism, if they are not considered to be clinically suitable candidates for surgery because of contraindications, or if they are reluctant to undergo a brain operation or to have an implanted device.

The current trial involved 40 patients with markedly asymmetric Parkinson’s disease who had motor signs not fully controlled by medication or who were ineligible for deep-brain stimulation surgery. They were randomly assigned in a 2:1 ratio to undergo focused ultrasound subthalamotomy on the side opposite their main motor signs or a sham procedure.

Results showed that the mean Movement Disorder Society–Unified Parkinson’s Disease Rating Scale part III (MDS-UPDRS III) motor score for the more affected side – which was the primary endpoint – decreased from 19.9 at baseline to 9.9 at 4 months in the active-treatment group (least-squares mean difference, 9.8 points); and from 18.7 to 17.1 in the control group (least-squares mean difference, 1.7 points). The between-group difference was 8.1 points (P < .001).

The change from baseline in the MDS-UPDRS III score for the more affected side in patients who underwent active treatment varied, ranging from 5% to 95%; the changes were qualitatively more evident for reduction of tremor and rigidity than for bradykinesia.

Adverse events in the active-treatment group were the following:

  • Dyskinesia in the off-medication state in six patients and in the on-medication state in six, which persisted in three and one, respectively, at 4 months.
  • Weakness on the treated side in five patients, which persisted in two at 4 months.
  • Speech disturbance in 15 patients, which persisted in 3 at 4 months.
  • Facial weakness in three patients, which persisted in one at 4 months.
  • in 13 patients, which persisted in two at 4 months.
 

 

In six patients in the active-treatment group, some of these deficits were present at 12 months.

The researchers noted that an approach that has been suggested to reduce the risk of dyskinesias has been to extend ablations dorsal to the subthalamic nucleus in order to interrupt the pallidothalamic-projecting neurons.

The study also showed a greater reduction in the use of dopaminergic medication in the active-treatment group versus the control group, but the researchers noted that the 95% confidence intervals for this and other secondary outcomes were not adjusted for multiple comparisons, so no definite conclusions can be drawn from these data.

They also pointed out that subthalamotomy was performed in one hemisphere, and the natural evolution of Parkinson’s disease eventually leads to motor impairment on both sides of the body in most patients.

“The likely need for an increase in the daily dose of levodopa equivalent to maintain function on the untreated side of the body could lead to the development of dyskinesias on the treated side. However, the few open-label studies of long-term (≥36 months) follow-up of radiofrequency subthalamotomy performed in one hemisphere do not provide support for this concern,” they said.
 

An important step, but improvements are needed

In an accompanying editorial, Joel S. Perlmutter, MD, and Mwiza Ushe, MD, Washington University, St. Louis, noted that surgical deep brain stimulation of the left and right subthalamic nuclei has shown a reduction in the severity of motor signs of 40%-60% and a reduction in medication use of up to 50%. But this technique involves a small craniotomy with implantation of stimulating electrodes, which has a 1%-5% risk of major adverse events such as hemorrhage, stroke, or infection.

Less severe complications include dystonia, dysarthria, gait impairment, dyskinesia, swallowing dysfunction, or change in verbal fluency; however, modification of the device programming may alleviate these effects. Nevertheless, some patients are wary of the implantation surgery and hardware and therefore decline to undergo deep-brain stimulation, the editorialists explained.

“The development of alternative procedures to deep-brain stimulation is important to the field of Parkinson’s disease treatment. The current trial begins the path to that goal, and improvements in targeting may improve the risk-benefit ratio and permit the use of lesions in both hemispheres, which would widen the population of eligible patients,” Dr. Perlmutter and Dr. Ushe wrote.

They pointed out that limiting the treatment to one side of the brain by ultrasound-produced lesioning constrains the application, since most patients with Parkinson’s disease have progression of symptoms on both sides of the body.

“The potential advantages and limitations of focused ultrasound–produced lesioning should be discussed with patients. We hope that improved technique will reduce the associated risks and increase the applicability of this provocative procedure,” the editorialists concluded.

This study was supported by Insightec, the Focused Ultrasound Foundation, Fundacion MAPFRE, Fundacion Hospitales de Madrid, and the University of Virginia Center of Excellence. Dr. Martinez-Fernandez reported receiving for consultancy fees for Insightec. Dr. Ushe reported non-financial support for Abbott outside the submitted work. Dr. Perlmutter disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Focused ultrasound for ablation of the subthalamic nucleus in one hemisphere improved motor features in a selected group of patients with markedly asymmetric Parkinson’s disease, but was associated with a high rate of adverse events, including dyskinesias and other neurologic complications, in a new randomized, sham-controlled trial.

“Longer-term and larger trials are needed to determine the role of focused ultrasound subthalamotomy in the management of Parkinson’s disease and its effect as compared with other available treatments, including deep-brain stimulation,” the authors concluded.

The trial was published online Dec.24, 2020, in the New England Journal of Medicine.

An accompanying editorial concluded that the high rate of adverse events and the lack of ability to modulate treatment over time to treat prominent tremor “raise questions about the appropriate implementation of focused ultrasound–produced lesions for the treatment of Parkinson’s disease.”
 

A scalpel-free alternative to brain surgery

The study authors, led by Raul Martinez-Fernandez, MD, PhD, University Hospital HM Puerta del Sur, Mostoles, Spain, explained that, in severe cases of refractory motor manifestations such as tremor and motor complications, a neurosurgical approach using deep-brain stimulation of the subthalamic nucleus can be used. But to avoid craniotomy and electrode penetration, MRI-guided focused ultrasound for the ablation of deep-brain structures, including the subthalamic nucleus, is being investigated as a treatment for Parkinson’s disease.

Patients are potential candidates for ultrasound ablation if they have prominently asymmetric parkinsonism, if they are not considered to be clinically suitable candidates for surgery because of contraindications, or if they are reluctant to undergo a brain operation or to have an implanted device.

The current trial involved 40 patients with markedly asymmetric Parkinson’s disease who had motor signs not fully controlled by medication or who were ineligible for deep-brain stimulation surgery. They were randomly assigned in a 2:1 ratio to undergo focused ultrasound subthalamotomy on the side opposite their main motor signs or a sham procedure.

Results showed that the mean Movement Disorder Society–Unified Parkinson’s Disease Rating Scale part III (MDS-UPDRS III) motor score for the more affected side – which was the primary endpoint – decreased from 19.9 at baseline to 9.9 at 4 months in the active-treatment group (least-squares mean difference, 9.8 points); and from 18.7 to 17.1 in the control group (least-squares mean difference, 1.7 points). The between-group difference was 8.1 points (P < .001).

The change from baseline in the MDS-UPDRS III score for the more affected side in patients who underwent active treatment varied, ranging from 5% to 95%; the changes were qualitatively more evident for reduction of tremor and rigidity than for bradykinesia.

Adverse events in the active-treatment group were the following:

  • Dyskinesia in the off-medication state in six patients and in the on-medication state in six, which persisted in three and one, respectively, at 4 months.
  • Weakness on the treated side in five patients, which persisted in two at 4 months.
  • Speech disturbance in 15 patients, which persisted in 3 at 4 months.
  • Facial weakness in three patients, which persisted in one at 4 months.
  • in 13 patients, which persisted in two at 4 months.
 

 

In six patients in the active-treatment group, some of these deficits were present at 12 months.

The researchers noted that an approach that has been suggested to reduce the risk of dyskinesias has been to extend ablations dorsal to the subthalamic nucleus in order to interrupt the pallidothalamic-projecting neurons.

The study also showed a greater reduction in the use of dopaminergic medication in the active-treatment group versus the control group, but the researchers noted that the 95% confidence intervals for this and other secondary outcomes were not adjusted for multiple comparisons, so no definite conclusions can be drawn from these data.

They also pointed out that subthalamotomy was performed in one hemisphere, and the natural evolution of Parkinson’s disease eventually leads to motor impairment on both sides of the body in most patients.

“The likely need for an increase in the daily dose of levodopa equivalent to maintain function on the untreated side of the body could lead to the development of dyskinesias on the treated side. However, the few open-label studies of long-term (≥36 months) follow-up of radiofrequency subthalamotomy performed in one hemisphere do not provide support for this concern,” they said.
 

An important step, but improvements are needed

In an accompanying editorial, Joel S. Perlmutter, MD, and Mwiza Ushe, MD, Washington University, St. Louis, noted that surgical deep brain stimulation of the left and right subthalamic nuclei has shown a reduction in the severity of motor signs of 40%-60% and a reduction in medication use of up to 50%. But this technique involves a small craniotomy with implantation of stimulating electrodes, which has a 1%-5% risk of major adverse events such as hemorrhage, stroke, or infection.

Less severe complications include dystonia, dysarthria, gait impairment, dyskinesia, swallowing dysfunction, or change in verbal fluency; however, modification of the device programming may alleviate these effects. Nevertheless, some patients are wary of the implantation surgery and hardware and therefore decline to undergo deep-brain stimulation, the editorialists explained.

“The development of alternative procedures to deep-brain stimulation is important to the field of Parkinson’s disease treatment. The current trial begins the path to that goal, and improvements in targeting may improve the risk-benefit ratio and permit the use of lesions in both hemispheres, which would widen the population of eligible patients,” Dr. Perlmutter and Dr. Ushe wrote.

They pointed out that limiting the treatment to one side of the brain by ultrasound-produced lesioning constrains the application, since most patients with Parkinson’s disease have progression of symptoms on both sides of the body.

“The potential advantages and limitations of focused ultrasound–produced lesioning should be discussed with patients. We hope that improved technique will reduce the associated risks and increase the applicability of this provocative procedure,” the editorialists concluded.

This study was supported by Insightec, the Focused Ultrasound Foundation, Fundacion MAPFRE, Fundacion Hospitales de Madrid, and the University of Virginia Center of Excellence. Dr. Martinez-Fernandez reported receiving for consultancy fees for Insightec. Dr. Ushe reported non-financial support for Abbott outside the submitted work. Dr. Perlmutter disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Issue
Neurology Reviews- 29(2)
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

Citation Override
Publish date: January 7, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

Which imaging criteria identify progressive forms of MS?

Article Type
Changed
Thu, 12/15/2022 - 15:42

The role of imaging in diagnosing progressive multiple sclerosis (MS) and in assessing prognosis is the subject of a new review.

MRI is central in the diagnostic work-up of patients suspected of having MS, given its high sensitivity in detecting disease dissemination in space and over time and its notable ability to exclude mimics of MS, the authors noted. However, diagnosis of primary progressive MS remains challenging and is only possible retrospectively on the basis of clinical assessment.

Identification of imaging features associated with primary progressive MS and features that predict evolution from relapsing remitting MS to secondary progressive MS is an important, unmet need, they wrote.

Diagnosis of progressive MS is limited by difficulties in distinguishing accumulating disability caused by inflammatory disease activity from that attributable to degenerative processes associated with secondary progressive MS. Moreover, there are no accepted clinical criteria for diagnosing secondary progressive MS, the authors explained.

This need has promoted extensive research in the field of imaging, facilitated by definition of novel MRI sequences, to identify imaging features reflecting pathophysiological mechanisms relevant to the pathobiology of progressive MS, the authors said.

The current review reports the conclusions of a workshop held in Milan in November 2019, at which an expert panel of neurologists and neuroradiologists addressed the role of MRI in progressive MS.

Massimo Filippi, MD, IRCCS San Raffaele Scientific Institute, Milan, was the lead author of the review, which was published online Dec. 14, 2020, in JAMA Neurology.

The authors concluded that no definitive, qualitative clinical, immunologic, histopathologic, or neuroimaging features differentiate primary progressive and secondary progressive forms of MS; both are characterized by neurodegenerative phenomena and a gradual and irreversible accumulation of clinical disability, which is also affected by aging and comorbidities.

A definitive diagnosis of primary progressive MS is more difficult than a diagnosis of relapsing remitting MS; in part, primary progressive MS is a diagnosis of exclusion because it can be mimicked by other conditions clinically and radiologically, the authors noted.

The writers did report that, although nonspecific, some spinal cord imaging features are typical of primary progressive MS. These include diffuse abnormalities and lesions involving gray matter and two or more white-matter columns, but confirmation of this is required.

In patients with primary progressive MS and those with relapse-onset MS, MRI features at disease onset predict long-term disability and a progressive disease course. These features include lesions in critical central nervous system regions (i.e., spinal cord, infratentorial regions, and gray matter) and high inflammatory activity in the first years after disease onset. These measures are evaluable in clinical practice, the authors said.

In patients with established MS, gray-matter involvement and neurodegeneration are associated with accelerated clinical worsening; however, detection validation and standardization need to be implemented at the individual patient level, they commented.

Novel candidate imaging biomarkers, such as subpial demyelination, and the presence of slowly expanding lesions or paramagnetic rim lesions may identify progressive MS but should be further investigated, they added.

Discovery of MRI markers capable of detecting evolution from relapsing-remitting to secondary progressive MS remains an unmet need that will probably require multiparametric MRI studies, because it is unlikely that a single MRI method will be able to allow clinicians to optimally distinguish among these stages, the authors said.

The contribution of these promising MRI measures combined with other biomarkers, such as quantification of serum neurofilament light chain levels or optical coherence tomography assessment, should be explored to improve the identification of patients with progressive MS, they concluded.
 

 

 

‘A comprehensive review’

In a comment, Jeffrey A. Cohen, MD, director of the Cleveland Clinic’s Mellen Center for MS Treatment and Research, said the article is a comprehensive review of the pathologic mechanisms that underlie progression in MS and the proxy measures of those processes (brain and spinal cord MRI, PET, optical coherence tomography, and biomarkers).

“The paper reports there is no qualitative difference between relapsing remitting and progressive MS; rather, the difference is quantitative,” Dr. Cohen noted. “In other words, the processes that underlie progression are present from the earliest stages of MS, becoming more prominent over time.”

The apparent transition to progressive MS, he added, “rather than representing a ‘transition,’ instead results from the accumulation of pathology over time, a shift from focal lesions to diffuse inflammation and damage, and unmasking of the damage due to decreased resiliency due to aging and failure of compensatory mechanisms (neuroplasticity and remyelination).”

Also commenting, Edward Fox, MD, director, MS Clinic of Central Texas and clinical associate professor, University of Texas, Austin, explained that loss of tissue is the main driver of progressive MS.

“We all look at imaging to confirm that the progressive symptoms expressed by the patient are related to demyelinating disease,” he said. “When I see MRI of the spinal cord showing multifocal lesions, especially if localized atrophy is seen in a region of the cord, I expect to hear a history of progressive deficits in gait and other signs of disability.”

Dr. Fox noted that, on MRI of the brain, gray matter atrophy both cortically and in the deep gray structures usually manifests as cognitive slowing and poorer performance in work and social situations.

“We hope that other biomarkers, such as neurofilament light chain, will add to this body of knowledge and give us a better grasp of the definition of neurodegeneration to confirm the clinical and radiographic findings,” he added.

Dr. Filippi has received compensation for consulting services and/or speaking activities from Bayer, Biogen Idec, Merck Serono, Novartis, Roche, Sanofi, Genzyme, Takeda, and Teva Pharmaceutical Industries; and research support from ARiSLA, Biogen Idec, Fondazione Italiana Sclerosi Multipla, Italian Ministry of Health, Merck Serono, Novartis, Roche, and Teva.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Publications
Topics
Sections

The role of imaging in diagnosing progressive multiple sclerosis (MS) and in assessing prognosis is the subject of a new review.

MRI is central in the diagnostic work-up of patients suspected of having MS, given its high sensitivity in detecting disease dissemination in space and over time and its notable ability to exclude mimics of MS, the authors noted. However, diagnosis of primary progressive MS remains challenging and is only possible retrospectively on the basis of clinical assessment.

Identification of imaging features associated with primary progressive MS and features that predict evolution from relapsing remitting MS to secondary progressive MS is an important, unmet need, they wrote.

Diagnosis of progressive MS is limited by difficulties in distinguishing accumulating disability caused by inflammatory disease activity from that attributable to degenerative processes associated with secondary progressive MS. Moreover, there are no accepted clinical criteria for diagnosing secondary progressive MS, the authors explained.

This need has promoted extensive research in the field of imaging, facilitated by definition of novel MRI sequences, to identify imaging features reflecting pathophysiological mechanisms relevant to the pathobiology of progressive MS, the authors said.

The current review reports the conclusions of a workshop held in Milan in November 2019, at which an expert panel of neurologists and neuroradiologists addressed the role of MRI in progressive MS.

Massimo Filippi, MD, IRCCS San Raffaele Scientific Institute, Milan, was the lead author of the review, which was published online Dec. 14, 2020, in JAMA Neurology.

The authors concluded that no definitive, qualitative clinical, immunologic, histopathologic, or neuroimaging features differentiate primary progressive and secondary progressive forms of MS; both are characterized by neurodegenerative phenomena and a gradual and irreversible accumulation of clinical disability, which is also affected by aging and comorbidities.

A definitive diagnosis of primary progressive MS is more difficult than a diagnosis of relapsing remitting MS; in part, primary progressive MS is a diagnosis of exclusion because it can be mimicked by other conditions clinically and radiologically, the authors noted.

The writers did report that, although nonspecific, some spinal cord imaging features are typical of primary progressive MS. These include diffuse abnormalities and lesions involving gray matter and two or more white-matter columns, but confirmation of this is required.

In patients with primary progressive MS and those with relapse-onset MS, MRI features at disease onset predict long-term disability and a progressive disease course. These features include lesions in critical central nervous system regions (i.e., spinal cord, infratentorial regions, and gray matter) and high inflammatory activity in the first years after disease onset. These measures are evaluable in clinical practice, the authors said.

In patients with established MS, gray-matter involvement and neurodegeneration are associated with accelerated clinical worsening; however, detection validation and standardization need to be implemented at the individual patient level, they commented.

Novel candidate imaging biomarkers, such as subpial demyelination, and the presence of slowly expanding lesions or paramagnetic rim lesions may identify progressive MS but should be further investigated, they added.

Discovery of MRI markers capable of detecting evolution from relapsing-remitting to secondary progressive MS remains an unmet need that will probably require multiparametric MRI studies, because it is unlikely that a single MRI method will be able to allow clinicians to optimally distinguish among these stages, the authors said.

The contribution of these promising MRI measures combined with other biomarkers, such as quantification of serum neurofilament light chain levels or optical coherence tomography assessment, should be explored to improve the identification of patients with progressive MS, they concluded.
 

 

 

‘A comprehensive review’

In a comment, Jeffrey A. Cohen, MD, director of the Cleveland Clinic’s Mellen Center for MS Treatment and Research, said the article is a comprehensive review of the pathologic mechanisms that underlie progression in MS and the proxy measures of those processes (brain and spinal cord MRI, PET, optical coherence tomography, and biomarkers).

“The paper reports there is no qualitative difference between relapsing remitting and progressive MS; rather, the difference is quantitative,” Dr. Cohen noted. “In other words, the processes that underlie progression are present from the earliest stages of MS, becoming more prominent over time.”

The apparent transition to progressive MS, he added, “rather than representing a ‘transition,’ instead results from the accumulation of pathology over time, a shift from focal lesions to diffuse inflammation and damage, and unmasking of the damage due to decreased resiliency due to aging and failure of compensatory mechanisms (neuroplasticity and remyelination).”

Also commenting, Edward Fox, MD, director, MS Clinic of Central Texas and clinical associate professor, University of Texas, Austin, explained that loss of tissue is the main driver of progressive MS.

“We all look at imaging to confirm that the progressive symptoms expressed by the patient are related to demyelinating disease,” he said. “When I see MRI of the spinal cord showing multifocal lesions, especially if localized atrophy is seen in a region of the cord, I expect to hear a history of progressive deficits in gait and other signs of disability.”

Dr. Fox noted that, on MRI of the brain, gray matter atrophy both cortically and in the deep gray structures usually manifests as cognitive slowing and poorer performance in work and social situations.

“We hope that other biomarkers, such as neurofilament light chain, will add to this body of knowledge and give us a better grasp of the definition of neurodegeneration to confirm the clinical and radiographic findings,” he added.

Dr. Filippi has received compensation for consulting services and/or speaking activities from Bayer, Biogen Idec, Merck Serono, Novartis, Roche, Sanofi, Genzyme, Takeda, and Teva Pharmaceutical Industries; and research support from ARiSLA, Biogen Idec, Fondazione Italiana Sclerosi Multipla, Italian Ministry of Health, Merck Serono, Novartis, Roche, and Teva.

A version of this article first appeared on Medscape.com.

The role of imaging in diagnosing progressive multiple sclerosis (MS) and in assessing prognosis is the subject of a new review.

MRI is central in the diagnostic work-up of patients suspected of having MS, given its high sensitivity in detecting disease dissemination in space and over time and its notable ability to exclude mimics of MS, the authors noted. However, diagnosis of primary progressive MS remains challenging and is only possible retrospectively on the basis of clinical assessment.

Identification of imaging features associated with primary progressive MS and features that predict evolution from relapsing remitting MS to secondary progressive MS is an important, unmet need, they wrote.

Diagnosis of progressive MS is limited by difficulties in distinguishing accumulating disability caused by inflammatory disease activity from that attributable to degenerative processes associated with secondary progressive MS. Moreover, there are no accepted clinical criteria for diagnosing secondary progressive MS, the authors explained.

This need has promoted extensive research in the field of imaging, facilitated by definition of novel MRI sequences, to identify imaging features reflecting pathophysiological mechanisms relevant to the pathobiology of progressive MS, the authors said.

The current review reports the conclusions of a workshop held in Milan in November 2019, at which an expert panel of neurologists and neuroradiologists addressed the role of MRI in progressive MS.

Massimo Filippi, MD, IRCCS San Raffaele Scientific Institute, Milan, was the lead author of the review, which was published online Dec. 14, 2020, in JAMA Neurology.

The authors concluded that no definitive, qualitative clinical, immunologic, histopathologic, or neuroimaging features differentiate primary progressive and secondary progressive forms of MS; both are characterized by neurodegenerative phenomena and a gradual and irreversible accumulation of clinical disability, which is also affected by aging and comorbidities.

A definitive diagnosis of primary progressive MS is more difficult than a diagnosis of relapsing remitting MS; in part, primary progressive MS is a diagnosis of exclusion because it can be mimicked by other conditions clinically and radiologically, the authors noted.

The writers did report that, although nonspecific, some spinal cord imaging features are typical of primary progressive MS. These include diffuse abnormalities and lesions involving gray matter and two or more white-matter columns, but confirmation of this is required.

In patients with primary progressive MS and those with relapse-onset MS, MRI features at disease onset predict long-term disability and a progressive disease course. These features include lesions in critical central nervous system regions (i.e., spinal cord, infratentorial regions, and gray matter) and high inflammatory activity in the first years after disease onset. These measures are evaluable in clinical practice, the authors said.

In patients with established MS, gray-matter involvement and neurodegeneration are associated with accelerated clinical worsening; however, detection validation and standardization need to be implemented at the individual patient level, they commented.

Novel candidate imaging biomarkers, such as subpial demyelination, and the presence of slowly expanding lesions or paramagnetic rim lesions may identify progressive MS but should be further investigated, they added.

Discovery of MRI markers capable of detecting evolution from relapsing-remitting to secondary progressive MS remains an unmet need that will probably require multiparametric MRI studies, because it is unlikely that a single MRI method will be able to allow clinicians to optimally distinguish among these stages, the authors said.

The contribution of these promising MRI measures combined with other biomarkers, such as quantification of serum neurofilament light chain levels or optical coherence tomography assessment, should be explored to improve the identification of patients with progressive MS, they concluded.
 

 

 

‘A comprehensive review’

In a comment, Jeffrey A. Cohen, MD, director of the Cleveland Clinic’s Mellen Center for MS Treatment and Research, said the article is a comprehensive review of the pathologic mechanisms that underlie progression in MS and the proxy measures of those processes (brain and spinal cord MRI, PET, optical coherence tomography, and biomarkers).

“The paper reports there is no qualitative difference between relapsing remitting and progressive MS; rather, the difference is quantitative,” Dr. Cohen noted. “In other words, the processes that underlie progression are present from the earliest stages of MS, becoming more prominent over time.”

The apparent transition to progressive MS, he added, “rather than representing a ‘transition,’ instead results from the accumulation of pathology over time, a shift from focal lesions to diffuse inflammation and damage, and unmasking of the damage due to decreased resiliency due to aging and failure of compensatory mechanisms (neuroplasticity and remyelination).”

Also commenting, Edward Fox, MD, director, MS Clinic of Central Texas and clinical associate professor, University of Texas, Austin, explained that loss of tissue is the main driver of progressive MS.

“We all look at imaging to confirm that the progressive symptoms expressed by the patient are related to demyelinating disease,” he said. “When I see MRI of the spinal cord showing multifocal lesions, especially if localized atrophy is seen in a region of the cord, I expect to hear a history of progressive deficits in gait and other signs of disability.”

Dr. Fox noted that, on MRI of the brain, gray matter atrophy both cortically and in the deep gray structures usually manifests as cognitive slowing and poorer performance in work and social situations.

“We hope that other biomarkers, such as neurofilament light chain, will add to this body of knowledge and give us a better grasp of the definition of neurodegeneration to confirm the clinical and radiographic findings,” he added.

Dr. Filippi has received compensation for consulting services and/or speaking activities from Bayer, Biogen Idec, Merck Serono, Novartis, Roche, Sanofi, Genzyme, Takeda, and Teva Pharmaceutical Industries; and research support from ARiSLA, Biogen Idec, Fondazione Italiana Sclerosi Multipla, Italian Ministry of Health, Merck Serono, Novartis, Roche, and Teva.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Issue
Neurology Reviews- 29(2)
Publications
Publications
Topics
Article Type
Sections
Citation Override
Publish date: January 5, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

New evidence shows that COVID-19 invades the brain

Article Type
Changed
Thu, 12/15/2022 - 15:42

SARS-CoV-2 can invade the brain and directly act on brain cells, causing neuroinflammation, new animal research suggests. Investigators injected spike 1 (S1), which is found on the tufts of the “red spikes” of the virus, into mice and found that it crossed the blood-brain barrier (BBB) and was taken up not only by brain regions and the brain space but also by other organs – specifically, the lungs, spleen, liver, and kidneys.

“We found that the S1 protein, which is the protein COVID-19 uses to ‘grab onto’ cells, crosses the BBB and is a good model of what the virus does when it enters the brain,” lead author William A. Banks, MD, professor of medicine, University of Washington, Seattle, said in an interview.

“When proteins such as the S1 protein become detached from the virus, they can enter the brain and cause mayhem, causing the brain to release cytokines, which, in turn, cause inflammation and subsequent neurotoxicity,” said Dr. Banks, associate chief of staff and a researcher at the Puget Sound Veterans Affairs Healthcare System.

The study was published online in Nature Neuroscience.
 

Neurologic symptoms

COVID-19 is associated with a variety of central nervous system symptoms, including the loss of taste and smell, headaches, confusion, stroke, and cerebral hemorrhage, the investigators noted.

Dr. Banks explained that SARS-CoV-2 may enter the brain by crossing the BBB, acting directly on the brain centers responsible for other body functions. The respiratory symptoms of COVID-19 may therefore result partly from the invasion of the areas of the brain responsible for respiratory functions, not only from the virus’ action at the site of the lungs.

The researchers set out to assess whether a particular viral protein – S1, which is a subunit of the viral spike protein – could cross the BBB or enter other organs when injected into mice. They found that, when intravenously injected S1 (I-S1) was cleared from the blood, tissues in multiple organs, including the lung, spleen, kidney, and liver, took it up.

Notably, uptake of I-S1 was higher in the liver, “suggesting that this protein is cleared from the blood predominantly by the liver,” Dr. Banks said. In addition, uptake by the lungs is “important, because that’s where many of the effects of the virus are,” he added.

The researchers found that I-S1 in the brains of the mice was “mostly degraded” 30 minutes following injection. “This indicates that I-S1 enters the BBB intact but is eventually degraded in the brain,” they wrote.

Moreover, by 30 minutes, more than half of the I-S1 proteins had crossed the capillary wall and had fully entered into the brain parenchymal and interstitial fluid spaces, as well as other regions.
 

More severe outcomes in men

The researchers then induced an inflammatory state in the mice through injection of lipopolysaccharide (LPS) and found that inflammation increased I-S1 uptake in both the brain and the lung (where uptake was increased by 101%). “These results show that inflammation could increase S1 toxicity for lung tissue by increasing its uptake,” the authors suggested. Moreover, inflammation also increased the entry of I-S1 into the brain, “likely due to BBB disruption.”

In human beings, male sex and APOE4 genotype are risk factors for both contracting COVID-19 and having a poor outcome, the authors noted. As a result, they examined I-S1 uptake in male and female mice that expressed human APOE3 or APOE4 (induced by a mouse ApoE promoter).

Multiple-comparison tests showed that among male mice that expressed human APOE3, the “fastest I-S1 uptake” was in the olfactory bulb, liver, and kidney. Female mice displayed increased APOE3 uptake in the spleen.

“This observation might relate to the increased susceptibility of men to more severe COVID-19 outcomes,” coauthor Jacob Raber, PhD, professor, departments of behavioral neuroscience, neurology, and radiation medicine, Oregon Health & Science University, Portland, said in a press release.

In addition to intravenous I-S1 injection, the researchers also investigated the effects of intranasal administration. They found that, although it also entered the brain, it did so at levels roughly 10 times lower than those induced by intravenous administration.
 

“Frightening tricks”

Dr. Banks said his laboratory has studied the BBB in conditions such as Alzheimer’s diseaseobesity, diabetes, and HIV. “Our experience with viruses is that they do an incredible number of things and have a frightening number of tricks,” he said. In this case, “the virus is probably causing inflammation by releasing cytokines elsewhere in the body that get into the brain through the BBB.” Conversely, “the virus itself may enter the brain by crossing the BBB and directly cause brain cells to release their own cytokines,” he added.

An additional finding of the study is that, whatever the S1 protein does in the brain is a model for what the entire virus itself does, because these proteins often bring the viruses along with them, he added.

Dr. Banks said the clinical implications of the findings are that antibodies from those who have already had COVID-19 could potentially be directed against S1. Similarly, he added, so can COVID-19 vaccines, which induce production of S1.

“When an antibody locks onto something, it prevents it from crossing the BBB,” Dr. Banks noted.
 

Confirmatory findings

Commenting on the study, Howard E. Gendelman, MD, Margaret R. Larson Professor of Internal Medicine and Infectious Diseases and professor and chair of the department of pharmacology and experimental neuroscience, University of Nebraska, Omaha, said the study is confirmatory.

“What this paper highlights, and we have known for a long time, is that COVID-19 is a systemic, not only a respiratory, disease involving many organs and tissues and can yield not only pulmonary problems but also a whole host of cardiac, brain, and kidney problems,” he said.

“So the fact that these proteins are getting in [the brain] and are able to induce a reaction in the brain itself, and this is part of the complex progressive nature of COVID-19, is an important finding,” added Dr. Gendelman, director of the center for neurodegenerative disorders at the university. He was not involved with the study.

The study was supported by the Veterans Affairs Puget Sound Healthcare System and by grants from the National Institutes of Health. The authors and Dr. Gendelman have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Publications
Topics
Sections

SARS-CoV-2 can invade the brain and directly act on brain cells, causing neuroinflammation, new animal research suggests. Investigators injected spike 1 (S1), which is found on the tufts of the “red spikes” of the virus, into mice and found that it crossed the blood-brain barrier (BBB) and was taken up not only by brain regions and the brain space but also by other organs – specifically, the lungs, spleen, liver, and kidneys.

“We found that the S1 protein, which is the protein COVID-19 uses to ‘grab onto’ cells, crosses the BBB and is a good model of what the virus does when it enters the brain,” lead author William A. Banks, MD, professor of medicine, University of Washington, Seattle, said in an interview.

“When proteins such as the S1 protein become detached from the virus, they can enter the brain and cause mayhem, causing the brain to release cytokines, which, in turn, cause inflammation and subsequent neurotoxicity,” said Dr. Banks, associate chief of staff and a researcher at the Puget Sound Veterans Affairs Healthcare System.

The study was published online in Nature Neuroscience.
 

Neurologic symptoms

COVID-19 is associated with a variety of central nervous system symptoms, including the loss of taste and smell, headaches, confusion, stroke, and cerebral hemorrhage, the investigators noted.

Dr. Banks explained that SARS-CoV-2 may enter the brain by crossing the BBB, acting directly on the brain centers responsible for other body functions. The respiratory symptoms of COVID-19 may therefore result partly from the invasion of the areas of the brain responsible for respiratory functions, not only from the virus’ action at the site of the lungs.

The researchers set out to assess whether a particular viral protein – S1, which is a subunit of the viral spike protein – could cross the BBB or enter other organs when injected into mice. They found that, when intravenously injected S1 (I-S1) was cleared from the blood, tissues in multiple organs, including the lung, spleen, kidney, and liver, took it up.

Notably, uptake of I-S1 was higher in the liver, “suggesting that this protein is cleared from the blood predominantly by the liver,” Dr. Banks said. In addition, uptake by the lungs is “important, because that’s where many of the effects of the virus are,” he added.

The researchers found that I-S1 in the brains of the mice was “mostly degraded” 30 minutes following injection. “This indicates that I-S1 enters the BBB intact but is eventually degraded in the brain,” they wrote.

Moreover, by 30 minutes, more than half of the I-S1 proteins had crossed the capillary wall and had fully entered into the brain parenchymal and interstitial fluid spaces, as well as other regions.
 

More severe outcomes in men

The researchers then induced an inflammatory state in the mice through injection of lipopolysaccharide (LPS) and found that inflammation increased I-S1 uptake in both the brain and the lung (where uptake was increased by 101%). “These results show that inflammation could increase S1 toxicity for lung tissue by increasing its uptake,” the authors suggested. Moreover, inflammation also increased the entry of I-S1 into the brain, “likely due to BBB disruption.”

In human beings, male sex and APOE4 genotype are risk factors for both contracting COVID-19 and having a poor outcome, the authors noted. As a result, they examined I-S1 uptake in male and female mice that expressed human APOE3 or APOE4 (induced by a mouse ApoE promoter).

Multiple-comparison tests showed that among male mice that expressed human APOE3, the “fastest I-S1 uptake” was in the olfactory bulb, liver, and kidney. Female mice displayed increased APOE3 uptake in the spleen.

“This observation might relate to the increased susceptibility of men to more severe COVID-19 outcomes,” coauthor Jacob Raber, PhD, professor, departments of behavioral neuroscience, neurology, and radiation medicine, Oregon Health & Science University, Portland, said in a press release.

In addition to intravenous I-S1 injection, the researchers also investigated the effects of intranasal administration. They found that, although it also entered the brain, it did so at levels roughly 10 times lower than those induced by intravenous administration.
 

“Frightening tricks”

Dr. Banks said his laboratory has studied the BBB in conditions such as Alzheimer’s diseaseobesity, diabetes, and HIV. “Our experience with viruses is that they do an incredible number of things and have a frightening number of tricks,” he said. In this case, “the virus is probably causing inflammation by releasing cytokines elsewhere in the body that get into the brain through the BBB.” Conversely, “the virus itself may enter the brain by crossing the BBB and directly cause brain cells to release their own cytokines,” he added.

An additional finding of the study is that, whatever the S1 protein does in the brain is a model for what the entire virus itself does, because these proteins often bring the viruses along with them, he added.

Dr. Banks said the clinical implications of the findings are that antibodies from those who have already had COVID-19 could potentially be directed against S1. Similarly, he added, so can COVID-19 vaccines, which induce production of S1.

“When an antibody locks onto something, it prevents it from crossing the BBB,” Dr. Banks noted.
 

Confirmatory findings

Commenting on the study, Howard E. Gendelman, MD, Margaret R. Larson Professor of Internal Medicine and Infectious Diseases and professor and chair of the department of pharmacology and experimental neuroscience, University of Nebraska, Omaha, said the study is confirmatory.

“What this paper highlights, and we have known for a long time, is that COVID-19 is a systemic, not only a respiratory, disease involving many organs and tissues and can yield not only pulmonary problems but also a whole host of cardiac, brain, and kidney problems,” he said.

“So the fact that these proteins are getting in [the brain] and are able to induce a reaction in the brain itself, and this is part of the complex progressive nature of COVID-19, is an important finding,” added Dr. Gendelman, director of the center for neurodegenerative disorders at the university. He was not involved with the study.

The study was supported by the Veterans Affairs Puget Sound Healthcare System and by grants from the National Institutes of Health. The authors and Dr. Gendelman have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

SARS-CoV-2 can invade the brain and directly act on brain cells, causing neuroinflammation, new animal research suggests. Investigators injected spike 1 (S1), which is found on the tufts of the “red spikes” of the virus, into mice and found that it crossed the blood-brain barrier (BBB) and was taken up not only by brain regions and the brain space but also by other organs – specifically, the lungs, spleen, liver, and kidneys.

“We found that the S1 protein, which is the protein COVID-19 uses to ‘grab onto’ cells, crosses the BBB and is a good model of what the virus does when it enters the brain,” lead author William A. Banks, MD, professor of medicine, University of Washington, Seattle, said in an interview.

“When proteins such as the S1 protein become detached from the virus, they can enter the brain and cause mayhem, causing the brain to release cytokines, which, in turn, cause inflammation and subsequent neurotoxicity,” said Dr. Banks, associate chief of staff and a researcher at the Puget Sound Veterans Affairs Healthcare System.

The study was published online in Nature Neuroscience.
 

Neurologic symptoms

COVID-19 is associated with a variety of central nervous system symptoms, including the loss of taste and smell, headaches, confusion, stroke, and cerebral hemorrhage, the investigators noted.

Dr. Banks explained that SARS-CoV-2 may enter the brain by crossing the BBB, acting directly on the brain centers responsible for other body functions. The respiratory symptoms of COVID-19 may therefore result partly from the invasion of the areas of the brain responsible for respiratory functions, not only from the virus’ action at the site of the lungs.

The researchers set out to assess whether a particular viral protein – S1, which is a subunit of the viral spike protein – could cross the BBB or enter other organs when injected into mice. They found that, when intravenously injected S1 (I-S1) was cleared from the blood, tissues in multiple organs, including the lung, spleen, kidney, and liver, took it up.

Notably, uptake of I-S1 was higher in the liver, “suggesting that this protein is cleared from the blood predominantly by the liver,” Dr. Banks said. In addition, uptake by the lungs is “important, because that’s where many of the effects of the virus are,” he added.

The researchers found that I-S1 in the brains of the mice was “mostly degraded” 30 minutes following injection. “This indicates that I-S1 enters the BBB intact but is eventually degraded in the brain,” they wrote.

Moreover, by 30 minutes, more than half of the I-S1 proteins had crossed the capillary wall and had fully entered into the brain parenchymal and interstitial fluid spaces, as well as other regions.
 

More severe outcomes in men

The researchers then induced an inflammatory state in the mice through injection of lipopolysaccharide (LPS) and found that inflammation increased I-S1 uptake in both the brain and the lung (where uptake was increased by 101%). “These results show that inflammation could increase S1 toxicity for lung tissue by increasing its uptake,” the authors suggested. Moreover, inflammation also increased the entry of I-S1 into the brain, “likely due to BBB disruption.”

In human beings, male sex and APOE4 genotype are risk factors for both contracting COVID-19 and having a poor outcome, the authors noted. As a result, they examined I-S1 uptake in male and female mice that expressed human APOE3 or APOE4 (induced by a mouse ApoE promoter).

Multiple-comparison tests showed that among male mice that expressed human APOE3, the “fastest I-S1 uptake” was in the olfactory bulb, liver, and kidney. Female mice displayed increased APOE3 uptake in the spleen.

“This observation might relate to the increased susceptibility of men to more severe COVID-19 outcomes,” coauthor Jacob Raber, PhD, professor, departments of behavioral neuroscience, neurology, and radiation medicine, Oregon Health & Science University, Portland, said in a press release.

In addition to intravenous I-S1 injection, the researchers also investigated the effects of intranasal administration. They found that, although it also entered the brain, it did so at levels roughly 10 times lower than those induced by intravenous administration.
 

“Frightening tricks”

Dr. Banks said his laboratory has studied the BBB in conditions such as Alzheimer’s diseaseobesity, diabetes, and HIV. “Our experience with viruses is that they do an incredible number of things and have a frightening number of tricks,” he said. In this case, “the virus is probably causing inflammation by releasing cytokines elsewhere in the body that get into the brain through the BBB.” Conversely, “the virus itself may enter the brain by crossing the BBB and directly cause brain cells to release their own cytokines,” he added.

An additional finding of the study is that, whatever the S1 protein does in the brain is a model for what the entire virus itself does, because these proteins often bring the viruses along with them, he added.

Dr. Banks said the clinical implications of the findings are that antibodies from those who have already had COVID-19 could potentially be directed against S1. Similarly, he added, so can COVID-19 vaccines, which induce production of S1.

“When an antibody locks onto something, it prevents it from crossing the BBB,” Dr. Banks noted.
 

Confirmatory findings

Commenting on the study, Howard E. Gendelman, MD, Margaret R. Larson Professor of Internal Medicine and Infectious Diseases and professor and chair of the department of pharmacology and experimental neuroscience, University of Nebraska, Omaha, said the study is confirmatory.

“What this paper highlights, and we have known for a long time, is that COVID-19 is a systemic, not only a respiratory, disease involving many organs and tissues and can yield not only pulmonary problems but also a whole host of cardiac, brain, and kidney problems,” he said.

“So the fact that these proteins are getting in [the brain] and are able to induce a reaction in the brain itself, and this is part of the complex progressive nature of COVID-19, is an important finding,” added Dr. Gendelman, director of the center for neurodegenerative disorders at the university. He was not involved with the study.

The study was supported by the Veterans Affairs Puget Sound Healthcare System and by grants from the National Institutes of Health. The authors and Dr. Gendelman have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Issue
Neurology Reviews- 29(2)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NATURE NEUROSCIENCE

Citation Override
Publish date: January 4, 2021
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

High blood pressure at any age speeds cognitive decline

Article Type
Changed
Thu, 12/15/2022 - 15:42

 

Individuals who have hypertension at any age are more likely to experience more rapid cognitive decline compared with their counterparts with normal blood pressure, new research shows. In a retrospective study of more than 15,000 participants, hypertension during middle age was associated with memory decline, and onset at later ages was linked to worsening memory and global cognition.

The investigators found that prehypertension, defined as systolic pressure of 120-139 mm Hg or diastolic pressure of 80-89 mm Hg, was also linked to accelerated cognitive decline.

Although duration of hypertension was not associated with any marker of cognitive decline, blood pressure control “can substantially reduce hypertension’s deleterious effect on the pace of cognitive decline,” said study investigator Sandhi M. Barreto, MD, PhD, professor of medicine at Universidade Federal de Minas Gerais, Belo Horizonte, Brazil.

The findings were published online Dec. 14 in Hypertension.
 

Unanswered questions

Hypertension is an established and highly prevalent risk factor for cognitive decline, but the age at which it begins to affect cognition is unclear. Previous research suggests that onset during middle age is associated with more harmful cognitive effects than onset in later life. One reason for this apparent difference may be that the duration of hypertension influences the magnitude of cognitive decline, the researchers noted.

Other studies have shown that prehypertension is associated with damage to certain organs, but its effects on cognition are uncertain. In addition, the effect of good blood pressure control with antihypertensive medications and the impact on cognition are also unclear.

To investigate, the researchers examined data from the ongoing, multicenter ELSA-Brasil study. ELSA-Brasil follows 15,105 civil servants between the ages of 35 and 74 years. Dr. Barreto and team assessed data from visit 1, which was conducted between 2008 and 2010, and visit 2, which was conducted between 2012 and 2014.

At each visit, participants underwent a memory test, a verbal fluency test, and the Trail Making Test Part B. The investigators calculated Z scores for these tests to derive a global cognitive score.

Blood pressure was measured on the right arm, and hypertension status, age at the time of hypertension diagnosis, duration of hypertension diagnosis, hypertension treatment, and control status were recorded. Other covariables included sex, education, race, smoking status, physical activity, body mass index, and total cholesterol level.

The researchers excluded patients who did not undergo cognitive testing at visit 2, those who had a history of stroke at baseline, and those who initiated antihypertensive medications despite having normotension. After exclusions, the analysis included 7,063 participants (approximately 55% were women, 15% were Black).

At visit 1, the mean age of the group was 58.9 years, and 53.4% of participants had 14 or more years of education. In addition, 22% had prehypertension, and 46.8% had hypertension. The median duration of hypertension was 7 years; 29.8% of participants with hypertension were diagnosed with the condition during middle age.

Of those who reported having hypertension at visit 1, 7.3% were not taking any antihypertensive medication. Among participants with hypertension who were taking antihypertensives, 31.2% had uncontrolled blood pressure.
 

Independent predictor

Results showed that prehypertension independently predicted a significantly greater decline in verbal fluency (Z score, –0.0095; P < .01) and global cognitive score (Z score, –0.0049; P < .05) compared with normal blood pressure.

At middle age, hypertension was associated with a steeper decline in memory (Z score, –0.0072; P < .05) compared with normal blood pressure. At older ages, hypertension was linked to a steeper decline in both memory (Z score, –0.0151; P < .001) and global cognitive score (Z score, –0.0080; P < .01). Duration of hypertension, however, did not significantly predict changes in cognition (P < .109).

Among those with hypertension who were taking antihypertensive medications, those with uncontrolled blood pressure experienced greater declines in rapid memory (Z score, –0.0126; P < .01) and global cognitive score (Z score, –0.0074; P < .01) than did those with controlled blood pressure.

The investigators noted that the study participants had a comparatively high level of education, which has been shown to “boost cognitive reserve and lessen the speed of age-related cognitive decline,” Dr. Barreto said. However, “our results indicate that the effect of hypertension on cognitive decline affects individuals of all educational levels similarly,” she said.

Dr. Barreto noted that the findings have two major clinical implications. First, “maintaining blood pressure below prehypertension levels is important to preserve cognitive function or delay cognitive decline,” she said. Secondly, “in hypertensive individuals, keeping blood pressure under control is essential to reduce the speed of cognitive decline.”

The researchers plan to conduct further analyses of the data to clarify the observed relationship between memory and verbal fluency. They also plan to examine how hypertension affects long-term executive function.
 

‘Continuum of risk’

Commenting on the study, Philip B. Gorelick, MD, MPH, adjunct professor of neurology (stroke and neurocritical care) at Northwestern University, Chicago, noted that, so far, research suggests that the risk for stroke associated with blood pressure levels should be understood as representing a continuum rather than as being associated with several discrete points.

“The same may hold true for cognitive decline and dementia. There may be a continuum of risk whereby persons even at so-called elevated but relatively lower levels of blood pressure based on a continuous scale are at risk,” said Dr. Gorelick, who was not involved with the current study.

The investigators relied on a large and well-studied population of civil servants. However, the population’s relative youth and high level of education may limit the generalizability of the findings, he noted. In addition, the follow-up time was relatively short.

“The hard endpoint of dementia was not studied but would be of interest to enhance our understanding of the influence of blood pressure elevation on cognitive decline or dementia during a longer follow-up of the cohort,” Dr. Gorelick said.

The findings also suggest the need to better understand mechanisms that link blood pressure elevation with cognitive decline, he added.

They indicate “the need for additional clinical trials to better elucidate blood pressure lowering targets for cognitive preservation in different groups of persons at risk,” such as those with normal cognition, those with mild cognitive impairment, and those with dementia, said Dr. Gorelick. “For example, is it safe and efficacious to lower blood pressure in persons with more advanced cognitive impairment or dementia?” he asked.

The study was funded by the Brazilian Coordination for the Improvement of Higher Education Personnel. Dr. Barreto has received support from the Research Agency of the State of Minas Gerais. Although Dr. Gorelick was not involved in the ELSA-Brasil cohort study, he serves on a data monitoring committee for a trial of a blood pressure–lowering agent in the preservation of cognition.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Publications
Topics
Sections

 

Individuals who have hypertension at any age are more likely to experience more rapid cognitive decline compared with their counterparts with normal blood pressure, new research shows. In a retrospective study of more than 15,000 participants, hypertension during middle age was associated with memory decline, and onset at later ages was linked to worsening memory and global cognition.

The investigators found that prehypertension, defined as systolic pressure of 120-139 mm Hg or diastolic pressure of 80-89 mm Hg, was also linked to accelerated cognitive decline.

Although duration of hypertension was not associated with any marker of cognitive decline, blood pressure control “can substantially reduce hypertension’s deleterious effect on the pace of cognitive decline,” said study investigator Sandhi M. Barreto, MD, PhD, professor of medicine at Universidade Federal de Minas Gerais, Belo Horizonte, Brazil.

The findings were published online Dec. 14 in Hypertension.
 

Unanswered questions

Hypertension is an established and highly prevalent risk factor for cognitive decline, but the age at which it begins to affect cognition is unclear. Previous research suggests that onset during middle age is associated with more harmful cognitive effects than onset in later life. One reason for this apparent difference may be that the duration of hypertension influences the magnitude of cognitive decline, the researchers noted.

Other studies have shown that prehypertension is associated with damage to certain organs, but its effects on cognition are uncertain. In addition, the effect of good blood pressure control with antihypertensive medications and the impact on cognition are also unclear.

To investigate, the researchers examined data from the ongoing, multicenter ELSA-Brasil study. ELSA-Brasil follows 15,105 civil servants between the ages of 35 and 74 years. Dr. Barreto and team assessed data from visit 1, which was conducted between 2008 and 2010, and visit 2, which was conducted between 2012 and 2014.

At each visit, participants underwent a memory test, a verbal fluency test, and the Trail Making Test Part B. The investigators calculated Z scores for these tests to derive a global cognitive score.

Blood pressure was measured on the right arm, and hypertension status, age at the time of hypertension diagnosis, duration of hypertension diagnosis, hypertension treatment, and control status were recorded. Other covariables included sex, education, race, smoking status, physical activity, body mass index, and total cholesterol level.

The researchers excluded patients who did not undergo cognitive testing at visit 2, those who had a history of stroke at baseline, and those who initiated antihypertensive medications despite having normotension. After exclusions, the analysis included 7,063 participants (approximately 55% were women, 15% were Black).

At visit 1, the mean age of the group was 58.9 years, and 53.4% of participants had 14 or more years of education. In addition, 22% had prehypertension, and 46.8% had hypertension. The median duration of hypertension was 7 years; 29.8% of participants with hypertension were diagnosed with the condition during middle age.

Of those who reported having hypertension at visit 1, 7.3% were not taking any antihypertensive medication. Among participants with hypertension who were taking antihypertensives, 31.2% had uncontrolled blood pressure.
 

Independent predictor

Results showed that prehypertension independently predicted a significantly greater decline in verbal fluency (Z score, –0.0095; P < .01) and global cognitive score (Z score, –0.0049; P < .05) compared with normal blood pressure.

At middle age, hypertension was associated with a steeper decline in memory (Z score, –0.0072; P < .05) compared with normal blood pressure. At older ages, hypertension was linked to a steeper decline in both memory (Z score, –0.0151; P < .001) and global cognitive score (Z score, –0.0080; P < .01). Duration of hypertension, however, did not significantly predict changes in cognition (P < .109).

Among those with hypertension who were taking antihypertensive medications, those with uncontrolled blood pressure experienced greater declines in rapid memory (Z score, –0.0126; P < .01) and global cognitive score (Z score, –0.0074; P < .01) than did those with controlled blood pressure.

The investigators noted that the study participants had a comparatively high level of education, which has been shown to “boost cognitive reserve and lessen the speed of age-related cognitive decline,” Dr. Barreto said. However, “our results indicate that the effect of hypertension on cognitive decline affects individuals of all educational levels similarly,” she said.

Dr. Barreto noted that the findings have two major clinical implications. First, “maintaining blood pressure below prehypertension levels is important to preserve cognitive function or delay cognitive decline,” she said. Secondly, “in hypertensive individuals, keeping blood pressure under control is essential to reduce the speed of cognitive decline.”

The researchers plan to conduct further analyses of the data to clarify the observed relationship between memory and verbal fluency. They also plan to examine how hypertension affects long-term executive function.
 

‘Continuum of risk’

Commenting on the study, Philip B. Gorelick, MD, MPH, adjunct professor of neurology (stroke and neurocritical care) at Northwestern University, Chicago, noted that, so far, research suggests that the risk for stroke associated with blood pressure levels should be understood as representing a continuum rather than as being associated with several discrete points.

“The same may hold true for cognitive decline and dementia. There may be a continuum of risk whereby persons even at so-called elevated but relatively lower levels of blood pressure based on a continuous scale are at risk,” said Dr. Gorelick, who was not involved with the current study.

The investigators relied on a large and well-studied population of civil servants. However, the population’s relative youth and high level of education may limit the generalizability of the findings, he noted. In addition, the follow-up time was relatively short.

“The hard endpoint of dementia was not studied but would be of interest to enhance our understanding of the influence of blood pressure elevation on cognitive decline or dementia during a longer follow-up of the cohort,” Dr. Gorelick said.

The findings also suggest the need to better understand mechanisms that link blood pressure elevation with cognitive decline, he added.

They indicate “the need for additional clinical trials to better elucidate blood pressure lowering targets for cognitive preservation in different groups of persons at risk,” such as those with normal cognition, those with mild cognitive impairment, and those with dementia, said Dr. Gorelick. “For example, is it safe and efficacious to lower blood pressure in persons with more advanced cognitive impairment or dementia?” he asked.

The study was funded by the Brazilian Coordination for the Improvement of Higher Education Personnel. Dr. Barreto has received support from the Research Agency of the State of Minas Gerais. Although Dr. Gorelick was not involved in the ELSA-Brasil cohort study, he serves on a data monitoring committee for a trial of a blood pressure–lowering agent in the preservation of cognition.

A version of this article first appeared on Medscape.com.

 

Individuals who have hypertension at any age are more likely to experience more rapid cognitive decline compared with their counterparts with normal blood pressure, new research shows. In a retrospective study of more than 15,000 participants, hypertension during middle age was associated with memory decline, and onset at later ages was linked to worsening memory and global cognition.

The investigators found that prehypertension, defined as systolic pressure of 120-139 mm Hg or diastolic pressure of 80-89 mm Hg, was also linked to accelerated cognitive decline.

Although duration of hypertension was not associated with any marker of cognitive decline, blood pressure control “can substantially reduce hypertension’s deleterious effect on the pace of cognitive decline,” said study investigator Sandhi M. Barreto, MD, PhD, professor of medicine at Universidade Federal de Minas Gerais, Belo Horizonte, Brazil.

The findings were published online Dec. 14 in Hypertension.
 

Unanswered questions

Hypertension is an established and highly prevalent risk factor for cognitive decline, but the age at which it begins to affect cognition is unclear. Previous research suggests that onset during middle age is associated with more harmful cognitive effects than onset in later life. One reason for this apparent difference may be that the duration of hypertension influences the magnitude of cognitive decline, the researchers noted.

Other studies have shown that prehypertension is associated with damage to certain organs, but its effects on cognition are uncertain. In addition, the effect of good blood pressure control with antihypertensive medications and the impact on cognition are also unclear.

To investigate, the researchers examined data from the ongoing, multicenter ELSA-Brasil study. ELSA-Brasil follows 15,105 civil servants between the ages of 35 and 74 years. Dr. Barreto and team assessed data from visit 1, which was conducted between 2008 and 2010, and visit 2, which was conducted between 2012 and 2014.

At each visit, participants underwent a memory test, a verbal fluency test, and the Trail Making Test Part B. The investigators calculated Z scores for these tests to derive a global cognitive score.

Blood pressure was measured on the right arm, and hypertension status, age at the time of hypertension diagnosis, duration of hypertension diagnosis, hypertension treatment, and control status were recorded. Other covariables included sex, education, race, smoking status, physical activity, body mass index, and total cholesterol level.

The researchers excluded patients who did not undergo cognitive testing at visit 2, those who had a history of stroke at baseline, and those who initiated antihypertensive medications despite having normotension. After exclusions, the analysis included 7,063 participants (approximately 55% were women, 15% were Black).

At visit 1, the mean age of the group was 58.9 years, and 53.4% of participants had 14 or more years of education. In addition, 22% had prehypertension, and 46.8% had hypertension. The median duration of hypertension was 7 years; 29.8% of participants with hypertension were diagnosed with the condition during middle age.

Of those who reported having hypertension at visit 1, 7.3% were not taking any antihypertensive medication. Among participants with hypertension who were taking antihypertensives, 31.2% had uncontrolled blood pressure.
 

Independent predictor

Results showed that prehypertension independently predicted a significantly greater decline in verbal fluency (Z score, –0.0095; P < .01) and global cognitive score (Z score, –0.0049; P < .05) compared with normal blood pressure.

At middle age, hypertension was associated with a steeper decline in memory (Z score, –0.0072; P < .05) compared with normal blood pressure. At older ages, hypertension was linked to a steeper decline in both memory (Z score, –0.0151; P < .001) and global cognitive score (Z score, –0.0080; P < .01). Duration of hypertension, however, did not significantly predict changes in cognition (P < .109).

Among those with hypertension who were taking antihypertensive medications, those with uncontrolled blood pressure experienced greater declines in rapid memory (Z score, –0.0126; P < .01) and global cognitive score (Z score, –0.0074; P < .01) than did those with controlled blood pressure.

The investigators noted that the study participants had a comparatively high level of education, which has been shown to “boost cognitive reserve and lessen the speed of age-related cognitive decline,” Dr. Barreto said. However, “our results indicate that the effect of hypertension on cognitive decline affects individuals of all educational levels similarly,” she said.

Dr. Barreto noted that the findings have two major clinical implications. First, “maintaining blood pressure below prehypertension levels is important to preserve cognitive function or delay cognitive decline,” she said. Secondly, “in hypertensive individuals, keeping blood pressure under control is essential to reduce the speed of cognitive decline.”

The researchers plan to conduct further analyses of the data to clarify the observed relationship between memory and verbal fluency. They also plan to examine how hypertension affects long-term executive function.
 

‘Continuum of risk’

Commenting on the study, Philip B. Gorelick, MD, MPH, adjunct professor of neurology (stroke and neurocritical care) at Northwestern University, Chicago, noted that, so far, research suggests that the risk for stroke associated with blood pressure levels should be understood as representing a continuum rather than as being associated with several discrete points.

“The same may hold true for cognitive decline and dementia. There may be a continuum of risk whereby persons even at so-called elevated but relatively lower levels of blood pressure based on a continuous scale are at risk,” said Dr. Gorelick, who was not involved with the current study.

The investigators relied on a large and well-studied population of civil servants. However, the population’s relative youth and high level of education may limit the generalizability of the findings, he noted. In addition, the follow-up time was relatively short.

“The hard endpoint of dementia was not studied but would be of interest to enhance our understanding of the influence of blood pressure elevation on cognitive decline or dementia during a longer follow-up of the cohort,” Dr. Gorelick said.

The findings also suggest the need to better understand mechanisms that link blood pressure elevation with cognitive decline, he added.

They indicate “the need for additional clinical trials to better elucidate blood pressure lowering targets for cognitive preservation in different groups of persons at risk,” such as those with normal cognition, those with mild cognitive impairment, and those with dementia, said Dr. Gorelick. “For example, is it safe and efficacious to lower blood pressure in persons with more advanced cognitive impairment or dementia?” he asked.

The study was funded by the Brazilian Coordination for the Improvement of Higher Education Personnel. Dr. Barreto has received support from the Research Agency of the State of Minas Gerais. Although Dr. Gorelick was not involved in the ELSA-Brasil cohort study, he serves on a data monitoring committee for a trial of a blood pressure–lowering agent in the preservation of cognition.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Issue
Neurology Reviews- 29(2)
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM HYPERTENSION

Citation Override
Publish date: December 17, 2020
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

Air pollution linked to brain amyloid pathology

Article Type
Changed
Thu, 12/15/2022 - 15:42

Higher levels of air pollution were associated with an increased risk for amyloid-beta pathology in a new study of older adults with cognitive impairment. “Many studies have now found a link between air pollution and clinical outcomes of dementia or cognitive decline,” said lead author Leonardo Iaccarino, PhD, Weill Institute for Neurosciences, University of California, San Francisco. “But this study is now showing a clear link between air pollution and a biomarker of Alzheimer’s disease: It shows a relationship between bad air quality and pathology in the brain.

“We believe that exposure to air pollution should be considered as one factor in the lifetime risk of developing Alzheimer’s disease,” he added. “We believe it is a significant determinant. Our results suggest that, if we can reduce occupational and residential exposure to air pollution, then this could help reduce the risk of Alzheimer’s disease.”

The study was published online Nov. 30 in JAMA Neurology.
 

A modifiable risk factor

Dr. Iaccarino explained that it is well known that air pollution is linked to poor health outcomes. “As well as cardiovascular and respiratory disease, there is also growing interest in the relationship between air pollution and brain health,” he said. “The link is becoming more and more convincing, with evidence from laboratory, animal, and human studies suggesting that individuals exposed to poor air quality have an increased risk of cognitive decline and dementia.”

In addition, this year, the Lancet Commission included air pollution in its updated list of modifiable risk factors for dementia.

For the current study, the researchers analyzed data from the Imaging Dementia–Evidence for Amyloid Scanning (IDEAS) Study, which included more than 18,000 U.S. participants with cognitive impairment who received an amyloid positron-emission tomography scan between 2016 and 2018.

The investigators used data from the IDEAS study to assess the relationship between the air quality at the place of residence of each patient and the likelihood of a positive amyloid PET result. Public records from the U.S. Environmental Protection Agency were used to estimate air quality in individual ZIP-code areas during two periods – 2002-2003 (approximately 14 years before the amyloid PET scan) and 2015-2016 (approximately 1 year before the amyloid PET scan).

Results showed that those living in an area with increased air pollution, as determined using concentrations of predicted fine particulate matter (PM2.5), had a higher probability of a positive amyloid PET scan. This association was dose dependent and statistically significant after adjusting for demographic, lifestyle, and socioeconomic factors as well as medical comorbidities. The association was seen in both periods; the adjusted odds ratio was 1.10 in 2002-2003 and 1.15 in 2015-2016.

“This shows about a 10% increased probability of a positive amyloid test for individuals living in the worst polluted areas, compared with those in the least polluted areas,” Dr. Iaccarino explained.

Every unit increase in PM2.5 in 2002-2003 was associated with an increased probability of positive amyloid findings on PET of 0.5%. Every unit increase in PM2.5 in for the 2015-2016 period was associated with an increased probability of positive amyloid findings on PET of 0.8%.

“This was a very large cohort study, and we adjusted for multiple other factors, so these are pretty robust findings,” Dr. Iaccarino said.

Exposure to higher ozone concentrations was not associated with amyloid positivity on PET scans in either time window.

“These findings suggest that brain amyloid-beta accumulation could be one of the biological pathways in the increased incidence of dementia and cognitive decline associated with exposure to air pollution,” the researchers stated.
 

 

 

A public health concern

“Adverse effects of airborne toxic pollutants associated with amyloid-beta pathology should be considered in public health policy decisions and should inform individual lifetime risk of developing Alzheimer’s disease and dementia,” they concluded.

Dr. Iaccarino noted that, although governments need to take primary action in reducing air pollution, individuals can make some changes to reduce their exposure to poor-quality air.

“Such changes could include not going out or using masks when pollution levels are very high (as happened recently in California with the wildfires) and avoiding areas where the air quality is known to be bad. In addition, there are activities which increase indoor air pollution which can be changed, such as certain types of cooking, cigarette smoking, use of coal fires,” he commented.

“Based on our findings, it would be reasonable to take action on these things, especially for individuals at higher risk of cardiovascular and respiratory disease or Alzheimer’s,” he added.

On a more optimistic note, Dr. Iaccarino pointed out that air quality in the United States has improved significantly in recent years. Meaningful improvements were found between the two periods in this analysis study (2002-2016), “so we are going in the right direction.”

The IDEAS Study was funded by the Alzheimer’s Association, the American College of Radiology, Avid Radiopharmaceuticals, GE Healthcare, and Life Molecular Imaging. Dr. Iaccarino has disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 29(1)
Publications
Topics
Sections

Higher levels of air pollution were associated with an increased risk for amyloid-beta pathology in a new study of older adults with cognitive impairment. “Many studies have now found a link between air pollution and clinical outcomes of dementia or cognitive decline,” said lead author Leonardo Iaccarino, PhD, Weill Institute for Neurosciences, University of California, San Francisco. “But this study is now showing a clear link between air pollution and a biomarker of Alzheimer’s disease: It shows a relationship between bad air quality and pathology in the brain.

“We believe that exposure to air pollution should be considered as one factor in the lifetime risk of developing Alzheimer’s disease,” he added. “We believe it is a significant determinant. Our results suggest that, if we can reduce occupational and residential exposure to air pollution, then this could help reduce the risk of Alzheimer’s disease.”

The study was published online Nov. 30 in JAMA Neurology.
 

A modifiable risk factor

Dr. Iaccarino explained that it is well known that air pollution is linked to poor health outcomes. “As well as cardiovascular and respiratory disease, there is also growing interest in the relationship between air pollution and brain health,” he said. “The link is becoming more and more convincing, with evidence from laboratory, animal, and human studies suggesting that individuals exposed to poor air quality have an increased risk of cognitive decline and dementia.”

In addition, this year, the Lancet Commission included air pollution in its updated list of modifiable risk factors for dementia.

For the current study, the researchers analyzed data from the Imaging Dementia–Evidence for Amyloid Scanning (IDEAS) Study, which included more than 18,000 U.S. participants with cognitive impairment who received an amyloid positron-emission tomography scan between 2016 and 2018.

The investigators used data from the IDEAS study to assess the relationship between the air quality at the place of residence of each patient and the likelihood of a positive amyloid PET result. Public records from the U.S. Environmental Protection Agency were used to estimate air quality in individual ZIP-code areas during two periods – 2002-2003 (approximately 14 years before the amyloid PET scan) and 2015-2016 (approximately 1 year before the amyloid PET scan).

Results showed that those living in an area with increased air pollution, as determined using concentrations of predicted fine particulate matter (PM2.5), had a higher probability of a positive amyloid PET scan. This association was dose dependent and statistically significant after adjusting for demographic, lifestyle, and socioeconomic factors as well as medical comorbidities. The association was seen in both periods; the adjusted odds ratio was 1.10 in 2002-2003 and 1.15 in 2015-2016.

“This shows about a 10% increased probability of a positive amyloid test for individuals living in the worst polluted areas, compared with those in the least polluted areas,” Dr. Iaccarino explained.

Every unit increase in PM2.5 in 2002-2003 was associated with an increased probability of positive amyloid findings on PET of 0.5%. Every unit increase in PM2.5 in for the 2015-2016 period was associated with an increased probability of positive amyloid findings on PET of 0.8%.

“This was a very large cohort study, and we adjusted for multiple other factors, so these are pretty robust findings,” Dr. Iaccarino said.

Exposure to higher ozone concentrations was not associated with amyloid positivity on PET scans in either time window.

“These findings suggest that brain amyloid-beta accumulation could be one of the biological pathways in the increased incidence of dementia and cognitive decline associated with exposure to air pollution,” the researchers stated.
 

 

 

A public health concern

“Adverse effects of airborne toxic pollutants associated with amyloid-beta pathology should be considered in public health policy decisions and should inform individual lifetime risk of developing Alzheimer’s disease and dementia,” they concluded.

Dr. Iaccarino noted that, although governments need to take primary action in reducing air pollution, individuals can make some changes to reduce their exposure to poor-quality air.

“Such changes could include not going out or using masks when pollution levels are very high (as happened recently in California with the wildfires) and avoiding areas where the air quality is known to be bad. In addition, there are activities which increase indoor air pollution which can be changed, such as certain types of cooking, cigarette smoking, use of coal fires,” he commented.

“Based on our findings, it would be reasonable to take action on these things, especially for individuals at higher risk of cardiovascular and respiratory disease or Alzheimer’s,” he added.

On a more optimistic note, Dr. Iaccarino pointed out that air quality in the United States has improved significantly in recent years. Meaningful improvements were found between the two periods in this analysis study (2002-2016), “so we are going in the right direction.”

The IDEAS Study was funded by the Alzheimer’s Association, the American College of Radiology, Avid Radiopharmaceuticals, GE Healthcare, and Life Molecular Imaging. Dr. Iaccarino has disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Higher levels of air pollution were associated with an increased risk for amyloid-beta pathology in a new study of older adults with cognitive impairment. “Many studies have now found a link between air pollution and clinical outcomes of dementia or cognitive decline,” said lead author Leonardo Iaccarino, PhD, Weill Institute for Neurosciences, University of California, San Francisco. “But this study is now showing a clear link between air pollution and a biomarker of Alzheimer’s disease: It shows a relationship between bad air quality and pathology in the brain.

“We believe that exposure to air pollution should be considered as one factor in the lifetime risk of developing Alzheimer’s disease,” he added. “We believe it is a significant determinant. Our results suggest that, if we can reduce occupational and residential exposure to air pollution, then this could help reduce the risk of Alzheimer’s disease.”

The study was published online Nov. 30 in JAMA Neurology.
 

A modifiable risk factor

Dr. Iaccarino explained that it is well known that air pollution is linked to poor health outcomes. “As well as cardiovascular and respiratory disease, there is also growing interest in the relationship between air pollution and brain health,” he said. “The link is becoming more and more convincing, with evidence from laboratory, animal, and human studies suggesting that individuals exposed to poor air quality have an increased risk of cognitive decline and dementia.”

In addition, this year, the Lancet Commission included air pollution in its updated list of modifiable risk factors for dementia.

For the current study, the researchers analyzed data from the Imaging Dementia–Evidence for Amyloid Scanning (IDEAS) Study, which included more than 18,000 U.S. participants with cognitive impairment who received an amyloid positron-emission tomography scan between 2016 and 2018.

The investigators used data from the IDEAS study to assess the relationship between the air quality at the place of residence of each patient and the likelihood of a positive amyloid PET result. Public records from the U.S. Environmental Protection Agency were used to estimate air quality in individual ZIP-code areas during two periods – 2002-2003 (approximately 14 years before the amyloid PET scan) and 2015-2016 (approximately 1 year before the amyloid PET scan).

Results showed that those living in an area with increased air pollution, as determined using concentrations of predicted fine particulate matter (PM2.5), had a higher probability of a positive amyloid PET scan. This association was dose dependent and statistically significant after adjusting for demographic, lifestyle, and socioeconomic factors as well as medical comorbidities. The association was seen in both periods; the adjusted odds ratio was 1.10 in 2002-2003 and 1.15 in 2015-2016.

“This shows about a 10% increased probability of a positive amyloid test for individuals living in the worst polluted areas, compared with those in the least polluted areas,” Dr. Iaccarino explained.

Every unit increase in PM2.5 in 2002-2003 was associated with an increased probability of positive amyloid findings on PET of 0.5%. Every unit increase in PM2.5 in for the 2015-2016 period was associated with an increased probability of positive amyloid findings on PET of 0.8%.

“This was a very large cohort study, and we adjusted for multiple other factors, so these are pretty robust findings,” Dr. Iaccarino said.

Exposure to higher ozone concentrations was not associated with amyloid positivity on PET scans in either time window.

“These findings suggest that brain amyloid-beta accumulation could be one of the biological pathways in the increased incidence of dementia and cognitive decline associated with exposure to air pollution,” the researchers stated.
 

 

 

A public health concern

“Adverse effects of airborne toxic pollutants associated with amyloid-beta pathology should be considered in public health policy decisions and should inform individual lifetime risk of developing Alzheimer’s disease and dementia,” they concluded.

Dr. Iaccarino noted that, although governments need to take primary action in reducing air pollution, individuals can make some changes to reduce their exposure to poor-quality air.

“Such changes could include not going out or using masks when pollution levels are very high (as happened recently in California with the wildfires) and avoiding areas where the air quality is known to be bad. In addition, there are activities which increase indoor air pollution which can be changed, such as certain types of cooking, cigarette smoking, use of coal fires,” he commented.

“Based on our findings, it would be reasonable to take action on these things, especially for individuals at higher risk of cardiovascular and respiratory disease or Alzheimer’s,” he added.

On a more optimistic note, Dr. Iaccarino pointed out that air quality in the United States has improved significantly in recent years. Meaningful improvements were found between the two periods in this analysis study (2002-2016), “so we are going in the right direction.”

The IDEAS Study was funded by the Alzheimer’s Association, the American College of Radiology, Avid Radiopharmaceuticals, GE Healthcare, and Life Molecular Imaging. Dr. Iaccarino has disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 29(1)
Issue
Neurology Reviews- 29(1)
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM JAMA NEUROLOGY

Citation Override
Publish date: December 8, 2020
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

Oral steroids benefit patients with cluster headache

Article Type
Changed
Thu, 12/15/2022 - 15:43

Adjunctive oral prednisone appears to significantly reduce cluster headache attacks, new research shows. Results of the multicenter, randomized, double-blind trial show that patients who received the steroid had 25% fewer attacks in the first week of therapy, compared with their counterparts who received placebo.

In addition, more than a third of patients in the prednisone group were pain free, and for almost half, headache frequency was reduced by at least 50% at day 7 of treatment.

These findings provide clear evidence that prednisone, in conjunction with the use of verapamil, is effective in cluster headache, said lead author Mark Obermann, MD, director, Center for Neurology, Asklepios Hospitals Seesen (Germany), and associate professor, University of Duisburg-Essen (Germany).

The key message, he added, is that all patients with cluster headache should receive prednisone at the start of an episode.

The study was published online Nov. 24 in the Lancet Neurology.
 

‘Suicide headaches’

Cluster headaches are intense unilateral attacks of facial and head pain. They last 15-180 minutes and predominantly affect men. They are accompanied by trigeminal autonomic symptoms and are extremely painful. “They’re referred to as ‘suicide headaches’ because the pain is so severe that patients often report they think about killing themselves to get rid of the pain,” said Dr. Obermann.

The cause is unclear, although there is some evidence that the hypothalamus is involved. The headaches sometimes follow a “strict circadian pattern,” said Dr. Obermann. He noted that the attacks might occur over a few weeks or months and then not return for months or even years.

An estimated 1 in 1,000 people experience cluster headache, but the condition is underrecognized, and research is scarce and poorly funded. Previous research does show that the calcium channel blocker verapamil, which is used to treat high blood pressure, is effective in cluster headache. However, it takes about 14 days to work and has to be slowly titrated because of cardiac side effects, said Dr. Obermann. For these reasons, international guidelines recommend initiating short-term preventive treatment with corticosteroids to suppress, or at least lessen, cluster headache attacks until long-term prevention is effective.

Although some clinicians treat cluster headaches with corticosteroids, others don’t because of a lack of evidence that shows they are effective. “There’s no evidence whatsoever on what the correct dose is or whether it helps at all. This is the gap we wanted to close,” said Dr. Obermann.

The study included 116 adult patients with cluster headache from 10 centers who were experiencing a cluster headache episode and were not taking prophylactic medication.

The trial only included patients who had an attack within 30 days of their current episode. The investigators included this restriction to reduce the possibility of spontaneous remission, which is “a big problem” in cluster headache trials, he said. To confirm that episodes were cluster headache attacks, patients were also required to have moderate to severe pain, indicated by a score of at least 5 on a numerical rating scale in which 0 indicates no pain and 10 indicates the worse imaginable pain.

Participants were allowed to use treatments for acute attack, but these therapies were limited to triptans, high-flow oxygen, intranasal lidocaineergotamine, and oral analgesics.
 

 

 

Debilitating pain

Patients were randomly assigned to receive oral prednisone (n = 53) or placebo (n = 56). The study groups were matched with respect to demographic and clinical characteristics. Prednisone was initiated at 100 mg/d for 5 days and was then tapered by 20 mg every 3 days in the active-treatment group. All patients also received oral verapamil at a starting dose of 40 mg three times per day. The dose was increased every 3 days by 40 mg to a maximum of 360 mg/d.

All participants received pantoprazole 20 mg to prevent the gastric side effects of prednisone. An attack was defined as a unilateral headache of moderate to severe intensity. The study lasted 28 days.

The study’s primary outcome was the mean number of cluster headache attacks during the first week of treatment with prednisone versus placebo.

The mean number of attacks during the first week of treatment was 7.1 in the prednisone group and 9.5 in the placebo group, for a difference of –2.4 attacks (95% confidence interval, –4.8 to –0.03; P = .002). “This might not sound like much,” but reducing the number of daily attacks from, say, eight to six “really makes a difference because the attacks are so painful,” said Dr. Obermann.

The prednisone group also came out on top for a number secondary outcomes. After the first 7 days, attacks ceased in 35% of the prednisone group versus 7% in the placebo group.
 

‘Clear evidence’ of efficacy

About 49% of patients who took prednisone reported a reduction of at least 50% in attack frequency at day 7. By comparison, 15% of patients who received placebo reported such a reduction. The number of cluster attacks at day 28 was less in the prednisone group than in the patients who received placebo.

With respect to treatment effect, the difference between prednisone and placebo gradually lessened over time “in parallel to the verapamil dose reaching its therapeutic effect,” the investigators noted. “Therefore, attack frequency reduction slowly converged between groups,” they added.

The study results provide “clear evidence” and should reassure clinicians that short-term prednisone early in a cluster headache attack is effective, said Dr. Obermann.

Adverse events, which included headache, palpitations, dizziness, and nausea, were as expected and were similar in the two groups. There were only two severe adverse events, both of which occurred in participants in the placebo group.

Dr. Obermann said the investigators were surprised that so many patients in the study were taking analgesics. “Analgesics don’t work in cluster headache; they just don’t work in this kind of pain.”

He noted that prednisone exposure of study patients spanned only 19 days and amounted to only 1,100 mg, which he believes is safe.

The prednisone dose used in the study is “what most clinicians use in clinical practice,” although there have been reports of success using 500 mg of IV prednisone over 5 days, said Dr. Obermann. He added that it would be “interesting to see if 50 mg would be just as good” as a starting dose.

Potential limitations of the study include the fact that the majority of participants were White, so the findings may not be generalizable to other populations.
 

 

 

Long-awaited results

In an accompanying editorial, Anne Ducros, MD, PhD, professor of neurology and director of the Headache Center, Montpellier (France) University Hospital, said the study provides “strong and long-awaited evidence supporting the use of oral steroids as a transitional treatment option.”

The trial “raises many topics for future research,” one of which is the long-term safety of prednisone for patients with cluster headache, said Dr. Ducros. She noted that use of high-dose steroids once or twice a year for 15 years or more “has the potential for severe systemic toxic effects,” such as corticosteroid-induced osteonecrosis of the femoral head.

Other questions about corticosteroid use for patients with cluster headache remain. These include understanding whether these agents provide better efficacy than occipital nerve injections and determining the optimal verapamil regimen, she noted.

In addition, the risk for oral steroid misuse needs to be studied, she said. She noted that drug misuse is common among patients with cluster headache.

Despite these questions, the results of this new study “provide an important step forward for patients with cluster headache, for whom safe and effective transitional therapies are much needed,” Dr. Ducros wrote.


Dr. Obermann has received fees from Sanofi, Biogen, Novartis, Teva Pharmaceuticals, and Eli Lilly and grants from Allergan and Heel Pharmaceuticals outside of this work. Dr. Ducros has received fees from Amgen, Novartis, Teva, and Eli Lilly; grants from the Programme Hospitalier de Recherche Clinique and from the Appel d’Offre Interne of Montpellier University Hospital; and nonfinancial support from SOS Oxygene.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 29(1)
Publications
Topics
Sections

Adjunctive oral prednisone appears to significantly reduce cluster headache attacks, new research shows. Results of the multicenter, randomized, double-blind trial show that patients who received the steroid had 25% fewer attacks in the first week of therapy, compared with their counterparts who received placebo.

In addition, more than a third of patients in the prednisone group were pain free, and for almost half, headache frequency was reduced by at least 50% at day 7 of treatment.

These findings provide clear evidence that prednisone, in conjunction with the use of verapamil, is effective in cluster headache, said lead author Mark Obermann, MD, director, Center for Neurology, Asklepios Hospitals Seesen (Germany), and associate professor, University of Duisburg-Essen (Germany).

The key message, he added, is that all patients with cluster headache should receive prednisone at the start of an episode.

The study was published online Nov. 24 in the Lancet Neurology.
 

‘Suicide headaches’

Cluster headaches are intense unilateral attacks of facial and head pain. They last 15-180 minutes and predominantly affect men. They are accompanied by trigeminal autonomic symptoms and are extremely painful. “They’re referred to as ‘suicide headaches’ because the pain is so severe that patients often report they think about killing themselves to get rid of the pain,” said Dr. Obermann.

The cause is unclear, although there is some evidence that the hypothalamus is involved. The headaches sometimes follow a “strict circadian pattern,” said Dr. Obermann. He noted that the attacks might occur over a few weeks or months and then not return for months or even years.

An estimated 1 in 1,000 people experience cluster headache, but the condition is underrecognized, and research is scarce and poorly funded. Previous research does show that the calcium channel blocker verapamil, which is used to treat high blood pressure, is effective in cluster headache. However, it takes about 14 days to work and has to be slowly titrated because of cardiac side effects, said Dr. Obermann. For these reasons, international guidelines recommend initiating short-term preventive treatment with corticosteroids to suppress, or at least lessen, cluster headache attacks until long-term prevention is effective.

Although some clinicians treat cluster headaches with corticosteroids, others don’t because of a lack of evidence that shows they are effective. “There’s no evidence whatsoever on what the correct dose is or whether it helps at all. This is the gap we wanted to close,” said Dr. Obermann.

The study included 116 adult patients with cluster headache from 10 centers who were experiencing a cluster headache episode and were not taking prophylactic medication.

The trial only included patients who had an attack within 30 days of their current episode. The investigators included this restriction to reduce the possibility of spontaneous remission, which is “a big problem” in cluster headache trials, he said. To confirm that episodes were cluster headache attacks, patients were also required to have moderate to severe pain, indicated by a score of at least 5 on a numerical rating scale in which 0 indicates no pain and 10 indicates the worse imaginable pain.

Participants were allowed to use treatments for acute attack, but these therapies were limited to triptans, high-flow oxygen, intranasal lidocaineergotamine, and oral analgesics.
 

 

 

Debilitating pain

Patients were randomly assigned to receive oral prednisone (n = 53) or placebo (n = 56). The study groups were matched with respect to demographic and clinical characteristics. Prednisone was initiated at 100 mg/d for 5 days and was then tapered by 20 mg every 3 days in the active-treatment group. All patients also received oral verapamil at a starting dose of 40 mg three times per day. The dose was increased every 3 days by 40 mg to a maximum of 360 mg/d.

All participants received pantoprazole 20 mg to prevent the gastric side effects of prednisone. An attack was defined as a unilateral headache of moderate to severe intensity. The study lasted 28 days.

The study’s primary outcome was the mean number of cluster headache attacks during the first week of treatment with prednisone versus placebo.

The mean number of attacks during the first week of treatment was 7.1 in the prednisone group and 9.5 in the placebo group, for a difference of –2.4 attacks (95% confidence interval, –4.8 to –0.03; P = .002). “This might not sound like much,” but reducing the number of daily attacks from, say, eight to six “really makes a difference because the attacks are so painful,” said Dr. Obermann.

The prednisone group also came out on top for a number secondary outcomes. After the first 7 days, attacks ceased in 35% of the prednisone group versus 7% in the placebo group.
 

‘Clear evidence’ of efficacy

About 49% of patients who took prednisone reported a reduction of at least 50% in attack frequency at day 7. By comparison, 15% of patients who received placebo reported such a reduction. The number of cluster attacks at day 28 was less in the prednisone group than in the patients who received placebo.

With respect to treatment effect, the difference between prednisone and placebo gradually lessened over time “in parallel to the verapamil dose reaching its therapeutic effect,” the investigators noted. “Therefore, attack frequency reduction slowly converged between groups,” they added.

The study results provide “clear evidence” and should reassure clinicians that short-term prednisone early in a cluster headache attack is effective, said Dr. Obermann.

Adverse events, which included headache, palpitations, dizziness, and nausea, were as expected and were similar in the two groups. There were only two severe adverse events, both of which occurred in participants in the placebo group.

Dr. Obermann said the investigators were surprised that so many patients in the study were taking analgesics. “Analgesics don’t work in cluster headache; they just don’t work in this kind of pain.”

He noted that prednisone exposure of study patients spanned only 19 days and amounted to only 1,100 mg, which he believes is safe.

The prednisone dose used in the study is “what most clinicians use in clinical practice,” although there have been reports of success using 500 mg of IV prednisone over 5 days, said Dr. Obermann. He added that it would be “interesting to see if 50 mg would be just as good” as a starting dose.

Potential limitations of the study include the fact that the majority of participants were White, so the findings may not be generalizable to other populations.
 

 

 

Long-awaited results

In an accompanying editorial, Anne Ducros, MD, PhD, professor of neurology and director of the Headache Center, Montpellier (France) University Hospital, said the study provides “strong and long-awaited evidence supporting the use of oral steroids as a transitional treatment option.”

The trial “raises many topics for future research,” one of which is the long-term safety of prednisone for patients with cluster headache, said Dr. Ducros. She noted that use of high-dose steroids once or twice a year for 15 years or more “has the potential for severe systemic toxic effects,” such as corticosteroid-induced osteonecrosis of the femoral head.

Other questions about corticosteroid use for patients with cluster headache remain. These include understanding whether these agents provide better efficacy than occipital nerve injections and determining the optimal verapamil regimen, she noted.

In addition, the risk for oral steroid misuse needs to be studied, she said. She noted that drug misuse is common among patients with cluster headache.

Despite these questions, the results of this new study “provide an important step forward for patients with cluster headache, for whom safe and effective transitional therapies are much needed,” Dr. Ducros wrote.


Dr. Obermann has received fees from Sanofi, Biogen, Novartis, Teva Pharmaceuticals, and Eli Lilly and grants from Allergan and Heel Pharmaceuticals outside of this work. Dr. Ducros has received fees from Amgen, Novartis, Teva, and Eli Lilly; grants from the Programme Hospitalier de Recherche Clinique and from the Appel d’Offre Interne of Montpellier University Hospital; and nonfinancial support from SOS Oxygene.

A version of this article originally appeared on Medscape.com.

Adjunctive oral prednisone appears to significantly reduce cluster headache attacks, new research shows. Results of the multicenter, randomized, double-blind trial show that patients who received the steroid had 25% fewer attacks in the first week of therapy, compared with their counterparts who received placebo.

In addition, more than a third of patients in the prednisone group were pain free, and for almost half, headache frequency was reduced by at least 50% at day 7 of treatment.

These findings provide clear evidence that prednisone, in conjunction with the use of verapamil, is effective in cluster headache, said lead author Mark Obermann, MD, director, Center for Neurology, Asklepios Hospitals Seesen (Germany), and associate professor, University of Duisburg-Essen (Germany).

The key message, he added, is that all patients with cluster headache should receive prednisone at the start of an episode.

The study was published online Nov. 24 in the Lancet Neurology.
 

‘Suicide headaches’

Cluster headaches are intense unilateral attacks of facial and head pain. They last 15-180 minutes and predominantly affect men. They are accompanied by trigeminal autonomic symptoms and are extremely painful. “They’re referred to as ‘suicide headaches’ because the pain is so severe that patients often report they think about killing themselves to get rid of the pain,” said Dr. Obermann.

The cause is unclear, although there is some evidence that the hypothalamus is involved. The headaches sometimes follow a “strict circadian pattern,” said Dr. Obermann. He noted that the attacks might occur over a few weeks or months and then not return for months or even years.

An estimated 1 in 1,000 people experience cluster headache, but the condition is underrecognized, and research is scarce and poorly funded. Previous research does show that the calcium channel blocker verapamil, which is used to treat high blood pressure, is effective in cluster headache. However, it takes about 14 days to work and has to be slowly titrated because of cardiac side effects, said Dr. Obermann. For these reasons, international guidelines recommend initiating short-term preventive treatment with corticosteroids to suppress, or at least lessen, cluster headache attacks until long-term prevention is effective.

Although some clinicians treat cluster headaches with corticosteroids, others don’t because of a lack of evidence that shows they are effective. “There’s no evidence whatsoever on what the correct dose is or whether it helps at all. This is the gap we wanted to close,” said Dr. Obermann.

The study included 116 adult patients with cluster headache from 10 centers who were experiencing a cluster headache episode and were not taking prophylactic medication.

The trial only included patients who had an attack within 30 days of their current episode. The investigators included this restriction to reduce the possibility of spontaneous remission, which is “a big problem” in cluster headache trials, he said. To confirm that episodes were cluster headache attacks, patients were also required to have moderate to severe pain, indicated by a score of at least 5 on a numerical rating scale in which 0 indicates no pain and 10 indicates the worse imaginable pain.

Participants were allowed to use treatments for acute attack, but these therapies were limited to triptans, high-flow oxygen, intranasal lidocaineergotamine, and oral analgesics.
 

 

 

Debilitating pain

Patients were randomly assigned to receive oral prednisone (n = 53) or placebo (n = 56). The study groups were matched with respect to demographic and clinical characteristics. Prednisone was initiated at 100 mg/d for 5 days and was then tapered by 20 mg every 3 days in the active-treatment group. All patients also received oral verapamil at a starting dose of 40 mg three times per day. The dose was increased every 3 days by 40 mg to a maximum of 360 mg/d.

All participants received pantoprazole 20 mg to prevent the gastric side effects of prednisone. An attack was defined as a unilateral headache of moderate to severe intensity. The study lasted 28 days.

The study’s primary outcome was the mean number of cluster headache attacks during the first week of treatment with prednisone versus placebo.

The mean number of attacks during the first week of treatment was 7.1 in the prednisone group and 9.5 in the placebo group, for a difference of –2.4 attacks (95% confidence interval, –4.8 to –0.03; P = .002). “This might not sound like much,” but reducing the number of daily attacks from, say, eight to six “really makes a difference because the attacks are so painful,” said Dr. Obermann.

The prednisone group also came out on top for a number secondary outcomes. After the first 7 days, attacks ceased in 35% of the prednisone group versus 7% in the placebo group.
 

‘Clear evidence’ of efficacy

About 49% of patients who took prednisone reported a reduction of at least 50% in attack frequency at day 7. By comparison, 15% of patients who received placebo reported such a reduction. The number of cluster attacks at day 28 was less in the prednisone group than in the patients who received placebo.

With respect to treatment effect, the difference between prednisone and placebo gradually lessened over time “in parallel to the verapamil dose reaching its therapeutic effect,” the investigators noted. “Therefore, attack frequency reduction slowly converged between groups,” they added.

The study results provide “clear evidence” and should reassure clinicians that short-term prednisone early in a cluster headache attack is effective, said Dr. Obermann.

Adverse events, which included headache, palpitations, dizziness, and nausea, were as expected and were similar in the two groups. There were only two severe adverse events, both of which occurred in participants in the placebo group.

Dr. Obermann said the investigators were surprised that so many patients in the study were taking analgesics. “Analgesics don’t work in cluster headache; they just don’t work in this kind of pain.”

He noted that prednisone exposure of study patients spanned only 19 days and amounted to only 1,100 mg, which he believes is safe.

The prednisone dose used in the study is “what most clinicians use in clinical practice,” although there have been reports of success using 500 mg of IV prednisone over 5 days, said Dr. Obermann. He added that it would be “interesting to see if 50 mg would be just as good” as a starting dose.

Potential limitations of the study include the fact that the majority of participants were White, so the findings may not be generalizable to other populations.
 

 

 

Long-awaited results

In an accompanying editorial, Anne Ducros, MD, PhD, professor of neurology and director of the Headache Center, Montpellier (France) University Hospital, said the study provides “strong and long-awaited evidence supporting the use of oral steroids as a transitional treatment option.”

The trial “raises many topics for future research,” one of which is the long-term safety of prednisone for patients with cluster headache, said Dr. Ducros. She noted that use of high-dose steroids once or twice a year for 15 years or more “has the potential for severe systemic toxic effects,” such as corticosteroid-induced osteonecrosis of the femoral head.

Other questions about corticosteroid use for patients with cluster headache remain. These include understanding whether these agents provide better efficacy than occipital nerve injections and determining the optimal verapamil regimen, she noted.

In addition, the risk for oral steroid misuse needs to be studied, she said. She noted that drug misuse is common among patients with cluster headache.

Despite these questions, the results of this new study “provide an important step forward for patients with cluster headache, for whom safe and effective transitional therapies are much needed,” Dr. Ducros wrote.


Dr. Obermann has received fees from Sanofi, Biogen, Novartis, Teva Pharmaceuticals, and Eli Lilly and grants from Allergan and Heel Pharmaceuticals outside of this work. Dr. Ducros has received fees from Amgen, Novartis, Teva, and Eli Lilly; grants from the Programme Hospitalier de Recherche Clinique and from the Appel d’Offre Interne of Montpellier University Hospital; and nonfinancial support from SOS Oxygene.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 29(1)
Issue
Neurology Reviews- 29(1)
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Citation Override
Publish date: December 8, 2020
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

First guidelines for keto diets in adults with epilepsy released

Article Type
Changed
Thu, 12/15/2022 - 15:43

 

An international panel of experts has published the first set of recommendations based on current clinical practices and scientific evidence for using ketogenic diet therapies in adults with drug-resistant epilepsy.

Just as in children with epilepsy, ketogenic diet therapies can be safe and effective in adults with epilepsy but should only be undertaken with the support of medical professionals trained in their use, the group said.

Dr. Mackenzie Cervenka


“Motivation is the key to successful ketogenic diet therapy adherence,” first author Mackenzie Cervenka, MD, director of the Adult Epilepsy Diet Center and associate professor of neurology at Johns Hopkins University, Baltimore, said in an interview.

“Patients who are autonomous require self-motivation and having a strong support structure is important as well. For those patients who are dependents, their caregivers need to be motivated to manage their diet,” said Dr. Cervenka.

The guidelines were published online Oct. 30 in Neurology Clinical Practice.

Novel in adult neurology

Ketogenic diet therapies are high-fat, low-carbohydrate, and adequate-protein diets that induce fat metabolism and ketone production. Despite its use as an effective antiseizure therapy since the 1920s, ketogenic diet therapies remain novel in adult neurology.

Furthermore, while there are established guidelines for ketogenic diet therapies to reduce seizures in children, there were no formal recommendations for adults, until now.

Drawing on the experience of experts at 20 centers using ketogenic diet therapies in more than 2,100 adults with epilepsy in 10 countries, Dr. Cervenka and an international team developed recommendations on use of ketogenic diet therapies in adults.

The panel noted, “with a relatively mild side effect profile and the potential to reduce seizures in nearly 60% of adults with drug-resistant epilepsy, ketogenic diet therapies should be part of the repertoire of available options.”

Ketogenic diet therapies are appropriate to offer to adults with seizure types and epilepsy syndromes for which these treatments are known to be effective in children, they said. These include tuberous sclerosis complexRett syndromeLennox-Gastaut syndrome, glucose transporter type 1 deficiency syndrome, genetic generalized epilepsies, and focal epilepsies caused by underlying migrational disorders and resistant to antiseizure medication.

However, adults with drug-resistant focal epilepsy should be offered surgical evaluation first, given the higher anticipated rate of seizure freedom via this route, the panel said.
 

A focus on compliance

Experts at nearly all of the centers report using two or more ketogenic diet therapies. Ninety percent use the modified Atkins diet, 84% use the classic ketogenic diet, and 63% use the modified ketogenic diet and/or low glycemic index treatment. More than half of the centers (58%) use medium-chain triglyceride oil in combination with another ketogenic diet therapy to boost ketone body production.

The most important factors influencing the choice of ketogenic diet therapy are ease of diet application for the patient (100%) and patient and/or caregiver preference, home setting, and mode of feeding (90% each).

The panel recommended that ketogenic diet therapies be tailored to fit the needs of the individual, taking into account his or her physical and mental characteristics, underlying medical conditions, food preferences, type and amount of support from family and others, level of self-sufficiency, feeding habits, and ease of following the diet.

“Most of the differences between the child and adult recommendations have to do with compliance. Often, it’s more of a challenge for adults than for children,” said Dr. Cervenka.

The panel recommended providing adult patients with recipe ideas, individualized training on the ketogenic diet lifestyle from a dietitian or nutritionist, and guidance for meal planning and preparation before starting the diet. This will provide the greatest likelihood of success, as patients often report difficulties coping with carbohydrate restriction.

“In pediatric practice, positive responders typically remain on a ketogenic diet therapy for 2 years before considering weaning. Ketogenic diet therapy in adults is not time-limited. However, a minimum of 3 months of ketogenic diet therapy is recommended before any judgment of response is made,” the panel advised.

The panel pointed out the absolute metabolic contraindications and cautions related to feeding difficulties, gastrointestinal dysfunction, and digestion remain the same for both children and adults. However, they added that a range of common adult conditions such as hyperlipidemia, heart disease, diabetes, low bone density, and pregnancy “bring additional consideration, caution, and monitoring to ketogenic diet therapy use.”
 

 

 

Beyond epilepsy

The guidelines also call for pre–ketogenic diet therapy biochemical studies to screen adults for preexisting abnormalities and establish a reference for comparing follow-up results after 3, 6, and 12 months, and then annually or as needed.

They also noted that metabolic studies such as urine organic acid and serum amino acid levels are generally not needed in adults unless there is a strong clinical suspicion for an underlying metabolic disorder.

Updated genetic evaluation may also be considered in adults with intellectual disability and epilepsy of unknown etiology. Serial bone mineral density scans may be obtained every 5 years.

The guidelines also call for ketone monitoring (blood beta-hydroxybutyrate or urine amino acids) during the early months of ketogenic diet therapy as an objective indication of compliance and biochemical response.

Dietary adjustments should focus on optimizing the treatment response, minimizing side effects, and maximizing sustainability.

Adults on a ketogenic diet therapy should also be advised to take multivitamin and mineral supplements and drink plenty of fluids.

The panel said emerging evidence also supports the use of ketogenic diet therapies in other adult neurologic disorders such as migraineParkinson’s disease, dementia, and multiple sclerosis.

However, the panel said further evidence is needed to guide recommendations on use of ketogenic diet therapies in other neurologic conditions.

The research had no targeted funding. Dr. Cervenka has reported receiving grants from Nutricia, Vitaflo, BrightFocus Foundation, and Army Research Laboratory; honoraria from the American Epilepsy Society, the Neurology Center, Epigenix, LivaNova, and Nutricia; royalties from Demos; and consulting for Nutricia, Glut1 Deficiency Foundation, and Sage Therapeutics. Disclosures for the other authors are listed in the article.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 29(1)
Publications
Topics
Sections

 

An international panel of experts has published the first set of recommendations based on current clinical practices and scientific evidence for using ketogenic diet therapies in adults with drug-resistant epilepsy.

Just as in children with epilepsy, ketogenic diet therapies can be safe and effective in adults with epilepsy but should only be undertaken with the support of medical professionals trained in their use, the group said.

Dr. Mackenzie Cervenka


“Motivation is the key to successful ketogenic diet therapy adherence,” first author Mackenzie Cervenka, MD, director of the Adult Epilepsy Diet Center and associate professor of neurology at Johns Hopkins University, Baltimore, said in an interview.

“Patients who are autonomous require self-motivation and having a strong support structure is important as well. For those patients who are dependents, their caregivers need to be motivated to manage their diet,” said Dr. Cervenka.

The guidelines were published online Oct. 30 in Neurology Clinical Practice.

Novel in adult neurology

Ketogenic diet therapies are high-fat, low-carbohydrate, and adequate-protein diets that induce fat metabolism and ketone production. Despite its use as an effective antiseizure therapy since the 1920s, ketogenic diet therapies remain novel in adult neurology.

Furthermore, while there are established guidelines for ketogenic diet therapies to reduce seizures in children, there were no formal recommendations for adults, until now.

Drawing on the experience of experts at 20 centers using ketogenic diet therapies in more than 2,100 adults with epilepsy in 10 countries, Dr. Cervenka and an international team developed recommendations on use of ketogenic diet therapies in adults.

The panel noted, “with a relatively mild side effect profile and the potential to reduce seizures in nearly 60% of adults with drug-resistant epilepsy, ketogenic diet therapies should be part of the repertoire of available options.”

Ketogenic diet therapies are appropriate to offer to adults with seizure types and epilepsy syndromes for which these treatments are known to be effective in children, they said. These include tuberous sclerosis complexRett syndromeLennox-Gastaut syndrome, glucose transporter type 1 deficiency syndrome, genetic generalized epilepsies, and focal epilepsies caused by underlying migrational disorders and resistant to antiseizure medication.

However, adults with drug-resistant focal epilepsy should be offered surgical evaluation first, given the higher anticipated rate of seizure freedom via this route, the panel said.
 

A focus on compliance

Experts at nearly all of the centers report using two or more ketogenic diet therapies. Ninety percent use the modified Atkins diet, 84% use the classic ketogenic diet, and 63% use the modified ketogenic diet and/or low glycemic index treatment. More than half of the centers (58%) use medium-chain triglyceride oil in combination with another ketogenic diet therapy to boost ketone body production.

The most important factors influencing the choice of ketogenic diet therapy are ease of diet application for the patient (100%) and patient and/or caregiver preference, home setting, and mode of feeding (90% each).

The panel recommended that ketogenic diet therapies be tailored to fit the needs of the individual, taking into account his or her physical and mental characteristics, underlying medical conditions, food preferences, type and amount of support from family and others, level of self-sufficiency, feeding habits, and ease of following the diet.

“Most of the differences between the child and adult recommendations have to do with compliance. Often, it’s more of a challenge for adults than for children,” said Dr. Cervenka.

The panel recommended providing adult patients with recipe ideas, individualized training on the ketogenic diet lifestyle from a dietitian or nutritionist, and guidance for meal planning and preparation before starting the diet. This will provide the greatest likelihood of success, as patients often report difficulties coping with carbohydrate restriction.

“In pediatric practice, positive responders typically remain on a ketogenic diet therapy for 2 years before considering weaning. Ketogenic diet therapy in adults is not time-limited. However, a minimum of 3 months of ketogenic diet therapy is recommended before any judgment of response is made,” the panel advised.

The panel pointed out the absolute metabolic contraindications and cautions related to feeding difficulties, gastrointestinal dysfunction, and digestion remain the same for both children and adults. However, they added that a range of common adult conditions such as hyperlipidemia, heart disease, diabetes, low bone density, and pregnancy “bring additional consideration, caution, and monitoring to ketogenic diet therapy use.”
 

 

 

Beyond epilepsy

The guidelines also call for pre–ketogenic diet therapy biochemical studies to screen adults for preexisting abnormalities and establish a reference for comparing follow-up results after 3, 6, and 12 months, and then annually or as needed.

They also noted that metabolic studies such as urine organic acid and serum amino acid levels are generally not needed in adults unless there is a strong clinical suspicion for an underlying metabolic disorder.

Updated genetic evaluation may also be considered in adults with intellectual disability and epilepsy of unknown etiology. Serial bone mineral density scans may be obtained every 5 years.

The guidelines also call for ketone monitoring (blood beta-hydroxybutyrate or urine amino acids) during the early months of ketogenic diet therapy as an objective indication of compliance and biochemical response.

Dietary adjustments should focus on optimizing the treatment response, minimizing side effects, and maximizing sustainability.

Adults on a ketogenic diet therapy should also be advised to take multivitamin and mineral supplements and drink plenty of fluids.

The panel said emerging evidence also supports the use of ketogenic diet therapies in other adult neurologic disorders such as migraineParkinson’s disease, dementia, and multiple sclerosis.

However, the panel said further evidence is needed to guide recommendations on use of ketogenic diet therapies in other neurologic conditions.

The research had no targeted funding. Dr. Cervenka has reported receiving grants from Nutricia, Vitaflo, BrightFocus Foundation, and Army Research Laboratory; honoraria from the American Epilepsy Society, the Neurology Center, Epigenix, LivaNova, and Nutricia; royalties from Demos; and consulting for Nutricia, Glut1 Deficiency Foundation, and Sage Therapeutics. Disclosures for the other authors are listed in the article.

A version of this article originally appeared on Medscape.com.

 

An international panel of experts has published the first set of recommendations based on current clinical practices and scientific evidence for using ketogenic diet therapies in adults with drug-resistant epilepsy.

Just as in children with epilepsy, ketogenic diet therapies can be safe and effective in adults with epilepsy but should only be undertaken with the support of medical professionals trained in their use, the group said.

Dr. Mackenzie Cervenka


“Motivation is the key to successful ketogenic diet therapy adherence,” first author Mackenzie Cervenka, MD, director of the Adult Epilepsy Diet Center and associate professor of neurology at Johns Hopkins University, Baltimore, said in an interview.

“Patients who are autonomous require self-motivation and having a strong support structure is important as well. For those patients who are dependents, their caregivers need to be motivated to manage their diet,” said Dr. Cervenka.

The guidelines were published online Oct. 30 in Neurology Clinical Practice.

Novel in adult neurology

Ketogenic diet therapies are high-fat, low-carbohydrate, and adequate-protein diets that induce fat metabolism and ketone production. Despite its use as an effective antiseizure therapy since the 1920s, ketogenic diet therapies remain novel in adult neurology.

Furthermore, while there are established guidelines for ketogenic diet therapies to reduce seizures in children, there were no formal recommendations for adults, until now.

Drawing on the experience of experts at 20 centers using ketogenic diet therapies in more than 2,100 adults with epilepsy in 10 countries, Dr. Cervenka and an international team developed recommendations on use of ketogenic diet therapies in adults.

The panel noted, “with a relatively mild side effect profile and the potential to reduce seizures in nearly 60% of adults with drug-resistant epilepsy, ketogenic diet therapies should be part of the repertoire of available options.”

Ketogenic diet therapies are appropriate to offer to adults with seizure types and epilepsy syndromes for which these treatments are known to be effective in children, they said. These include tuberous sclerosis complexRett syndromeLennox-Gastaut syndrome, glucose transporter type 1 deficiency syndrome, genetic generalized epilepsies, and focal epilepsies caused by underlying migrational disorders and resistant to antiseizure medication.

However, adults with drug-resistant focal epilepsy should be offered surgical evaluation first, given the higher anticipated rate of seizure freedom via this route, the panel said.
 

A focus on compliance

Experts at nearly all of the centers report using two or more ketogenic diet therapies. Ninety percent use the modified Atkins diet, 84% use the classic ketogenic diet, and 63% use the modified ketogenic diet and/or low glycemic index treatment. More than half of the centers (58%) use medium-chain triglyceride oil in combination with another ketogenic diet therapy to boost ketone body production.

The most important factors influencing the choice of ketogenic diet therapy are ease of diet application for the patient (100%) and patient and/or caregiver preference, home setting, and mode of feeding (90% each).

The panel recommended that ketogenic diet therapies be tailored to fit the needs of the individual, taking into account his or her physical and mental characteristics, underlying medical conditions, food preferences, type and amount of support from family and others, level of self-sufficiency, feeding habits, and ease of following the diet.

“Most of the differences between the child and adult recommendations have to do with compliance. Often, it’s more of a challenge for adults than for children,” said Dr. Cervenka.

The panel recommended providing adult patients with recipe ideas, individualized training on the ketogenic diet lifestyle from a dietitian or nutritionist, and guidance for meal planning and preparation before starting the diet. This will provide the greatest likelihood of success, as patients often report difficulties coping with carbohydrate restriction.

“In pediatric practice, positive responders typically remain on a ketogenic diet therapy for 2 years before considering weaning. Ketogenic diet therapy in adults is not time-limited. However, a minimum of 3 months of ketogenic diet therapy is recommended before any judgment of response is made,” the panel advised.

The panel pointed out the absolute metabolic contraindications and cautions related to feeding difficulties, gastrointestinal dysfunction, and digestion remain the same for both children and adults. However, they added that a range of common adult conditions such as hyperlipidemia, heart disease, diabetes, low bone density, and pregnancy “bring additional consideration, caution, and monitoring to ketogenic diet therapy use.”
 

 

 

Beyond epilepsy

The guidelines also call for pre–ketogenic diet therapy biochemical studies to screen adults for preexisting abnormalities and establish a reference for comparing follow-up results after 3, 6, and 12 months, and then annually or as needed.

They also noted that metabolic studies such as urine organic acid and serum amino acid levels are generally not needed in adults unless there is a strong clinical suspicion for an underlying metabolic disorder.

Updated genetic evaluation may also be considered in adults with intellectual disability and epilepsy of unknown etiology. Serial bone mineral density scans may be obtained every 5 years.

The guidelines also call for ketone monitoring (blood beta-hydroxybutyrate or urine amino acids) during the early months of ketogenic diet therapy as an objective indication of compliance and biochemical response.

Dietary adjustments should focus on optimizing the treatment response, minimizing side effects, and maximizing sustainability.

Adults on a ketogenic diet therapy should also be advised to take multivitamin and mineral supplements and drink plenty of fluids.

The panel said emerging evidence also supports the use of ketogenic diet therapies in other adult neurologic disorders such as migraineParkinson’s disease, dementia, and multiple sclerosis.

However, the panel said further evidence is needed to guide recommendations on use of ketogenic diet therapies in other neurologic conditions.

The research had no targeted funding. Dr. Cervenka has reported receiving grants from Nutricia, Vitaflo, BrightFocus Foundation, and Army Research Laboratory; honoraria from the American Epilepsy Society, the Neurology Center, Epigenix, LivaNova, and Nutricia; royalties from Demos; and consulting for Nutricia, Glut1 Deficiency Foundation, and Sage Therapeutics. Disclosures for the other authors are listed in the article.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 29(1)
Issue
Neurology Reviews- 29(1)
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Citation Override
Publish date: December 2, 2020
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

Statins beneficial in elderly, guidelines should be strengthened

Article Type
Changed
Thu, 12/15/2022 - 15:43

Contrary to historical evidence, among older people, elevated LDL cholesterol levels increase risk for heart attack and cardiovascular disease, and older patients benefit as much, if not more, from statins and other cholesterol-lowering drugs than do younger people, two new studies show.

“By contrast with previous historical studies, our data show that LDL cholesterol is an important risk factor for myocardial infarction and atherosclerotic cardiovascular disease in a contemporary primary prevention cohort of individuals aged 70 to 100 years,” Borge Nordestgaard, MD, of the University of Copenhagen, and colleagues noted in the first of the two studies, published this week in the Lancet.

“By lowering LDL cholesterol in healthy individuals aged 70-100 years, the potential for preventing myocardial infarctions and atherosclerotic cardiovascular disease is huge, and at a substantially lower number needed to treat when compared with those aged 20-69 years,” they added.

“These findings support the concept of the cumulative burden of LDL cholesterol over one’s lifetime and the progressive increase in risk for atherosclerotic cardiovascular disease, including myocardial infarction, with age,” added Frederick J. Raal, PhD, and Farzahna Mohamed, MB BCh, of the University of the Witwatersrand, Johannesburg, South Africa, in an editorial published with both new studies in the Lancet (2020 Nov 10. doi: 10.1016/S0140-6736[20]32333-3).

The studies underscore the need for clinicians to consider continued risks associated with elevated LDL cholesterol in older age, they stressed, adding that statins are also beneficial for younger persons at risk to prevent conditions from worsening.

“The average age of patients in all the trials analyzed was older than 60 years, an age when atherosclerotic cardiovascular disease is already well established,” the editorialists wrote.

“Lipid-lowering therapy should be initiated at a younger age, preferably before age 40 years, in those at risk to delay the onset of atherosclerosis, rather than try to manage the condition once fully established or advanced,” they stressed.
 

No RCTs have included patients older than 70

For persons aged 40-75 years, elevated LDL cholesterol levels are a known risk factor for MI and atherosclerotic cardiovascular disease, and there is consensus in guidelines regarding treatment with statins.

However, the risk for people older than 70 is controversial. Some studies show little or no association between elevated LDL cholesterol levels and an increased risk for MI.

Contributing to the uncertainty is that few of the randomized, controlled trials that have investigated the question have included patients aged older than 70 years.

As a consequence, many practice guidelines have noted that the level of evidence in older patients is low, and some organizations have lowered the strength of recommendations regarding the treatment for older patients in comparison with younger patients.
 

Primary prevention: CV events increase with elevated LDL cholesterol in older age

Dr. Nordestgaard and colleagues studied data on 91,131 people living in Copenhagen who did not have atherosclerotic cardiovascular disease or diabetes at baseline and were not taking statins.

Of the participants, 10,592 were aged 70-79 years, and 3,188 participants were aged 80-100 years.

Over an average follow-up period of 7.7 years, 1,515 participants had a first MI, and 3,389 developed atherosclerotic cardiovascular disease.

In the primary-prevention cohort, after multivariate adjustment, the risk of having a heart attack per 1.0 mmol/L increase in LDL cholesterol was increased in the group overall (hazard ratio, 1.34). The increased risk was observed for all age groups, including those aged 80-100 years (HR, 1.28), 70-79 (HR, 1.25), 60-69 (HR, 1.29), 50-59 (HR, 1.28), and 20-49 (HR, 1.68).

Risk for atherosclerotic cardiovascular disease was also raised per 1.0 mmol/L increase in LDL cholesterol overall (HR, 1.16) and in all age groups, particularly those aged 70-100 years.

Greater elevations in LDL cholesterol (5.0 mmol/L or higher, indicative of possible familial hypercholesterolemia) were associated with a notably higher risk for heart attack after multivariate adjustment in people aged 80-100 (HR, 2.99). Risk was also higher among those aged 70-79 (HR, 1.82).

The highest incidence was in those older than 70. The rate was 8.5 heart attacks per 1,000 people per year among those aged 80-100 and 5.2 heart attacks per 1,000 in those aged 70-79. The rates were 2.5 per 1,000 among those 60-69, 1.8 for those aged 50-59, and 0.8 for those aged 20-49.

“The absolute risk [of cardiovascular events] is of course much higher in the elderly than those under the age of 75, but what was a surprise was how clear our results were on a relative risk scale, that the risk associated with elevated LDL [cholesterol] was as high in people aged 80-100 as the younger patients,” Dr. Nordestgaard said in an interview.

With regard to the benefits of cholesterol-lowering drugs, the study showed that the number needed to prevent one heart attack over 5 years was 80 among those aged 80-100; the number was 439 for people aged 50-59.

With regard to stronger statins, when moderate-intensity statins were used, the number needed to treat to prevent one cardiovascular disease event of any type dropped to 42 for patients aged 80-100. It was 88 for those aged 70-79, 164 for those aged 60-69, 345 for those aged 50-59, and 769 for those aged 20-49.

“The clinical significance of this is that it appears those in older age groups indeed benefit from cholesterol-lowering therapy,” Dr. Nordestgaard said. “I think many people have this idea that LDL [cholesterol] is not important over the age of about 70-75, but that’s not the case.”

“These robust findings are novel,” he and his colleagues stressed.

Despite these observational findings, the South African editorialists noted that “whether lipid-lowering therapy should be initiated for primary prevention in people aged 75 years or older is unclear,” owing to the host of risks and benefits that need to be balanced.

The findings of an ongoing randomized, placebo-controlled trial (STAREE) may answer this question, they wrote. It is investigating primary prevention in 18,000 older patients (≥70 years) who are being randomly assigned to receive atorvastatin 40 mg/d or placebo. The study is seeking to determine whether statin treatment extends the length of a disability-free life, which will be assessed on the basis of survival outside permanent residential care. Results are expected in 2022-2023.
 

 

 

Unequivocal reductions in events in elderly, comparable with younger patients

In the second study (Lancet. 2020 Nov 10. doi: 10.1016/S0140-6736[20]32332-1), Baris Gencer, MD, of Brigham and Women’s Hospital, Boston, =and colleagues evaluated the effects of statins and other cholesterol-lowering drugs, including ezetimibe and proprotein convertase subtilisin/kexin type 9 inhibitors, in older versus younger patients.

The systematic review and meta-analysis of 29 randomized controlled trials, also published in the Lancet, were presented virtually as a poster as part of the 2020 American Heart Association scientific session. It included data on 244,090 patients, including 21,492 aged 75 years and older.

The meta-analysis included studies of cardiovascular outcomes of a guideline-recommended LDL cholesterol–lowering drug, with a median follow-up of at least 2 years and inclusion of data on patients aged 75 years and older.

The results showed that over a median follow-up of 2.2 to 6 years, statin use by older patients was associated with a relative risk reduction of major vascular events of 26% per 1 mmol/L reduction in LDL cholesterol (P = .0019), which was comparable with a risk reduction of 15% per 1 mmol/L reduction in LDL cholesterol for patients younger than 75 years (P = .37, compared with older patients).

Treatment of older patients with LDL cholesterol–lowering drugs was also associated with significantly improved outcomes in cardiovascular death (risk ratio, 0.85), MI (RR, 0.80), stroke (RR, 0.73), and coronary revascularization (RR, 0.80).

“We found an unequivocal reduction in the risk of major vascular events with both statin and nonstatin LDL cholesterol-lowering treatments, which was similar to that seen in younger patients,” the authors wrote.

“Cholesterol-lowering medications are affordable drugs that have reduced risk of heart disease for millions of people worldwide, but until now, their benefits for older people have remained less certain,” said lead author Marc Sabatine, MD, also of Brigham and Women’s Hospital, in a Lancet press release.

“Our analysis indicates that these therapies are as effective in reducing cardiovascular events and deaths in people aged 75 years and over as they are in younger people. We found no offsetting safety concerns, and together, these results should strengthen guideline recommendations for the use of cholesterol-lowering medications, including statin and nonstatin therapy, in elderly people.”

The editorialists agreed: “More than 80% of fatal cardiovascular events occur in individuals older than 65 years, and the incidence of cardiovascular events is increasing in those older than 80 years; therefore, the findings of Gencer and colleagues’ study should encourage the use of lipid-lowering therapy in older patients.”

The authors of the two studies have disclosed no relevant financial relationships. Dr. Raal has received research grants, honoraria, or consulting fees for advisory board membership, professional input, and lectures on lipid-lowering drug therapy from Amgen, Regeneron, Sanofi, Novartis, and the Medicines Company.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Publications
Topics
Sections

Contrary to historical evidence, among older people, elevated LDL cholesterol levels increase risk for heart attack and cardiovascular disease, and older patients benefit as much, if not more, from statins and other cholesterol-lowering drugs than do younger people, two new studies show.

“By contrast with previous historical studies, our data show that LDL cholesterol is an important risk factor for myocardial infarction and atherosclerotic cardiovascular disease in a contemporary primary prevention cohort of individuals aged 70 to 100 years,” Borge Nordestgaard, MD, of the University of Copenhagen, and colleagues noted in the first of the two studies, published this week in the Lancet.

“By lowering LDL cholesterol in healthy individuals aged 70-100 years, the potential for preventing myocardial infarctions and atherosclerotic cardiovascular disease is huge, and at a substantially lower number needed to treat when compared with those aged 20-69 years,” they added.

“These findings support the concept of the cumulative burden of LDL cholesterol over one’s lifetime and the progressive increase in risk for atherosclerotic cardiovascular disease, including myocardial infarction, with age,” added Frederick J. Raal, PhD, and Farzahna Mohamed, MB BCh, of the University of the Witwatersrand, Johannesburg, South Africa, in an editorial published with both new studies in the Lancet (2020 Nov 10. doi: 10.1016/S0140-6736[20]32333-3).

The studies underscore the need for clinicians to consider continued risks associated with elevated LDL cholesterol in older age, they stressed, adding that statins are also beneficial for younger persons at risk to prevent conditions from worsening.

“The average age of patients in all the trials analyzed was older than 60 years, an age when atherosclerotic cardiovascular disease is already well established,” the editorialists wrote.

“Lipid-lowering therapy should be initiated at a younger age, preferably before age 40 years, in those at risk to delay the onset of atherosclerosis, rather than try to manage the condition once fully established or advanced,” they stressed.
 

No RCTs have included patients older than 70

For persons aged 40-75 years, elevated LDL cholesterol levels are a known risk factor for MI and atherosclerotic cardiovascular disease, and there is consensus in guidelines regarding treatment with statins.

However, the risk for people older than 70 is controversial. Some studies show little or no association between elevated LDL cholesterol levels and an increased risk for MI.

Contributing to the uncertainty is that few of the randomized, controlled trials that have investigated the question have included patients aged older than 70 years.

As a consequence, many practice guidelines have noted that the level of evidence in older patients is low, and some organizations have lowered the strength of recommendations regarding the treatment for older patients in comparison with younger patients.
 

Primary prevention: CV events increase with elevated LDL cholesterol in older age

Dr. Nordestgaard and colleagues studied data on 91,131 people living in Copenhagen who did not have atherosclerotic cardiovascular disease or diabetes at baseline and were not taking statins.

Of the participants, 10,592 were aged 70-79 years, and 3,188 participants were aged 80-100 years.

Over an average follow-up period of 7.7 years, 1,515 participants had a first MI, and 3,389 developed atherosclerotic cardiovascular disease.

In the primary-prevention cohort, after multivariate adjustment, the risk of having a heart attack per 1.0 mmol/L increase in LDL cholesterol was increased in the group overall (hazard ratio, 1.34). The increased risk was observed for all age groups, including those aged 80-100 years (HR, 1.28), 70-79 (HR, 1.25), 60-69 (HR, 1.29), 50-59 (HR, 1.28), and 20-49 (HR, 1.68).

Risk for atherosclerotic cardiovascular disease was also raised per 1.0 mmol/L increase in LDL cholesterol overall (HR, 1.16) and in all age groups, particularly those aged 70-100 years.

Greater elevations in LDL cholesterol (5.0 mmol/L or higher, indicative of possible familial hypercholesterolemia) were associated with a notably higher risk for heart attack after multivariate adjustment in people aged 80-100 (HR, 2.99). Risk was also higher among those aged 70-79 (HR, 1.82).

The highest incidence was in those older than 70. The rate was 8.5 heart attacks per 1,000 people per year among those aged 80-100 and 5.2 heart attacks per 1,000 in those aged 70-79. The rates were 2.5 per 1,000 among those 60-69, 1.8 for those aged 50-59, and 0.8 for those aged 20-49.

“The absolute risk [of cardiovascular events] is of course much higher in the elderly than those under the age of 75, but what was a surprise was how clear our results were on a relative risk scale, that the risk associated with elevated LDL [cholesterol] was as high in people aged 80-100 as the younger patients,” Dr. Nordestgaard said in an interview.

With regard to the benefits of cholesterol-lowering drugs, the study showed that the number needed to prevent one heart attack over 5 years was 80 among those aged 80-100; the number was 439 for people aged 50-59.

With regard to stronger statins, when moderate-intensity statins were used, the number needed to treat to prevent one cardiovascular disease event of any type dropped to 42 for patients aged 80-100. It was 88 for those aged 70-79, 164 for those aged 60-69, 345 for those aged 50-59, and 769 for those aged 20-49.

“The clinical significance of this is that it appears those in older age groups indeed benefit from cholesterol-lowering therapy,” Dr. Nordestgaard said. “I think many people have this idea that LDL [cholesterol] is not important over the age of about 70-75, but that’s not the case.”

“These robust findings are novel,” he and his colleagues stressed.

Despite these observational findings, the South African editorialists noted that “whether lipid-lowering therapy should be initiated for primary prevention in people aged 75 years or older is unclear,” owing to the host of risks and benefits that need to be balanced.

The findings of an ongoing randomized, placebo-controlled trial (STAREE) may answer this question, they wrote. It is investigating primary prevention in 18,000 older patients (≥70 years) who are being randomly assigned to receive atorvastatin 40 mg/d or placebo. The study is seeking to determine whether statin treatment extends the length of a disability-free life, which will be assessed on the basis of survival outside permanent residential care. Results are expected in 2022-2023.
 

 

 

Unequivocal reductions in events in elderly, comparable with younger patients

In the second study (Lancet. 2020 Nov 10. doi: 10.1016/S0140-6736[20]32332-1), Baris Gencer, MD, of Brigham and Women’s Hospital, Boston, =and colleagues evaluated the effects of statins and other cholesterol-lowering drugs, including ezetimibe and proprotein convertase subtilisin/kexin type 9 inhibitors, in older versus younger patients.

The systematic review and meta-analysis of 29 randomized controlled trials, also published in the Lancet, were presented virtually as a poster as part of the 2020 American Heart Association scientific session. It included data on 244,090 patients, including 21,492 aged 75 years and older.

The meta-analysis included studies of cardiovascular outcomes of a guideline-recommended LDL cholesterol–lowering drug, with a median follow-up of at least 2 years and inclusion of data on patients aged 75 years and older.

The results showed that over a median follow-up of 2.2 to 6 years, statin use by older patients was associated with a relative risk reduction of major vascular events of 26% per 1 mmol/L reduction in LDL cholesterol (P = .0019), which was comparable with a risk reduction of 15% per 1 mmol/L reduction in LDL cholesterol for patients younger than 75 years (P = .37, compared with older patients).

Treatment of older patients with LDL cholesterol–lowering drugs was also associated with significantly improved outcomes in cardiovascular death (risk ratio, 0.85), MI (RR, 0.80), stroke (RR, 0.73), and coronary revascularization (RR, 0.80).

“We found an unequivocal reduction in the risk of major vascular events with both statin and nonstatin LDL cholesterol-lowering treatments, which was similar to that seen in younger patients,” the authors wrote.

“Cholesterol-lowering medications are affordable drugs that have reduced risk of heart disease for millions of people worldwide, but until now, their benefits for older people have remained less certain,” said lead author Marc Sabatine, MD, also of Brigham and Women’s Hospital, in a Lancet press release.

“Our analysis indicates that these therapies are as effective in reducing cardiovascular events and deaths in people aged 75 years and over as they are in younger people. We found no offsetting safety concerns, and together, these results should strengthen guideline recommendations for the use of cholesterol-lowering medications, including statin and nonstatin therapy, in elderly people.”

The editorialists agreed: “More than 80% of fatal cardiovascular events occur in individuals older than 65 years, and the incidence of cardiovascular events is increasing in those older than 80 years; therefore, the findings of Gencer and colleagues’ study should encourage the use of lipid-lowering therapy in older patients.”

The authors of the two studies have disclosed no relevant financial relationships. Dr. Raal has received research grants, honoraria, or consulting fees for advisory board membership, professional input, and lectures on lipid-lowering drug therapy from Amgen, Regeneron, Sanofi, Novartis, and the Medicines Company.

A version of this article originally appeared on Medscape.com.

Contrary to historical evidence, among older people, elevated LDL cholesterol levels increase risk for heart attack and cardiovascular disease, and older patients benefit as much, if not more, from statins and other cholesterol-lowering drugs than do younger people, two new studies show.

“By contrast with previous historical studies, our data show that LDL cholesterol is an important risk factor for myocardial infarction and atherosclerotic cardiovascular disease in a contemporary primary prevention cohort of individuals aged 70 to 100 years,” Borge Nordestgaard, MD, of the University of Copenhagen, and colleagues noted in the first of the two studies, published this week in the Lancet.

“By lowering LDL cholesterol in healthy individuals aged 70-100 years, the potential for preventing myocardial infarctions and atherosclerotic cardiovascular disease is huge, and at a substantially lower number needed to treat when compared with those aged 20-69 years,” they added.

“These findings support the concept of the cumulative burden of LDL cholesterol over one’s lifetime and the progressive increase in risk for atherosclerotic cardiovascular disease, including myocardial infarction, with age,” added Frederick J. Raal, PhD, and Farzahna Mohamed, MB BCh, of the University of the Witwatersrand, Johannesburg, South Africa, in an editorial published with both new studies in the Lancet (2020 Nov 10. doi: 10.1016/S0140-6736[20]32333-3).

The studies underscore the need for clinicians to consider continued risks associated with elevated LDL cholesterol in older age, they stressed, adding that statins are also beneficial for younger persons at risk to prevent conditions from worsening.

“The average age of patients in all the trials analyzed was older than 60 years, an age when atherosclerotic cardiovascular disease is already well established,” the editorialists wrote.

“Lipid-lowering therapy should be initiated at a younger age, preferably before age 40 years, in those at risk to delay the onset of atherosclerosis, rather than try to manage the condition once fully established or advanced,” they stressed.
 

No RCTs have included patients older than 70

For persons aged 40-75 years, elevated LDL cholesterol levels are a known risk factor for MI and atherosclerotic cardiovascular disease, and there is consensus in guidelines regarding treatment with statins.

However, the risk for people older than 70 is controversial. Some studies show little or no association between elevated LDL cholesterol levels and an increased risk for MI.

Contributing to the uncertainty is that few of the randomized, controlled trials that have investigated the question have included patients aged older than 70 years.

As a consequence, many practice guidelines have noted that the level of evidence in older patients is low, and some organizations have lowered the strength of recommendations regarding the treatment for older patients in comparison with younger patients.
 

Primary prevention: CV events increase with elevated LDL cholesterol in older age

Dr. Nordestgaard and colleagues studied data on 91,131 people living in Copenhagen who did not have atherosclerotic cardiovascular disease or diabetes at baseline and were not taking statins.

Of the participants, 10,592 were aged 70-79 years, and 3,188 participants were aged 80-100 years.

Over an average follow-up period of 7.7 years, 1,515 participants had a first MI, and 3,389 developed atherosclerotic cardiovascular disease.

In the primary-prevention cohort, after multivariate adjustment, the risk of having a heart attack per 1.0 mmol/L increase in LDL cholesterol was increased in the group overall (hazard ratio, 1.34). The increased risk was observed for all age groups, including those aged 80-100 years (HR, 1.28), 70-79 (HR, 1.25), 60-69 (HR, 1.29), 50-59 (HR, 1.28), and 20-49 (HR, 1.68).

Risk for atherosclerotic cardiovascular disease was also raised per 1.0 mmol/L increase in LDL cholesterol overall (HR, 1.16) and in all age groups, particularly those aged 70-100 years.

Greater elevations in LDL cholesterol (5.0 mmol/L or higher, indicative of possible familial hypercholesterolemia) were associated with a notably higher risk for heart attack after multivariate adjustment in people aged 80-100 (HR, 2.99). Risk was also higher among those aged 70-79 (HR, 1.82).

The highest incidence was in those older than 70. The rate was 8.5 heart attacks per 1,000 people per year among those aged 80-100 and 5.2 heart attacks per 1,000 in those aged 70-79. The rates were 2.5 per 1,000 among those 60-69, 1.8 for those aged 50-59, and 0.8 for those aged 20-49.

“The absolute risk [of cardiovascular events] is of course much higher in the elderly than those under the age of 75, but what was a surprise was how clear our results were on a relative risk scale, that the risk associated with elevated LDL [cholesterol] was as high in people aged 80-100 as the younger patients,” Dr. Nordestgaard said in an interview.

With regard to the benefits of cholesterol-lowering drugs, the study showed that the number needed to prevent one heart attack over 5 years was 80 among those aged 80-100; the number was 439 for people aged 50-59.

With regard to stronger statins, when moderate-intensity statins were used, the number needed to treat to prevent one cardiovascular disease event of any type dropped to 42 for patients aged 80-100. It was 88 for those aged 70-79, 164 for those aged 60-69, 345 for those aged 50-59, and 769 for those aged 20-49.

“The clinical significance of this is that it appears those in older age groups indeed benefit from cholesterol-lowering therapy,” Dr. Nordestgaard said. “I think many people have this idea that LDL [cholesterol] is not important over the age of about 70-75, but that’s not the case.”

“These robust findings are novel,” he and his colleagues stressed.

Despite these observational findings, the South African editorialists noted that “whether lipid-lowering therapy should be initiated for primary prevention in people aged 75 years or older is unclear,” owing to the host of risks and benefits that need to be balanced.

The findings of an ongoing randomized, placebo-controlled trial (STAREE) may answer this question, they wrote. It is investigating primary prevention in 18,000 older patients (≥70 years) who are being randomly assigned to receive atorvastatin 40 mg/d or placebo. The study is seeking to determine whether statin treatment extends the length of a disability-free life, which will be assessed on the basis of survival outside permanent residential care. Results are expected in 2022-2023.
 

 

 

Unequivocal reductions in events in elderly, comparable with younger patients

In the second study (Lancet. 2020 Nov 10. doi: 10.1016/S0140-6736[20]32332-1), Baris Gencer, MD, of Brigham and Women’s Hospital, Boston, =and colleagues evaluated the effects of statins and other cholesterol-lowering drugs, including ezetimibe and proprotein convertase subtilisin/kexin type 9 inhibitors, in older versus younger patients.

The systematic review and meta-analysis of 29 randomized controlled trials, also published in the Lancet, were presented virtually as a poster as part of the 2020 American Heart Association scientific session. It included data on 244,090 patients, including 21,492 aged 75 years and older.

The meta-analysis included studies of cardiovascular outcomes of a guideline-recommended LDL cholesterol–lowering drug, with a median follow-up of at least 2 years and inclusion of data on patients aged 75 years and older.

The results showed that over a median follow-up of 2.2 to 6 years, statin use by older patients was associated with a relative risk reduction of major vascular events of 26% per 1 mmol/L reduction in LDL cholesterol (P = .0019), which was comparable with a risk reduction of 15% per 1 mmol/L reduction in LDL cholesterol for patients younger than 75 years (P = .37, compared with older patients).

Treatment of older patients with LDL cholesterol–lowering drugs was also associated with significantly improved outcomes in cardiovascular death (risk ratio, 0.85), MI (RR, 0.80), stroke (RR, 0.73), and coronary revascularization (RR, 0.80).

“We found an unequivocal reduction in the risk of major vascular events with both statin and nonstatin LDL cholesterol-lowering treatments, which was similar to that seen in younger patients,” the authors wrote.

“Cholesterol-lowering medications are affordable drugs that have reduced risk of heart disease for millions of people worldwide, but until now, their benefits for older people have remained less certain,” said lead author Marc Sabatine, MD, also of Brigham and Women’s Hospital, in a Lancet press release.

“Our analysis indicates that these therapies are as effective in reducing cardiovascular events and deaths in people aged 75 years and over as they are in younger people. We found no offsetting safety concerns, and together, these results should strengthen guideline recommendations for the use of cholesterol-lowering medications, including statin and nonstatin therapy, in elderly people.”

The editorialists agreed: “More than 80% of fatal cardiovascular events occur in individuals older than 65 years, and the incidence of cardiovascular events is increasing in those older than 80 years; therefore, the findings of Gencer and colleagues’ study should encourage the use of lipid-lowering therapy in older patients.”

The authors of the two studies have disclosed no relevant financial relationships. Dr. Raal has received research grants, honoraria, or consulting fees for advisory board membership, professional input, and lectures on lipid-lowering drug therapy from Amgen, Regeneron, Sanofi, Novartis, and the Medicines Company.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 29(2)
Issue
Neurology Reviews- 29(2)
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Citation Override
Publish date: November 24, 2020
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

Concussion linked to risk for dementia, Parkinson’s disease, and ADHD

Article Type
Changed
Thu, 12/15/2022 - 15:43

 

Concussion is associated with increased risk for subsequent development of attention-deficit/hyperactivity disorder (ADHD), as well as dementia and Parkinson’s disease, new research suggests. Results from a retrospective, population-based cohort study showed that controlling for socioeconomic status and overall health did not significantly affect this association.

The link between concussion and risk for ADHD and for mood and anxiety disorder was stronger in the women than in the men. In addition, having a history of multiple concussions strengthened the association between concussion and subsequent mood and anxiety disorder, dementia, and Parkinson’s disease compared with experiencing just one concussion.

The findings are similar to those of previous studies, noted lead author Marc P. Morissette, PhD, research assistant at the Pan Am Clinic Foundation in Winnipeg, Manitoba, Canada. “The main methodological differences separating our study from previous studies in this area is a focus on concussion-specific injuries identified from medical records and the potential for study participants to have up to 25 years of follow-up data,” said Dr. Morissette.

The findings were published online July 27 in Family Medicine and Community Health, a BMJ journal.
 

Almost 190,000 participants

Several studies have shown associations between head injury and increased risk for ADHD, depression, anxiety, Alzheimer’s disease, and Parkinson’s disease. However, many of these studies relied on self-reported medical history, included all forms of traumatic brain injury, and failed to adjust for preexisting health conditions.

An improved understanding of concussion and the risks associated with it could help physicians manage their patients’ long-term needs, the investigators noted.

In the current study, the researchers examined anonymized administrative health data collected between the periods of 1990–1991 and 2014–2015 in the Manitoba Population Research Data Repository at the Manitoba Center for Health Policy.

Eligible patients had been diagnosed with concussion in accordance with standard criteria. Participants were excluded if they had been diagnosed with dementia or Parkinson’s disease before the incident concussion during the study period. The investigators matched three control participants to each included patient on the basis of age, sex, and location.

Study outcome was time from index date (date of first concussion) to diagnosis of ADHD, mood and anxiety disorder, dementia, or Parkinson’s disease. The researchers controlled for socioeconomic status using the Socioeconomic Factor Index, version 2 (SEFI2), and for preexisting medical conditions using the Charlson Comorbidity Index (CCI).

The study included 28,021 men (mean age, 25 years) and 19,462 women (mean age, 30 years) in the concussion group and 81,871 men (mean age, 25 years) and 57,159 women (mean age, 30 years) in the control group. Mean SEFI2 score was approximately −0.05, and mean CCI score was approximately 0.2.
 

Dose effect?

Results showed that concussion was associated with an increased risk for ADHD (hazard ratio [HR], 1.39), mood and anxiety disorder (HR, 1.72), dementia (HR, 1.72), and Parkinson’s disease (HR, 1.57).

After a concussion, the risk of developing ADHD was 28% higher and the risk of developing mood and anxiety disorder was 7% higher among women than among men. Gender was not associated with risk for dementia or Parkinson’s disease after concussion.

Sustaining a second concussion increased the strength of the association with risk for dementia compared with sustaining a single concussion (HR, 1.62). Similarly, sustaining more than three concussions increased the strength of the association with the risk for mood and anxiety disorders (HR for more than three vs one concussion, 1.22) and Parkinson›s disease (HR, 3.27).

A sensitivity analysis found similar associations between concussion and risk for mood and anxiety disorder among all age groups. Younger participants were at greater risk for ADHD, however, and older participants were at greater risk for dementia and Parkinson’s disease.

Increased awareness of concussion and the outcomes of interest, along with improved diagnostic tools, may have influenced the study’s findings, Dr. Morissette noted. “The sex-based differences may be due to either pathophysiological differences in response to concussive injuries or potentially a difference in willingness to seek medical care or share symptoms, concussion-related or otherwise, with a medical professional,” he said.

“We are hopeful that our findings will encourage practitioners to be cognizant of various conditions that may present in individuals who have previously experienced a concussion,” Dr. Morissette added. “If physicians are aware of the various associations identified following a concussion, it may lead to more thorough clinical examination at initial presentation, along with more dedicated care throughout the patient’s life.”
 

 

 

Association versus causation

Commenting on the research, Steven Erickson, MD, sports medicine specialist at Banner–University Medicine Neuroscience Institute, Phoenix, Ariz., noted that although the study showed an association between concussion and subsequent diagnosis of ADHD, anxiety, and Parkinson’s disease, “this association should not be misconstrued as causation.” He added that the study’s conclusions “are just as likely to be due to labeling theory” or a self-fulfilling prophecy.

“Patients diagnosed with ADHD, anxiety, or Parkinson’s disease may recall concussion and associate the two diagnoses; but patients who have not previously been diagnosed with a concussion cannot draw that conclusion,” said Dr. Erickson, who was not involved with the research.

Citing the apparent gender difference in the strength of the association between concussion and the outcomes of interest, Dr. Erickson noted that women are more likely to report symptoms in general “and therefore are more likely to be diagnosed with ADHD and anxiety disorders” because of differences in reporting rather than incidence of disease.

“Further research needs to be done to definitively determine a causal relationship between concussion and any psychiatric or neurologic diagnosis,” Dr. Erickson concluded.

The study was funded by the Pan Am Clinic Foundation. Dr. Morissette and Dr. Erickson have disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 28(9)
Publications
Topics
Sections

 

Concussion is associated with increased risk for subsequent development of attention-deficit/hyperactivity disorder (ADHD), as well as dementia and Parkinson’s disease, new research suggests. Results from a retrospective, population-based cohort study showed that controlling for socioeconomic status and overall health did not significantly affect this association.

The link between concussion and risk for ADHD and for mood and anxiety disorder was stronger in the women than in the men. In addition, having a history of multiple concussions strengthened the association between concussion and subsequent mood and anxiety disorder, dementia, and Parkinson’s disease compared with experiencing just one concussion.

The findings are similar to those of previous studies, noted lead author Marc P. Morissette, PhD, research assistant at the Pan Am Clinic Foundation in Winnipeg, Manitoba, Canada. “The main methodological differences separating our study from previous studies in this area is a focus on concussion-specific injuries identified from medical records and the potential for study participants to have up to 25 years of follow-up data,” said Dr. Morissette.

The findings were published online July 27 in Family Medicine and Community Health, a BMJ journal.
 

Almost 190,000 participants

Several studies have shown associations between head injury and increased risk for ADHD, depression, anxiety, Alzheimer’s disease, and Parkinson’s disease. However, many of these studies relied on self-reported medical history, included all forms of traumatic brain injury, and failed to adjust for preexisting health conditions.

An improved understanding of concussion and the risks associated with it could help physicians manage their patients’ long-term needs, the investigators noted.

In the current study, the researchers examined anonymized administrative health data collected between the periods of 1990–1991 and 2014–2015 in the Manitoba Population Research Data Repository at the Manitoba Center for Health Policy.

Eligible patients had been diagnosed with concussion in accordance with standard criteria. Participants were excluded if they had been diagnosed with dementia or Parkinson’s disease before the incident concussion during the study period. The investigators matched three control participants to each included patient on the basis of age, sex, and location.

Study outcome was time from index date (date of first concussion) to diagnosis of ADHD, mood and anxiety disorder, dementia, or Parkinson’s disease. The researchers controlled for socioeconomic status using the Socioeconomic Factor Index, version 2 (SEFI2), and for preexisting medical conditions using the Charlson Comorbidity Index (CCI).

The study included 28,021 men (mean age, 25 years) and 19,462 women (mean age, 30 years) in the concussion group and 81,871 men (mean age, 25 years) and 57,159 women (mean age, 30 years) in the control group. Mean SEFI2 score was approximately −0.05, and mean CCI score was approximately 0.2.
 

Dose effect?

Results showed that concussion was associated with an increased risk for ADHD (hazard ratio [HR], 1.39), mood and anxiety disorder (HR, 1.72), dementia (HR, 1.72), and Parkinson’s disease (HR, 1.57).

After a concussion, the risk of developing ADHD was 28% higher and the risk of developing mood and anxiety disorder was 7% higher among women than among men. Gender was not associated with risk for dementia or Parkinson’s disease after concussion.

Sustaining a second concussion increased the strength of the association with risk for dementia compared with sustaining a single concussion (HR, 1.62). Similarly, sustaining more than three concussions increased the strength of the association with the risk for mood and anxiety disorders (HR for more than three vs one concussion, 1.22) and Parkinson›s disease (HR, 3.27).

A sensitivity analysis found similar associations between concussion and risk for mood and anxiety disorder among all age groups. Younger participants were at greater risk for ADHD, however, and older participants were at greater risk for dementia and Parkinson’s disease.

Increased awareness of concussion and the outcomes of interest, along with improved diagnostic tools, may have influenced the study’s findings, Dr. Morissette noted. “The sex-based differences may be due to either pathophysiological differences in response to concussive injuries or potentially a difference in willingness to seek medical care or share symptoms, concussion-related or otherwise, with a medical professional,” he said.

“We are hopeful that our findings will encourage practitioners to be cognizant of various conditions that may present in individuals who have previously experienced a concussion,” Dr. Morissette added. “If physicians are aware of the various associations identified following a concussion, it may lead to more thorough clinical examination at initial presentation, along with more dedicated care throughout the patient’s life.”
 

 

 

Association versus causation

Commenting on the research, Steven Erickson, MD, sports medicine specialist at Banner–University Medicine Neuroscience Institute, Phoenix, Ariz., noted that although the study showed an association between concussion and subsequent diagnosis of ADHD, anxiety, and Parkinson’s disease, “this association should not be misconstrued as causation.” He added that the study’s conclusions “are just as likely to be due to labeling theory” or a self-fulfilling prophecy.

“Patients diagnosed with ADHD, anxiety, or Parkinson’s disease may recall concussion and associate the two diagnoses; but patients who have not previously been diagnosed with a concussion cannot draw that conclusion,” said Dr. Erickson, who was not involved with the research.

Citing the apparent gender difference in the strength of the association between concussion and the outcomes of interest, Dr. Erickson noted that women are more likely to report symptoms in general “and therefore are more likely to be diagnosed with ADHD and anxiety disorders” because of differences in reporting rather than incidence of disease.

“Further research needs to be done to definitively determine a causal relationship between concussion and any psychiatric or neurologic diagnosis,” Dr. Erickson concluded.

The study was funded by the Pan Am Clinic Foundation. Dr. Morissette and Dr. Erickson have disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

 

Concussion is associated with increased risk for subsequent development of attention-deficit/hyperactivity disorder (ADHD), as well as dementia and Parkinson’s disease, new research suggests. Results from a retrospective, population-based cohort study showed that controlling for socioeconomic status and overall health did not significantly affect this association.

The link between concussion and risk for ADHD and for mood and anxiety disorder was stronger in the women than in the men. In addition, having a history of multiple concussions strengthened the association between concussion and subsequent mood and anxiety disorder, dementia, and Parkinson’s disease compared with experiencing just one concussion.

The findings are similar to those of previous studies, noted lead author Marc P. Morissette, PhD, research assistant at the Pan Am Clinic Foundation in Winnipeg, Manitoba, Canada. “The main methodological differences separating our study from previous studies in this area is a focus on concussion-specific injuries identified from medical records and the potential for study participants to have up to 25 years of follow-up data,” said Dr. Morissette.

The findings were published online July 27 in Family Medicine and Community Health, a BMJ journal.
 

Almost 190,000 participants

Several studies have shown associations between head injury and increased risk for ADHD, depression, anxiety, Alzheimer’s disease, and Parkinson’s disease. However, many of these studies relied on self-reported medical history, included all forms of traumatic brain injury, and failed to adjust for preexisting health conditions.

An improved understanding of concussion and the risks associated with it could help physicians manage their patients’ long-term needs, the investigators noted.

In the current study, the researchers examined anonymized administrative health data collected between the periods of 1990–1991 and 2014–2015 in the Manitoba Population Research Data Repository at the Manitoba Center for Health Policy.

Eligible patients had been diagnosed with concussion in accordance with standard criteria. Participants were excluded if they had been diagnosed with dementia or Parkinson’s disease before the incident concussion during the study period. The investigators matched three control participants to each included patient on the basis of age, sex, and location.

Study outcome was time from index date (date of first concussion) to diagnosis of ADHD, mood and anxiety disorder, dementia, or Parkinson’s disease. The researchers controlled for socioeconomic status using the Socioeconomic Factor Index, version 2 (SEFI2), and for preexisting medical conditions using the Charlson Comorbidity Index (CCI).

The study included 28,021 men (mean age, 25 years) and 19,462 women (mean age, 30 years) in the concussion group and 81,871 men (mean age, 25 years) and 57,159 women (mean age, 30 years) in the control group. Mean SEFI2 score was approximately −0.05, and mean CCI score was approximately 0.2.
 

Dose effect?

Results showed that concussion was associated with an increased risk for ADHD (hazard ratio [HR], 1.39), mood and anxiety disorder (HR, 1.72), dementia (HR, 1.72), and Parkinson’s disease (HR, 1.57).

After a concussion, the risk of developing ADHD was 28% higher and the risk of developing mood and anxiety disorder was 7% higher among women than among men. Gender was not associated with risk for dementia or Parkinson’s disease after concussion.

Sustaining a second concussion increased the strength of the association with risk for dementia compared with sustaining a single concussion (HR, 1.62). Similarly, sustaining more than three concussions increased the strength of the association with the risk for mood and anxiety disorders (HR for more than three vs one concussion, 1.22) and Parkinson›s disease (HR, 3.27).

A sensitivity analysis found similar associations between concussion and risk for mood and anxiety disorder among all age groups. Younger participants were at greater risk for ADHD, however, and older participants were at greater risk for dementia and Parkinson’s disease.

Increased awareness of concussion and the outcomes of interest, along with improved diagnostic tools, may have influenced the study’s findings, Dr. Morissette noted. “The sex-based differences may be due to either pathophysiological differences in response to concussive injuries or potentially a difference in willingness to seek medical care or share symptoms, concussion-related or otherwise, with a medical professional,” he said.

“We are hopeful that our findings will encourage practitioners to be cognizant of various conditions that may present in individuals who have previously experienced a concussion,” Dr. Morissette added. “If physicians are aware of the various associations identified following a concussion, it may lead to more thorough clinical examination at initial presentation, along with more dedicated care throughout the patient’s life.”
 

 

 

Association versus causation

Commenting on the research, Steven Erickson, MD, sports medicine specialist at Banner–University Medicine Neuroscience Institute, Phoenix, Ariz., noted that although the study showed an association between concussion and subsequent diagnosis of ADHD, anxiety, and Parkinson’s disease, “this association should not be misconstrued as causation.” He added that the study’s conclusions “are just as likely to be due to labeling theory” or a self-fulfilling prophecy.

“Patients diagnosed with ADHD, anxiety, or Parkinson’s disease may recall concussion and associate the two diagnoses; but patients who have not previously been diagnosed with a concussion cannot draw that conclusion,” said Dr. Erickson, who was not involved with the research.

Citing the apparent gender difference in the strength of the association between concussion and the outcomes of interest, Dr. Erickson noted that women are more likely to report symptoms in general “and therefore are more likely to be diagnosed with ADHD and anxiety disorders” because of differences in reporting rather than incidence of disease.

“Further research needs to be done to definitively determine a causal relationship between concussion and any psychiatric or neurologic diagnosis,” Dr. Erickson concluded.

The study was funded by the Pan Am Clinic Foundation. Dr. Morissette and Dr. Erickson have disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 28(9)
Issue
Neurology Reviews- 28(9)
Publications
Publications
Topics
Article Type
Sections
Article Source

From Family Medicine and Community Health

Citation Override
Publish date: August 12, 2020
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
CME ID
226881
Disqus Comments
Default
Gate On Date
Tue, 11/24/2020 - 16:15
Un-Gate On Date
Tue, 11/24/2020 - 16:15
Use ProPublica
CFC Schedule Remove Status
Tue, 11/24/2020 - 16:15
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article

Late-onset epilepsy tied to a threefold increased dementia risk

Article Type
Changed
Thu, 12/15/2022 - 15:43

Late-onset epilepsy is linked to a substantial increased risk of subsequent dementia. Results of a retrospective analysis show that patients who develop epilepsy at age 67 or older have a threefold increased risk of subsequent dementia versus their counterparts without epilepsy.

Dr. Emily L. Johnson

“This is an exciting area, as we are finding that just as the risk of seizures is increased in neurodegenerative diseases, the risk of dementia is increased after late-onset epilepsy and seizures,” study investigator Emily L. Johnson, MD, assistant professor of neurology at Johns Hopkins University, Baltimore, said in an interview. “Several other cohort studies are finding similar results, including the Veterans’ Health Study and the Framingham Study,” she added.

The study was published online Oct. 23 in Neurology
 

Bidirectional relationship?

Previous research has established that dementia is a risk factor for epilepsy, but recent studies also suggest an increased risk of incident dementia among patients with adult-onset epilepsy. Several risk factors for late-onset epilepsy, including diabetes and hypertension, also are risk factors for dementia. However, the effect of late-onset epilepsy on dementia risk in patients with these comorbidities has not been clarified.

To investigate, the researchers examined data from the Atherosclerosis Risk in Communities (ARIC) study. Participants include Black and White men and women from four U.S. communities. Baseline visits in this longitudinal cohort study began between 1987 and 1989, and follow-up included seven additional visits and regular phone calls.

The investigators identified participants with late-onset epilepsy by searching for Medicare claims related to seizures or epilepsy filed between 1991 and 2015. Those with two or more such claims and age of onset of 67 years or greater were considered to have late-onset epilepsy. Participants with preexisting conditions such as brain tumors or multiple sclerosis were excluded.

ARIC participants who presented in person for visits 2, 4, 5, and 6 underwent cognitive testing with the Delayed Word Recall Test, the Digit Symbol Substitution Test, and the Word Fluency Test.

Testing at visits 5 and 6 also included other tests, such as the Mini-Mental State Examination, the Boston Naming test, and the Wechsler Memory Scale-III. Dr. Johnson and colleagues excluded data for visit 7 from the analysis because dementia adjudication was not yet complete.

The researchers identified participants with dementia using data from visits 5 and 6 and ascertained time of dementia onset through participant and informant interviews, phone calls, and hospital discharge data. Participants also were screened for mild cognitive impairment (MCI) at visits 5 and 6.

Data were analyzed using a Cox proportional hazards model and multinomial logistic regression. In subsequent analyses, researchers adjusted the data for age, sex, race, smoking status, alcohol use, hypertension, diabetes, body mass index (BMI), APOE4 status, and prevalent stroke.

The researchers found that of 9,033 study participants, 671 had late-onset epilepsy. The late-onset epilepsy group was older at baseline (56.5 vs. 55.1 years) and more likely to have hypertension (38.9% vs. 33.3%), diabetes (16.1% vs. 9.6%), and two alleles of APOE4 genotype (3.9% vs. 2.5%), compared with those without the disorder.

In all, 1,687 participants developed dementia during follow-up. The rate of incident dementia was 41.6% in participants with late-onset epilepsy and 16.8% in participants without late-onset epilepsy. The adjusted hazard ratio of subsequent dementia in participants with late-onset epilepsy versus those without the disorder was 3.05 (95% confidence interval, 2.65-3.51).

The median time to dementia ascertainment after late-onset epilepsy was 3.66 years.
 

 

 

Counterintuitive finding

The relationship between late-onset epilepsy and subsequent dementia was stronger in patients without stroke. The investigators offered a possible explanation for this counterintuitive finding. “We observed an interaction between [late-onset epilepsy] and stroke, with a lower (but still substantial) association between [late-onset epilepsy] and dementia in those with a history of stroke. This may be due to the known strong association between stroke and dementia, which may wash out the contributions of [late-onset epilepsy] to cognitive impairment,” the researchers wrote.

“There may also be under-capturing of dementia diagnoses among participants with stroke in the ascertainment from [Centers for Medicare & Medicaid Services] codes, as physicians may be reluctant to make a separate code for ‘dementia’ in those with cognitive impairment after stroke,” they added.

When the researchers restricted the analysis only to participants who attended visits 5 and 6 and had late-onset epilepsy ascertainment available, they found that the relative risk ratio for dementia at visit 6 was 2.90 (95% CI, 1.22-6.92; P = .009). The RRR for MCI was 0.97 (95% CI, 0.39-2.38; P = .803). The greater functional impairment in patients with late-onset epilepsy may explain the lack of a relationship between late-onset epilepsy and MCI.

“It will be important for neurologists to be aware of the possibility of cognitive impairment following late-onset epilepsy and to check in with patients and family members to see if there are concerns,” said Dr. Johnson.

“We should also be talking about the importance of lowering other risk factors for dementia by making sure cardiovascular risk factors are controlled and encouraging physical and cognitive activity,” she added.

The results require confirmation in a clinical population, the investigators noted. In addition, future research is necessary to clarify whether seizures directly increase the risk of dementia or whether shared neuropathology between epilepsy and dementia explains the risk.

“In the near future, I plan to enroll participants with late-onset epilepsy in an observational study to better understand factors that may contribute to cognitive change. Collaborations will be key as we seek to further understand what causes these changes and what could be done to prevent them,” Dr. Johnson added.
 

Strengths and weaknesses

In an accompanying editorial, W. Allen Hauser, MD, professor emeritus of neurology and epidemiology at Columbia University in New York, and colleagues noted that the findings support a bidirectional relationship between dementia and epilepsy, adding that accumulation of amyloid beta peptide is a plausible underlying pathophysiology that may explain this relationship.

Future research should clarify the effect of factors such as seizure type, seizure frequency, and age of onset on the risk of dementia among patients with epilepsy, the editorialists wrote. Such investigations could help elucidate the underlying mechanisms of these conditions and help to improve treatment, they added.

Commenting on the findings, Ilo Leppik, MD, professor of neurology and pharmacy at the University of Minnesota in Minneapolis described the research as “a very well-done study by qualified researchers in the field. … For the last century, medicine has unfortunately become compartmentalized by specialty and then subspecialty. The brain and disorders of the brain do not recognize these silos. … It is not a stretch of the known science to begin to understand that epilepsy and dementia have common anatomical and physiological underpinnings.”

The long period of prospectively gathering data and the measurement of cognitive function through various modalities are among the study’s great strengths, said Dr. Leppik. However, the study’s weakness is its reliance on Medicare claims data, which mainly would reflect convulsive seizures.

“What is missing is how many persons had subtle focal-unaware seizures that may not be identified unless a careful history is taken,” said Dr. Leppik. “Thus, this study likely underestimates the frequency of epilepsy.”

Neurologists who evaluate a person with early dementia should be on the lookout for a history of subtle seizures, said Dr. Leppik. Animal studies suggest treatment with levetiracetam or brivaracetam may slow the course of dementia, and a clinical study in participants with early dementia is underway.

“Treatment with an antiseizure drug may prove to be beneficial, especially if evidence for the presence of subtle epilepsy can be found,” Dr. Leppik added.

Greater collaboration between epileptologists and dementia specialists and larger studies of antiseizure drugs are necessary, he noted. “These studies can incorporate sophisticated structural and biochemical [analyses] to better identify the relationships between brain mechanisms that likely underlie both seizures and dementia. The ultimate promise is that early treatment of seizures may alter the course of dementia,” Dr. Leppik said.

The study by Dr. Johnson and colleagues was supported by a contract from the National Institute on Aging; ARIC from the National Heart, Lung, and Blood Institute; the National Institutes of Health; and the Department of Health & Human Services. The authors and Dr. Leppik have disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 28(12)
Publications
Topics
Sections

Late-onset epilepsy is linked to a substantial increased risk of subsequent dementia. Results of a retrospective analysis show that patients who develop epilepsy at age 67 or older have a threefold increased risk of subsequent dementia versus their counterparts without epilepsy.

Dr. Emily L. Johnson

“This is an exciting area, as we are finding that just as the risk of seizures is increased in neurodegenerative diseases, the risk of dementia is increased after late-onset epilepsy and seizures,” study investigator Emily L. Johnson, MD, assistant professor of neurology at Johns Hopkins University, Baltimore, said in an interview. “Several other cohort studies are finding similar results, including the Veterans’ Health Study and the Framingham Study,” she added.

The study was published online Oct. 23 in Neurology
 

Bidirectional relationship?

Previous research has established that dementia is a risk factor for epilepsy, but recent studies also suggest an increased risk of incident dementia among patients with adult-onset epilepsy. Several risk factors for late-onset epilepsy, including diabetes and hypertension, also are risk factors for dementia. However, the effect of late-onset epilepsy on dementia risk in patients with these comorbidities has not been clarified.

To investigate, the researchers examined data from the Atherosclerosis Risk in Communities (ARIC) study. Participants include Black and White men and women from four U.S. communities. Baseline visits in this longitudinal cohort study began between 1987 and 1989, and follow-up included seven additional visits and regular phone calls.

The investigators identified participants with late-onset epilepsy by searching for Medicare claims related to seizures or epilepsy filed between 1991 and 2015. Those with two or more such claims and age of onset of 67 years or greater were considered to have late-onset epilepsy. Participants with preexisting conditions such as brain tumors or multiple sclerosis were excluded.

ARIC participants who presented in person for visits 2, 4, 5, and 6 underwent cognitive testing with the Delayed Word Recall Test, the Digit Symbol Substitution Test, and the Word Fluency Test.

Testing at visits 5 and 6 also included other tests, such as the Mini-Mental State Examination, the Boston Naming test, and the Wechsler Memory Scale-III. Dr. Johnson and colleagues excluded data for visit 7 from the analysis because dementia adjudication was not yet complete.

The researchers identified participants with dementia using data from visits 5 and 6 and ascertained time of dementia onset through participant and informant interviews, phone calls, and hospital discharge data. Participants also were screened for mild cognitive impairment (MCI) at visits 5 and 6.

Data were analyzed using a Cox proportional hazards model and multinomial logistic regression. In subsequent analyses, researchers adjusted the data for age, sex, race, smoking status, alcohol use, hypertension, diabetes, body mass index (BMI), APOE4 status, and prevalent stroke.

The researchers found that of 9,033 study participants, 671 had late-onset epilepsy. The late-onset epilepsy group was older at baseline (56.5 vs. 55.1 years) and more likely to have hypertension (38.9% vs. 33.3%), diabetes (16.1% vs. 9.6%), and two alleles of APOE4 genotype (3.9% vs. 2.5%), compared with those without the disorder.

In all, 1,687 participants developed dementia during follow-up. The rate of incident dementia was 41.6% in participants with late-onset epilepsy and 16.8% in participants without late-onset epilepsy. The adjusted hazard ratio of subsequent dementia in participants with late-onset epilepsy versus those without the disorder was 3.05 (95% confidence interval, 2.65-3.51).

The median time to dementia ascertainment after late-onset epilepsy was 3.66 years.
 

 

 

Counterintuitive finding

The relationship between late-onset epilepsy and subsequent dementia was stronger in patients without stroke. The investigators offered a possible explanation for this counterintuitive finding. “We observed an interaction between [late-onset epilepsy] and stroke, with a lower (but still substantial) association between [late-onset epilepsy] and dementia in those with a history of stroke. This may be due to the known strong association between stroke and dementia, which may wash out the contributions of [late-onset epilepsy] to cognitive impairment,” the researchers wrote.

“There may also be under-capturing of dementia diagnoses among participants with stroke in the ascertainment from [Centers for Medicare & Medicaid Services] codes, as physicians may be reluctant to make a separate code for ‘dementia’ in those with cognitive impairment after stroke,” they added.

When the researchers restricted the analysis only to participants who attended visits 5 and 6 and had late-onset epilepsy ascertainment available, they found that the relative risk ratio for dementia at visit 6 was 2.90 (95% CI, 1.22-6.92; P = .009). The RRR for MCI was 0.97 (95% CI, 0.39-2.38; P = .803). The greater functional impairment in patients with late-onset epilepsy may explain the lack of a relationship between late-onset epilepsy and MCI.

“It will be important for neurologists to be aware of the possibility of cognitive impairment following late-onset epilepsy and to check in with patients and family members to see if there are concerns,” said Dr. Johnson.

“We should also be talking about the importance of lowering other risk factors for dementia by making sure cardiovascular risk factors are controlled and encouraging physical and cognitive activity,” she added.

The results require confirmation in a clinical population, the investigators noted. In addition, future research is necessary to clarify whether seizures directly increase the risk of dementia or whether shared neuropathology between epilepsy and dementia explains the risk.

“In the near future, I plan to enroll participants with late-onset epilepsy in an observational study to better understand factors that may contribute to cognitive change. Collaborations will be key as we seek to further understand what causes these changes and what could be done to prevent them,” Dr. Johnson added.
 

Strengths and weaknesses

In an accompanying editorial, W. Allen Hauser, MD, professor emeritus of neurology and epidemiology at Columbia University in New York, and colleagues noted that the findings support a bidirectional relationship between dementia and epilepsy, adding that accumulation of amyloid beta peptide is a plausible underlying pathophysiology that may explain this relationship.

Future research should clarify the effect of factors such as seizure type, seizure frequency, and age of onset on the risk of dementia among patients with epilepsy, the editorialists wrote. Such investigations could help elucidate the underlying mechanisms of these conditions and help to improve treatment, they added.

Commenting on the findings, Ilo Leppik, MD, professor of neurology and pharmacy at the University of Minnesota in Minneapolis described the research as “a very well-done study by qualified researchers in the field. … For the last century, medicine has unfortunately become compartmentalized by specialty and then subspecialty. The brain and disorders of the brain do not recognize these silos. … It is not a stretch of the known science to begin to understand that epilepsy and dementia have common anatomical and physiological underpinnings.”

The long period of prospectively gathering data and the measurement of cognitive function through various modalities are among the study’s great strengths, said Dr. Leppik. However, the study’s weakness is its reliance on Medicare claims data, which mainly would reflect convulsive seizures.

“What is missing is how many persons had subtle focal-unaware seizures that may not be identified unless a careful history is taken,” said Dr. Leppik. “Thus, this study likely underestimates the frequency of epilepsy.”

Neurologists who evaluate a person with early dementia should be on the lookout for a history of subtle seizures, said Dr. Leppik. Animal studies suggest treatment with levetiracetam or brivaracetam may slow the course of dementia, and a clinical study in participants with early dementia is underway.

“Treatment with an antiseizure drug may prove to be beneficial, especially if evidence for the presence of subtle epilepsy can be found,” Dr. Leppik added.

Greater collaboration between epileptologists and dementia specialists and larger studies of antiseizure drugs are necessary, he noted. “These studies can incorporate sophisticated structural and biochemical [analyses] to better identify the relationships between brain mechanisms that likely underlie both seizures and dementia. The ultimate promise is that early treatment of seizures may alter the course of dementia,” Dr. Leppik said.

The study by Dr. Johnson and colleagues was supported by a contract from the National Institute on Aging; ARIC from the National Heart, Lung, and Blood Institute; the National Institutes of Health; and the Department of Health & Human Services. The authors and Dr. Leppik have disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Late-onset epilepsy is linked to a substantial increased risk of subsequent dementia. Results of a retrospective analysis show that patients who develop epilepsy at age 67 or older have a threefold increased risk of subsequent dementia versus their counterparts without epilepsy.

Dr. Emily L. Johnson

“This is an exciting area, as we are finding that just as the risk of seizures is increased in neurodegenerative diseases, the risk of dementia is increased after late-onset epilepsy and seizures,” study investigator Emily L. Johnson, MD, assistant professor of neurology at Johns Hopkins University, Baltimore, said in an interview. “Several other cohort studies are finding similar results, including the Veterans’ Health Study and the Framingham Study,” she added.

The study was published online Oct. 23 in Neurology
 

Bidirectional relationship?

Previous research has established that dementia is a risk factor for epilepsy, but recent studies also suggest an increased risk of incident dementia among patients with adult-onset epilepsy. Several risk factors for late-onset epilepsy, including diabetes and hypertension, also are risk factors for dementia. However, the effect of late-onset epilepsy on dementia risk in patients with these comorbidities has not been clarified.

To investigate, the researchers examined data from the Atherosclerosis Risk in Communities (ARIC) study. Participants include Black and White men and women from four U.S. communities. Baseline visits in this longitudinal cohort study began between 1987 and 1989, and follow-up included seven additional visits and regular phone calls.

The investigators identified participants with late-onset epilepsy by searching for Medicare claims related to seizures or epilepsy filed between 1991 and 2015. Those with two or more such claims and age of onset of 67 years or greater were considered to have late-onset epilepsy. Participants with preexisting conditions such as brain tumors or multiple sclerosis were excluded.

ARIC participants who presented in person for visits 2, 4, 5, and 6 underwent cognitive testing with the Delayed Word Recall Test, the Digit Symbol Substitution Test, and the Word Fluency Test.

Testing at visits 5 and 6 also included other tests, such as the Mini-Mental State Examination, the Boston Naming test, and the Wechsler Memory Scale-III. Dr. Johnson and colleagues excluded data for visit 7 from the analysis because dementia adjudication was not yet complete.

The researchers identified participants with dementia using data from visits 5 and 6 and ascertained time of dementia onset through participant and informant interviews, phone calls, and hospital discharge data. Participants also were screened for mild cognitive impairment (MCI) at visits 5 and 6.

Data were analyzed using a Cox proportional hazards model and multinomial logistic regression. In subsequent analyses, researchers adjusted the data for age, sex, race, smoking status, alcohol use, hypertension, diabetes, body mass index (BMI), APOE4 status, and prevalent stroke.

The researchers found that of 9,033 study participants, 671 had late-onset epilepsy. The late-onset epilepsy group was older at baseline (56.5 vs. 55.1 years) and more likely to have hypertension (38.9% vs. 33.3%), diabetes (16.1% vs. 9.6%), and two alleles of APOE4 genotype (3.9% vs. 2.5%), compared with those without the disorder.

In all, 1,687 participants developed dementia during follow-up. The rate of incident dementia was 41.6% in participants with late-onset epilepsy and 16.8% in participants without late-onset epilepsy. The adjusted hazard ratio of subsequent dementia in participants with late-onset epilepsy versus those without the disorder was 3.05 (95% confidence interval, 2.65-3.51).

The median time to dementia ascertainment after late-onset epilepsy was 3.66 years.
 

 

 

Counterintuitive finding

The relationship between late-onset epilepsy and subsequent dementia was stronger in patients without stroke. The investigators offered a possible explanation for this counterintuitive finding. “We observed an interaction between [late-onset epilepsy] and stroke, with a lower (but still substantial) association between [late-onset epilepsy] and dementia in those with a history of stroke. This may be due to the known strong association between stroke and dementia, which may wash out the contributions of [late-onset epilepsy] to cognitive impairment,” the researchers wrote.

“There may also be under-capturing of dementia diagnoses among participants with stroke in the ascertainment from [Centers for Medicare & Medicaid Services] codes, as physicians may be reluctant to make a separate code for ‘dementia’ in those with cognitive impairment after stroke,” they added.

When the researchers restricted the analysis only to participants who attended visits 5 and 6 and had late-onset epilepsy ascertainment available, they found that the relative risk ratio for dementia at visit 6 was 2.90 (95% CI, 1.22-6.92; P = .009). The RRR for MCI was 0.97 (95% CI, 0.39-2.38; P = .803). The greater functional impairment in patients with late-onset epilepsy may explain the lack of a relationship between late-onset epilepsy and MCI.

“It will be important for neurologists to be aware of the possibility of cognitive impairment following late-onset epilepsy and to check in with patients and family members to see if there are concerns,” said Dr. Johnson.

“We should also be talking about the importance of lowering other risk factors for dementia by making sure cardiovascular risk factors are controlled and encouraging physical and cognitive activity,” she added.

The results require confirmation in a clinical population, the investigators noted. In addition, future research is necessary to clarify whether seizures directly increase the risk of dementia or whether shared neuropathology between epilepsy and dementia explains the risk.

“In the near future, I plan to enroll participants with late-onset epilepsy in an observational study to better understand factors that may contribute to cognitive change. Collaborations will be key as we seek to further understand what causes these changes and what could be done to prevent them,” Dr. Johnson added.
 

Strengths and weaknesses

In an accompanying editorial, W. Allen Hauser, MD, professor emeritus of neurology and epidemiology at Columbia University in New York, and colleagues noted that the findings support a bidirectional relationship between dementia and epilepsy, adding that accumulation of amyloid beta peptide is a plausible underlying pathophysiology that may explain this relationship.

Future research should clarify the effect of factors such as seizure type, seizure frequency, and age of onset on the risk of dementia among patients with epilepsy, the editorialists wrote. Such investigations could help elucidate the underlying mechanisms of these conditions and help to improve treatment, they added.

Commenting on the findings, Ilo Leppik, MD, professor of neurology and pharmacy at the University of Minnesota in Minneapolis described the research as “a very well-done study by qualified researchers in the field. … For the last century, medicine has unfortunately become compartmentalized by specialty and then subspecialty. The brain and disorders of the brain do not recognize these silos. … It is not a stretch of the known science to begin to understand that epilepsy and dementia have common anatomical and physiological underpinnings.”

The long period of prospectively gathering data and the measurement of cognitive function through various modalities are among the study’s great strengths, said Dr. Leppik. However, the study’s weakness is its reliance on Medicare claims data, which mainly would reflect convulsive seizures.

“What is missing is how many persons had subtle focal-unaware seizures that may not be identified unless a careful history is taken,” said Dr. Leppik. “Thus, this study likely underestimates the frequency of epilepsy.”

Neurologists who evaluate a person with early dementia should be on the lookout for a history of subtle seizures, said Dr. Leppik. Animal studies suggest treatment with levetiracetam or brivaracetam may slow the course of dementia, and a clinical study in participants with early dementia is underway.

“Treatment with an antiseizure drug may prove to be beneficial, especially if evidence for the presence of subtle epilepsy can be found,” Dr. Leppik added.

Greater collaboration between epileptologists and dementia specialists and larger studies of antiseizure drugs are necessary, he noted. “These studies can incorporate sophisticated structural and biochemical [analyses] to better identify the relationships between brain mechanisms that likely underlie both seizures and dementia. The ultimate promise is that early treatment of seizures may alter the course of dementia,” Dr. Leppik said.

The study by Dr. Johnson and colleagues was supported by a contract from the National Institute on Aging; ARIC from the National Heart, Lung, and Blood Institute; the National Institutes of Health; and the Department of Health & Human Services. The authors and Dr. Leppik have disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Issue
Neurology Reviews- 28(12)
Issue
Neurology Reviews- 28(12)
Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM NEUROLOGY

Citation Override
Publish date: November 12, 2020
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article