Dabigatran raises major bleeding risk

Higher bleeding risk than warfarin’s
Article Type
Changed
Wed, 05/26/2021 - 13:58
Display Headline
Dabigatran raises major bleeding risk

Dabigatran significantly raises the risk of major bleeding and gastrointestinal bleeding across all subgroups of patients with atrial fibrillation, and particularly in African Americans and patients with chronic kidney disease, according to a report published online Nov. 3 in JAMA Internal Medicine.

Physicians should only prescribe dabigatran with caution, and should fully explain to patients who do take the drug how to identify abnormal bleeding so that it can be detected and controlled as early as possible, said Inmaculada Hernandez, Pharm.D., of the department of health policy and management, University of Pittsburgh, and her associates.

The FDA approved dabigatran in 2010 via an accelerated pathway after only 6 months of review, based largely on findings from a single clinical study, the RE-LY (Randomized Evaluation of Long-Term Anticoagulation Therapy) trial, which did not adjust for patient characteristics (N. Engl. J. Med. 2009;361:1139-51). That study reported lower bleeding risks with dabigatran than with warfarin. Several months later, the agency’s Adverse Event Reporting System received “a large number” of reports of severe bleeding associated with dabigatran; and the relative bleeding risk associated with the two drugs is still unclear.

Dr. Hernandez and her colleagues examined the issue using data from a nationally representative random sample of 9,404 Medicare beneficiaries newly diagnosed as having nonvalvular atrial fibrillation during a 1-year period and treated in real-world practice. A total of 1,302 patients were given dabigatran and 8,102 were given warfarin to prevent stroke and systemic embolism. They were followed for a median of about 200 days, until discontinuing or switching their anticoagulant, dying, or reaching the study’s cutoff date. Nine categories of bleeding were assessed, and the data were adjusted to account for numerous demographic and clinical characteristics known to affect bleeding risk.

Compared with warfarin, dabigatran was associated with a significantly higher risk of major bleeding (9.0% vs 5.9%), with a hazard ratio of 1.58. Dabigatran also was associated with a significantly higher risk of GI bleeding (HR, 1.85), hematuria (HR, 1.41), vaginal bleeding (HR, 2.27), hemarthrosis (HR, 2.78), and hemoptysis (HR, 1.49). In contrast, dabigatran was associated with a slightly lower (0.6%) rate of intracranial bleeding, and also with lower rates of epistaxis and nonspecified bleeding, the investigators reported (JAMA Intern. Med. 2014 Nov. 3 [doi: 10.1001/jamainternmed.2014.5398]).

These differences were consistent across numerous subgroups of patients assessed, and were especially strong among African Americans and patients with chronic kidney disease.

This study was supported by the Commonwealth Foundation and the U.S. Agency for Healthcare Research and Quality. Dr. Hernandez and her associates reported having no financial conflicts of interest.

References

Body

The bleeding risk for dabigatran appears to be higher than that for warfarin and significantly greater than it initially seemed at the time of FDA approval.

Hernandez et al. noted that the study on which the FDA based its approval failed to adjust for important differences in patient characteristics, which likely biased the results. They remind us that postmarketing data are crucial for us to advise our patients accurately.

Dr. Rita F. Redberg is the editor of JAMA Internal Medicine and director of women’s cardiovascular services at the Philip R. Lee Institute for Health Policy Studies at the University of California, San Francisco, Medical Center. She reported no financial conflicts of interest. Dr. Redberg made these remarks in an Editor’s Note accompanying Dr. Hernandez’s report (JAMA Intern. Med. 2014 Nov. 3).

Author and Disclosure Information

Publications
Topics
Legacy Keywords
dabigatran, bleeding, RE-LY, FDA, warfarin, atrial fibrillation
Author and Disclosure Information

Author and Disclosure Information

Related Articles
Body

The bleeding risk for dabigatran appears to be higher than that for warfarin and significantly greater than it initially seemed at the time of FDA approval.

Hernandez et al. noted that the study on which the FDA based its approval failed to adjust for important differences in patient characteristics, which likely biased the results. They remind us that postmarketing data are crucial for us to advise our patients accurately.

Dr. Rita F. Redberg is the editor of JAMA Internal Medicine and director of women’s cardiovascular services at the Philip R. Lee Institute for Health Policy Studies at the University of California, San Francisco, Medical Center. She reported no financial conflicts of interest. Dr. Redberg made these remarks in an Editor’s Note accompanying Dr. Hernandez’s report (JAMA Intern. Med. 2014 Nov. 3).

Body

The bleeding risk for dabigatran appears to be higher than that for warfarin and significantly greater than it initially seemed at the time of FDA approval.

Hernandez et al. noted that the study on which the FDA based its approval failed to adjust for important differences in patient characteristics, which likely biased the results. They remind us that postmarketing data are crucial for us to advise our patients accurately.

Dr. Rita F. Redberg is the editor of JAMA Internal Medicine and director of women’s cardiovascular services at the Philip R. Lee Institute for Health Policy Studies at the University of California, San Francisco, Medical Center. She reported no financial conflicts of interest. Dr. Redberg made these remarks in an Editor’s Note accompanying Dr. Hernandez’s report (JAMA Intern. Med. 2014 Nov. 3).

Title
Higher bleeding risk than warfarin’s
Higher bleeding risk than warfarin’s

Dabigatran significantly raises the risk of major bleeding and gastrointestinal bleeding across all subgroups of patients with atrial fibrillation, and particularly in African Americans and patients with chronic kidney disease, according to a report published online Nov. 3 in JAMA Internal Medicine.

Physicians should only prescribe dabigatran with caution, and should fully explain to patients who do take the drug how to identify abnormal bleeding so that it can be detected and controlled as early as possible, said Inmaculada Hernandez, Pharm.D., of the department of health policy and management, University of Pittsburgh, and her associates.

The FDA approved dabigatran in 2010 via an accelerated pathway after only 6 months of review, based largely on findings from a single clinical study, the RE-LY (Randomized Evaluation of Long-Term Anticoagulation Therapy) trial, which did not adjust for patient characteristics (N. Engl. J. Med. 2009;361:1139-51). That study reported lower bleeding risks with dabigatran than with warfarin. Several months later, the agency’s Adverse Event Reporting System received “a large number” of reports of severe bleeding associated with dabigatran; and the relative bleeding risk associated with the two drugs is still unclear.

Dr. Hernandez and her colleagues examined the issue using data from a nationally representative random sample of 9,404 Medicare beneficiaries newly diagnosed as having nonvalvular atrial fibrillation during a 1-year period and treated in real-world practice. A total of 1,302 patients were given dabigatran and 8,102 were given warfarin to prevent stroke and systemic embolism. They were followed for a median of about 200 days, until discontinuing or switching their anticoagulant, dying, or reaching the study’s cutoff date. Nine categories of bleeding were assessed, and the data were adjusted to account for numerous demographic and clinical characteristics known to affect bleeding risk.

Compared with warfarin, dabigatran was associated with a significantly higher risk of major bleeding (9.0% vs 5.9%), with a hazard ratio of 1.58. Dabigatran also was associated with a significantly higher risk of GI bleeding (HR, 1.85), hematuria (HR, 1.41), vaginal bleeding (HR, 2.27), hemarthrosis (HR, 2.78), and hemoptysis (HR, 1.49). In contrast, dabigatran was associated with a slightly lower (0.6%) rate of intracranial bleeding, and also with lower rates of epistaxis and nonspecified bleeding, the investigators reported (JAMA Intern. Med. 2014 Nov. 3 [doi: 10.1001/jamainternmed.2014.5398]).

These differences were consistent across numerous subgroups of patients assessed, and were especially strong among African Americans and patients with chronic kidney disease.

This study was supported by the Commonwealth Foundation and the U.S. Agency for Healthcare Research and Quality. Dr. Hernandez and her associates reported having no financial conflicts of interest.

Dabigatran significantly raises the risk of major bleeding and gastrointestinal bleeding across all subgroups of patients with atrial fibrillation, and particularly in African Americans and patients with chronic kidney disease, according to a report published online Nov. 3 in JAMA Internal Medicine.

Physicians should only prescribe dabigatran with caution, and should fully explain to patients who do take the drug how to identify abnormal bleeding so that it can be detected and controlled as early as possible, said Inmaculada Hernandez, Pharm.D., of the department of health policy and management, University of Pittsburgh, and her associates.

The FDA approved dabigatran in 2010 via an accelerated pathway after only 6 months of review, based largely on findings from a single clinical study, the RE-LY (Randomized Evaluation of Long-Term Anticoagulation Therapy) trial, which did not adjust for patient characteristics (N. Engl. J. Med. 2009;361:1139-51). That study reported lower bleeding risks with dabigatran than with warfarin. Several months later, the agency’s Adverse Event Reporting System received “a large number” of reports of severe bleeding associated with dabigatran; and the relative bleeding risk associated with the two drugs is still unclear.

Dr. Hernandez and her colleagues examined the issue using data from a nationally representative random sample of 9,404 Medicare beneficiaries newly diagnosed as having nonvalvular atrial fibrillation during a 1-year period and treated in real-world practice. A total of 1,302 patients were given dabigatran and 8,102 were given warfarin to prevent stroke and systemic embolism. They were followed for a median of about 200 days, until discontinuing or switching their anticoagulant, dying, or reaching the study’s cutoff date. Nine categories of bleeding were assessed, and the data were adjusted to account for numerous demographic and clinical characteristics known to affect bleeding risk.

Compared with warfarin, dabigatran was associated with a significantly higher risk of major bleeding (9.0% vs 5.9%), with a hazard ratio of 1.58. Dabigatran also was associated with a significantly higher risk of GI bleeding (HR, 1.85), hematuria (HR, 1.41), vaginal bleeding (HR, 2.27), hemarthrosis (HR, 2.78), and hemoptysis (HR, 1.49). In contrast, dabigatran was associated with a slightly lower (0.6%) rate of intracranial bleeding, and also with lower rates of epistaxis and nonspecified bleeding, the investigators reported (JAMA Intern. Med. 2014 Nov. 3 [doi: 10.1001/jamainternmed.2014.5398]).

These differences were consistent across numerous subgroups of patients assessed, and were especially strong among African Americans and patients with chronic kidney disease.

This study was supported by the Commonwealth Foundation and the U.S. Agency for Healthcare Research and Quality. Dr. Hernandez and her associates reported having no financial conflicts of interest.

References

References

Publications
Publications
Topics
Article Type
Display Headline
Dabigatran raises major bleeding risk
Display Headline
Dabigatran raises major bleeding risk
Legacy Keywords
dabigatran, bleeding, RE-LY, FDA, warfarin, atrial fibrillation
Legacy Keywords
dabigatran, bleeding, RE-LY, FDA, warfarin, atrial fibrillation
Article Source

FROM JAMA INTERNAL MEDICINE

PURLs Copyright

Inside the Article

Vitals

Key clinical point: Dabigatran raises the risk of major bleeding, contrary to initial reports that fast-tracked FDA approval.

Major finding: Compared with warfarin, dabigatran was associated with a significantly higher risk of major bleeding (9.0% vs. 5.9%), with a hazard ratio of 1.58.

Data source: A retrospective cohort study of bleeding risks in 1,302 dabigatran users and 8,102 warfarin users who had newly diagnosed nonvalvular atrial fibrillation.

Disclosures: This study was supported by the Commonwealth Foundation and the U.S. Agency for Healthcare Research and Quality. Dr. Hernandez and her associates reported having no financial conflicts of interest.

t-PA May Boost Recovery From Traumatic Brain Injury

Article Type
Changed
Thu, 12/15/2022 - 16:15
Display Headline
t-PA May Boost Recovery From Traumatic Brain Injury

When administered as a nasal spray, t-PA may improve functional recovery in patients with less severe forms of traumatic brain injury (TBI), according to a study published September 3 in PLoS One.

Seven days after laboratory rats withstood TBI, investigators treated them intranasally with saline or t-PA. Compared with saline treatment, subacute intranasal t-PA treatment significantly improved the animals’ cognitive and sensorimotor functional recovery, reduced the cortical stimulation threshold evoking ipsilateral forelimb movement, enhanced neurogenesis in the dentate gyrus and axonal sprouting of the corticospinal tract originating from the contralesional cortex into the denervated side of the cervical gray matter, and increased the level of mature brain-derived neurotrophic factor.

“Using this novel procedure in our earlier stroke studies, we found significant improvement in neurologic function,” said Michael Chopp, PhD, Scientific Director of the Henry Ford Neuroscience Institute in Detroit. “We essentially repeated the experiment on laboratory rats with subacute TBI with similarly remarkable results. As in stroke treated intranasally with t-PA, our subjects showed greatly improved functional outcome and rewiring of the cortical spinal tract.”

Although the damage resulting from stroke can be reduced if t-PA is administered intravenously within 4.5 hours, IV t-PA also has potentially harmful side effects, including swelling of the brain and hemorrhage. Researchers at Henry Ford Hospital found that the effective treatment window could be extended to as long as two weeks for laboratory rats dosed with t-PA in a nasal spray, which avoids the harmful side effects of IV injection.

Previous research has indicated that drugs administered through the nose directly target the brain and spinal cord, although researchers do not yet fully understand how this targeting occurs. Although the new study offers hope that a drug treatment will emerge, no effective pharmacologic therapy is available yet.

These most recent findings suggest that t-PA has the potential to be a noninvasive treatment for subacute TBI, thus helping the brain restore function to damaged cells. The investigators noted that further animal studies will be required to determine the best dose and the appropriate time window for optimal intranasal treatment.

References

Suggested Reading
Meng Y, Chopp M, Zhang Y, et al. Subacute intranasal administration of tissue plasminogen activator promotes neuroplasticity and improves functional recovery following traumatic brain injury in rats. PLoS One. 2014 Sep 3;9(9):e106238.

Author and Disclosure Information

Issue
Neurology Reviews - 22(11)
Publications
Topics
Page Number
27
Legacy Keywords
t-PA, traumatic brain injury, PLos One, Neurology Reviews, TBI, Michael Chopp
Sections
Author and Disclosure Information

Author and Disclosure Information

When administered as a nasal spray, t-PA may improve functional recovery in patients with less severe forms of traumatic brain injury (TBI), according to a study published September 3 in PLoS One.

Seven days after laboratory rats withstood TBI, investigators treated them intranasally with saline or t-PA. Compared with saline treatment, subacute intranasal t-PA treatment significantly improved the animals’ cognitive and sensorimotor functional recovery, reduced the cortical stimulation threshold evoking ipsilateral forelimb movement, enhanced neurogenesis in the dentate gyrus and axonal sprouting of the corticospinal tract originating from the contralesional cortex into the denervated side of the cervical gray matter, and increased the level of mature brain-derived neurotrophic factor.

“Using this novel procedure in our earlier stroke studies, we found significant improvement in neurologic function,” said Michael Chopp, PhD, Scientific Director of the Henry Ford Neuroscience Institute in Detroit. “We essentially repeated the experiment on laboratory rats with subacute TBI with similarly remarkable results. As in stroke treated intranasally with t-PA, our subjects showed greatly improved functional outcome and rewiring of the cortical spinal tract.”

Although the damage resulting from stroke can be reduced if t-PA is administered intravenously within 4.5 hours, IV t-PA also has potentially harmful side effects, including swelling of the brain and hemorrhage. Researchers at Henry Ford Hospital found that the effective treatment window could be extended to as long as two weeks for laboratory rats dosed with t-PA in a nasal spray, which avoids the harmful side effects of IV injection.

Previous research has indicated that drugs administered through the nose directly target the brain and spinal cord, although researchers do not yet fully understand how this targeting occurs. Although the new study offers hope that a drug treatment will emerge, no effective pharmacologic therapy is available yet.

These most recent findings suggest that t-PA has the potential to be a noninvasive treatment for subacute TBI, thus helping the brain restore function to damaged cells. The investigators noted that further animal studies will be required to determine the best dose and the appropriate time window for optimal intranasal treatment.

When administered as a nasal spray, t-PA may improve functional recovery in patients with less severe forms of traumatic brain injury (TBI), according to a study published September 3 in PLoS One.

Seven days after laboratory rats withstood TBI, investigators treated them intranasally with saline or t-PA. Compared with saline treatment, subacute intranasal t-PA treatment significantly improved the animals’ cognitive and sensorimotor functional recovery, reduced the cortical stimulation threshold evoking ipsilateral forelimb movement, enhanced neurogenesis in the dentate gyrus and axonal sprouting of the corticospinal tract originating from the contralesional cortex into the denervated side of the cervical gray matter, and increased the level of mature brain-derived neurotrophic factor.

“Using this novel procedure in our earlier stroke studies, we found significant improvement in neurologic function,” said Michael Chopp, PhD, Scientific Director of the Henry Ford Neuroscience Institute in Detroit. “We essentially repeated the experiment on laboratory rats with subacute TBI with similarly remarkable results. As in stroke treated intranasally with t-PA, our subjects showed greatly improved functional outcome and rewiring of the cortical spinal tract.”

Although the damage resulting from stroke can be reduced if t-PA is administered intravenously within 4.5 hours, IV t-PA also has potentially harmful side effects, including swelling of the brain and hemorrhage. Researchers at Henry Ford Hospital found that the effective treatment window could be extended to as long as two weeks for laboratory rats dosed with t-PA in a nasal spray, which avoids the harmful side effects of IV injection.

Previous research has indicated that drugs administered through the nose directly target the brain and spinal cord, although researchers do not yet fully understand how this targeting occurs. Although the new study offers hope that a drug treatment will emerge, no effective pharmacologic therapy is available yet.

These most recent findings suggest that t-PA has the potential to be a noninvasive treatment for subacute TBI, thus helping the brain restore function to damaged cells. The investigators noted that further animal studies will be required to determine the best dose and the appropriate time window for optimal intranasal treatment.

References

Suggested Reading
Meng Y, Chopp M, Zhang Y, et al. Subacute intranasal administration of tissue plasminogen activator promotes neuroplasticity and improves functional recovery following traumatic brain injury in rats. PLoS One. 2014 Sep 3;9(9):e106238.

References

Suggested Reading
Meng Y, Chopp M, Zhang Y, et al. Subacute intranasal administration of tissue plasminogen activator promotes neuroplasticity and improves functional recovery following traumatic brain injury in rats. PLoS One. 2014 Sep 3;9(9):e106238.

Issue
Neurology Reviews - 22(11)
Issue
Neurology Reviews - 22(11)
Page Number
27
Page Number
27
Publications
Publications
Topics
Article Type
Display Headline
t-PA May Boost Recovery From Traumatic Brain Injury
Display Headline
t-PA May Boost Recovery From Traumatic Brain Injury
Legacy Keywords
t-PA, traumatic brain injury, PLos One, Neurology Reviews, TBI, Michael Chopp
Legacy Keywords
t-PA, traumatic brain injury, PLos One, Neurology Reviews, TBI, Michael Chopp
Sections
Article Source

PURLs Copyright

Inside the Article

TBI Is Associated With Increased Dementia Risk in Older Adults

Article Type
Changed
Thu, 12/15/2022 - 16:15
Display Headline
TBI Is Associated With Increased Dementia Risk in Older Adults

Traumatic brain injury (TBI) appears to be associated with an increased risk of dementia in adults 55 and older, researchers reported online ahead of print October 27 in JAMA Neurology.

Controversy exists about whether there is a link between a single TBI and the risk of developing dementia. According to the CDC, Americans 55 and older account for more than 60% of all hospitalizations for TBI, with the highest rates of TBI-related emergency department visits, inpatient stays, and deaths happening among patients age 75 and older. Therefore, understanding the effects of a TBI and the development of dementia among middle-aged or older adults has important public health implications.

Raquel C. Gardner, MD, Clinical Instructor and Behavioral Neurology Fellow at the University of California, San Francisco, and colleagues examined the risk of dementia among adults age 55 and older with recent TBI, compared with adults with non-TBI body trauma (NTT), which was defined as fractures but not of the head or neck. The study included 164,661 patients identified in a statewide California administrative health database.

A total of 51,799 patients with trauma (31.5%) had TBI. Of those, 4,361 patients (8.4%) developed dementia, compared with 6,610 patients (5.9%) with NTT. The average time from trauma to dementia diagnosis was 3.2 years, and it was shorter in the TBI group, compared with the NTT group (3.1 vs 3.3 years). Moderate to severe TBI was associated with increased risk of dementia in persons age 55 or older, and mild TBI at age 65 or older increased the dementia risk.

“Whether a person with TBI recovers cognitively or develops dementia, however, is likely dependent on multiple additional risk and protective factors, ranging from genetics and medical comorbidities to environmental exposures and specific characteristics of the TBI itself,” the authors noted.

In a related editorial, Steven T. DeKosky, MD, Professor and Chair, Department of Neurology, University of Pittsburgh School of Medicine, stated that “Judicious use of data by skilled researchers who are familiar with the entire range of dementia research from pathobiology to health care needs will enable us to ask important questions, evolve new or more informed queries, and both lead and complement the translational questions that are before us. Dementia is both a global problem and a pathological conundrum; thus, the complementary use of big data and basic neuroscience analyses offers the most promise.”

References

Suggested Reading
Barnes DE, Kaup A, Kirby KA, et al. Traumatic brain injury and risk of dementia in older veterans. Neurology. 2014;83(4):312-319.
DeKosky ST. The role of big data in understanding late-life cognitive decline: E Unum, Pluribus. JAMA Neurol. 2014 October 27 [Epub ahead of print].
Gardner RC, Burke JF, Nettiksimmons, et al. Dementia risk after traumatic brain injury vs nonbrain trauma: the role of age and severity. JAMA Neurol. 2014 October 27 [Epub ahead of print].

Author and Disclosure Information

Issue
Neurology Reviews - 22(11)
Publications
Topics
Page Number
33
Legacy Keywords
Neurology Reviews, TBI, Dementia, Adults, Raquel C. Gardner, CDC
Sections
Author and Disclosure Information

Author and Disclosure Information

Traumatic brain injury (TBI) appears to be associated with an increased risk of dementia in adults 55 and older, researchers reported online ahead of print October 27 in JAMA Neurology.

Controversy exists about whether there is a link between a single TBI and the risk of developing dementia. According to the CDC, Americans 55 and older account for more than 60% of all hospitalizations for TBI, with the highest rates of TBI-related emergency department visits, inpatient stays, and deaths happening among patients age 75 and older. Therefore, understanding the effects of a TBI and the development of dementia among middle-aged or older adults has important public health implications.

Raquel C. Gardner, MD, Clinical Instructor and Behavioral Neurology Fellow at the University of California, San Francisco, and colleagues examined the risk of dementia among adults age 55 and older with recent TBI, compared with adults with non-TBI body trauma (NTT), which was defined as fractures but not of the head or neck. The study included 164,661 patients identified in a statewide California administrative health database.

A total of 51,799 patients with trauma (31.5%) had TBI. Of those, 4,361 patients (8.4%) developed dementia, compared with 6,610 patients (5.9%) with NTT. The average time from trauma to dementia diagnosis was 3.2 years, and it was shorter in the TBI group, compared with the NTT group (3.1 vs 3.3 years). Moderate to severe TBI was associated with increased risk of dementia in persons age 55 or older, and mild TBI at age 65 or older increased the dementia risk.

“Whether a person with TBI recovers cognitively or develops dementia, however, is likely dependent on multiple additional risk and protective factors, ranging from genetics and medical comorbidities to environmental exposures and specific characteristics of the TBI itself,” the authors noted.

In a related editorial, Steven T. DeKosky, MD, Professor and Chair, Department of Neurology, University of Pittsburgh School of Medicine, stated that “Judicious use of data by skilled researchers who are familiar with the entire range of dementia research from pathobiology to health care needs will enable us to ask important questions, evolve new or more informed queries, and both lead and complement the translational questions that are before us. Dementia is both a global problem and a pathological conundrum; thus, the complementary use of big data and basic neuroscience analyses offers the most promise.”

Traumatic brain injury (TBI) appears to be associated with an increased risk of dementia in adults 55 and older, researchers reported online ahead of print October 27 in JAMA Neurology.

Controversy exists about whether there is a link between a single TBI and the risk of developing dementia. According to the CDC, Americans 55 and older account for more than 60% of all hospitalizations for TBI, with the highest rates of TBI-related emergency department visits, inpatient stays, and deaths happening among patients age 75 and older. Therefore, understanding the effects of a TBI and the development of dementia among middle-aged or older adults has important public health implications.

Raquel C. Gardner, MD, Clinical Instructor and Behavioral Neurology Fellow at the University of California, San Francisco, and colleagues examined the risk of dementia among adults age 55 and older with recent TBI, compared with adults with non-TBI body trauma (NTT), which was defined as fractures but not of the head or neck. The study included 164,661 patients identified in a statewide California administrative health database.

A total of 51,799 patients with trauma (31.5%) had TBI. Of those, 4,361 patients (8.4%) developed dementia, compared with 6,610 patients (5.9%) with NTT. The average time from trauma to dementia diagnosis was 3.2 years, and it was shorter in the TBI group, compared with the NTT group (3.1 vs 3.3 years). Moderate to severe TBI was associated with increased risk of dementia in persons age 55 or older, and mild TBI at age 65 or older increased the dementia risk.

“Whether a person with TBI recovers cognitively or develops dementia, however, is likely dependent on multiple additional risk and protective factors, ranging from genetics and medical comorbidities to environmental exposures and specific characteristics of the TBI itself,” the authors noted.

In a related editorial, Steven T. DeKosky, MD, Professor and Chair, Department of Neurology, University of Pittsburgh School of Medicine, stated that “Judicious use of data by skilled researchers who are familiar with the entire range of dementia research from pathobiology to health care needs will enable us to ask important questions, evolve new or more informed queries, and both lead and complement the translational questions that are before us. Dementia is both a global problem and a pathological conundrum; thus, the complementary use of big data and basic neuroscience analyses offers the most promise.”

References

Suggested Reading
Barnes DE, Kaup A, Kirby KA, et al. Traumatic brain injury and risk of dementia in older veterans. Neurology. 2014;83(4):312-319.
DeKosky ST. The role of big data in understanding late-life cognitive decline: E Unum, Pluribus. JAMA Neurol. 2014 October 27 [Epub ahead of print].
Gardner RC, Burke JF, Nettiksimmons, et al. Dementia risk after traumatic brain injury vs nonbrain trauma: the role of age and severity. JAMA Neurol. 2014 October 27 [Epub ahead of print].

References

Suggested Reading
Barnes DE, Kaup A, Kirby KA, et al. Traumatic brain injury and risk of dementia in older veterans. Neurology. 2014;83(4):312-319.
DeKosky ST. The role of big data in understanding late-life cognitive decline: E Unum, Pluribus. JAMA Neurol. 2014 October 27 [Epub ahead of print].
Gardner RC, Burke JF, Nettiksimmons, et al. Dementia risk after traumatic brain injury vs nonbrain trauma: the role of age and severity. JAMA Neurol. 2014 October 27 [Epub ahead of print].

Issue
Neurology Reviews - 22(11)
Issue
Neurology Reviews - 22(11)
Page Number
33
Page Number
33
Publications
Publications
Topics
Article Type
Display Headline
TBI Is Associated With Increased Dementia Risk in Older Adults
Display Headline
TBI Is Associated With Increased Dementia Risk in Older Adults
Legacy Keywords
Neurology Reviews, TBI, Dementia, Adults, Raquel C. Gardner, CDC
Legacy Keywords
Neurology Reviews, TBI, Dementia, Adults, Raquel C. Gardner, CDC
Sections
Article Source

PURLs Copyright

Inside the Article

Chemical Derived From Broccoli Sprouts Shows Promise in Treating Autism

Article Type
Changed
Mon, 01/07/2019 - 09:54
Display Headline
Chemical Derived From Broccoli Sprouts Shows Promise in Treating Autism

Sulforaphane, a chemical derived from broccoli sprouts, may ease classic behavioral symptoms in patients with autism spectrum disorders (ASDs), according to a study published online ahead of print October 13 in the Proceedings of the National Academy of Sciences.

The study involved 40 males, ages 13 to 27, with moderate to severe autism. Many participants who received a daily dose of sulforaphane experienced substantial improvements in their social interaction and verbal communication, along with decreases in repetitive, ritualistic behaviors, compared with those who received a placebo, according to the researchers.

“We believe that this may be preliminary evidence for the first treatment for autism that improves symptoms by apparently correcting some of the underlying cellular problems,” said Paul Talalay, MD, Professor of Pharmacology and Molecular Sciences at Johns Hopkins University in Baltimore.

“We are far from being able to declare a victory over autism, but this gives us important insights into what might help,” said coinvestigator Andrew Zimmerman, MD, Professor of Pediatric Neurology at UMass Memorial Medical Center in Worcester.

Cause of Autism Is Elusive
Researchers estimate that ASD affects 1% to 2% of the world’s population, with a much higher incidence in boys than in girls. Its behavioral symptoms, such as poor social interaction and verbal communication, are well known and were first described 70 years ago by Leo Kanner, MD.

Unfortunately, its root causes remain elusive, though progress has been made, Dr. Talalay said, in describing some of the biochemical and molecular abnormalities that tend to accompany ASD. Many of these are related to the efficiency of energy generation in cells. Studies show that the cells of patients with ASD often have high levels of oxidative stress, the buildup of harmful, unintended byproducts from the cell’s use of oxygen that can cause inflammation, damage DNA, and lead to cancer and other chronic diseases.

In 1992, Dr. Talalay’s research group found that sulforaphane can bolster the body’s natural defenses against oxidative stress, inflammation, and DNA damage. In addition, the chemical later was found to improve the body’s heat-shock response, a cascade of events used to protect cells from the stress caused by high temperatures, including those experienced when people have fever.

About one-half of parents report that their children’s autistic behavior improves noticeably when they have a fever, then reverts back when the fever is gone. In 2007, Dr. Zimmerman tested this anecdotal trend clinically and found it to be true, though a mechanism for the fever effect was not identified. Because fevers, similar to sulforaphane, initiate the body’s heat-shock response, Drs. Zimmerman and Talalay wondered if sulforaphane could cause the same temporary improvement in autism that fevers do.

Improvement Linked to Sulforaphane
Before the start of the trial, the patients’ caregivers and physicians filled out three standard behavioral assessments—the Aberrant Behavior Checklist (ABC), the Social Responsiveness Scale (SRS), and the Clinical Global Impressions-Improvement scale (CGI-I). The assessments measure sensory sensitivities, ability to relate to others, verbal communication skills, social interactions, and other behaviors related to autism. Twenty-six participants were randomly selected to receive, based on their weight, 9 to 27 mg of sulforaphane daily, and 14 received placebo. Behavioral assessments were again completed at four, 10, and 18 weeks while treatment continued. A final assessment was completed for most of the participants four weeks after the treatment had stopped.

Most subjects who responded to sulforaphane showed significant improvements by the first measurement at four weeks and continued to improve during the rest of the treatment. After 18 weeks of treatment, the average ABC and SRS scores of those who received sulforaphane had decreased 34% and 17%, respectively, with improvements in bouts of irritability, lethargy, repetitive movements, hyperactivity, awareness, communication, motivation, and mannerisms.

After 18 weeks of treatment, according to the CGI-I scale, 46%, 54%, and 42% of sulforaphane recipients experienced noticeable improvements in social interaction, aberrant behaviors, and verbal communication, respectively.

Dr. Talalay noted that the scores of those who took sulforaphane trended back toward their original values after they stopped taking the chemical, similar to what happens to those who experience improvements during a fever. “It seems like sulforaphane is temporarily helping cells to cope with their handicaps,” he said.

Dr. Zimmerman added that before his group learned which subjects received the sulforaphane or placebo, the impressions of the clinical team, including parents, were that 13 participants noticeably improved. For example, some treated subjects looked them in the eye and shook their hands, which they had not done before. They found out later that all 13 had been taking sulforaphane, which is half of the treatment group. Dr. Talalay cautioned that the levels of sulforaphane precursors present in different varieties of broccoli are highly variable. Furthermore, the capacity of individuals to convert these precursors to active sulforaphane also varies greatly. It would be difficult, he noted, to achieve the levels of sulforaphane used in this study by eating large amounts of broccoli or other cruciferous vegetables.

References

Suggested Reading
Singh K, Connors SL, Macklin EA, et al. Sulforaphane treatment of autism spectrum disorder (ASD). PNAS. 2014 Oct 13 [Epub ahead of print].

Author and Disclosure Information

Issue
Neurology Reviews - 22(11)
Publications
Topics
Page Number
26
Legacy Keywords
autism, broccoli sprouts, Proceedings of the National Academy of Sciences, sulforaphane
Author and Disclosure Information

Author and Disclosure Information

Sulforaphane, a chemical derived from broccoli sprouts, may ease classic behavioral symptoms in patients with autism spectrum disorders (ASDs), according to a study published online ahead of print October 13 in the Proceedings of the National Academy of Sciences.

The study involved 40 males, ages 13 to 27, with moderate to severe autism. Many participants who received a daily dose of sulforaphane experienced substantial improvements in their social interaction and verbal communication, along with decreases in repetitive, ritualistic behaviors, compared with those who received a placebo, according to the researchers.

“We believe that this may be preliminary evidence for the first treatment for autism that improves symptoms by apparently correcting some of the underlying cellular problems,” said Paul Talalay, MD, Professor of Pharmacology and Molecular Sciences at Johns Hopkins University in Baltimore.

“We are far from being able to declare a victory over autism, but this gives us important insights into what might help,” said coinvestigator Andrew Zimmerman, MD, Professor of Pediatric Neurology at UMass Memorial Medical Center in Worcester.

Cause of Autism Is Elusive
Researchers estimate that ASD affects 1% to 2% of the world’s population, with a much higher incidence in boys than in girls. Its behavioral symptoms, such as poor social interaction and verbal communication, are well known and were first described 70 years ago by Leo Kanner, MD.

Unfortunately, its root causes remain elusive, though progress has been made, Dr. Talalay said, in describing some of the biochemical and molecular abnormalities that tend to accompany ASD. Many of these are related to the efficiency of energy generation in cells. Studies show that the cells of patients with ASD often have high levels of oxidative stress, the buildup of harmful, unintended byproducts from the cell’s use of oxygen that can cause inflammation, damage DNA, and lead to cancer and other chronic diseases.

In 1992, Dr. Talalay’s research group found that sulforaphane can bolster the body’s natural defenses against oxidative stress, inflammation, and DNA damage. In addition, the chemical later was found to improve the body’s heat-shock response, a cascade of events used to protect cells from the stress caused by high temperatures, including those experienced when people have fever.

About one-half of parents report that their children’s autistic behavior improves noticeably when they have a fever, then reverts back when the fever is gone. In 2007, Dr. Zimmerman tested this anecdotal trend clinically and found it to be true, though a mechanism for the fever effect was not identified. Because fevers, similar to sulforaphane, initiate the body’s heat-shock response, Drs. Zimmerman and Talalay wondered if sulforaphane could cause the same temporary improvement in autism that fevers do.

Improvement Linked to Sulforaphane
Before the start of the trial, the patients’ caregivers and physicians filled out three standard behavioral assessments—the Aberrant Behavior Checklist (ABC), the Social Responsiveness Scale (SRS), and the Clinical Global Impressions-Improvement scale (CGI-I). The assessments measure sensory sensitivities, ability to relate to others, verbal communication skills, social interactions, and other behaviors related to autism. Twenty-six participants were randomly selected to receive, based on their weight, 9 to 27 mg of sulforaphane daily, and 14 received placebo. Behavioral assessments were again completed at four, 10, and 18 weeks while treatment continued. A final assessment was completed for most of the participants four weeks after the treatment had stopped.

Most subjects who responded to sulforaphane showed significant improvements by the first measurement at four weeks and continued to improve during the rest of the treatment. After 18 weeks of treatment, the average ABC and SRS scores of those who received sulforaphane had decreased 34% and 17%, respectively, with improvements in bouts of irritability, lethargy, repetitive movements, hyperactivity, awareness, communication, motivation, and mannerisms.

After 18 weeks of treatment, according to the CGI-I scale, 46%, 54%, and 42% of sulforaphane recipients experienced noticeable improvements in social interaction, aberrant behaviors, and verbal communication, respectively.

Dr. Talalay noted that the scores of those who took sulforaphane trended back toward their original values after they stopped taking the chemical, similar to what happens to those who experience improvements during a fever. “It seems like sulforaphane is temporarily helping cells to cope with their handicaps,” he said.

Dr. Zimmerman added that before his group learned which subjects received the sulforaphane or placebo, the impressions of the clinical team, including parents, were that 13 participants noticeably improved. For example, some treated subjects looked them in the eye and shook their hands, which they had not done before. They found out later that all 13 had been taking sulforaphane, which is half of the treatment group. Dr. Talalay cautioned that the levels of sulforaphane precursors present in different varieties of broccoli are highly variable. Furthermore, the capacity of individuals to convert these precursors to active sulforaphane also varies greatly. It would be difficult, he noted, to achieve the levels of sulforaphane used in this study by eating large amounts of broccoli or other cruciferous vegetables.

Sulforaphane, a chemical derived from broccoli sprouts, may ease classic behavioral symptoms in patients with autism spectrum disorders (ASDs), according to a study published online ahead of print October 13 in the Proceedings of the National Academy of Sciences.

The study involved 40 males, ages 13 to 27, with moderate to severe autism. Many participants who received a daily dose of sulforaphane experienced substantial improvements in their social interaction and verbal communication, along with decreases in repetitive, ritualistic behaviors, compared with those who received a placebo, according to the researchers.

“We believe that this may be preliminary evidence for the first treatment for autism that improves symptoms by apparently correcting some of the underlying cellular problems,” said Paul Talalay, MD, Professor of Pharmacology and Molecular Sciences at Johns Hopkins University in Baltimore.

“We are far from being able to declare a victory over autism, but this gives us important insights into what might help,” said coinvestigator Andrew Zimmerman, MD, Professor of Pediatric Neurology at UMass Memorial Medical Center in Worcester.

Cause of Autism Is Elusive
Researchers estimate that ASD affects 1% to 2% of the world’s population, with a much higher incidence in boys than in girls. Its behavioral symptoms, such as poor social interaction and verbal communication, are well known and were first described 70 years ago by Leo Kanner, MD.

Unfortunately, its root causes remain elusive, though progress has been made, Dr. Talalay said, in describing some of the biochemical and molecular abnormalities that tend to accompany ASD. Many of these are related to the efficiency of energy generation in cells. Studies show that the cells of patients with ASD often have high levels of oxidative stress, the buildup of harmful, unintended byproducts from the cell’s use of oxygen that can cause inflammation, damage DNA, and lead to cancer and other chronic diseases.

In 1992, Dr. Talalay’s research group found that sulforaphane can bolster the body’s natural defenses against oxidative stress, inflammation, and DNA damage. In addition, the chemical later was found to improve the body’s heat-shock response, a cascade of events used to protect cells from the stress caused by high temperatures, including those experienced when people have fever.

About one-half of parents report that their children’s autistic behavior improves noticeably when they have a fever, then reverts back when the fever is gone. In 2007, Dr. Zimmerman tested this anecdotal trend clinically and found it to be true, though a mechanism for the fever effect was not identified. Because fevers, similar to sulforaphane, initiate the body’s heat-shock response, Drs. Zimmerman and Talalay wondered if sulforaphane could cause the same temporary improvement in autism that fevers do.

Improvement Linked to Sulforaphane
Before the start of the trial, the patients’ caregivers and physicians filled out three standard behavioral assessments—the Aberrant Behavior Checklist (ABC), the Social Responsiveness Scale (SRS), and the Clinical Global Impressions-Improvement scale (CGI-I). The assessments measure sensory sensitivities, ability to relate to others, verbal communication skills, social interactions, and other behaviors related to autism. Twenty-six participants were randomly selected to receive, based on their weight, 9 to 27 mg of sulforaphane daily, and 14 received placebo. Behavioral assessments were again completed at four, 10, and 18 weeks while treatment continued. A final assessment was completed for most of the participants four weeks after the treatment had stopped.

Most subjects who responded to sulforaphane showed significant improvements by the first measurement at four weeks and continued to improve during the rest of the treatment. After 18 weeks of treatment, the average ABC and SRS scores of those who received sulforaphane had decreased 34% and 17%, respectively, with improvements in bouts of irritability, lethargy, repetitive movements, hyperactivity, awareness, communication, motivation, and mannerisms.

After 18 weeks of treatment, according to the CGI-I scale, 46%, 54%, and 42% of sulforaphane recipients experienced noticeable improvements in social interaction, aberrant behaviors, and verbal communication, respectively.

Dr. Talalay noted that the scores of those who took sulforaphane trended back toward their original values after they stopped taking the chemical, similar to what happens to those who experience improvements during a fever. “It seems like sulforaphane is temporarily helping cells to cope with their handicaps,” he said.

Dr. Zimmerman added that before his group learned which subjects received the sulforaphane or placebo, the impressions of the clinical team, including parents, were that 13 participants noticeably improved. For example, some treated subjects looked them in the eye and shook their hands, which they had not done before. They found out later that all 13 had been taking sulforaphane, which is half of the treatment group. Dr. Talalay cautioned that the levels of sulforaphane precursors present in different varieties of broccoli are highly variable. Furthermore, the capacity of individuals to convert these precursors to active sulforaphane also varies greatly. It would be difficult, he noted, to achieve the levels of sulforaphane used in this study by eating large amounts of broccoli or other cruciferous vegetables.

References

Suggested Reading
Singh K, Connors SL, Macklin EA, et al. Sulforaphane treatment of autism spectrum disorder (ASD). PNAS. 2014 Oct 13 [Epub ahead of print].

References

Suggested Reading
Singh K, Connors SL, Macklin EA, et al. Sulforaphane treatment of autism spectrum disorder (ASD). PNAS. 2014 Oct 13 [Epub ahead of print].

Issue
Neurology Reviews - 22(11)
Issue
Neurology Reviews - 22(11)
Page Number
26
Page Number
26
Publications
Publications
Topics
Article Type
Display Headline
Chemical Derived From Broccoli Sprouts Shows Promise in Treating Autism
Display Headline
Chemical Derived From Broccoli Sprouts Shows Promise in Treating Autism
Legacy Keywords
autism, broccoli sprouts, Proceedings of the National Academy of Sciences, sulforaphane
Legacy Keywords
autism, broccoli sprouts, Proceedings of the National Academy of Sciences, sulforaphane
Article Source

PURLs Copyright

Inside the Article

New and Noteworthy Information—November 2014

Article Type
Changed
Thu, 12/15/2022 - 16:15
Display Headline
New and Noteworthy Information—November 2014

Researchers found no long-term association of vaccines with multiple sclerosis (MS) or any other CNS demyelinating syndromes, according to a study published online ahead of print October 20 in JAMA Neurology. The investigators examined the relationship between vaccines and MS or other CNS demyelinating syndromes by using data from Kaiser Permanente Southern California members. The study authors identified 780 cases of CNS demyelinating syndromes and 3,885 controls; 92 cases and 459 controls were females between the ages of 9 and 26, which is the indicated age range for human papillomavirus (HPV) vaccination. The researchers found no associations between HepB vaccinations, HPV vaccination, or any vaccination and the risk of MS or CNS demyelinating syndromes for as long as three years later. Vaccination of any type was associated with increased risk of a CNS demyelinating syndrome onset within the first 30 days after vaccination only in patients younger than 50, but this association was not evident after 30 days.

Bariatric surgery is a potential risk factor for spontaneous intracranial hypotension, according to a study published online ahead of print October 22 in Neurology. Researchers compared a group of 338 patients with spontaneous intracranial hypotension to a control group of 245 people with unruptured intracranial aneurysms. Eleven of the 338 (3.3%) people with spontaneous intracranial hypotension had previously had bariatric surgery, compared with two of the 245 (0.8%) people with intracranial aneurysms. Of the 11 people with bariatric surgery and spontaneous intracranial hypotension, nine had no more symptoms after treatment, while symptoms persisted for two. The symptoms started from three months to 20 years after the bariatric surgery, and participants had lost an average of 116 pounds during that time.

Longitudinal measures of cortical atrophy were widely correlated with sleep quality, according to a study published September 9 in Neurology. The study included 147 adults, ages 20 and 84. Researchers examined the link between sleep difficulties, such as having trouble falling asleep or staying asleep at night, and brain volume. All participants underwent two MRI brain scans, an average of 3.5 years apart, before completing a questionnaire about their sleep habits. A total of 35% of the participants met the criteria for poor sleep quality, scoring an average of 8.5 out of 21 points on the sleep assessment. The researchers found that sleep difficulties were linked with a more rapid decline in brain volume during the course of the study in various brain regions, including within frontal, temporal, and parietal areas. The results were more pronounced in people older than 60.

An international group of researchers has established the first standardized guidelines for the collection of blood to test for early Alzheimer’s disease, as reported online ahead of print September 27 in Alzheimer’s & Dementia. These guidelines will be used in research for blood-based biomarkers of Alzheimer’s disease and will ensure that every laboratory is following the same protocol when collecting blood. The lack of readily available biomarkers is a significant hindrance toward progressing to effective therapeutic and preventative strategies for Alzheimer’s disease. Researchers have worked with representatives from the United States, Germany, Australia, England, and other countries to create these standards. “You can create a blood test in the lab, but if you don’t have a systemized way for collecting the blood, the test will never go into practice,” said the investigators.

A new study suggests a cause of amyotrophic lateral sclerosis (ALS), according to a study published online ahead of print October 14 in Proceedings of the National Academy of Sciences. Researchers used advanced biophysical methods to probe how different superoxide dismutase 1(SOD1) gene mutations in a genetic ALS hotspot affect SOD protein stability. Investigators examined how the aggregation dynamics of mutant SOD G93A differed from that of nonmutant SOD. They developed a method for gradually inducing SOD aggregation, which was measured with SAXS, a structural imaging system. The G93-mutant SODs appear to have looser, floppier structures that are more likely to drop their copper ions and are more likely to misfold and stick together in aggregates. “Our work supports a common theme whereby loss of protein stability leads to disease,” investigators said.

Long-term functional outcome and risk of fatal or disabling stroke are similar for stenting and endarterectomy for symptomatic carotid stenosis, according to a study published online ahead of print October 14 in the Lancet. Researchers followed 1,713 patients with carotid artery disease, of whom 855 were assigned to stenting and 858 to endarterectomy, for as long as 10 years. The median follow-up was 4.2 years. Both techniques were found to be equally good at preventing fatal and disabling strokes, but stented patients were slightly more likely to have minor strokes without long-term effects. The risk of any stroke in five years was 15.2% in the stenting group, compared with 9.4% in the endarterectomy group, but the additional strokes were minor and had no impact on long-term quality of life.

 

 

Researchers have found Class II evidence that serum metabolite profiles accurately distinguish patients with different subtypes and stages of multiple sclerosis (MS), according to a study published October 21 in Neurology. Investigators obtained serum samples from patients with primary progressive MS, secondary-progressive MS, and relapsing-remitting MS, patients with other neurodegenerative conditions, and from age-matched controls. Samples were analyzed by nuclear magnetic resonance, and partial least squares discriminant analysis models were derived to separate disease groups. The partial least squares discriminant analysis models for serum samples from patients with MS enabled reliable differentiation between relapsing-remitting MS and secondary-progressive MS. This approach identified significant differences between the metabolite profiles of each of the MS groups and the healthy controls, as well as predicting disease group membership with high sensitivity and specificity.

Parkinson’s disease pathogenic mutations have an age-dependent penetrance that could be ameliorated or exacerbated by modifier genes or environmental factors in different populations, according to a study published online ahead of print October 20 in JAMA Neurology. The investigators examined 49 previously published studies that included 709 participants and were found in ISI Web of Science and PubMed. They also analyzed extracted information about the number of mutation carriers within families and sporadic cases worldwide for pathogenic mutations in SNCA, LRRK2, VPS35, EIF4G1, and DNAJC13. The end-of-search date was January 31, 2014. In particular, penetrance of SNCA duplications were comparable to point mutations and driven by inclusion of SNCA p.A53T (mean age at onset, 45.9). Each penetrance estimate was given separately with 95% confidence intervals.

Spreading depolarizations can be measured after traumatic brain injury (TBI) by the placement of EEG electrodes on the scalp, according to a study published online ahead of print August 25 in Annals of Neurology. Eighteen patients requiring surgical treatment for TBI were monitored by invasive electrocorticography (ECoG) and noninvasive scalp EEG during intensive care. Spreading depolarizations were first identified in subdural recordings, and EEGs were then examined visually and quantitatively to identify correlates. A total of 455 spreading depolarizations occurred during 65.9 days of simultaneous ECoG and EEG monitoring. For 179 of 455 events (39%), depolarizations caused temporally isolated, transient depressions of spontaneous EEG amplitudes to 57% (median) of baseline power. For 62 of 179 (35%) events, isolated depressions showed a clear spread of depression between EEG channels with delays of 17 minutes (median).

A diet that includes walnuts may have a beneficial effect on reducing the risk, delaying the onset, and slowing the progression of, or preventing, Alzheimer’s disease, according to a study published October 21 in the Journal of Alzheimer’s Disease. The research group examined the effects of dietary supplementation on mice with 6% or 9% walnuts, which is equivalent to 1 ounce and 1.5 ounces per day, respectively, of walnuts in humans. The investigators found significant improvement in learning skills, memory, reducing anxiety, and motor development in mice fed a walnut-enriched diet. “These findings are very promising and help lay the groundwork for future human studies on walnuts and Alzheimer’s disease,” the investigators said.

Dopamine receptor agonist drugs are associated with impulse control disorders, such as pathologic gambling, hypersexuality, and compulsive shopping, according to a study that was published online ahead of print October 20 in JAMA Internal Medicine. Researchers conducted a retrospective disproportionality analysis that was based on the 2.7 million serious domestic and foreign adverse drug event reports between 2003 and 2012 that were extracted from the FDA Adverse Event Reporting System. The investigators identified 1,580 events indicating impulse control disorders from the United States and 21 other countries (710 for dopamine receptor agonist drugs and 870 for other drugs). The dopamine receptor agonist drugs had a strong signal associated with these impulse control disorders. The association was strongest for the dopamine agonists pramipexole and ropinirole, with preferential affinity for the dopamine D3 receptor. A signal was also seen for aripiprazole.

Kimberly D. Williams

References

Author and Disclosure Information

Issue
Neurology Reviews - 22(11)
Publications
Page Number
7-8
Legacy Keywords
Kimberly D. Williams, Neurology Reviews, Dementia, Alzheimer's, TBI, Brain, Injury
Sections
Author and Disclosure Information

Author and Disclosure Information

Researchers found no long-term association of vaccines with multiple sclerosis (MS) or any other CNS demyelinating syndromes, according to a study published online ahead of print October 20 in JAMA Neurology. The investigators examined the relationship between vaccines and MS or other CNS demyelinating syndromes by using data from Kaiser Permanente Southern California members. The study authors identified 780 cases of CNS demyelinating syndromes and 3,885 controls; 92 cases and 459 controls were females between the ages of 9 and 26, which is the indicated age range for human papillomavirus (HPV) vaccination. The researchers found no associations between HepB vaccinations, HPV vaccination, or any vaccination and the risk of MS or CNS demyelinating syndromes for as long as three years later. Vaccination of any type was associated with increased risk of a CNS demyelinating syndrome onset within the first 30 days after vaccination only in patients younger than 50, but this association was not evident after 30 days.

Bariatric surgery is a potential risk factor for spontaneous intracranial hypotension, according to a study published online ahead of print October 22 in Neurology. Researchers compared a group of 338 patients with spontaneous intracranial hypotension to a control group of 245 people with unruptured intracranial aneurysms. Eleven of the 338 (3.3%) people with spontaneous intracranial hypotension had previously had bariatric surgery, compared with two of the 245 (0.8%) people with intracranial aneurysms. Of the 11 people with bariatric surgery and spontaneous intracranial hypotension, nine had no more symptoms after treatment, while symptoms persisted for two. The symptoms started from three months to 20 years after the bariatric surgery, and participants had lost an average of 116 pounds during that time.

Longitudinal measures of cortical atrophy were widely correlated with sleep quality, according to a study published September 9 in Neurology. The study included 147 adults, ages 20 and 84. Researchers examined the link between sleep difficulties, such as having trouble falling asleep or staying asleep at night, and brain volume. All participants underwent two MRI brain scans, an average of 3.5 years apart, before completing a questionnaire about their sleep habits. A total of 35% of the participants met the criteria for poor sleep quality, scoring an average of 8.5 out of 21 points on the sleep assessment. The researchers found that sleep difficulties were linked with a more rapid decline in brain volume during the course of the study in various brain regions, including within frontal, temporal, and parietal areas. The results were more pronounced in people older than 60.

An international group of researchers has established the first standardized guidelines for the collection of blood to test for early Alzheimer’s disease, as reported online ahead of print September 27 in Alzheimer’s & Dementia. These guidelines will be used in research for blood-based biomarkers of Alzheimer’s disease and will ensure that every laboratory is following the same protocol when collecting blood. The lack of readily available biomarkers is a significant hindrance toward progressing to effective therapeutic and preventative strategies for Alzheimer’s disease. Researchers have worked with representatives from the United States, Germany, Australia, England, and other countries to create these standards. “You can create a blood test in the lab, but if you don’t have a systemized way for collecting the blood, the test will never go into practice,” said the investigators.

A new study suggests a cause of amyotrophic lateral sclerosis (ALS), according to a study published online ahead of print October 14 in Proceedings of the National Academy of Sciences. Researchers used advanced biophysical methods to probe how different superoxide dismutase 1(SOD1) gene mutations in a genetic ALS hotspot affect SOD protein stability. Investigators examined how the aggregation dynamics of mutant SOD G93A differed from that of nonmutant SOD. They developed a method for gradually inducing SOD aggregation, which was measured with SAXS, a structural imaging system. The G93-mutant SODs appear to have looser, floppier structures that are more likely to drop their copper ions and are more likely to misfold and stick together in aggregates. “Our work supports a common theme whereby loss of protein stability leads to disease,” investigators said.

Long-term functional outcome and risk of fatal or disabling stroke are similar for stenting and endarterectomy for symptomatic carotid stenosis, according to a study published online ahead of print October 14 in the Lancet. Researchers followed 1,713 patients with carotid artery disease, of whom 855 were assigned to stenting and 858 to endarterectomy, for as long as 10 years. The median follow-up was 4.2 years. Both techniques were found to be equally good at preventing fatal and disabling strokes, but stented patients were slightly more likely to have minor strokes without long-term effects. The risk of any stroke in five years was 15.2% in the stenting group, compared with 9.4% in the endarterectomy group, but the additional strokes were minor and had no impact on long-term quality of life.

 

 

Researchers have found Class II evidence that serum metabolite profiles accurately distinguish patients with different subtypes and stages of multiple sclerosis (MS), according to a study published October 21 in Neurology. Investigators obtained serum samples from patients with primary progressive MS, secondary-progressive MS, and relapsing-remitting MS, patients with other neurodegenerative conditions, and from age-matched controls. Samples were analyzed by nuclear magnetic resonance, and partial least squares discriminant analysis models were derived to separate disease groups. The partial least squares discriminant analysis models for serum samples from patients with MS enabled reliable differentiation between relapsing-remitting MS and secondary-progressive MS. This approach identified significant differences between the metabolite profiles of each of the MS groups and the healthy controls, as well as predicting disease group membership with high sensitivity and specificity.

Parkinson’s disease pathogenic mutations have an age-dependent penetrance that could be ameliorated or exacerbated by modifier genes or environmental factors in different populations, according to a study published online ahead of print October 20 in JAMA Neurology. The investigators examined 49 previously published studies that included 709 participants and were found in ISI Web of Science and PubMed. They also analyzed extracted information about the number of mutation carriers within families and sporadic cases worldwide for pathogenic mutations in SNCA, LRRK2, VPS35, EIF4G1, and DNAJC13. The end-of-search date was January 31, 2014. In particular, penetrance of SNCA duplications were comparable to point mutations and driven by inclusion of SNCA p.A53T (mean age at onset, 45.9). Each penetrance estimate was given separately with 95% confidence intervals.

Spreading depolarizations can be measured after traumatic brain injury (TBI) by the placement of EEG electrodes on the scalp, according to a study published online ahead of print August 25 in Annals of Neurology. Eighteen patients requiring surgical treatment for TBI were monitored by invasive electrocorticography (ECoG) and noninvasive scalp EEG during intensive care. Spreading depolarizations were first identified in subdural recordings, and EEGs were then examined visually and quantitatively to identify correlates. A total of 455 spreading depolarizations occurred during 65.9 days of simultaneous ECoG and EEG monitoring. For 179 of 455 events (39%), depolarizations caused temporally isolated, transient depressions of spontaneous EEG amplitudes to 57% (median) of baseline power. For 62 of 179 (35%) events, isolated depressions showed a clear spread of depression between EEG channels with delays of 17 minutes (median).

A diet that includes walnuts may have a beneficial effect on reducing the risk, delaying the onset, and slowing the progression of, or preventing, Alzheimer’s disease, according to a study published October 21 in the Journal of Alzheimer’s Disease. The research group examined the effects of dietary supplementation on mice with 6% or 9% walnuts, which is equivalent to 1 ounce and 1.5 ounces per day, respectively, of walnuts in humans. The investigators found significant improvement in learning skills, memory, reducing anxiety, and motor development in mice fed a walnut-enriched diet. “These findings are very promising and help lay the groundwork for future human studies on walnuts and Alzheimer’s disease,” the investigators said.

Dopamine receptor agonist drugs are associated with impulse control disorders, such as pathologic gambling, hypersexuality, and compulsive shopping, according to a study that was published online ahead of print October 20 in JAMA Internal Medicine. Researchers conducted a retrospective disproportionality analysis that was based on the 2.7 million serious domestic and foreign adverse drug event reports between 2003 and 2012 that were extracted from the FDA Adverse Event Reporting System. The investigators identified 1,580 events indicating impulse control disorders from the United States and 21 other countries (710 for dopamine receptor agonist drugs and 870 for other drugs). The dopamine receptor agonist drugs had a strong signal associated with these impulse control disorders. The association was strongest for the dopamine agonists pramipexole and ropinirole, with preferential affinity for the dopamine D3 receptor. A signal was also seen for aripiprazole.

Kimberly D. Williams

Researchers found no long-term association of vaccines with multiple sclerosis (MS) or any other CNS demyelinating syndromes, according to a study published online ahead of print October 20 in JAMA Neurology. The investigators examined the relationship between vaccines and MS or other CNS demyelinating syndromes by using data from Kaiser Permanente Southern California members. The study authors identified 780 cases of CNS demyelinating syndromes and 3,885 controls; 92 cases and 459 controls were females between the ages of 9 and 26, which is the indicated age range for human papillomavirus (HPV) vaccination. The researchers found no associations between HepB vaccinations, HPV vaccination, or any vaccination and the risk of MS or CNS demyelinating syndromes for as long as three years later. Vaccination of any type was associated with increased risk of a CNS demyelinating syndrome onset within the first 30 days after vaccination only in patients younger than 50, but this association was not evident after 30 days.

Bariatric surgery is a potential risk factor for spontaneous intracranial hypotension, according to a study published online ahead of print October 22 in Neurology. Researchers compared a group of 338 patients with spontaneous intracranial hypotension to a control group of 245 people with unruptured intracranial aneurysms. Eleven of the 338 (3.3%) people with spontaneous intracranial hypotension had previously had bariatric surgery, compared with two of the 245 (0.8%) people with intracranial aneurysms. Of the 11 people with bariatric surgery and spontaneous intracranial hypotension, nine had no more symptoms after treatment, while symptoms persisted for two. The symptoms started from three months to 20 years after the bariatric surgery, and participants had lost an average of 116 pounds during that time.

Longitudinal measures of cortical atrophy were widely correlated with sleep quality, according to a study published September 9 in Neurology. The study included 147 adults, ages 20 and 84. Researchers examined the link between sleep difficulties, such as having trouble falling asleep or staying asleep at night, and brain volume. All participants underwent two MRI brain scans, an average of 3.5 years apart, before completing a questionnaire about their sleep habits. A total of 35% of the participants met the criteria for poor sleep quality, scoring an average of 8.5 out of 21 points on the sleep assessment. The researchers found that sleep difficulties were linked with a more rapid decline in brain volume during the course of the study in various brain regions, including within frontal, temporal, and parietal areas. The results were more pronounced in people older than 60.

An international group of researchers has established the first standardized guidelines for the collection of blood to test for early Alzheimer’s disease, as reported online ahead of print September 27 in Alzheimer’s & Dementia. These guidelines will be used in research for blood-based biomarkers of Alzheimer’s disease and will ensure that every laboratory is following the same protocol when collecting blood. The lack of readily available biomarkers is a significant hindrance toward progressing to effective therapeutic and preventative strategies for Alzheimer’s disease. Researchers have worked with representatives from the United States, Germany, Australia, England, and other countries to create these standards. “You can create a blood test in the lab, but if you don’t have a systemized way for collecting the blood, the test will never go into practice,” said the investigators.

A new study suggests a cause of amyotrophic lateral sclerosis (ALS), according to a study published online ahead of print October 14 in Proceedings of the National Academy of Sciences. Researchers used advanced biophysical methods to probe how different superoxide dismutase 1(SOD1) gene mutations in a genetic ALS hotspot affect SOD protein stability. Investigators examined how the aggregation dynamics of mutant SOD G93A differed from that of nonmutant SOD. They developed a method for gradually inducing SOD aggregation, which was measured with SAXS, a structural imaging system. The G93-mutant SODs appear to have looser, floppier structures that are more likely to drop their copper ions and are more likely to misfold and stick together in aggregates. “Our work supports a common theme whereby loss of protein stability leads to disease,” investigators said.

Long-term functional outcome and risk of fatal or disabling stroke are similar for stenting and endarterectomy for symptomatic carotid stenosis, according to a study published online ahead of print October 14 in the Lancet. Researchers followed 1,713 patients with carotid artery disease, of whom 855 were assigned to stenting and 858 to endarterectomy, for as long as 10 years. The median follow-up was 4.2 years. Both techniques were found to be equally good at preventing fatal and disabling strokes, but stented patients were slightly more likely to have minor strokes without long-term effects. The risk of any stroke in five years was 15.2% in the stenting group, compared with 9.4% in the endarterectomy group, but the additional strokes were minor and had no impact on long-term quality of life.

 

 

Researchers have found Class II evidence that serum metabolite profiles accurately distinguish patients with different subtypes and stages of multiple sclerosis (MS), according to a study published October 21 in Neurology. Investigators obtained serum samples from patients with primary progressive MS, secondary-progressive MS, and relapsing-remitting MS, patients with other neurodegenerative conditions, and from age-matched controls. Samples were analyzed by nuclear magnetic resonance, and partial least squares discriminant analysis models were derived to separate disease groups. The partial least squares discriminant analysis models for serum samples from patients with MS enabled reliable differentiation between relapsing-remitting MS and secondary-progressive MS. This approach identified significant differences between the metabolite profiles of each of the MS groups and the healthy controls, as well as predicting disease group membership with high sensitivity and specificity.

Parkinson’s disease pathogenic mutations have an age-dependent penetrance that could be ameliorated or exacerbated by modifier genes or environmental factors in different populations, according to a study published online ahead of print October 20 in JAMA Neurology. The investigators examined 49 previously published studies that included 709 participants and were found in ISI Web of Science and PubMed. They also analyzed extracted information about the number of mutation carriers within families and sporadic cases worldwide for pathogenic mutations in SNCA, LRRK2, VPS35, EIF4G1, and DNAJC13. The end-of-search date was January 31, 2014. In particular, penetrance of SNCA duplications were comparable to point mutations and driven by inclusion of SNCA p.A53T (mean age at onset, 45.9). Each penetrance estimate was given separately with 95% confidence intervals.

Spreading depolarizations can be measured after traumatic brain injury (TBI) by the placement of EEG electrodes on the scalp, according to a study published online ahead of print August 25 in Annals of Neurology. Eighteen patients requiring surgical treatment for TBI were monitored by invasive electrocorticography (ECoG) and noninvasive scalp EEG during intensive care. Spreading depolarizations were first identified in subdural recordings, and EEGs were then examined visually and quantitatively to identify correlates. A total of 455 spreading depolarizations occurred during 65.9 days of simultaneous ECoG and EEG monitoring. For 179 of 455 events (39%), depolarizations caused temporally isolated, transient depressions of spontaneous EEG amplitudes to 57% (median) of baseline power. For 62 of 179 (35%) events, isolated depressions showed a clear spread of depression between EEG channels with delays of 17 minutes (median).

A diet that includes walnuts may have a beneficial effect on reducing the risk, delaying the onset, and slowing the progression of, or preventing, Alzheimer’s disease, according to a study published October 21 in the Journal of Alzheimer’s Disease. The research group examined the effects of dietary supplementation on mice with 6% or 9% walnuts, which is equivalent to 1 ounce and 1.5 ounces per day, respectively, of walnuts in humans. The investigators found significant improvement in learning skills, memory, reducing anxiety, and motor development in mice fed a walnut-enriched diet. “These findings are very promising and help lay the groundwork for future human studies on walnuts and Alzheimer’s disease,” the investigators said.

Dopamine receptor agonist drugs are associated with impulse control disorders, such as pathologic gambling, hypersexuality, and compulsive shopping, according to a study that was published online ahead of print October 20 in JAMA Internal Medicine. Researchers conducted a retrospective disproportionality analysis that was based on the 2.7 million serious domestic and foreign adverse drug event reports between 2003 and 2012 that were extracted from the FDA Adverse Event Reporting System. The investigators identified 1,580 events indicating impulse control disorders from the United States and 21 other countries (710 for dopamine receptor agonist drugs and 870 for other drugs). The dopamine receptor agonist drugs had a strong signal associated with these impulse control disorders. The association was strongest for the dopamine agonists pramipexole and ropinirole, with preferential affinity for the dopamine D3 receptor. A signal was also seen for aripiprazole.

Kimberly D. Williams

References

References

Issue
Neurology Reviews - 22(11)
Issue
Neurology Reviews - 22(11)
Page Number
7-8
Page Number
7-8
Publications
Publications
Article Type
Display Headline
New and Noteworthy Information—November 2014
Display Headline
New and Noteworthy Information—November 2014
Legacy Keywords
Kimberly D. Williams, Neurology Reviews, Dementia, Alzheimer's, TBI, Brain, Injury
Legacy Keywords
Kimberly D. Williams, Neurology Reviews, Dementia, Alzheimer's, TBI, Brain, Injury
Sections
Article Source

PURLs Copyright

Inside the Article

Cognitive Rest May Be Crucial After Concussion

Article Type
Changed
Mon, 01/07/2019 - 09:54
Display Headline
Cognitive Rest May Be Crucial After Concussion

SAN DIEGO—With all the media attention drawn to the effects of sports-related concussion in recent years, a significant portion of schools in the United States have adopted return-to-play guidelines, but only a minority have return-to-learn protocols in place, according to a physician speaking at the 2014 Annual Meeting of the American Academy of Pediatrics.

Literature on the topic is scarce, but one survey of school nurses in Illinois found that 57% of schools in that state had return-to-play protocols, while 30% had protocols in place for returning to the classroom, said Kelsey Logan, MD, Director of the Division of Sports Medicine at Cincinnati Children’s Hospital Medical Center. A survey of youth in Nebraska who had sustained concussions in sports found that a minority (42%) of their teachers provided extra assistance in the classroom following their injury.

Cognitive Activity Can Prolong Recovery
Limiting cognitive activities “is a big part of their stress in getting over their injury,” said Dr. Logan. “I talk to the families about decreasing their child’s emotional stress, and academics are largely a cause of this. They’re stressed from day one about the work they’re missing.… If we address those [concerns] up front, they tend to be a little less stressed.”

Increasing cognitive activity soon after a concussive injury “worsens symptoms and prolongs recovery,” noted Dr. Logan. “That often takes several conversations with patients and parents before they understand that concept. Many times parents want you to micromanage their kid’s day—tell them exactly what they can and can’t do. That’s not really our role. I cannot predict whether 15 versus 20 minutes of looking on a computer is going to make their symptoms worse. Understanding concepts is important. When you start to experience a big gap in energy and your symptoms get worse, you need to back off. Our goal is to determine the appropriate balance of cognitive activity and cognitive rest.”

Creating a Return-to-Learn Plan
Developing a return-to-learning plan following a concussion starts with an assessment of the patient’s symptoms, which vary from individual to individual. “You can’t predict exactly what a person’s going to go through,” said Dr. Logan, one of the authors of a guideline on return to learning that was published in Pediatrics in 2013. “It’s important to consider physical, cognitive, emotional, and sleep symptoms.... Some patients will have many emotional symptoms after a concussion; others won’t. This is why it’s so important for primary care pediatricians to be treating concussions because they know their patients.”

Dr. Logan recommends that patients and their families use checklists to document symptoms, track their severity and progression, and target symptoms to address with school personnel. The ideal role of family members and friends is to enforce rest and reduce stimulation, while the role of the medical team is to evaluate symptoms, prescribe physical and cognitive rest, and get input from family members and school personnel on the patient’s progress. The chief goal is to help the patient get the most out of the school day without worsening symptoms. This process starts with limitations on school time.

“For an athlete who has a constant headache, I would recommend that she stay out of school until she feels a little bit better,” said Dr. Logan. “There’s not a specific symptom score that she needs to meet to go back to school. It’s when the family and the patient feel that she can go to school and concentrate. You don’t want to throw that athlete back into a full school day right away. You want to start with a few hours of school, maybe a half-day, depending on symptoms.”

The Importance of Rest Breaks
Acutely concussed athletes can only concentrate for 30- to 45-minute blocks of time, added Dr. Logan, so “I like to prescribe rest breaks. I try to get them to recognize that if they go to a hard class like calculus and have to work hard for 45 minutes or so, they’re probably going to be fried for the next period, so there needs to be something a little less onerous like study hall, or lunch, where they can rest. They need to use common sense during the day.”

During office visits, Dr. Logan reviews the school day schedule with patients, “and we try to target different areas where they can feel comfortable to rest. I’m asking their opinion on where the best spots in their day are to get some rest. Because if I just say, ‘you’re going to do this, this, and this, what’s their likelihood of following through with those instructions? It’s really low.”

 

 

Reducing the Burden of Schoolwork
Dr. Logan recommends limiting computer time, reading, math, and note-taking during recovery, because each task tends to cause symptoms to worsen. “Having either the teacher’s notes supplied to them or having another student take notes for them may allow them to tolerate more class time than they would if they were trying to take notes,” said Dr. Logan. “Listen to lectures only.” At home, students should perform only activities that don’t exacerbate symptoms. This means limiting instant messaging, texting, watching TV, and playing video games.

A subset of concussed patients are overstimulated by light and sound, “so it’s important to ask about that and make adjustments in the school day,” said Dr. Logan. “This [approach] would involve reducing sound and light when you can and wearing sunglasses and earplugs.”

Dr. Logan recommends delaying tests that may fall in the time line of recovery, such as midterms, finals, or college-readiness tests such as the SAT. “A brain-injured person is not going to do well on any of these tests,” she said. “In notes to school personnel, write ‘no testing for now,’ or ‘postpone testing.’ ”

Doug Brunk

References

Suggested Reading
Halstead ME, McAvoy K, Devore CD, et al. Returning to learning following a concussion. Pediatrics. 2013;132(5):948-957.

Author and Disclosure Information

Issue
Neurology Reviews - 22(11)
Publications
Topics
Page Number
56
Legacy Keywords
concussion, return to learn, Kelsey Logan, neurology reviews, doug brunk, recovery
Sections
Author and Disclosure Information

Author and Disclosure Information

SAN DIEGO—With all the media attention drawn to the effects of sports-related concussion in recent years, a significant portion of schools in the United States have adopted return-to-play guidelines, but only a minority have return-to-learn protocols in place, according to a physician speaking at the 2014 Annual Meeting of the American Academy of Pediatrics.

Literature on the topic is scarce, but one survey of school nurses in Illinois found that 57% of schools in that state had return-to-play protocols, while 30% had protocols in place for returning to the classroom, said Kelsey Logan, MD, Director of the Division of Sports Medicine at Cincinnati Children’s Hospital Medical Center. A survey of youth in Nebraska who had sustained concussions in sports found that a minority (42%) of their teachers provided extra assistance in the classroom following their injury.

Cognitive Activity Can Prolong Recovery
Limiting cognitive activities “is a big part of their stress in getting over their injury,” said Dr. Logan. “I talk to the families about decreasing their child’s emotional stress, and academics are largely a cause of this. They’re stressed from day one about the work they’re missing.… If we address those [concerns] up front, they tend to be a little less stressed.”

Increasing cognitive activity soon after a concussive injury “worsens symptoms and prolongs recovery,” noted Dr. Logan. “That often takes several conversations with patients and parents before they understand that concept. Many times parents want you to micromanage their kid’s day—tell them exactly what they can and can’t do. That’s not really our role. I cannot predict whether 15 versus 20 minutes of looking on a computer is going to make their symptoms worse. Understanding concepts is important. When you start to experience a big gap in energy and your symptoms get worse, you need to back off. Our goal is to determine the appropriate balance of cognitive activity and cognitive rest.”

Creating a Return-to-Learn Plan
Developing a return-to-learning plan following a concussion starts with an assessment of the patient’s symptoms, which vary from individual to individual. “You can’t predict exactly what a person’s going to go through,” said Dr. Logan, one of the authors of a guideline on return to learning that was published in Pediatrics in 2013. “It’s important to consider physical, cognitive, emotional, and sleep symptoms.... Some patients will have many emotional symptoms after a concussion; others won’t. This is why it’s so important for primary care pediatricians to be treating concussions because they know their patients.”

Dr. Logan recommends that patients and their families use checklists to document symptoms, track their severity and progression, and target symptoms to address with school personnel. The ideal role of family members and friends is to enforce rest and reduce stimulation, while the role of the medical team is to evaluate symptoms, prescribe physical and cognitive rest, and get input from family members and school personnel on the patient’s progress. The chief goal is to help the patient get the most out of the school day without worsening symptoms. This process starts with limitations on school time.

“For an athlete who has a constant headache, I would recommend that she stay out of school until she feels a little bit better,” said Dr. Logan. “There’s not a specific symptom score that she needs to meet to go back to school. It’s when the family and the patient feel that she can go to school and concentrate. You don’t want to throw that athlete back into a full school day right away. You want to start with a few hours of school, maybe a half-day, depending on symptoms.”

The Importance of Rest Breaks
Acutely concussed athletes can only concentrate for 30- to 45-minute blocks of time, added Dr. Logan, so “I like to prescribe rest breaks. I try to get them to recognize that if they go to a hard class like calculus and have to work hard for 45 minutes or so, they’re probably going to be fried for the next period, so there needs to be something a little less onerous like study hall, or lunch, where they can rest. They need to use common sense during the day.”

During office visits, Dr. Logan reviews the school day schedule with patients, “and we try to target different areas where they can feel comfortable to rest. I’m asking their opinion on where the best spots in their day are to get some rest. Because if I just say, ‘you’re going to do this, this, and this, what’s their likelihood of following through with those instructions? It’s really low.”

 

 

Reducing the Burden of Schoolwork
Dr. Logan recommends limiting computer time, reading, math, and note-taking during recovery, because each task tends to cause symptoms to worsen. “Having either the teacher’s notes supplied to them or having another student take notes for them may allow them to tolerate more class time than they would if they were trying to take notes,” said Dr. Logan. “Listen to lectures only.” At home, students should perform only activities that don’t exacerbate symptoms. This means limiting instant messaging, texting, watching TV, and playing video games.

A subset of concussed patients are overstimulated by light and sound, “so it’s important to ask about that and make adjustments in the school day,” said Dr. Logan. “This [approach] would involve reducing sound and light when you can and wearing sunglasses and earplugs.”

Dr. Logan recommends delaying tests that may fall in the time line of recovery, such as midterms, finals, or college-readiness tests such as the SAT. “A brain-injured person is not going to do well on any of these tests,” she said. “In notes to school personnel, write ‘no testing for now,’ or ‘postpone testing.’ ”

Doug Brunk

SAN DIEGO—With all the media attention drawn to the effects of sports-related concussion in recent years, a significant portion of schools in the United States have adopted return-to-play guidelines, but only a minority have return-to-learn protocols in place, according to a physician speaking at the 2014 Annual Meeting of the American Academy of Pediatrics.

Literature on the topic is scarce, but one survey of school nurses in Illinois found that 57% of schools in that state had return-to-play protocols, while 30% had protocols in place for returning to the classroom, said Kelsey Logan, MD, Director of the Division of Sports Medicine at Cincinnati Children’s Hospital Medical Center. A survey of youth in Nebraska who had sustained concussions in sports found that a minority (42%) of their teachers provided extra assistance in the classroom following their injury.

Cognitive Activity Can Prolong Recovery
Limiting cognitive activities “is a big part of their stress in getting over their injury,” said Dr. Logan. “I talk to the families about decreasing their child’s emotional stress, and academics are largely a cause of this. They’re stressed from day one about the work they’re missing.… If we address those [concerns] up front, they tend to be a little less stressed.”

Increasing cognitive activity soon after a concussive injury “worsens symptoms and prolongs recovery,” noted Dr. Logan. “That often takes several conversations with patients and parents before they understand that concept. Many times parents want you to micromanage their kid’s day—tell them exactly what they can and can’t do. That’s not really our role. I cannot predict whether 15 versus 20 minutes of looking on a computer is going to make their symptoms worse. Understanding concepts is important. When you start to experience a big gap in energy and your symptoms get worse, you need to back off. Our goal is to determine the appropriate balance of cognitive activity and cognitive rest.”

Creating a Return-to-Learn Plan
Developing a return-to-learning plan following a concussion starts with an assessment of the patient’s symptoms, which vary from individual to individual. “You can’t predict exactly what a person’s going to go through,” said Dr. Logan, one of the authors of a guideline on return to learning that was published in Pediatrics in 2013. “It’s important to consider physical, cognitive, emotional, and sleep symptoms.... Some patients will have many emotional symptoms after a concussion; others won’t. This is why it’s so important for primary care pediatricians to be treating concussions because they know their patients.”

Dr. Logan recommends that patients and their families use checklists to document symptoms, track their severity and progression, and target symptoms to address with school personnel. The ideal role of family members and friends is to enforce rest and reduce stimulation, while the role of the medical team is to evaluate symptoms, prescribe physical and cognitive rest, and get input from family members and school personnel on the patient’s progress. The chief goal is to help the patient get the most out of the school day without worsening symptoms. This process starts with limitations on school time.

“For an athlete who has a constant headache, I would recommend that she stay out of school until she feels a little bit better,” said Dr. Logan. “There’s not a specific symptom score that she needs to meet to go back to school. It’s when the family and the patient feel that she can go to school and concentrate. You don’t want to throw that athlete back into a full school day right away. You want to start with a few hours of school, maybe a half-day, depending on symptoms.”

The Importance of Rest Breaks
Acutely concussed athletes can only concentrate for 30- to 45-minute blocks of time, added Dr. Logan, so “I like to prescribe rest breaks. I try to get them to recognize that if they go to a hard class like calculus and have to work hard for 45 minutes or so, they’re probably going to be fried for the next period, so there needs to be something a little less onerous like study hall, or lunch, where they can rest. They need to use common sense during the day.”

During office visits, Dr. Logan reviews the school day schedule with patients, “and we try to target different areas where they can feel comfortable to rest. I’m asking their opinion on where the best spots in their day are to get some rest. Because if I just say, ‘you’re going to do this, this, and this, what’s their likelihood of following through with those instructions? It’s really low.”

 

 

Reducing the Burden of Schoolwork
Dr. Logan recommends limiting computer time, reading, math, and note-taking during recovery, because each task tends to cause symptoms to worsen. “Having either the teacher’s notes supplied to them or having another student take notes for them may allow them to tolerate more class time than they would if they were trying to take notes,” said Dr. Logan. “Listen to lectures only.” At home, students should perform only activities that don’t exacerbate symptoms. This means limiting instant messaging, texting, watching TV, and playing video games.

A subset of concussed patients are overstimulated by light and sound, “so it’s important to ask about that and make adjustments in the school day,” said Dr. Logan. “This [approach] would involve reducing sound and light when you can and wearing sunglasses and earplugs.”

Dr. Logan recommends delaying tests that may fall in the time line of recovery, such as midterms, finals, or college-readiness tests such as the SAT. “A brain-injured person is not going to do well on any of these tests,” she said. “In notes to school personnel, write ‘no testing for now,’ or ‘postpone testing.’ ”

Doug Brunk

References

Suggested Reading
Halstead ME, McAvoy K, Devore CD, et al. Returning to learning following a concussion. Pediatrics. 2013;132(5):948-957.

References

Suggested Reading
Halstead ME, McAvoy K, Devore CD, et al. Returning to learning following a concussion. Pediatrics. 2013;132(5):948-957.

Issue
Neurology Reviews - 22(11)
Issue
Neurology Reviews - 22(11)
Page Number
56
Page Number
56
Publications
Publications
Topics
Article Type
Display Headline
Cognitive Rest May Be Crucial After Concussion
Display Headline
Cognitive Rest May Be Crucial After Concussion
Legacy Keywords
concussion, return to learn, Kelsey Logan, neurology reviews, doug brunk, recovery
Legacy Keywords
concussion, return to learn, Kelsey Logan, neurology reviews, doug brunk, recovery
Sections
Article Source

PURLs Copyright

Inside the Article

‘Chemo brain’ may have targetable causes

Article Type
Changed
Wed, 01/04/2023 - 16:51
Display Headline
‘Chemo brain’ may have targetable causes

BOSTON – The risk for cognitive decline following cancer treatment varies by both cancer and therapy types, and can range from subtle changes to severe deficits, according to a researcher.

Patients who are older, have lower cognitive reserves, or have comorbidities such as cardiovascular disease or diabetes are at risk for cognitive problems following cancer treatment, said Tim A. Ahles, Ph.D., director of the Neurocognitive Research Laboratory, Memorial Sloan-Kettering Cancer Center, New York.

“I think it’s important that we identify these modifiable risk factors for intervention, and some of the nonmodifiable risk factors that inform decision making,” he said at the Palliative Care in Oncology Symposium.

“When we talk about cognitive function, we’re really talking about how does cancer and cancer treatment impact on memory, concentration, executive function, or ability to multitask, the speed at which we process information,” he said.

Oncologists have known for decades that brain tumors and their treatment have a negative effect on cognitive function, particularly among children under 5 years of age, whose developing brains are sensitive to treatments such as chemotherapy, surgery, radiation, and high-dose steroids.

There is also a dose-response effect, with high-dose chemotherapy such as ablative regimens used for bone-marrow transplantation being associated with higher probability of cognitive problems.

Dr. Ahles noted that about two-thirds of adult survivors of childhood cancers develop chronic illnesses within 30 years of diagnosis, including cardiac and pulmonary disease, and diabetes and endocrine dysfunction.

“It turns out they’re also at higher risk for cognitive issues,” including white matter abnormalities and microvascular stroke, he said.

The population of survivors of brain tumors and childhood cancers is dwarfed, however, by the large and growing population of breast, colorectal, lung, and prostate cancer patients who are diagnosed every year and exposed to adjuvant therapies, he added.

Aging and cognitive reserve

Evidence from breast cancer studies has shown that about 20%-25% of patients have lower than expected cognitive functioning – based on age, education, occupation, and other factors – before they embark on adjuvant therapy.

“That’s actually a risk factor for posttreatment cognitive decline, so there’s something that’s already going on that’s disrupting the cognitive information-processing system before we even start adjuvant treatment that may be critically important in terms of their outcomes as survivors,” Dr. Ahles said.

A significant subset of women in longitudinal studies of breast cancer survivors – about 15%-30% – experience long-term posttreatment cognitive problems, making it imperative for researchers and clinicians to identify risk factors for persistent cognitive decline, he said.

There is evidence to suggest that cancer treatments may interplay with biologic factors at the cellular level to increase the risk for cognitive loss. For example, aging is associated with reduction in brain volume, decrease in white matter integrity, and decreases in vascularization and neurotransmitter activity.

The effects of age on the brain are attenuated, however, among patients with higher cognitive reserves, defined as a combination of innate and developed cognitive capacity. Cognitive reserve is influenced by a number of factors, including genetics, education, occupational attainment, and lifestyle.

High cognitive reserve has been associated with later onset of Alzheimer’s disease symptoms and smaller changes in cognitive function with normal aging or following a brain injury, Dr. Ahles noted.

In a longitudinal study of cognitive changes associated with adjuvant therapy for breast cancer, Dr. Ahles and colleagues found that both age and pretreatment cognitive reserve were related to posttreatment decline in processing speed in women exposed to chemotherapy, compared with those who did not have chemotherapy or with healthy controls. In addition, chemotherapy had a short-term impact on verbal ability. The authors found evidence to suggest the patterns they saw may be related to the combination of chemotherapy and tamoxifen.

Part of the difficulty of studying cognitive decline among older patients is the higher prevalence of changes associated with aging. Dr. Ahles pointed to a French longitudinal study of women over 65 being treated for breast cancer, in which investigators found that 41% of the study population had cognitive impairment before starting on adjuvant therapy.

Older adults may be more frail, with diminished biological reserves and lower resistance to stressors caused by “cumulative declines across a variety of physiological systems making you more vulnerable to adverse events,” Dr. Ahles said.

Aging and genetics

Genes associated with cognitive aging are also risk factors for posttreatment cognitive decline, notably the genetic variants of APOE, including the epsilon 4 (APOE-e4) allele linked to increased risk for early-onset Alzheimer’s disease.

Dr. Ahles and colleagues had previously shown that APOE-e4 may be a biomarker for increased risk for chemotherapy-induced cognitive decline. The adverse effects of APOE-e4 appear to be mitigated somewhat by smoking, because it may correct for a deficit in nicotinic receptor density and dopamine levels in carriers.

 

 

Another genetic factor linked to postchemotherapy cognitive decline is the Val158Met polymorphism of the gene encoding for Catechol-O-methyltransferase (COMT), an enzyme that degrades neurotransmitters such as dopamine. Patients with this polymorphism have rapid dopamine metabolism, resulting in reduced dopamine activity.

These findings point to potential molecular mechanisms for cognitive changes associated with chemotherapy, and suggest that therapies targeted at neurotransmitter systems may ameliorate the effect, Dr. Ahles said.

He noted that animal studies have shown that fluoxetine (Prozac) prevents deficits in behavior and hippocampal function associated with 5-fluourauracil (5-FU), and that nicotine patches have been shown to improve cognitive functioning in patients with mild cognitive impairment.

The symposium was cosponsored by AAHPM, ASCO, ASTRO, and MASCC.

References

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
chemo brain, chemotherapy, cognitive function
Sections
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

BOSTON – The risk for cognitive decline following cancer treatment varies by both cancer and therapy types, and can range from subtle changes to severe deficits, according to a researcher.

Patients who are older, have lower cognitive reserves, or have comorbidities such as cardiovascular disease or diabetes are at risk for cognitive problems following cancer treatment, said Tim A. Ahles, Ph.D., director of the Neurocognitive Research Laboratory, Memorial Sloan-Kettering Cancer Center, New York.

“I think it’s important that we identify these modifiable risk factors for intervention, and some of the nonmodifiable risk factors that inform decision making,” he said at the Palliative Care in Oncology Symposium.

“When we talk about cognitive function, we’re really talking about how does cancer and cancer treatment impact on memory, concentration, executive function, or ability to multitask, the speed at which we process information,” he said.

Oncologists have known for decades that brain tumors and their treatment have a negative effect on cognitive function, particularly among children under 5 years of age, whose developing brains are sensitive to treatments such as chemotherapy, surgery, radiation, and high-dose steroids.

There is also a dose-response effect, with high-dose chemotherapy such as ablative regimens used for bone-marrow transplantation being associated with higher probability of cognitive problems.

Dr. Ahles noted that about two-thirds of adult survivors of childhood cancers develop chronic illnesses within 30 years of diagnosis, including cardiac and pulmonary disease, and diabetes and endocrine dysfunction.

“It turns out they’re also at higher risk for cognitive issues,” including white matter abnormalities and microvascular stroke, he said.

The population of survivors of brain tumors and childhood cancers is dwarfed, however, by the large and growing population of breast, colorectal, lung, and prostate cancer patients who are diagnosed every year and exposed to adjuvant therapies, he added.

Aging and cognitive reserve

Evidence from breast cancer studies has shown that about 20%-25% of patients have lower than expected cognitive functioning – based on age, education, occupation, and other factors – before they embark on adjuvant therapy.

“That’s actually a risk factor for posttreatment cognitive decline, so there’s something that’s already going on that’s disrupting the cognitive information-processing system before we even start adjuvant treatment that may be critically important in terms of their outcomes as survivors,” Dr. Ahles said.

A significant subset of women in longitudinal studies of breast cancer survivors – about 15%-30% – experience long-term posttreatment cognitive problems, making it imperative for researchers and clinicians to identify risk factors for persistent cognitive decline, he said.

There is evidence to suggest that cancer treatments may interplay with biologic factors at the cellular level to increase the risk for cognitive loss. For example, aging is associated with reduction in brain volume, decrease in white matter integrity, and decreases in vascularization and neurotransmitter activity.

The effects of age on the brain are attenuated, however, among patients with higher cognitive reserves, defined as a combination of innate and developed cognitive capacity. Cognitive reserve is influenced by a number of factors, including genetics, education, occupational attainment, and lifestyle.

High cognitive reserve has been associated with later onset of Alzheimer’s disease symptoms and smaller changes in cognitive function with normal aging or following a brain injury, Dr. Ahles noted.

In a longitudinal study of cognitive changes associated with adjuvant therapy for breast cancer, Dr. Ahles and colleagues found that both age and pretreatment cognitive reserve were related to posttreatment decline in processing speed in women exposed to chemotherapy, compared with those who did not have chemotherapy or with healthy controls. In addition, chemotherapy had a short-term impact on verbal ability. The authors found evidence to suggest the patterns they saw may be related to the combination of chemotherapy and tamoxifen.

Part of the difficulty of studying cognitive decline among older patients is the higher prevalence of changes associated with aging. Dr. Ahles pointed to a French longitudinal study of women over 65 being treated for breast cancer, in which investigators found that 41% of the study population had cognitive impairment before starting on adjuvant therapy.

Older adults may be more frail, with diminished biological reserves and lower resistance to stressors caused by “cumulative declines across a variety of physiological systems making you more vulnerable to adverse events,” Dr. Ahles said.

Aging and genetics

Genes associated with cognitive aging are also risk factors for posttreatment cognitive decline, notably the genetic variants of APOE, including the epsilon 4 (APOE-e4) allele linked to increased risk for early-onset Alzheimer’s disease.

Dr. Ahles and colleagues had previously shown that APOE-e4 may be a biomarker for increased risk for chemotherapy-induced cognitive decline. The adverse effects of APOE-e4 appear to be mitigated somewhat by smoking, because it may correct for a deficit in nicotinic receptor density and dopamine levels in carriers.

 

 

Another genetic factor linked to postchemotherapy cognitive decline is the Val158Met polymorphism of the gene encoding for Catechol-O-methyltransferase (COMT), an enzyme that degrades neurotransmitters such as dopamine. Patients with this polymorphism have rapid dopamine metabolism, resulting in reduced dopamine activity.

These findings point to potential molecular mechanisms for cognitive changes associated with chemotherapy, and suggest that therapies targeted at neurotransmitter systems may ameliorate the effect, Dr. Ahles said.

He noted that animal studies have shown that fluoxetine (Prozac) prevents deficits in behavior and hippocampal function associated with 5-fluourauracil (5-FU), and that nicotine patches have been shown to improve cognitive functioning in patients with mild cognitive impairment.

The symposium was cosponsored by AAHPM, ASCO, ASTRO, and MASCC.

BOSTON – The risk for cognitive decline following cancer treatment varies by both cancer and therapy types, and can range from subtle changes to severe deficits, according to a researcher.

Patients who are older, have lower cognitive reserves, or have comorbidities such as cardiovascular disease or diabetes are at risk for cognitive problems following cancer treatment, said Tim A. Ahles, Ph.D., director of the Neurocognitive Research Laboratory, Memorial Sloan-Kettering Cancer Center, New York.

“I think it’s important that we identify these modifiable risk factors for intervention, and some of the nonmodifiable risk factors that inform decision making,” he said at the Palliative Care in Oncology Symposium.

“When we talk about cognitive function, we’re really talking about how does cancer and cancer treatment impact on memory, concentration, executive function, or ability to multitask, the speed at which we process information,” he said.

Oncologists have known for decades that brain tumors and their treatment have a negative effect on cognitive function, particularly among children under 5 years of age, whose developing brains are sensitive to treatments such as chemotherapy, surgery, radiation, and high-dose steroids.

There is also a dose-response effect, with high-dose chemotherapy such as ablative regimens used for bone-marrow transplantation being associated with higher probability of cognitive problems.

Dr. Ahles noted that about two-thirds of adult survivors of childhood cancers develop chronic illnesses within 30 years of diagnosis, including cardiac and pulmonary disease, and diabetes and endocrine dysfunction.

“It turns out they’re also at higher risk for cognitive issues,” including white matter abnormalities and microvascular stroke, he said.

The population of survivors of brain tumors and childhood cancers is dwarfed, however, by the large and growing population of breast, colorectal, lung, and prostate cancer patients who are diagnosed every year and exposed to adjuvant therapies, he added.

Aging and cognitive reserve

Evidence from breast cancer studies has shown that about 20%-25% of patients have lower than expected cognitive functioning – based on age, education, occupation, and other factors – before they embark on adjuvant therapy.

“That’s actually a risk factor for posttreatment cognitive decline, so there’s something that’s already going on that’s disrupting the cognitive information-processing system before we even start adjuvant treatment that may be critically important in terms of their outcomes as survivors,” Dr. Ahles said.

A significant subset of women in longitudinal studies of breast cancer survivors – about 15%-30% – experience long-term posttreatment cognitive problems, making it imperative for researchers and clinicians to identify risk factors for persistent cognitive decline, he said.

There is evidence to suggest that cancer treatments may interplay with biologic factors at the cellular level to increase the risk for cognitive loss. For example, aging is associated with reduction in brain volume, decrease in white matter integrity, and decreases in vascularization and neurotransmitter activity.

The effects of age on the brain are attenuated, however, among patients with higher cognitive reserves, defined as a combination of innate and developed cognitive capacity. Cognitive reserve is influenced by a number of factors, including genetics, education, occupational attainment, and lifestyle.

High cognitive reserve has been associated with later onset of Alzheimer’s disease symptoms and smaller changes in cognitive function with normal aging or following a brain injury, Dr. Ahles noted.

In a longitudinal study of cognitive changes associated with adjuvant therapy for breast cancer, Dr. Ahles and colleagues found that both age and pretreatment cognitive reserve were related to posttreatment decline in processing speed in women exposed to chemotherapy, compared with those who did not have chemotherapy or with healthy controls. In addition, chemotherapy had a short-term impact on verbal ability. The authors found evidence to suggest the patterns they saw may be related to the combination of chemotherapy and tamoxifen.

Part of the difficulty of studying cognitive decline among older patients is the higher prevalence of changes associated with aging. Dr. Ahles pointed to a French longitudinal study of women over 65 being treated for breast cancer, in which investigators found that 41% of the study population had cognitive impairment before starting on adjuvant therapy.

Older adults may be more frail, with diminished biological reserves and lower resistance to stressors caused by “cumulative declines across a variety of physiological systems making you more vulnerable to adverse events,” Dr. Ahles said.

Aging and genetics

Genes associated with cognitive aging are also risk factors for posttreatment cognitive decline, notably the genetic variants of APOE, including the epsilon 4 (APOE-e4) allele linked to increased risk for early-onset Alzheimer’s disease.

Dr. Ahles and colleagues had previously shown that APOE-e4 may be a biomarker for increased risk for chemotherapy-induced cognitive decline. The adverse effects of APOE-e4 appear to be mitigated somewhat by smoking, because it may correct for a deficit in nicotinic receptor density and dopamine levels in carriers.

 

 

Another genetic factor linked to postchemotherapy cognitive decline is the Val158Met polymorphism of the gene encoding for Catechol-O-methyltransferase (COMT), an enzyme that degrades neurotransmitters such as dopamine. Patients with this polymorphism have rapid dopamine metabolism, resulting in reduced dopamine activity.

These findings point to potential molecular mechanisms for cognitive changes associated with chemotherapy, and suggest that therapies targeted at neurotransmitter systems may ameliorate the effect, Dr. Ahles said.

He noted that animal studies have shown that fluoxetine (Prozac) prevents deficits in behavior and hippocampal function associated with 5-fluourauracil (5-FU), and that nicotine patches have been shown to improve cognitive functioning in patients with mild cognitive impairment.

The symposium was cosponsored by AAHPM, ASCO, ASTRO, and MASCC.

References

References

Publications
Publications
Topics
Article Type
Display Headline
‘Chemo brain’ may have targetable causes
Display Headline
‘Chemo brain’ may have targetable causes
Legacy Keywords
chemo brain, chemotherapy, cognitive function
Legacy Keywords
chemo brain, chemotherapy, cognitive function
Sections
Article Source

PURLs Copyright

Inside the Article

Vitals

Key clinical point: Cognitive decline following chemotherapy may be an interplay of aging and drug-induced molecular changes.

Major finding: An estimated 20%-25% of women in breast cancer studies have lower than predicted cognitive function before starting chemotherapy.

Data source: Review of evidence on the association between cancer chemotherapy and cognitive decline.

Disclosures: Dr. Ahles’ work is supported by Memorial-Sloan Kettering Cancer Center. He reported having no relevant disclosures.

Air pollution not to blame for childhood leukemia, study suggests

Article Type
Changed
Mon, 11/03/2014 - 06:00
Display Headline
Air pollution not to blame for childhood leukemia, study suggests

Power lines in England

The increased risk of leukemia reported among children born close to overhead power lines is likely not a result of alterations in air pollution, researchers have reported in the Journal of Radiological Protection.

The group found little evidence to support the “corona-ion hypothesis” which has been cited as a possible explanation for the excess of childhood leukemia cases close to high-voltage overhead power lines in the UK prior to the 1980s.

The hypothesis is based on the fact that high-voltage overhead power lines create charged particles in the surrounding air.

These ionized particles, known as corona ions, can be blown away by the wind and attach to air pollutants, such as those from traffic or smoking.

The corona-ion hypothesis suggests these electrically charged pollutants are more likely to be retained in the airways or lungs, and this could lead to serious health effects, including childhood leukemia.

The researchers previously showed that, on average, there has been no increased risk of leukemia among children born near high-voltage power lines in recent decades. However, the same piece of research confirmed an increased risk prior to the 1980s, which has yet to be explained.

To investigate this theory, John Swanson, of National Grid in London, and his colleagues used data from 7347 children in England and Wales who were born and diagnosed with leukemia between 1968 and 2008, and who lived within 600 m of a high-voltage overhead power line.

The researchers calculated the exposure of each of the subjects to corona ions using a model based on: the voltage of the power line; the distance from the line; how the concentration of corona ions varied with distance from the power lines; and, using data from various meteorological stations, the amount of time and speed that wind blew in each direction around the power lines.

The results did not suggest that exposure to corona ions explained the pattern of increased leukemia rates close to high-voltage overhead power lines previously found in earlier decades.

“We found in earlier studies that, for previous decades, childhood leukemia rates were higher near power lines,” said Kathryn Bunch, of the University of Oxford.

“This new paper seems to show that this wasn’t caused by corona ions, but it leaves us still searching for the true cause, and we are undertaking further investigations of the variation in risk over time.”

Publications
Topics

Power lines in England

The increased risk of leukemia reported among children born close to overhead power lines is likely not a result of alterations in air pollution, researchers have reported in the Journal of Radiological Protection.

The group found little evidence to support the “corona-ion hypothesis” which has been cited as a possible explanation for the excess of childhood leukemia cases close to high-voltage overhead power lines in the UK prior to the 1980s.

The hypothesis is based on the fact that high-voltage overhead power lines create charged particles in the surrounding air.

These ionized particles, known as corona ions, can be blown away by the wind and attach to air pollutants, such as those from traffic or smoking.

The corona-ion hypothesis suggests these electrically charged pollutants are more likely to be retained in the airways or lungs, and this could lead to serious health effects, including childhood leukemia.

The researchers previously showed that, on average, there has been no increased risk of leukemia among children born near high-voltage power lines in recent decades. However, the same piece of research confirmed an increased risk prior to the 1980s, which has yet to be explained.

To investigate this theory, John Swanson, of National Grid in London, and his colleagues used data from 7347 children in England and Wales who were born and diagnosed with leukemia between 1968 and 2008, and who lived within 600 m of a high-voltage overhead power line.

The researchers calculated the exposure of each of the subjects to corona ions using a model based on: the voltage of the power line; the distance from the line; how the concentration of corona ions varied with distance from the power lines; and, using data from various meteorological stations, the amount of time and speed that wind blew in each direction around the power lines.

The results did not suggest that exposure to corona ions explained the pattern of increased leukemia rates close to high-voltage overhead power lines previously found in earlier decades.

“We found in earlier studies that, for previous decades, childhood leukemia rates were higher near power lines,” said Kathryn Bunch, of the University of Oxford.

“This new paper seems to show that this wasn’t caused by corona ions, but it leaves us still searching for the true cause, and we are undertaking further investigations of the variation in risk over time.”

Power lines in England

The increased risk of leukemia reported among children born close to overhead power lines is likely not a result of alterations in air pollution, researchers have reported in the Journal of Radiological Protection.

The group found little evidence to support the “corona-ion hypothesis” which has been cited as a possible explanation for the excess of childhood leukemia cases close to high-voltage overhead power lines in the UK prior to the 1980s.

The hypothesis is based on the fact that high-voltage overhead power lines create charged particles in the surrounding air.

These ionized particles, known as corona ions, can be blown away by the wind and attach to air pollutants, such as those from traffic or smoking.

The corona-ion hypothesis suggests these electrically charged pollutants are more likely to be retained in the airways or lungs, and this could lead to serious health effects, including childhood leukemia.

The researchers previously showed that, on average, there has been no increased risk of leukemia among children born near high-voltage power lines in recent decades. However, the same piece of research confirmed an increased risk prior to the 1980s, which has yet to be explained.

To investigate this theory, John Swanson, of National Grid in London, and his colleagues used data from 7347 children in England and Wales who were born and diagnosed with leukemia between 1968 and 2008, and who lived within 600 m of a high-voltage overhead power line.

The researchers calculated the exposure of each of the subjects to corona ions using a model based on: the voltage of the power line; the distance from the line; how the concentration of corona ions varied with distance from the power lines; and, using data from various meteorological stations, the amount of time and speed that wind blew in each direction around the power lines.

The results did not suggest that exposure to corona ions explained the pattern of increased leukemia rates close to high-voltage overhead power lines previously found in earlier decades.

“We found in earlier studies that, for previous decades, childhood leukemia rates were higher near power lines,” said Kathryn Bunch, of the University of Oxford.

“This new paper seems to show that this wasn’t caused by corona ions, but it leaves us still searching for the true cause, and we are undertaking further investigations of the variation in risk over time.”

Publications
Publications
Topics
Article Type
Display Headline
Air pollution not to blame for childhood leukemia, study suggests
Display Headline
Air pollution not to blame for childhood leukemia, study suggests
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

New agents challenge role of transplant in high-risk CLL

Article Type
Changed
Mon, 11/03/2014 - 06:00
Display Headline
New agents challenge role of transplant in high-risk CLL

Preparing for HSCT
Credit: Chad McNeeley

NEW YORK—The role of allogeneic hematopoietic stem cell transplant (HSCT) for patients with high-risk chronic lymphocytic leukemia (CLL) is changing in the age of targeted therapy.

While allogeneic HSCT has been considered standard treatment for these patients, the question arises whether it will maintain its position in the era “of all these wonderful new drugs,” said David Maloney, MD, PhD, of the Fred Hutchinson Cancer Research Center in Seattle, Washington.

Dr Maloney undertook to convince the audience at the Lymphoma & Myeloma 2014 congress that there is still a role for allogeneic transplant in CLL patients.

He noted that early allogeneic transplant trials used myeloablative conditioning regimens, which were “prohibitively toxic.” They have now given way to reduced-intensity regimens.

“But the breakthrough came about when it was realized that the reason that allogeneic transplant could cure patients with CLL had really nothing to do with their conditioning regimen . . . ,” he said. “[I]t was probably the donor T cells providing immunologic activity and graft-vs-host activity that was actually able to provide graft-vs-tumor activity and cure patients.”

Seattle regimen

Dr Maloney described the reduced-intensity regimen used in Seattle—fludarabine and 2 Gy total body irradiation. The single dose of radiation is typically 1/6 of what a myeloablative regimen would be.

“This is truly an outpatient regimen,” he said. “Most patients, 50%, get through this without ever being in the hospital.”

Follow-up at 5 years showed overall survival to be 43%, progression-free survival 36%, complete responses 52%, and relapse 34%.

“This may not look very good,” Dr Maloney said, but these are fludarabine-refractory CLL patients whose expected median survival is around 12 months.

Dr Maloney noted that approximately the same outcomes were achieved whether the graft was from a matched related or unrelated donor, and cytogenetics really didn’t play a huge role in outcome.

The biggest factor affecting outcome was lymph node size. Patients with nodes 5 cm or larger did very poorly. And patients with lymph nodes smaller than 5 cm, irrespective of white cell count or bone marrow infiltration, actually did quite well in comparison to the group with large lymph nodes.

“So the graft-vs-tumor activity seems to be limited in its ability to get rid of bulky lymphadenopathy in this population,” Dr Maloney said.

Prior alemtuzumab therapy was also associated with the worst outcome in terms of relapse and disease progression.

Patients without comorbidities and without bulky lymphadenopathy have a very good outcome, Dr Maloney noted, saying, “You can cure 60% to 70% with an allogeneic transplant.”

He also pointed out that many groups are now doing this type of transplant with related and unrelated donors.

Transplant vs new agents

In addition to offering a potential cure, allogeneic transplant may provide better-functioning hematopoietic and immune systems after transplant than before, especially in those patients who received FCR (fludarabine, mitoxantrone, and rituximab) or other treatments.

Transplant, while potentially curative with a high complete response rate, has early non-relapse mortality around 15% to 20%.

“So this makes it hard to position in this era of pills that you can take,” Dr Maloney said.

He pointed out that while ibrutinib and idelalisib have excellent outcomes and overall survival, “these studies are very, very early . . .  but obviously extremely promising.”

A group of European physicians recently published a position paper proposing a treatment algorithm that includes transplant for high-risk CLL patients. The algorithm indicates that relapsed/refractory patients should try the novel agents first.

Then, if patients respond, they can continue with the novel agent or proceed to transplant. Patients with lower-risk disease or those who are a higher transplant risk should probably continue on the novel agent. 

Those who are younger with higher-risk disease, such as a 17p deletion, or who are a low transplant risk may be willing to choose transplant earlier.

Patients who do not respond to the novel agents can consider transplant or an alternative salvage regimen.

“[O]bviously, this is extremely controversial,” Dr Maloney said, “and what everyone is going to do is use these new agents to push transplant further down the road. And I think that’s appropriate.”

At the very least, Dr Maloney believes patients deserve a discussion of options early on.

He added that chimeric antigen receptor (CAR) T cells “will likely bump transplant even down another notch” because patients are likely to be willing to take the risk of CAR T cells before they’ll take the risk of chronic graft-vs-host disease with an unrelated donor.”

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Preparing for HSCT
Credit: Chad McNeeley

NEW YORK—The role of allogeneic hematopoietic stem cell transplant (HSCT) for patients with high-risk chronic lymphocytic leukemia (CLL) is changing in the age of targeted therapy.

While allogeneic HSCT has been considered standard treatment for these patients, the question arises whether it will maintain its position in the era “of all these wonderful new drugs,” said David Maloney, MD, PhD, of the Fred Hutchinson Cancer Research Center in Seattle, Washington.

Dr Maloney undertook to convince the audience at the Lymphoma & Myeloma 2014 congress that there is still a role for allogeneic transplant in CLL patients.

He noted that early allogeneic transplant trials used myeloablative conditioning regimens, which were “prohibitively toxic.” They have now given way to reduced-intensity regimens.

“But the breakthrough came about when it was realized that the reason that allogeneic transplant could cure patients with CLL had really nothing to do with their conditioning regimen . . . ,” he said. “[I]t was probably the donor T cells providing immunologic activity and graft-vs-host activity that was actually able to provide graft-vs-tumor activity and cure patients.”

Seattle regimen

Dr Maloney described the reduced-intensity regimen used in Seattle—fludarabine and 2 Gy total body irradiation. The single dose of radiation is typically 1/6 of what a myeloablative regimen would be.

“This is truly an outpatient regimen,” he said. “Most patients, 50%, get through this without ever being in the hospital.”

Follow-up at 5 years showed overall survival to be 43%, progression-free survival 36%, complete responses 52%, and relapse 34%.

“This may not look very good,” Dr Maloney said, but these are fludarabine-refractory CLL patients whose expected median survival is around 12 months.

Dr Maloney noted that approximately the same outcomes were achieved whether the graft was from a matched related or unrelated donor, and cytogenetics really didn’t play a huge role in outcome.

The biggest factor affecting outcome was lymph node size. Patients with nodes 5 cm or larger did very poorly. And patients with lymph nodes smaller than 5 cm, irrespective of white cell count or bone marrow infiltration, actually did quite well in comparison to the group with large lymph nodes.

“So the graft-vs-tumor activity seems to be limited in its ability to get rid of bulky lymphadenopathy in this population,” Dr Maloney said.

Prior alemtuzumab therapy was also associated with the worst outcome in terms of relapse and disease progression.

Patients without comorbidities and without bulky lymphadenopathy have a very good outcome, Dr Maloney noted, saying, “You can cure 60% to 70% with an allogeneic transplant.”

He also pointed out that many groups are now doing this type of transplant with related and unrelated donors.

Transplant vs new agents

In addition to offering a potential cure, allogeneic transplant may provide better-functioning hematopoietic and immune systems after transplant than before, especially in those patients who received FCR (fludarabine, mitoxantrone, and rituximab) or other treatments.

Transplant, while potentially curative with a high complete response rate, has early non-relapse mortality around 15% to 20%.

“So this makes it hard to position in this era of pills that you can take,” Dr Maloney said.

He pointed out that while ibrutinib and idelalisib have excellent outcomes and overall survival, “these studies are very, very early . . .  but obviously extremely promising.”

A group of European physicians recently published a position paper proposing a treatment algorithm that includes transplant for high-risk CLL patients. The algorithm indicates that relapsed/refractory patients should try the novel agents first.

Then, if patients respond, they can continue with the novel agent or proceed to transplant. Patients with lower-risk disease or those who are a higher transplant risk should probably continue on the novel agent. 

Those who are younger with higher-risk disease, such as a 17p deletion, or who are a low transplant risk may be willing to choose transplant earlier.

Patients who do not respond to the novel agents can consider transplant or an alternative salvage regimen.

“[O]bviously, this is extremely controversial,” Dr Maloney said, “and what everyone is going to do is use these new agents to push transplant further down the road. And I think that’s appropriate.”

At the very least, Dr Maloney believes patients deserve a discussion of options early on.

He added that chimeric antigen receptor (CAR) T cells “will likely bump transplant even down another notch” because patients are likely to be willing to take the risk of CAR T cells before they’ll take the risk of chronic graft-vs-host disease with an unrelated donor.”

Preparing for HSCT
Credit: Chad McNeeley

NEW YORK—The role of allogeneic hematopoietic stem cell transplant (HSCT) for patients with high-risk chronic lymphocytic leukemia (CLL) is changing in the age of targeted therapy.

While allogeneic HSCT has been considered standard treatment for these patients, the question arises whether it will maintain its position in the era “of all these wonderful new drugs,” said David Maloney, MD, PhD, of the Fred Hutchinson Cancer Research Center in Seattle, Washington.

Dr Maloney undertook to convince the audience at the Lymphoma & Myeloma 2014 congress that there is still a role for allogeneic transplant in CLL patients.

He noted that early allogeneic transplant trials used myeloablative conditioning regimens, which were “prohibitively toxic.” They have now given way to reduced-intensity regimens.

“But the breakthrough came about when it was realized that the reason that allogeneic transplant could cure patients with CLL had really nothing to do with their conditioning regimen . . . ,” he said. “[I]t was probably the donor T cells providing immunologic activity and graft-vs-host activity that was actually able to provide graft-vs-tumor activity and cure patients.”

Seattle regimen

Dr Maloney described the reduced-intensity regimen used in Seattle—fludarabine and 2 Gy total body irradiation. The single dose of radiation is typically 1/6 of what a myeloablative regimen would be.

“This is truly an outpatient regimen,” he said. “Most patients, 50%, get through this without ever being in the hospital.”

Follow-up at 5 years showed overall survival to be 43%, progression-free survival 36%, complete responses 52%, and relapse 34%.

“This may not look very good,” Dr Maloney said, but these are fludarabine-refractory CLL patients whose expected median survival is around 12 months.

Dr Maloney noted that approximately the same outcomes were achieved whether the graft was from a matched related or unrelated donor, and cytogenetics really didn’t play a huge role in outcome.

The biggest factor affecting outcome was lymph node size. Patients with nodes 5 cm or larger did very poorly. And patients with lymph nodes smaller than 5 cm, irrespective of white cell count or bone marrow infiltration, actually did quite well in comparison to the group with large lymph nodes.

“So the graft-vs-tumor activity seems to be limited in its ability to get rid of bulky lymphadenopathy in this population,” Dr Maloney said.

Prior alemtuzumab therapy was also associated with the worst outcome in terms of relapse and disease progression.

Patients without comorbidities and without bulky lymphadenopathy have a very good outcome, Dr Maloney noted, saying, “You can cure 60% to 70% with an allogeneic transplant.”

He also pointed out that many groups are now doing this type of transplant with related and unrelated donors.

Transplant vs new agents

In addition to offering a potential cure, allogeneic transplant may provide better-functioning hematopoietic and immune systems after transplant than before, especially in those patients who received FCR (fludarabine, mitoxantrone, and rituximab) or other treatments.

Transplant, while potentially curative with a high complete response rate, has early non-relapse mortality around 15% to 20%.

“So this makes it hard to position in this era of pills that you can take,” Dr Maloney said.

He pointed out that while ibrutinib and idelalisib have excellent outcomes and overall survival, “these studies are very, very early . . .  but obviously extremely promising.”

A group of European physicians recently published a position paper proposing a treatment algorithm that includes transplant for high-risk CLL patients. The algorithm indicates that relapsed/refractory patients should try the novel agents first.

Then, if patients respond, they can continue with the novel agent or proceed to transplant. Patients with lower-risk disease or those who are a higher transplant risk should probably continue on the novel agent. 

Those who are younger with higher-risk disease, such as a 17p deletion, or who are a low transplant risk may be willing to choose transplant earlier.

Patients who do not respond to the novel agents can consider transplant or an alternative salvage regimen.

“[O]bviously, this is extremely controversial,” Dr Maloney said, “and what everyone is going to do is use these new agents to push transplant further down the road. And I think that’s appropriate.”

At the very least, Dr Maloney believes patients deserve a discussion of options early on.

He added that chimeric antigen receptor (CAR) T cells “will likely bump transplant even down another notch” because patients are likely to be willing to take the risk of CAR T cells before they’ll take the risk of chronic graft-vs-host disease with an unrelated donor.”

Publications
Publications
Topics
Article Type
Display Headline
New agents challenge role of transplant in high-risk CLL
Display Headline
New agents challenge role of transplant in high-risk CLL
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Emergency Departments Monitored, Investigated by Hospital Committees, Governmental Agencies

Article Type
Changed
Fri, 09/14/2018 - 12:12
Display Headline
Emergency Departments Monitored, Investigated by Hospital Committees, Governmental Agencies

Why is it that there are no focused looks into the ED? We all know, as hospitalists, that the ED locks us into many admissions. Yet I see no initiatives through the Centers for Medicare and Medicaid Services (CMS) going after the ED for wanting patients admitted rather than trying to get these patients sent home for outpatient therapy.

–Ray Nowaczyk, DO

Dr. Hospitalist responds:

Au contraire, my fellow hospitalist! The ED is monitored and investigated by many hospital committees and governmental agencies. Although we physicians, and I’m sure most hospitals, have always acknowledged our responsibilities to take care of patients during an emergency, this responsibility was enshrined in legalese in 1986 with the passage of the Emergency Medical Treatment and Active Labor Act (EMTALA), also known as the “antidumping law.” Since its passage, any hospital that receives Medicare or Medicaid funding, which includes almost all of them, is at risk of being fined or losing this vital source of funding if this law is violated.

EMTALA essentially states that any patient who presents to the ED must be provided a screening exam and treatment for any “emergent medical condition” (including labor), regardless of the individual’s ability to pay. The hospital is then required to provide “stabilizing” treatment for these patients or transfer them to another facility where this treatment can be provided. Furthermore, hospitals that refuse to accept these patients in transfer without valid reasons (e.g. no open beds) can be charged with an EMTALA violation.

As you well know, what is considered stabilized or at baseline by one clinician can be seen as unstable or requiring urgent care by another. The real day-to-day practice of medicine often defies evidence-based logic and forces us to make decisions based on many clinical and nonclinical variables.

These situations are further compounded by recent CMS attempts to hold hospitals publicly accountable for ED throughput by posting these measures on its website. Along with other metrics, the citizenry can now see how long it takes an ED patient to be seen by a health professional, receive pain medication if they have a broken bone, receive appropriate treatment and be sent home, or, if admitted, how long it takes to get into a bed.

This information makes it clearer that in situations of clinical uncertainty, it may be easier for many ED physicians to admit than to discharge. The “treat-‘em or street-‘em” mentality of triaging patients, of course, varies from doc to doc and can definitely create antipathy towards physicians in the ED. As much as I may disagree with some of our ED doc’s admissions, I always—OK, maybe not always—try to assume they have the patient’s best interest at heart.

Once admitted, the onus is placed on us, as hospitalists, to determine whether the patient requires ongoing inpatient care, can be cared for in an “observation” capacity, or should be discharged. We all have received calls from a nurse informing us that the patient “does not meet inpatient criteria”—even if the patient is hypotensive with systemic inflammatory response syndrome and lactic acidosis. Oh, if we could only send them back to the ED!

Do you have a problem or concern that you’d like Dr. Hospitalist to address? Email your questions to [email protected].

Issue
The Hospitalist - 2014(11)
Publications
Sections

Why is it that there are no focused looks into the ED? We all know, as hospitalists, that the ED locks us into many admissions. Yet I see no initiatives through the Centers for Medicare and Medicaid Services (CMS) going after the ED for wanting patients admitted rather than trying to get these patients sent home for outpatient therapy.

–Ray Nowaczyk, DO

Dr. Hospitalist responds:

Au contraire, my fellow hospitalist! The ED is monitored and investigated by many hospital committees and governmental agencies. Although we physicians, and I’m sure most hospitals, have always acknowledged our responsibilities to take care of patients during an emergency, this responsibility was enshrined in legalese in 1986 with the passage of the Emergency Medical Treatment and Active Labor Act (EMTALA), also known as the “antidumping law.” Since its passage, any hospital that receives Medicare or Medicaid funding, which includes almost all of them, is at risk of being fined or losing this vital source of funding if this law is violated.

EMTALA essentially states that any patient who presents to the ED must be provided a screening exam and treatment for any “emergent medical condition” (including labor), regardless of the individual’s ability to pay. The hospital is then required to provide “stabilizing” treatment for these patients or transfer them to another facility where this treatment can be provided. Furthermore, hospitals that refuse to accept these patients in transfer without valid reasons (e.g. no open beds) can be charged with an EMTALA violation.

As you well know, what is considered stabilized or at baseline by one clinician can be seen as unstable or requiring urgent care by another. The real day-to-day practice of medicine often defies evidence-based logic and forces us to make decisions based on many clinical and nonclinical variables.

These situations are further compounded by recent CMS attempts to hold hospitals publicly accountable for ED throughput by posting these measures on its website. Along with other metrics, the citizenry can now see how long it takes an ED patient to be seen by a health professional, receive pain medication if they have a broken bone, receive appropriate treatment and be sent home, or, if admitted, how long it takes to get into a bed.

This information makes it clearer that in situations of clinical uncertainty, it may be easier for many ED physicians to admit than to discharge. The “treat-‘em or street-‘em” mentality of triaging patients, of course, varies from doc to doc and can definitely create antipathy towards physicians in the ED. As much as I may disagree with some of our ED doc’s admissions, I always—OK, maybe not always—try to assume they have the patient’s best interest at heart.

Once admitted, the onus is placed on us, as hospitalists, to determine whether the patient requires ongoing inpatient care, can be cared for in an “observation” capacity, or should be discharged. We all have received calls from a nurse informing us that the patient “does not meet inpatient criteria”—even if the patient is hypotensive with systemic inflammatory response syndrome and lactic acidosis. Oh, if we could only send them back to the ED!

Do you have a problem or concern that you’d like Dr. Hospitalist to address? Email your questions to [email protected].

Why is it that there are no focused looks into the ED? We all know, as hospitalists, that the ED locks us into many admissions. Yet I see no initiatives through the Centers for Medicare and Medicaid Services (CMS) going after the ED for wanting patients admitted rather than trying to get these patients sent home for outpatient therapy.

–Ray Nowaczyk, DO

Dr. Hospitalist responds:

Au contraire, my fellow hospitalist! The ED is monitored and investigated by many hospital committees and governmental agencies. Although we physicians, and I’m sure most hospitals, have always acknowledged our responsibilities to take care of patients during an emergency, this responsibility was enshrined in legalese in 1986 with the passage of the Emergency Medical Treatment and Active Labor Act (EMTALA), also known as the “antidumping law.” Since its passage, any hospital that receives Medicare or Medicaid funding, which includes almost all of them, is at risk of being fined or losing this vital source of funding if this law is violated.

EMTALA essentially states that any patient who presents to the ED must be provided a screening exam and treatment for any “emergent medical condition” (including labor), regardless of the individual’s ability to pay. The hospital is then required to provide “stabilizing” treatment for these patients or transfer them to another facility where this treatment can be provided. Furthermore, hospitals that refuse to accept these patients in transfer without valid reasons (e.g. no open beds) can be charged with an EMTALA violation.

As you well know, what is considered stabilized or at baseline by one clinician can be seen as unstable or requiring urgent care by another. The real day-to-day practice of medicine often defies evidence-based logic and forces us to make decisions based on many clinical and nonclinical variables.

These situations are further compounded by recent CMS attempts to hold hospitals publicly accountable for ED throughput by posting these measures on its website. Along with other metrics, the citizenry can now see how long it takes an ED patient to be seen by a health professional, receive pain medication if they have a broken bone, receive appropriate treatment and be sent home, or, if admitted, how long it takes to get into a bed.

This information makes it clearer that in situations of clinical uncertainty, it may be easier for many ED physicians to admit than to discharge. The “treat-‘em or street-‘em” mentality of triaging patients, of course, varies from doc to doc and can definitely create antipathy towards physicians in the ED. As much as I may disagree with some of our ED doc’s admissions, I always—OK, maybe not always—try to assume they have the patient’s best interest at heart.

Once admitted, the onus is placed on us, as hospitalists, to determine whether the patient requires ongoing inpatient care, can be cared for in an “observation” capacity, or should be discharged. We all have received calls from a nurse informing us that the patient “does not meet inpatient criteria”—even if the patient is hypotensive with systemic inflammatory response syndrome and lactic acidosis. Oh, if we could only send them back to the ED!

Do you have a problem or concern that you’d like Dr. Hospitalist to address? Email your questions to [email protected].

Issue
The Hospitalist - 2014(11)
Issue
The Hospitalist - 2014(11)
Publications
Publications
Article Type
Display Headline
Emergency Departments Monitored, Investigated by Hospital Committees, Governmental Agencies
Display Headline
Emergency Departments Monitored, Investigated by Hospital Committees, Governmental Agencies
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)