M. Alexander Otto began his reporting career early in 1999 covering the pharmaceutical industry for a national pharmacists' magazine and freelancing for the Washington Post and other newspapers. He then joined BNA, now part of Bloomberg News, covering health law and the protection of people and animals in medical research. Alex next worked for the McClatchy Company. Based on his work, Alex won a year-long Knight Science Journalism Fellowship to MIT in 2008-2009. He joined the company shortly thereafter. Alex has a newspaper journalism degree from Syracuse (N.Y.) University and a master's degree in medical science -- a physician assistant degree -- from George Washington University. Alex is based in Seattle.

Do not delay tranexamic acid for hemorrhage

Spectrum of applications being defined
Article Type
Changed
Wed, 12/12/2018 - 21:08

The sooner tranexamic acid is administered for hemorrhage, the better, according to an analysis of 40,138 patients with severe traumatic or postpartum bleeding.

SOURCE: Gayet-Ageron A et al. Lancet. 2017 Nov 7. doi: 10.1016/S0140-6736(17)32455-8

Body

 

Early administration of tranexamic acid appears to offer the best hope for a good outcome in a bleeding patient with hyperfibrinolysis. The effect of tranexamic acid on inflammation and other pathways in patients without active bleeding is less clear. It is also unclear whether thromboelastography will move out of the research laboratory and become a routine means of assessment for bleeding patients.

At present, the careful study of Dr. Gayet-Ageron and her coworkers suggests applicability of early administration of this agent in patients with substantial bleeding from multiple causes. As data from additional trials with tranexamic acid become available, the spectrum of applications for this agent should become apparent.

David Dries, MD, is a professor of surgery at the University of Minnesota, Minneapolis. He made his comments in an editorial (Lancet. 2017 Nov 7. doi: 10.1016/S0140-6736(17)32806-4) and had no competing interests.
 

Publications
Topics
Sections
Body

 

Early administration of tranexamic acid appears to offer the best hope for a good outcome in a bleeding patient with hyperfibrinolysis. The effect of tranexamic acid on inflammation and other pathways in patients without active bleeding is less clear. It is also unclear whether thromboelastography will move out of the research laboratory and become a routine means of assessment for bleeding patients.

At present, the careful study of Dr. Gayet-Ageron and her coworkers suggests applicability of early administration of this agent in patients with substantial bleeding from multiple causes. As data from additional trials with tranexamic acid become available, the spectrum of applications for this agent should become apparent.

David Dries, MD, is a professor of surgery at the University of Minnesota, Minneapolis. He made his comments in an editorial (Lancet. 2017 Nov 7. doi: 10.1016/S0140-6736(17)32806-4) and had no competing interests.
 

Body

 

Early administration of tranexamic acid appears to offer the best hope for a good outcome in a bleeding patient with hyperfibrinolysis. The effect of tranexamic acid on inflammation and other pathways in patients without active bleeding is less clear. It is also unclear whether thromboelastography will move out of the research laboratory and become a routine means of assessment for bleeding patients.

At present, the careful study of Dr. Gayet-Ageron and her coworkers suggests applicability of early administration of this agent in patients with substantial bleeding from multiple causes. As data from additional trials with tranexamic acid become available, the spectrum of applications for this agent should become apparent.

David Dries, MD, is a professor of surgery at the University of Minnesota, Minneapolis. He made his comments in an editorial (Lancet. 2017 Nov 7. doi: 10.1016/S0140-6736(17)32806-4) and had no competing interests.
 

Title
Spectrum of applications being defined
Spectrum of applications being defined

The sooner tranexamic acid is administered for hemorrhage, the better, according to an analysis of 40,138 patients with severe traumatic or postpartum bleeding.

SOURCE: Gayet-Ageron A et al. Lancet. 2017 Nov 7. doi: 10.1016/S0140-6736(17)32455-8

The sooner tranexamic acid is administered for hemorrhage, the better, according to an analysis of 40,138 patients with severe traumatic or postpartum bleeding.

SOURCE: Gayet-Ageron A et al. Lancet. 2017 Nov 7. doi: 10.1016/S0140-6736(17)32455-8

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE LANCET

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

USPSTF goes neutral on adolescent scoliosis screening

Major research gaps identified
Article Type
Changed
Tue, 02/14/2023 - 13:05

 

The U.S. Preventive Services Task Force neither recommended for nor recommended against routine screening for adolescent idiopathic scoliosis in new guidelines published Jan. 9 in JAMA.

The determination applies to asymptomatic adolescents 10-18 years old; it does not apply to children and adolescents who present with back pain, breathing difficulties, obvious spine deformities, or abnormal imaging.

Draw05/Thinkstock
The determination is a switch from the last guidance offered in 2004, when the task force decided that screening harms – false-positives, unnecessary radiation exposure, and the psychosocial effect of being tagged with a potentially nonsignificant problem, among others – outweigh the evidence of benefit.

Studies since then, however, have shifted the calculus a bit so that the group “no longer has moderate certainty that the harms of treatment outweigh the benefits ... As a result, the USPSTF has determined that the current evidence is insufficient to assess the balance of benefits and harms of screening for adolescent idiopathic scoliosis,” which led the group to issue an “I statement” for “insufficient evidence,” David C. Grossman, MD, MPH, of Kaiser Permanente

Washington Health Research Institute, Seattle, and the other members of the task force wrote.

An I statement means that “if the service is offered, patients should understand the uncertainty about the balance of benefits and harms ... The USPSTF recognizes that clinical decisions involve more considerations than evidence alone. Clinicians should understand the evidence but individualize decision making to the specific patient or situation.”

The task force did find that screening using the forward bend test, scoliometer, or both with radiologic confirmation does a good job at detecting scoliosis. It also found a growing body of evidence that bracing can interrupt or slow scoliosis progression; “however, evidence on whether reducing spinal curvature in adolescence has a long-term effect on health in adulthood is inadequate,” and “evidence on the effects of exercise and surgery on health or spinal curvature in childhood or adulthood is insufficient.” Also, the majority of individuals identified through screening will never require treatment, the task force said.

The guidance is based on a review of 448,276 subjects in 14 studies, more than half of which were published after the last guidance.

USPSTF noted that limited new evidence suggests curves “may respond similarly to physiotherapeutic, scoliosis-specific exercise treatment; if confirmed, this may represent a treatment option for mild curves before bracing is recommended.”

Meanwhile, “surgical treatment remains the standard of care for curves that progress to greater than 40-50 degrees; however, there are no controlled studies of surgical [versus] nonsurgical treatment in individuals with lower degrees of curvature,” the task force said in an evidence review that was also published Jan. 9 in JAMA and was led by pediatrician John Dunn, MD, of Kaiser Permanente Washington Health Research Institute, Seattle.

More than half of US states either mandate or recommend school-based screening for scoliosis. The American Academy of Orthopaedic Surgeons, the Scoliosis Research Society, the Pediatric Orthopaedic Society of North America, and the American Academy of Pediatrics advocate screening for scoliosis in girls at 10 and 12 years and in male adolescents at either 13 or 14 years as part of medical home preventive services. The United Kingdom National Screening Society does not recommend screening for scoliosis given the uncertainty surrounding the effectiveness of screening and treatment.

The work was funded by the Agency for Healthcare Research and Quality. The authors had no relevant disclosures.
 

SOURCE: US Preventive Services Task Force. JAMA. 2018 Jan 9;319(2):165-72; Dunn J et al. JAMA. 2018;319(2):173-87.

Body

 

Twenty or more states, including highly populous states such as California, New York, Ohio, and Texas, mandate or strongly recommend school-based screening for scoliosis ... Given the new USPSTF recommendations and the I statement [suggesting insufficient evidence], it would be appropriate for states to advise students and parents of the insufficient data about benefits and harms of screening, while also sharing more recent evidence that bracing and exercise therapies may be helpful if scoliosis is clinically diagnosed in screen-positive youth.

The broad lack of evidence regarding the short-term effect of screening for adolescents and long-term health outcomes in later adolescence and into adulthood is a clear obstacle to moving adolescent idiopathic scoliosis recommendations beyond the I rating. Consequently, the gaps in current understanding serve to highlight immediate opportunities for clinical and health services research. For example, a multisite, multiyear observational study could provide evidence about the association between reduction in spinal curvature in adolescence and long-term health outcomes.

John Sarwark, MD , is head of orthopedic surgery at Ann & Robert H. Lurie Children’s Hospital and a professor of orthopedic surgery at Northwestern University, both in Chicago. Matthew Davis, MD , is head of academic general pediatrics and primary care at Lurie and a pediatrics professor at Northwestern. They made their comments in an editorial published Jan. 9 in JAMA and were not involved with the work ( 2018;319(2):127-129) .

Publications
Topics
Sections
Body

 

Twenty or more states, including highly populous states such as California, New York, Ohio, and Texas, mandate or strongly recommend school-based screening for scoliosis ... Given the new USPSTF recommendations and the I statement [suggesting insufficient evidence], it would be appropriate for states to advise students and parents of the insufficient data about benefits and harms of screening, while also sharing more recent evidence that bracing and exercise therapies may be helpful if scoliosis is clinically diagnosed in screen-positive youth.

The broad lack of evidence regarding the short-term effect of screening for adolescents and long-term health outcomes in later adolescence and into adulthood is a clear obstacle to moving adolescent idiopathic scoliosis recommendations beyond the I rating. Consequently, the gaps in current understanding serve to highlight immediate opportunities for clinical and health services research. For example, a multisite, multiyear observational study could provide evidence about the association between reduction in spinal curvature in adolescence and long-term health outcomes.

John Sarwark, MD , is head of orthopedic surgery at Ann & Robert H. Lurie Children’s Hospital and a professor of orthopedic surgery at Northwestern University, both in Chicago. Matthew Davis, MD , is head of academic general pediatrics and primary care at Lurie and a pediatrics professor at Northwestern. They made their comments in an editorial published Jan. 9 in JAMA and were not involved with the work ( 2018;319(2):127-129) .

Body

 

Twenty or more states, including highly populous states such as California, New York, Ohio, and Texas, mandate or strongly recommend school-based screening for scoliosis ... Given the new USPSTF recommendations and the I statement [suggesting insufficient evidence], it would be appropriate for states to advise students and parents of the insufficient data about benefits and harms of screening, while also sharing more recent evidence that bracing and exercise therapies may be helpful if scoliosis is clinically diagnosed in screen-positive youth.

The broad lack of evidence regarding the short-term effect of screening for adolescents and long-term health outcomes in later adolescence and into adulthood is a clear obstacle to moving adolescent idiopathic scoliosis recommendations beyond the I rating. Consequently, the gaps in current understanding serve to highlight immediate opportunities for clinical and health services research. For example, a multisite, multiyear observational study could provide evidence about the association between reduction in spinal curvature in adolescence and long-term health outcomes.

John Sarwark, MD , is head of orthopedic surgery at Ann & Robert H. Lurie Children’s Hospital and a professor of orthopedic surgery at Northwestern University, both in Chicago. Matthew Davis, MD , is head of academic general pediatrics and primary care at Lurie and a pediatrics professor at Northwestern. They made their comments in an editorial published Jan. 9 in JAMA and were not involved with the work ( 2018;319(2):127-129) .

Title
Major research gaps identified
Major research gaps identified

 

The U.S. Preventive Services Task Force neither recommended for nor recommended against routine screening for adolescent idiopathic scoliosis in new guidelines published Jan. 9 in JAMA.

The determination applies to asymptomatic adolescents 10-18 years old; it does not apply to children and adolescents who present with back pain, breathing difficulties, obvious spine deformities, or abnormal imaging.

Draw05/Thinkstock
The determination is a switch from the last guidance offered in 2004, when the task force decided that screening harms – false-positives, unnecessary radiation exposure, and the psychosocial effect of being tagged with a potentially nonsignificant problem, among others – outweigh the evidence of benefit.

Studies since then, however, have shifted the calculus a bit so that the group “no longer has moderate certainty that the harms of treatment outweigh the benefits ... As a result, the USPSTF has determined that the current evidence is insufficient to assess the balance of benefits and harms of screening for adolescent idiopathic scoliosis,” which led the group to issue an “I statement” for “insufficient evidence,” David C. Grossman, MD, MPH, of Kaiser Permanente

Washington Health Research Institute, Seattle, and the other members of the task force wrote.

An I statement means that “if the service is offered, patients should understand the uncertainty about the balance of benefits and harms ... The USPSTF recognizes that clinical decisions involve more considerations than evidence alone. Clinicians should understand the evidence but individualize decision making to the specific patient or situation.”

The task force did find that screening using the forward bend test, scoliometer, or both with radiologic confirmation does a good job at detecting scoliosis. It also found a growing body of evidence that bracing can interrupt or slow scoliosis progression; “however, evidence on whether reducing spinal curvature in adolescence has a long-term effect on health in adulthood is inadequate,” and “evidence on the effects of exercise and surgery on health or spinal curvature in childhood or adulthood is insufficient.” Also, the majority of individuals identified through screening will never require treatment, the task force said.

The guidance is based on a review of 448,276 subjects in 14 studies, more than half of which were published after the last guidance.

USPSTF noted that limited new evidence suggests curves “may respond similarly to physiotherapeutic, scoliosis-specific exercise treatment; if confirmed, this may represent a treatment option for mild curves before bracing is recommended.”

Meanwhile, “surgical treatment remains the standard of care for curves that progress to greater than 40-50 degrees; however, there are no controlled studies of surgical [versus] nonsurgical treatment in individuals with lower degrees of curvature,” the task force said in an evidence review that was also published Jan. 9 in JAMA and was led by pediatrician John Dunn, MD, of Kaiser Permanente Washington Health Research Institute, Seattle.

More than half of US states either mandate or recommend school-based screening for scoliosis. The American Academy of Orthopaedic Surgeons, the Scoliosis Research Society, the Pediatric Orthopaedic Society of North America, and the American Academy of Pediatrics advocate screening for scoliosis in girls at 10 and 12 years and in male adolescents at either 13 or 14 years as part of medical home preventive services. The United Kingdom National Screening Society does not recommend screening for scoliosis given the uncertainty surrounding the effectiveness of screening and treatment.

The work was funded by the Agency for Healthcare Research and Quality. The authors had no relevant disclosures.
 

SOURCE: US Preventive Services Task Force. JAMA. 2018 Jan 9;319(2):165-72; Dunn J et al. JAMA. 2018;319(2):173-87.

 

The U.S. Preventive Services Task Force neither recommended for nor recommended against routine screening for adolescent idiopathic scoliosis in new guidelines published Jan. 9 in JAMA.

The determination applies to asymptomatic adolescents 10-18 years old; it does not apply to children and adolescents who present with back pain, breathing difficulties, obvious spine deformities, or abnormal imaging.

Draw05/Thinkstock
The determination is a switch from the last guidance offered in 2004, when the task force decided that screening harms – false-positives, unnecessary radiation exposure, and the psychosocial effect of being tagged with a potentially nonsignificant problem, among others – outweigh the evidence of benefit.

Studies since then, however, have shifted the calculus a bit so that the group “no longer has moderate certainty that the harms of treatment outweigh the benefits ... As a result, the USPSTF has determined that the current evidence is insufficient to assess the balance of benefits and harms of screening for adolescent idiopathic scoliosis,” which led the group to issue an “I statement” for “insufficient evidence,” David C. Grossman, MD, MPH, of Kaiser Permanente

Washington Health Research Institute, Seattle, and the other members of the task force wrote.

An I statement means that “if the service is offered, patients should understand the uncertainty about the balance of benefits and harms ... The USPSTF recognizes that clinical decisions involve more considerations than evidence alone. Clinicians should understand the evidence but individualize decision making to the specific patient or situation.”

The task force did find that screening using the forward bend test, scoliometer, or both with radiologic confirmation does a good job at detecting scoliosis. It also found a growing body of evidence that bracing can interrupt or slow scoliosis progression; “however, evidence on whether reducing spinal curvature in adolescence has a long-term effect on health in adulthood is inadequate,” and “evidence on the effects of exercise and surgery on health or spinal curvature in childhood or adulthood is insufficient.” Also, the majority of individuals identified through screening will never require treatment, the task force said.

The guidance is based on a review of 448,276 subjects in 14 studies, more than half of which were published after the last guidance.

USPSTF noted that limited new evidence suggests curves “may respond similarly to physiotherapeutic, scoliosis-specific exercise treatment; if confirmed, this may represent a treatment option for mild curves before bracing is recommended.”

Meanwhile, “surgical treatment remains the standard of care for curves that progress to greater than 40-50 degrees; however, there are no controlled studies of surgical [versus] nonsurgical treatment in individuals with lower degrees of curvature,” the task force said in an evidence review that was also published Jan. 9 in JAMA and was led by pediatrician John Dunn, MD, of Kaiser Permanente Washington Health Research Institute, Seattle.

More than half of US states either mandate or recommend school-based screening for scoliosis. The American Academy of Orthopaedic Surgeons, the Scoliosis Research Society, the Pediatric Orthopaedic Society of North America, and the American Academy of Pediatrics advocate screening for scoliosis in girls at 10 and 12 years and in male adolescents at either 13 or 14 years as part of medical home preventive services. The United Kingdom National Screening Society does not recommend screening for scoliosis given the uncertainty surrounding the effectiveness of screening and treatment.

The work was funded by the Agency for Healthcare Research and Quality. The authors had no relevant disclosures.
 

SOURCE: US Preventive Services Task Force. JAMA. 2018 Jan 9;319(2):165-72; Dunn J et al. JAMA. 2018;319(2):173-87.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

Continue to opt for HDT/ASCT for multiple myeloma

Article Type
Changed
Fri, 01/04/2019 - 10:15

 

High-dose therapy with melphalan followed by autologous stem cell transplant (HDT/ASCT) is still the best option for multiple myeloma even after almost 2 decades with newer and highly effective induction agents, according to a recent systematic review and two meta-analyses.

Given the “unprecedented efficacy” of “modern induction therapy with immunomodulatory drugs and proteasome inhibitors (also called ‘novel agents’),” investigators “have sought to reevaluate the role of HDT/ASCT,” wrote Binod Dhakal, MD, of the Medical College of Wisconsin, and his colleagues. The report is in JAMA Oncology.

To solve the issue, they analyzed five randomized controlled trials conducted since 2000 and concluded that HDT/ASCT is still the preferred treatment approach.

Despite a lack of demonstrable overall survival benefit, there is a significant progression-free survival (PFS) benefit, low treatment-related mortality, and potential high minimal residual disease-negative rates conferred by HDT/ASCT in newly-diagnosed multiple myeloma, the researchers noted.

The combined odds for complete response were 1.27 (95% confidence interval, 0.97-1.65, P = .07) with HDT/ASCT, compared with standard-dose therapy (SDT). The combined hazard ratio (HR) for PFS was 0.55 (95% CI, 0.41-0.7, P less than .001) and 0.76 for overall survival (95% CI, 0.42-1.36, P = .20) in favor of HDT.

PFS was best with tandem HDT/ASCT (HR, 0.49, 95% CI, 0.37-0.65) followed by single HDT/ASCT with bortezomib, lenalidomide, and dexamethasone consolidation (HR, 0.53, 95% CI, 0.37-0.76) and single HDT/ASCT alone (HR, 0.68, 95% CI, 0.53-0.87), compared with SDT. However, none of the HDT/ASCT approaches had a significant impact on overall survival.

Meanwhile, treatment-related mortality with HDT/ASCT was minimal, at less than 1%.

“The achievement of high [minimal residual disease] rates with HDT/ASCT may render this approach the ideal platform for testing novel approaches (e.g., immunotherapy) aiming at disease eradication and cures,” the researchers wrote.

The researchers reported relationships with a number of companies, including Takeda, Celgene, and Amgen, that make novel induction agents.

SOURCE: Dhakal B et al. JAMA Oncol. 2018 Jan 4. doi: 10.1001/jamaoncol.2017.4600.

Publications
Topics
Sections

 

High-dose therapy with melphalan followed by autologous stem cell transplant (HDT/ASCT) is still the best option for multiple myeloma even after almost 2 decades with newer and highly effective induction agents, according to a recent systematic review and two meta-analyses.

Given the “unprecedented efficacy” of “modern induction therapy with immunomodulatory drugs and proteasome inhibitors (also called ‘novel agents’),” investigators “have sought to reevaluate the role of HDT/ASCT,” wrote Binod Dhakal, MD, of the Medical College of Wisconsin, and his colleagues. The report is in JAMA Oncology.

To solve the issue, they analyzed five randomized controlled trials conducted since 2000 and concluded that HDT/ASCT is still the preferred treatment approach.

Despite a lack of demonstrable overall survival benefit, there is a significant progression-free survival (PFS) benefit, low treatment-related mortality, and potential high minimal residual disease-negative rates conferred by HDT/ASCT in newly-diagnosed multiple myeloma, the researchers noted.

The combined odds for complete response were 1.27 (95% confidence interval, 0.97-1.65, P = .07) with HDT/ASCT, compared with standard-dose therapy (SDT). The combined hazard ratio (HR) for PFS was 0.55 (95% CI, 0.41-0.7, P less than .001) and 0.76 for overall survival (95% CI, 0.42-1.36, P = .20) in favor of HDT.

PFS was best with tandem HDT/ASCT (HR, 0.49, 95% CI, 0.37-0.65) followed by single HDT/ASCT with bortezomib, lenalidomide, and dexamethasone consolidation (HR, 0.53, 95% CI, 0.37-0.76) and single HDT/ASCT alone (HR, 0.68, 95% CI, 0.53-0.87), compared with SDT. However, none of the HDT/ASCT approaches had a significant impact on overall survival.

Meanwhile, treatment-related mortality with HDT/ASCT was minimal, at less than 1%.

“The achievement of high [minimal residual disease] rates with HDT/ASCT may render this approach the ideal platform for testing novel approaches (e.g., immunotherapy) aiming at disease eradication and cures,” the researchers wrote.

The researchers reported relationships with a number of companies, including Takeda, Celgene, and Amgen, that make novel induction agents.

SOURCE: Dhakal B et al. JAMA Oncol. 2018 Jan 4. doi: 10.1001/jamaoncol.2017.4600.

 

High-dose therapy with melphalan followed by autologous stem cell transplant (HDT/ASCT) is still the best option for multiple myeloma even after almost 2 decades with newer and highly effective induction agents, according to a recent systematic review and two meta-analyses.

Given the “unprecedented efficacy” of “modern induction therapy with immunomodulatory drugs and proteasome inhibitors (also called ‘novel agents’),” investigators “have sought to reevaluate the role of HDT/ASCT,” wrote Binod Dhakal, MD, of the Medical College of Wisconsin, and his colleagues. The report is in JAMA Oncology.

To solve the issue, they analyzed five randomized controlled trials conducted since 2000 and concluded that HDT/ASCT is still the preferred treatment approach.

Despite a lack of demonstrable overall survival benefit, there is a significant progression-free survival (PFS) benefit, low treatment-related mortality, and potential high minimal residual disease-negative rates conferred by HDT/ASCT in newly-diagnosed multiple myeloma, the researchers noted.

The combined odds for complete response were 1.27 (95% confidence interval, 0.97-1.65, P = .07) with HDT/ASCT, compared with standard-dose therapy (SDT). The combined hazard ratio (HR) for PFS was 0.55 (95% CI, 0.41-0.7, P less than .001) and 0.76 for overall survival (95% CI, 0.42-1.36, P = .20) in favor of HDT.

PFS was best with tandem HDT/ASCT (HR, 0.49, 95% CI, 0.37-0.65) followed by single HDT/ASCT with bortezomib, lenalidomide, and dexamethasone consolidation (HR, 0.53, 95% CI, 0.37-0.76) and single HDT/ASCT alone (HR, 0.68, 95% CI, 0.53-0.87), compared with SDT. However, none of the HDT/ASCT approaches had a significant impact on overall survival.

Meanwhile, treatment-related mortality with HDT/ASCT was minimal, at less than 1%.

“The achievement of high [minimal residual disease] rates with HDT/ASCT may render this approach the ideal platform for testing novel approaches (e.g., immunotherapy) aiming at disease eradication and cures,” the researchers wrote.

The researchers reported relationships with a number of companies, including Takeda, Celgene, and Amgen, that make novel induction agents.

SOURCE: Dhakal B et al. JAMA Oncol. 2018 Jan 4. doi: 10.1001/jamaoncol.2017.4600.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM JAMA ONCOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: High-dose therapy with melphalan plus stem cell transplant is still the best option in untreated multiple myeloma.

Major finding: The combined odds for complete response were 1.27 (95% CI 0.97-1.65, P = .07) with HDT/ASCT, compared with standard-dose therapy (SDT).

Study details: A systematic review and two meta-analyses examining five phase 3 clinical trials reported since 2000.

Disclosures: The researchers reported relationships with a number of companies, including Takeda, Celgene, and Amgen, that make novel induction agents.

Source: Dhakal B et al. JAMA Oncol. 2018 Jan 4. doi: 10.1001/jamaoncol.2017.4600.

Disqus Comments
Default

Risks identified for drug-resistant bacteremia in cirrhosis

Article Type
Changed
Fri, 01/18/2019 - 17:18

 

In patients hospitalized with cirrhosis, biliary cirrhosis, recent health care exposure, nonwhite race, and cultures taken more than 48 hours after admission all independently predicted that bacteremia would be caused by multidrug-resistant organisms (MDROs), according to a medical record review at CHI St. Luke’s Medical Center, an 850-bed tertiary care center in Houston.

“These variables along with severity of infection and liver disease may help clinicians identify patients who will benefit most from broader-spectrum empiric antimicrobial therapy,” wrote the investigators, led by Jennifer Addo Smith, PharmD, of St. Luke’s, in the Journal of Clinical Gastroenterology.

But local epidemiology remains important. “Although a gram-positive agent (e.g., vancomycin) and a carbapenem-sparing gram-negative agent (e.g., ceftriaxone, cefepime) are reasonable empiric agents at our center, other centers with different resistance patterns may warrant different empiric therapy. Given the low prevalence of VRE [vancomycin-resistant Enterococcus] in this study ... and E. faecium in other studies (4%-7%), an empiric agent active against VRE does not seem to be routinely required,” they said.

The team looked into the issue because there hasn’t been much investigation in the United States of the role of multidrug resistant organisms in bacteremia among patients hospitalized with cirrhosis.

Thirty patients in the study had bacteremia caused by MDROs while 60 had bacteremia from non-MDROs, giving a 33% prevalence of MDRO bacteremia, which was consistent with previous, mostly European studies.

Enterobacteriaceae (43%), Staphylococcus aureus (18%), Streptococcus spp. (11%), Enterococcus spp. (10%), and nonfermenting gram-negative bacilli (6%) were the main causes of bacteremia overall.

Among the 30 MDRO cases, methicillin-resistant S. aureus was isolated in seven (23%); methicillin-resistant coagulase-negative Staphylococci in four (13%); fluoroquinolone-resistant Enterobacteriaceae in nine (30%); extended spectrum beta-lactamase–producing Enterobacteriaceae in three (10%), and VRE in two (7%). No carbapenemase-producing gram-negative bacteria were identified.

The predictors of MDRO bacteremia emerged on multivariate analysis and included biliary cirrhosis (adjusted odds ratio, 11.75; 95% confidence interval, 2.08-66.32); recent health care exposure (aOR, 9.81; 95% CI, 2.15-44.88); blood cultures obtained 48 hours after hospital admission (aOR, 6.02; 95% CI, 1.70-21.40) and nonwhite race (aOR , 3.35; 95% CI, 1.19-9.38).

Blood cultures past 48 hours and recent health care exposure – generally hospitalization within the past 90 days – were likely surrogates for nosocomial infection.

The link with biliary cirrhosis is unclear. “Compared with other cirrhotic patients, perhaps patients with PBC [primary biliary cholangitis] have had more cumulative antimicrobial exposure because of [their] higher risk for UTIs [urinary tract infections] and therefore are at increased risk for MDROs,” they wrote.

The median age in the study was 59 years. Half of the patients were white; 46% were women. Hepatitis C was the most common cause of cirrhosis, followed by alcohol.

MDRO was defined in the study as bacteria not susceptible to at least one antibiotic in at least three antimicrobial categories; 90 cirrhosis patients without bacteremia served as controls.

The funding source was not reported. Dr. Addo Smith had no disclosures.

SOURCE: Smith JA et al. J Clin Gastroenterol. 2017 Nov 23. doi: 10.1097/MCG.0000000000000964.

*This story was updated on 1/10/2018.

Publications
Topics
Sections

 

In patients hospitalized with cirrhosis, biliary cirrhosis, recent health care exposure, nonwhite race, and cultures taken more than 48 hours after admission all independently predicted that bacteremia would be caused by multidrug-resistant organisms (MDROs), according to a medical record review at CHI St. Luke’s Medical Center, an 850-bed tertiary care center in Houston.

“These variables along with severity of infection and liver disease may help clinicians identify patients who will benefit most from broader-spectrum empiric antimicrobial therapy,” wrote the investigators, led by Jennifer Addo Smith, PharmD, of St. Luke’s, in the Journal of Clinical Gastroenterology.

But local epidemiology remains important. “Although a gram-positive agent (e.g., vancomycin) and a carbapenem-sparing gram-negative agent (e.g., ceftriaxone, cefepime) are reasonable empiric agents at our center, other centers with different resistance patterns may warrant different empiric therapy. Given the low prevalence of VRE [vancomycin-resistant Enterococcus] in this study ... and E. faecium in other studies (4%-7%), an empiric agent active against VRE does not seem to be routinely required,” they said.

The team looked into the issue because there hasn’t been much investigation in the United States of the role of multidrug resistant organisms in bacteremia among patients hospitalized with cirrhosis.

Thirty patients in the study had bacteremia caused by MDROs while 60 had bacteremia from non-MDROs, giving a 33% prevalence of MDRO bacteremia, which was consistent with previous, mostly European studies.

Enterobacteriaceae (43%), Staphylococcus aureus (18%), Streptococcus spp. (11%), Enterococcus spp. (10%), and nonfermenting gram-negative bacilli (6%) were the main causes of bacteremia overall.

Among the 30 MDRO cases, methicillin-resistant S. aureus was isolated in seven (23%); methicillin-resistant coagulase-negative Staphylococci in four (13%); fluoroquinolone-resistant Enterobacteriaceae in nine (30%); extended spectrum beta-lactamase–producing Enterobacteriaceae in three (10%), and VRE in two (7%). No carbapenemase-producing gram-negative bacteria were identified.

The predictors of MDRO bacteremia emerged on multivariate analysis and included biliary cirrhosis (adjusted odds ratio, 11.75; 95% confidence interval, 2.08-66.32); recent health care exposure (aOR, 9.81; 95% CI, 2.15-44.88); blood cultures obtained 48 hours after hospital admission (aOR, 6.02; 95% CI, 1.70-21.40) and nonwhite race (aOR , 3.35; 95% CI, 1.19-9.38).

Blood cultures past 48 hours and recent health care exposure – generally hospitalization within the past 90 days – were likely surrogates for nosocomial infection.

The link with biliary cirrhosis is unclear. “Compared with other cirrhotic patients, perhaps patients with PBC [primary biliary cholangitis] have had more cumulative antimicrobial exposure because of [their] higher risk for UTIs [urinary tract infections] and therefore are at increased risk for MDROs,” they wrote.

The median age in the study was 59 years. Half of the patients were white; 46% were women. Hepatitis C was the most common cause of cirrhosis, followed by alcohol.

MDRO was defined in the study as bacteria not susceptible to at least one antibiotic in at least three antimicrobial categories; 90 cirrhosis patients without bacteremia served as controls.

The funding source was not reported. Dr. Addo Smith had no disclosures.

SOURCE: Smith JA et al. J Clin Gastroenterol. 2017 Nov 23. doi: 10.1097/MCG.0000000000000964.

*This story was updated on 1/10/2018.

 

In patients hospitalized with cirrhosis, biliary cirrhosis, recent health care exposure, nonwhite race, and cultures taken more than 48 hours after admission all independently predicted that bacteremia would be caused by multidrug-resistant organisms (MDROs), according to a medical record review at CHI St. Luke’s Medical Center, an 850-bed tertiary care center in Houston.

“These variables along with severity of infection and liver disease may help clinicians identify patients who will benefit most from broader-spectrum empiric antimicrobial therapy,” wrote the investigators, led by Jennifer Addo Smith, PharmD, of St. Luke’s, in the Journal of Clinical Gastroenterology.

But local epidemiology remains important. “Although a gram-positive agent (e.g., vancomycin) and a carbapenem-sparing gram-negative agent (e.g., ceftriaxone, cefepime) are reasonable empiric agents at our center, other centers with different resistance patterns may warrant different empiric therapy. Given the low prevalence of VRE [vancomycin-resistant Enterococcus] in this study ... and E. faecium in other studies (4%-7%), an empiric agent active against VRE does not seem to be routinely required,” they said.

The team looked into the issue because there hasn’t been much investigation in the United States of the role of multidrug resistant organisms in bacteremia among patients hospitalized with cirrhosis.

Thirty patients in the study had bacteremia caused by MDROs while 60 had bacteremia from non-MDROs, giving a 33% prevalence of MDRO bacteremia, which was consistent with previous, mostly European studies.

Enterobacteriaceae (43%), Staphylococcus aureus (18%), Streptococcus spp. (11%), Enterococcus spp. (10%), and nonfermenting gram-negative bacilli (6%) were the main causes of bacteremia overall.

Among the 30 MDRO cases, methicillin-resistant S. aureus was isolated in seven (23%); methicillin-resistant coagulase-negative Staphylococci in four (13%); fluoroquinolone-resistant Enterobacteriaceae in nine (30%); extended spectrum beta-lactamase–producing Enterobacteriaceae in three (10%), and VRE in two (7%). No carbapenemase-producing gram-negative bacteria were identified.

The predictors of MDRO bacteremia emerged on multivariate analysis and included biliary cirrhosis (adjusted odds ratio, 11.75; 95% confidence interval, 2.08-66.32); recent health care exposure (aOR, 9.81; 95% CI, 2.15-44.88); blood cultures obtained 48 hours after hospital admission (aOR, 6.02; 95% CI, 1.70-21.40) and nonwhite race (aOR , 3.35; 95% CI, 1.19-9.38).

Blood cultures past 48 hours and recent health care exposure – generally hospitalization within the past 90 days – were likely surrogates for nosocomial infection.

The link with biliary cirrhosis is unclear. “Compared with other cirrhotic patients, perhaps patients with PBC [primary biliary cholangitis] have had more cumulative antimicrobial exposure because of [their] higher risk for UTIs [urinary tract infections] and therefore are at increased risk for MDROs,” they wrote.

The median age in the study was 59 years. Half of the patients were white; 46% were women. Hepatitis C was the most common cause of cirrhosis, followed by alcohol.

MDRO was defined in the study as bacteria not susceptible to at least one antibiotic in at least three antimicrobial categories; 90 cirrhosis patients without bacteremia served as controls.

The funding source was not reported. Dr. Addo Smith had no disclosures.

SOURCE: Smith JA et al. J Clin Gastroenterol. 2017 Nov 23. doi: 10.1097/MCG.0000000000000964.

*This story was updated on 1/10/2018.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM THE JOURNAL OF CLINICAL GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: In patients hospitalized with cirrhosis, nonwhite race, biliary involvement, recent health care exposure, and cultures taken more than 48 hours after hospital admission all independently predicted that bacteremia would be caused by multidrug-resistant organisms.

Major finding: The predictors of multidrug-resistant organism bacteremia emerged on multivariate analysis and included biliary cirrhosis (aOR 11.75; 95% CI, 2.08-66.32); recent health care exposure (aOR 9.81; 95% CI, 2.15-44.88); and blood cultures obtained 48 hours after hospital admission (aOR 6.02; 95% CI, 1.70-21.40).

Study details: Review of 90 cirrhotic patients with bacteremia, plus 90 controls.

Disclosures: The lead investigator had no disclosures.

Source: Smith JA et al. J Clin Gastroenterol. 2017 Nov 23. doi: 10.1097/MCG.0000000000000964.

Disqus Comments
Default

Cars that recognize hypoglycemia? Maybe soon

Article Type
Changed
Tue, 05/03/2022 - 15:21

 

– When researchers at the University of Nebraska placed sensors in the cars of patients with type 1 diabetes, they found something interesting: About 3.4% of the time, the patients were driving with a blood glucose below 70 mg/dL.

Almost 10% of the time, it was above 300 mg/dL, and both hyper and hypoglycemia, but especially hypoglycemia, corresponded with erratic driving, especially at highway speeds.

The finding explains why patients taking insulin for type 1 diabetes have a 12%-19% higher risk of crashing their cars, compared with the general population. But in a larger sense, the study speaks to a new possibility as cars become smarter: monitoring drivers’ mental states and pulling over to the side of the road or otherwise taking control if there’s a problem.

The “results show that vehicle sensor and physiologic data can be successfully linked to quantify individual driver performance and behavior in drivers with metabolic disorders that affect brain function. The work we are doing could be used to tune the algorithm that drive these automated vehicles. I think this is a very important area of study,” said senior investigator Matthew Rizzo, MD, chair of the university’s department of neurological sciences in Omaha.

Dr. Matthew Rizzo
With funding from Toyota, his team placed a kind of black box inside the cars of 19 patients with type 1 diabetes and 16 diabetes-free controls who were of similar age and educational background. The box had a GPS and an accelerometer to detect and record hard turns, sudden stops, swerves, and other signs that something dangerous had happened. The cars were also rigged with video cameras that recorded both the driver and the view out the windshield.

Participants had the devices in their cars for a month, during which time the diabetes patients were also on continuous, 24-hour blood glucose monitoring. The investigators then synched the car data with the glucose readings, and compared it with the data from the controls’ cars. In all, the system recorded more than 1,000 hours of road time across 3,687 drives and 21,232 miles.

“What we found was that the drivers with diabetes had trouble,” Dr. Rizzo said at the American Neurological Association annual meeting.

Glucose was dangerously high or low about 13% of the time when people with diabetes were behind the wheel. Their accelerometer profiles revealed more risky maneuvering and variability in pedal control even during periods of euglycemia and moderate hyperglycemia, but particularly when hypoglycemia occurred at highway speeds.

One driver almost blacked out behind the wheel when his blood glucose fell below 40 mg/dL. “He might have been driving because he was not aware he had a problem,” Dr. Rizzo said. He is now; he was shown the video.

The team reviewed their subjects’ department of motor vehicles records for the 2 years before the study. All three car crashes in the study population were among drivers with diabetes, and they received 11 of the 13 citations (85%).

The technology has many implications. In the short term, it’s a feedback tool to help people with diabetes stay safer on the road. But the work is also “a model for us to be able to approach all kinds of medical disorders in the real world. We want generalizable models that go beyond type 1 diabetes to type 2 diabetes and other forms of encephalopathy, of which there are many in neurology.” Those models could one day lead to “automated in-vehicle technology responsive to driver’s momentary neurocognitive state. You could have [systems] that alert the car that the driver is in no state to drive; the car could even take over. We are very excited about” the possibilities, Dr. Rizzo said.

Meanwhile, “just the diagnosis of diabetes itself is not enough to restrict a person from driving. But if you record their sugars over long periods of time, and you see the kind of changes we saw in some of the drivers, it means the license might need to be adjusted slightly,” he said.

Dr. Rizzo had no relevant disclosures. One of the investigators was an employee of the Toyota Collaborative Safety Research Center.

SOURCE: Rizzo M, et al., ANA 2017 abstract number S131

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– When researchers at the University of Nebraska placed sensors in the cars of patients with type 1 diabetes, they found something interesting: About 3.4% of the time, the patients were driving with a blood glucose below 70 mg/dL.

Almost 10% of the time, it was above 300 mg/dL, and both hyper and hypoglycemia, but especially hypoglycemia, corresponded with erratic driving, especially at highway speeds.

The finding explains why patients taking insulin for type 1 diabetes have a 12%-19% higher risk of crashing their cars, compared with the general population. But in a larger sense, the study speaks to a new possibility as cars become smarter: monitoring drivers’ mental states and pulling over to the side of the road or otherwise taking control if there’s a problem.

The “results show that vehicle sensor and physiologic data can be successfully linked to quantify individual driver performance and behavior in drivers with metabolic disorders that affect brain function. The work we are doing could be used to tune the algorithm that drive these automated vehicles. I think this is a very important area of study,” said senior investigator Matthew Rizzo, MD, chair of the university’s department of neurological sciences in Omaha.

Dr. Matthew Rizzo
With funding from Toyota, his team placed a kind of black box inside the cars of 19 patients with type 1 diabetes and 16 diabetes-free controls who were of similar age and educational background. The box had a GPS and an accelerometer to detect and record hard turns, sudden stops, swerves, and other signs that something dangerous had happened. The cars were also rigged with video cameras that recorded both the driver and the view out the windshield.

Participants had the devices in their cars for a month, during which time the diabetes patients were also on continuous, 24-hour blood glucose monitoring. The investigators then synched the car data with the glucose readings, and compared it with the data from the controls’ cars. In all, the system recorded more than 1,000 hours of road time across 3,687 drives and 21,232 miles.

“What we found was that the drivers with diabetes had trouble,” Dr. Rizzo said at the American Neurological Association annual meeting.

Glucose was dangerously high or low about 13% of the time when people with diabetes were behind the wheel. Their accelerometer profiles revealed more risky maneuvering and variability in pedal control even during periods of euglycemia and moderate hyperglycemia, but particularly when hypoglycemia occurred at highway speeds.

One driver almost blacked out behind the wheel when his blood glucose fell below 40 mg/dL. “He might have been driving because he was not aware he had a problem,” Dr. Rizzo said. He is now; he was shown the video.

The team reviewed their subjects’ department of motor vehicles records for the 2 years before the study. All three car crashes in the study population were among drivers with diabetes, and they received 11 of the 13 citations (85%).

The technology has many implications. In the short term, it’s a feedback tool to help people with diabetes stay safer on the road. But the work is also “a model for us to be able to approach all kinds of medical disorders in the real world. We want generalizable models that go beyond type 1 diabetes to type 2 diabetes and other forms of encephalopathy, of which there are many in neurology.” Those models could one day lead to “automated in-vehicle technology responsive to driver’s momentary neurocognitive state. You could have [systems] that alert the car that the driver is in no state to drive; the car could even take over. We are very excited about” the possibilities, Dr. Rizzo said.

Meanwhile, “just the diagnosis of diabetes itself is not enough to restrict a person from driving. But if you record their sugars over long periods of time, and you see the kind of changes we saw in some of the drivers, it means the license might need to be adjusted slightly,” he said.

Dr. Rizzo had no relevant disclosures. One of the investigators was an employee of the Toyota Collaborative Safety Research Center.

SOURCE: Rizzo M, et al., ANA 2017 abstract number S131

 

– When researchers at the University of Nebraska placed sensors in the cars of patients with type 1 diabetes, they found something interesting: About 3.4% of the time, the patients were driving with a blood glucose below 70 mg/dL.

Almost 10% of the time, it was above 300 mg/dL, and both hyper and hypoglycemia, but especially hypoglycemia, corresponded with erratic driving, especially at highway speeds.

The finding explains why patients taking insulin for type 1 diabetes have a 12%-19% higher risk of crashing their cars, compared with the general population. But in a larger sense, the study speaks to a new possibility as cars become smarter: monitoring drivers’ mental states and pulling over to the side of the road or otherwise taking control if there’s a problem.

The “results show that vehicle sensor and physiologic data can be successfully linked to quantify individual driver performance and behavior in drivers with metabolic disorders that affect brain function. The work we are doing could be used to tune the algorithm that drive these automated vehicles. I think this is a very important area of study,” said senior investigator Matthew Rizzo, MD, chair of the university’s department of neurological sciences in Omaha.

Dr. Matthew Rizzo
With funding from Toyota, his team placed a kind of black box inside the cars of 19 patients with type 1 diabetes and 16 diabetes-free controls who were of similar age and educational background. The box had a GPS and an accelerometer to detect and record hard turns, sudden stops, swerves, and other signs that something dangerous had happened. The cars were also rigged with video cameras that recorded both the driver and the view out the windshield.

Participants had the devices in their cars for a month, during which time the diabetes patients were also on continuous, 24-hour blood glucose monitoring. The investigators then synched the car data with the glucose readings, and compared it with the data from the controls’ cars. In all, the system recorded more than 1,000 hours of road time across 3,687 drives and 21,232 miles.

“What we found was that the drivers with diabetes had trouble,” Dr. Rizzo said at the American Neurological Association annual meeting.

Glucose was dangerously high or low about 13% of the time when people with diabetes were behind the wheel. Their accelerometer profiles revealed more risky maneuvering and variability in pedal control even during periods of euglycemia and moderate hyperglycemia, but particularly when hypoglycemia occurred at highway speeds.

One driver almost blacked out behind the wheel when his blood glucose fell below 40 mg/dL. “He might have been driving because he was not aware he had a problem,” Dr. Rizzo said. He is now; he was shown the video.

The team reviewed their subjects’ department of motor vehicles records for the 2 years before the study. All three car crashes in the study population were among drivers with diabetes, and they received 11 of the 13 citations (85%).

The technology has many implications. In the short term, it’s a feedback tool to help people with diabetes stay safer on the road. But the work is also “a model for us to be able to approach all kinds of medical disorders in the real world. We want generalizable models that go beyond type 1 diabetes to type 2 diabetes and other forms of encephalopathy, of which there are many in neurology.” Those models could one day lead to “automated in-vehicle technology responsive to driver’s momentary neurocognitive state. You could have [systems] that alert the car that the driver is in no state to drive; the car could even take over. We are very excited about” the possibilities, Dr. Rizzo said.

Meanwhile, “just the diagnosis of diabetes itself is not enough to restrict a person from driving. But if you record their sugars over long periods of time, and you see the kind of changes we saw in some of the drivers, it means the license might need to be adjusted slightly,” he said.

Dr. Rizzo had no relevant disclosures. One of the investigators was an employee of the Toyota Collaborative Safety Research Center.

SOURCE: Rizzo M, et al., ANA 2017 abstract number S131

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

REPORTING FROM ANA 2017

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Cars may one day be able to recognize impaired mental status in drivers who have type 1 diabetes, and may take control of the vehicle.

Major finding: Glucose was dangerously high or low about 13% of the time when people with diabetes were behind the wheel.

Study details: Investigators paired real-time driving data with continuous glucose monitoring in patients with type 1 diabetes to asses how blood sugar levels affected driving.

Disclosures: Toyota funded the work. The senior investigator had no relevant disclosures.

Source: Rizzo M, et al. ANA 2017 abstract number S131.

Disqus Comments
Default

First month of LABA/LAMA ups cardiovascular risk

Article Type
Changed
Fri, 01/18/2019 - 17:18

 

New use of inhaled long-acting beta-2 agonists (LABAs) or long-acting antimuscarinic antagonists (LAMAs) was associated with a 1.5-fold increased cardiovascular risk within 30 days of initiation in patients with chronic obstructive pulmonary disease, irrespective of prior cardiovascular disease status and history of exacerbations, according to a review of more than 280,000 COPD patients in Taiwan.

The relationship between cardiovascular disease (CVD) and LABAs and LAMAs in chronic obstructive pulmonary disease (COPD) has long been debated. The new study addressed some limitations of previous studies, which had found conflicting results ranging from no increased risk to up to a 4.5-fold increased risk of cardiovascular events when the medications were used for COPD.

Previous randomized trials haven’t raised much concern, but they included prior users who may have developed tolerance to the heart effects and excluded patients with baseline CVD. “We caution physicians to closely monitor new users of LABAs or LAMAs for cardiovascular symptoms.” Health care professionals should be vigilant for any cardiovascular symptoms during the first 30 days of inhalation therapy, said investigators led by Meng-Ting Wang, PhD, of the National Defense Medical Center, Taipei.

“We suspect that there may exist a subgroup of patients with COPD who are particularly at risk of CVD with initial exposure to LABAs or LAMAs ... we suggest that the use of inhaled long-acting bronchodilators in COPD needs to be carefully assessed, and a thorough cardiovascular physical examination, especially heart rate measurement and electrocardiograms, needs to be performed” before prescribing LABAs and LAMAs, they wrote in an article in JAMA Internal Medicine.

The team identified 284,220 COPD patients in the Taiwan National Health Insurance Research Database during 2007-2011 who were new to the medications. During a mean follow-up of 2 years, 37,719 developed severe CVD requiring hospitalization or emergency care, including coronary artery disease, heart failure, ischemic stroke, and arrhythmia.

The team compared their CVD subjects with controls who did not have a heart event and found that new LABA and LAMA use in COPD was associated with a 1.50-fold (95% confidence interval, 1.35-1.67; P less than .001) and a 1.52-fold (95% CI, 1.28-1.80; P less than .001) increased cardiovascular risk within 30 days of initiation, respectively.

One severe CVD event requiring hospitalization or ED care occurred for every 406 (95% CI, 303-580) new LABA users and 391 (95% CI, 254-725) new LAMA users during the first 30 days of therapy.

The LABA- and LAMA-associated CVD risk remained significant, regardless of patients’ CVD history and COPD exacerbations. Analyses of individual CVD outcomes revealed increased risks of coronary artery disease and heart failure with LABA and LAMA treatment, and an increased risk for cardiac arrhythmias with LAMA therapy.

The cardiovascular risks peaked at around the 30th day of treatment, waned from 31-60 days of treatment, and reduced to a level lower than the baseline risk from 71-240 days.

“Given that CVD is highly prevalent among patients with COPD, clinicians should also pay attention to the management of CVD risk factors throughout the duration of LABA or LAMA therapy ... if needed, a preventive therapy for CVD should be considered during the initial treatment of inhaled long-acting bronchodilators,” the investigators said.

LABAs and LAMAs are believed to cause sympathetic overactivation by activating sympathetic beta-2 adrenergic receptors and suppressing parasympathetic muscarinic-3 receptors, which could contribute to the CVD risk. Also, LABA and LAMA use in COPD has been observed to increase inflammatory cytokine levels, which might also play a role.

The subjects were 40 years or older; the mean age was 71.4 years and 68.9% of the participants were men.

The work was supported by Taiwan’s Ministry of Science and Technology. The investigators had no disclosures.

Eli Zimmerman contributed to this report.

SOURCE: Wang MT et al. JAMA Intern Med. 2018 Jan 2. doi: 10.1001/jamainternmed.2017.7720.

Body

Daniel R. Ouellette, MD, FCCP, comments: Long acting beta agonists (LABA) and long acting muscarinic antagonists (LAMA) are agents commonly used to treat patients with chronic obstructive pulmonary disease (COPD). These inhaled medications have been generally considered to be safe and have a favorable side-effect profile. Although there has been some speculative data that suggest that these agents may be associated with increased cardiovascular risk, prospective, controlled studies have generally suggested that the cardiovascular risk is not increased with the use of these medicines.

Dr. Daniel Oullette
A recent article in JAMA suggests that patients with COPD who have been initiated on LAMA and LABA agents may have an increased risk of cardiovascular events in the weeks following initiation. Using a large insurance database, investigators from Taiwan found that patients with new prescriptions for these drugs have increased cardiovascular events. These researchers further suggest that previous studies may have overlooked this phenomenon, as longitudinal studies would have studied cardiovascular risk among patients with established use patterns of LAMA and LABA agents, instead of just patients initiated upon therapy. They suggest that the longitudinal populations may therefore be censored and excluded patients who had effects shortly after commencing the medications.

One strength of this study is the size of the database, which is robust, and the novel treatment that this study uses to address the research question. Weaknesses include the study's necessarily retrospective design, and the fact that the population is from a single geographic area. Further research will be needed to understand whether or not the initiation of LABA and LAMA medications in COPD patients is associated with increased cardiovascular risk.

Publications
Topics
Sections
Body

Daniel R. Ouellette, MD, FCCP, comments: Long acting beta agonists (LABA) and long acting muscarinic antagonists (LAMA) are agents commonly used to treat patients with chronic obstructive pulmonary disease (COPD). These inhaled medications have been generally considered to be safe and have a favorable side-effect profile. Although there has been some speculative data that suggest that these agents may be associated with increased cardiovascular risk, prospective, controlled studies have generally suggested that the cardiovascular risk is not increased with the use of these medicines.

Dr. Daniel Oullette
A recent article in JAMA suggests that patients with COPD who have been initiated on LAMA and LABA agents may have an increased risk of cardiovascular events in the weeks following initiation. Using a large insurance database, investigators from Taiwan found that patients with new prescriptions for these drugs have increased cardiovascular events. These researchers further suggest that previous studies may have overlooked this phenomenon, as longitudinal studies would have studied cardiovascular risk among patients with established use patterns of LAMA and LABA agents, instead of just patients initiated upon therapy. They suggest that the longitudinal populations may therefore be censored and excluded patients who had effects shortly after commencing the medications.

One strength of this study is the size of the database, which is robust, and the novel treatment that this study uses to address the research question. Weaknesses include the study's necessarily retrospective design, and the fact that the population is from a single geographic area. Further research will be needed to understand whether or not the initiation of LABA and LAMA medications in COPD patients is associated with increased cardiovascular risk.

Body

Daniel R. Ouellette, MD, FCCP, comments: Long acting beta agonists (LABA) and long acting muscarinic antagonists (LAMA) are agents commonly used to treat patients with chronic obstructive pulmonary disease (COPD). These inhaled medications have been generally considered to be safe and have a favorable side-effect profile. Although there has been some speculative data that suggest that these agents may be associated with increased cardiovascular risk, prospective, controlled studies have generally suggested that the cardiovascular risk is not increased with the use of these medicines.

Dr. Daniel Oullette
A recent article in JAMA suggests that patients with COPD who have been initiated on LAMA and LABA agents may have an increased risk of cardiovascular events in the weeks following initiation. Using a large insurance database, investigators from Taiwan found that patients with new prescriptions for these drugs have increased cardiovascular events. These researchers further suggest that previous studies may have overlooked this phenomenon, as longitudinal studies would have studied cardiovascular risk among patients with established use patterns of LAMA and LABA agents, instead of just patients initiated upon therapy. They suggest that the longitudinal populations may therefore be censored and excluded patients who had effects shortly after commencing the medications.

One strength of this study is the size of the database, which is robust, and the novel treatment that this study uses to address the research question. Weaknesses include the study's necessarily retrospective design, and the fact that the population is from a single geographic area. Further research will be needed to understand whether or not the initiation of LABA and LAMA medications in COPD patients is associated with increased cardiovascular risk.

 

New use of inhaled long-acting beta-2 agonists (LABAs) or long-acting antimuscarinic antagonists (LAMAs) was associated with a 1.5-fold increased cardiovascular risk within 30 days of initiation in patients with chronic obstructive pulmonary disease, irrespective of prior cardiovascular disease status and history of exacerbations, according to a review of more than 280,000 COPD patients in Taiwan.

The relationship between cardiovascular disease (CVD) and LABAs and LAMAs in chronic obstructive pulmonary disease (COPD) has long been debated. The new study addressed some limitations of previous studies, which had found conflicting results ranging from no increased risk to up to a 4.5-fold increased risk of cardiovascular events when the medications were used for COPD.

Previous randomized trials haven’t raised much concern, but they included prior users who may have developed tolerance to the heart effects and excluded patients with baseline CVD. “We caution physicians to closely monitor new users of LABAs or LAMAs for cardiovascular symptoms.” Health care professionals should be vigilant for any cardiovascular symptoms during the first 30 days of inhalation therapy, said investigators led by Meng-Ting Wang, PhD, of the National Defense Medical Center, Taipei.

“We suspect that there may exist a subgroup of patients with COPD who are particularly at risk of CVD with initial exposure to LABAs or LAMAs ... we suggest that the use of inhaled long-acting bronchodilators in COPD needs to be carefully assessed, and a thorough cardiovascular physical examination, especially heart rate measurement and electrocardiograms, needs to be performed” before prescribing LABAs and LAMAs, they wrote in an article in JAMA Internal Medicine.

The team identified 284,220 COPD patients in the Taiwan National Health Insurance Research Database during 2007-2011 who were new to the medications. During a mean follow-up of 2 years, 37,719 developed severe CVD requiring hospitalization or emergency care, including coronary artery disease, heart failure, ischemic stroke, and arrhythmia.

The team compared their CVD subjects with controls who did not have a heart event and found that new LABA and LAMA use in COPD was associated with a 1.50-fold (95% confidence interval, 1.35-1.67; P less than .001) and a 1.52-fold (95% CI, 1.28-1.80; P less than .001) increased cardiovascular risk within 30 days of initiation, respectively.

One severe CVD event requiring hospitalization or ED care occurred for every 406 (95% CI, 303-580) new LABA users and 391 (95% CI, 254-725) new LAMA users during the first 30 days of therapy.

The LABA- and LAMA-associated CVD risk remained significant, regardless of patients’ CVD history and COPD exacerbations. Analyses of individual CVD outcomes revealed increased risks of coronary artery disease and heart failure with LABA and LAMA treatment, and an increased risk for cardiac arrhythmias with LAMA therapy.

The cardiovascular risks peaked at around the 30th day of treatment, waned from 31-60 days of treatment, and reduced to a level lower than the baseline risk from 71-240 days.

“Given that CVD is highly prevalent among patients with COPD, clinicians should also pay attention to the management of CVD risk factors throughout the duration of LABA or LAMA therapy ... if needed, a preventive therapy for CVD should be considered during the initial treatment of inhaled long-acting bronchodilators,” the investigators said.

LABAs and LAMAs are believed to cause sympathetic overactivation by activating sympathetic beta-2 adrenergic receptors and suppressing parasympathetic muscarinic-3 receptors, which could contribute to the CVD risk. Also, LABA and LAMA use in COPD has been observed to increase inflammatory cytokine levels, which might also play a role.

The subjects were 40 years or older; the mean age was 71.4 years and 68.9% of the participants were men.

The work was supported by Taiwan’s Ministry of Science and Technology. The investigators had no disclosures.

Eli Zimmerman contributed to this report.

SOURCE: Wang MT et al. JAMA Intern Med. 2018 Jan 2. doi: 10.1001/jamainternmed.2017.7720.

 

New use of inhaled long-acting beta-2 agonists (LABAs) or long-acting antimuscarinic antagonists (LAMAs) was associated with a 1.5-fold increased cardiovascular risk within 30 days of initiation in patients with chronic obstructive pulmonary disease, irrespective of prior cardiovascular disease status and history of exacerbations, according to a review of more than 280,000 COPD patients in Taiwan.

The relationship between cardiovascular disease (CVD) and LABAs and LAMAs in chronic obstructive pulmonary disease (COPD) has long been debated. The new study addressed some limitations of previous studies, which had found conflicting results ranging from no increased risk to up to a 4.5-fold increased risk of cardiovascular events when the medications were used for COPD.

Previous randomized trials haven’t raised much concern, but they included prior users who may have developed tolerance to the heart effects and excluded patients with baseline CVD. “We caution physicians to closely monitor new users of LABAs or LAMAs for cardiovascular symptoms.” Health care professionals should be vigilant for any cardiovascular symptoms during the first 30 days of inhalation therapy, said investigators led by Meng-Ting Wang, PhD, of the National Defense Medical Center, Taipei.

“We suspect that there may exist a subgroup of patients with COPD who are particularly at risk of CVD with initial exposure to LABAs or LAMAs ... we suggest that the use of inhaled long-acting bronchodilators in COPD needs to be carefully assessed, and a thorough cardiovascular physical examination, especially heart rate measurement and electrocardiograms, needs to be performed” before prescribing LABAs and LAMAs, they wrote in an article in JAMA Internal Medicine.

The team identified 284,220 COPD patients in the Taiwan National Health Insurance Research Database during 2007-2011 who were new to the medications. During a mean follow-up of 2 years, 37,719 developed severe CVD requiring hospitalization or emergency care, including coronary artery disease, heart failure, ischemic stroke, and arrhythmia.

The team compared their CVD subjects with controls who did not have a heart event and found that new LABA and LAMA use in COPD was associated with a 1.50-fold (95% confidence interval, 1.35-1.67; P less than .001) and a 1.52-fold (95% CI, 1.28-1.80; P less than .001) increased cardiovascular risk within 30 days of initiation, respectively.

One severe CVD event requiring hospitalization or ED care occurred for every 406 (95% CI, 303-580) new LABA users and 391 (95% CI, 254-725) new LAMA users during the first 30 days of therapy.

The LABA- and LAMA-associated CVD risk remained significant, regardless of patients’ CVD history and COPD exacerbations. Analyses of individual CVD outcomes revealed increased risks of coronary artery disease and heart failure with LABA and LAMA treatment, and an increased risk for cardiac arrhythmias with LAMA therapy.

The cardiovascular risks peaked at around the 30th day of treatment, waned from 31-60 days of treatment, and reduced to a level lower than the baseline risk from 71-240 days.

“Given that CVD is highly prevalent among patients with COPD, clinicians should also pay attention to the management of CVD risk factors throughout the duration of LABA or LAMA therapy ... if needed, a preventive therapy for CVD should be considered during the initial treatment of inhaled long-acting bronchodilators,” the investigators said.

LABAs and LAMAs are believed to cause sympathetic overactivation by activating sympathetic beta-2 adrenergic receptors and suppressing parasympathetic muscarinic-3 receptors, which could contribute to the CVD risk. Also, LABA and LAMA use in COPD has been observed to increase inflammatory cytokine levels, which might also play a role.

The subjects were 40 years or older; the mean age was 71.4 years and 68.9% of the participants were men.

The work was supported by Taiwan’s Ministry of Science and Technology. The investigators had no disclosures.

Eli Zimmerman contributed to this report.

SOURCE: Wang MT et al. JAMA Intern Med. 2018 Jan 2. doi: 10.1001/jamainternmed.2017.7720.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA INTERNAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Researchers recommend patients receive a thorough cardiovascular physical examination before they are prescribed LABAs and LAMAs.

Major finding: New use of inhaled long-acting beta-2 agonists or antimuscarinic antagonists was associated with a 1.5-fold increased cardiovascular risk within 30 days of initiation in patients with COPD, irrespective of prior cardiovascular disease status and history of exacerbations.

Study details: The findings are from a review of 284,220 COPD patients in the Taiwan National Health Insurance Research Database.

Disclosures: The work was supported by Taiwan’s Ministry of Science and Technology. The investigators had no disclosures.

Source: Wang MT et al. JAMA Intern Med. 2018 Jan 2. doi: 10.1001/jamainternmed.2017.7720.

Disqus Comments
Default

ACC guidance addresses newer HFrEF options

Article Type
Changed
Fri, 01/18/2019 - 17:18

 

It might be prudent to monitor N-terminal pro–B-type natriuretic peptide (NT-proBNP) and skip natriuretic peptide measures in heart failure patients on sacubitril/valsartan (Entresto), according to a new expert consensus document from the American College of Cardiology on managing heart failure with reduced ejection fraction.

“While rising natriuretic peptide concentrations are correlated with adverse outcomes, this relationship can be confounded with the use of sacubitril/valsartan. Due to neprilysin inhibition, concentrations of BNP rise in patients treated with sacubitril/valsartan and tend not to return to baseline despite chronic therapy. In contrast, NT-proBNP concentrations typically decrease, as NT-proBNP is not a substrate for neprilysin,” explained authors led by heart failure pathway writing committee chairman Clyde W. Yancy, MD, chief of cardiology at Northwestern University in Chicago (J Am Coll Cardiol. 2017 Dec 22. doi: 10.1016/j.jacc.2017.11.025).

Dr. Clyde W. Yancy
It’s just one of scores of tips in “2017 ACC Expert Consensus Decision Pathway for Optimization of Heart Failure Treatment: Answers to 10 Pivotal Issues About Heart Failure With Reduced Ejection Fraction.” The document is meant to complement the 2017 heart failure (HF) guidelines published jointly by the ACC, the American Heart Association, and the Heart Failure Society of America (J Am Coll Cardiol. 2017 Aug 8;70(6):776-803. doi: 10.1016/j.jacc.2017.04.025)

Treatment of heart failure with reduced ejection fraction (HFrEF) “can feel overwhelming, and many opportunities to improve patient outcomes are being missed; hopefully, this Expert Consensus Decision Pathway may streamline care to realize best possible patient outcomes,” the authors wrote.

The 10 issues and their detailed answers address therapeutic options, adherence, treatment barriers, drug costs, special populations, and palliative care. The document is full of tables and figures of treatment algorithms, drug doses, and other matters.

There’s a good deal of advice about using two newer HFrEF options: sacubitril/valsartan and ivabradine (Corlanor). Sacubitril/valsartan, an angiotensin receptor-neprilysin inhibitor (ANRI), is a switch agent for patients who tolerate but remain symptomatic on ACE inhibitors (ACEIs) or angiotensin II receptor blockers (ARB). Moving over to sacubitril/valsartan has been shown to decrease the risk of hospitalization and death.

Switching from an ACEI requires a 36-hour washout period to avoid angdioedema; no washout is needed for ARB switches. Sacubitril/valsartan doses can be increased every 2-4 weeks to allow time for adjustment to vasodilatory effects. In one study, gradual titration over about 6 weeks maximized attainment of target dosages. As with ACEIs and ARBs, titration might require lowering loop diuretic doses, with careful attention paid to potassium concentrations.

“The committee is aware that clinicians may occasionally consider initiating ANRI in patients who have not previously been treated with an ACEI or ARB. To be explicitly clear, no predicate data supports this approach,” but it “might be considered” if patients are well informed of the risks, including angioedema and hypotension, the committee wrote.

Ivabradine is for patients whose resting heart rate is at or above 70 bpm despite maximal beta-blocker treatment. “It is important to emphasize that ivabradine is indicated only for patients in sinus rhythm, not in those with atrial fibrillation, patients who are 100% atrially paced, or unstable patients. From a safety standpoint, patients treated with ivabradine had more bradycardia and developed more atrial fibrillation as well as transient blurring of vision,” according to the consensus document.

Turning to wireless implantable pulmonary artery pressure monitoring, another newer approach, the group noted that, compared with standard care, it reduced hospitalization and led to more frequent adjustment of diuretic doses, suggesting a benefit “in well-selected patients with recurrent congestion. … The impact on mortality is unknown.”

“For a number of reasons,” hydralazine/isosorbide dinitrate “is often neglected in eligible patients. However, given the benefits of this combination (43% relative reduction in mortality and 33% relative reduction in HF hospitalization), African-American patients should receive these drugs once target or maximally tolerated doses of beta-blocker and ACEI/ ARB/ARNI are achieved. This is especially important for those patients with [New York Heart Association] class III to IV symptoms,” the committee members said.

Regarding treatment adherence, the group noted that “monetary incentives or other rewards for adherence to medications may be cost saving for highly efficacious and inexpensive drugs such as beta-blockers.”

The work was supported by the ACC with no industry funding. Dr. Yancy had no disclosures.

SOURCE: Yancy C et. al. J Am Coll Cardiol. 2017 Dec 22. doi: 10.1016/j.jacc.2017.11.025

Publications
Topics
Sections

 

It might be prudent to monitor N-terminal pro–B-type natriuretic peptide (NT-proBNP) and skip natriuretic peptide measures in heart failure patients on sacubitril/valsartan (Entresto), according to a new expert consensus document from the American College of Cardiology on managing heart failure with reduced ejection fraction.

“While rising natriuretic peptide concentrations are correlated with adverse outcomes, this relationship can be confounded with the use of sacubitril/valsartan. Due to neprilysin inhibition, concentrations of BNP rise in patients treated with sacubitril/valsartan and tend not to return to baseline despite chronic therapy. In contrast, NT-proBNP concentrations typically decrease, as NT-proBNP is not a substrate for neprilysin,” explained authors led by heart failure pathway writing committee chairman Clyde W. Yancy, MD, chief of cardiology at Northwestern University in Chicago (J Am Coll Cardiol. 2017 Dec 22. doi: 10.1016/j.jacc.2017.11.025).

Dr. Clyde W. Yancy
It’s just one of scores of tips in “2017 ACC Expert Consensus Decision Pathway for Optimization of Heart Failure Treatment: Answers to 10 Pivotal Issues About Heart Failure With Reduced Ejection Fraction.” The document is meant to complement the 2017 heart failure (HF) guidelines published jointly by the ACC, the American Heart Association, and the Heart Failure Society of America (J Am Coll Cardiol. 2017 Aug 8;70(6):776-803. doi: 10.1016/j.jacc.2017.04.025)

Treatment of heart failure with reduced ejection fraction (HFrEF) “can feel overwhelming, and many opportunities to improve patient outcomes are being missed; hopefully, this Expert Consensus Decision Pathway may streamline care to realize best possible patient outcomes,” the authors wrote.

The 10 issues and their detailed answers address therapeutic options, adherence, treatment barriers, drug costs, special populations, and palliative care. The document is full of tables and figures of treatment algorithms, drug doses, and other matters.

There’s a good deal of advice about using two newer HFrEF options: sacubitril/valsartan and ivabradine (Corlanor). Sacubitril/valsartan, an angiotensin receptor-neprilysin inhibitor (ANRI), is a switch agent for patients who tolerate but remain symptomatic on ACE inhibitors (ACEIs) or angiotensin II receptor blockers (ARB). Moving over to sacubitril/valsartan has been shown to decrease the risk of hospitalization and death.

Switching from an ACEI requires a 36-hour washout period to avoid angdioedema; no washout is needed for ARB switches. Sacubitril/valsartan doses can be increased every 2-4 weeks to allow time for adjustment to vasodilatory effects. In one study, gradual titration over about 6 weeks maximized attainment of target dosages. As with ACEIs and ARBs, titration might require lowering loop diuretic doses, with careful attention paid to potassium concentrations.

“The committee is aware that clinicians may occasionally consider initiating ANRI in patients who have not previously been treated with an ACEI or ARB. To be explicitly clear, no predicate data supports this approach,” but it “might be considered” if patients are well informed of the risks, including angioedema and hypotension, the committee wrote.

Ivabradine is for patients whose resting heart rate is at or above 70 bpm despite maximal beta-blocker treatment. “It is important to emphasize that ivabradine is indicated only for patients in sinus rhythm, not in those with atrial fibrillation, patients who are 100% atrially paced, or unstable patients. From a safety standpoint, patients treated with ivabradine had more bradycardia and developed more atrial fibrillation as well as transient blurring of vision,” according to the consensus document.

Turning to wireless implantable pulmonary artery pressure monitoring, another newer approach, the group noted that, compared with standard care, it reduced hospitalization and led to more frequent adjustment of diuretic doses, suggesting a benefit “in well-selected patients with recurrent congestion. … The impact on mortality is unknown.”

“For a number of reasons,” hydralazine/isosorbide dinitrate “is often neglected in eligible patients. However, given the benefits of this combination (43% relative reduction in mortality and 33% relative reduction in HF hospitalization), African-American patients should receive these drugs once target or maximally tolerated doses of beta-blocker and ACEI/ ARB/ARNI are achieved. This is especially important for those patients with [New York Heart Association] class III to IV symptoms,” the committee members said.

Regarding treatment adherence, the group noted that “monetary incentives or other rewards for adherence to medications may be cost saving for highly efficacious and inexpensive drugs such as beta-blockers.”

The work was supported by the ACC with no industry funding. Dr. Yancy had no disclosures.

SOURCE: Yancy C et. al. J Am Coll Cardiol. 2017 Dec 22. doi: 10.1016/j.jacc.2017.11.025

 

It might be prudent to monitor N-terminal pro–B-type natriuretic peptide (NT-proBNP) and skip natriuretic peptide measures in heart failure patients on sacubitril/valsartan (Entresto), according to a new expert consensus document from the American College of Cardiology on managing heart failure with reduced ejection fraction.

“While rising natriuretic peptide concentrations are correlated with adverse outcomes, this relationship can be confounded with the use of sacubitril/valsartan. Due to neprilysin inhibition, concentrations of BNP rise in patients treated with sacubitril/valsartan and tend not to return to baseline despite chronic therapy. In contrast, NT-proBNP concentrations typically decrease, as NT-proBNP is not a substrate for neprilysin,” explained authors led by heart failure pathway writing committee chairman Clyde W. Yancy, MD, chief of cardiology at Northwestern University in Chicago (J Am Coll Cardiol. 2017 Dec 22. doi: 10.1016/j.jacc.2017.11.025).

Dr. Clyde W. Yancy
It’s just one of scores of tips in “2017 ACC Expert Consensus Decision Pathway for Optimization of Heart Failure Treatment: Answers to 10 Pivotal Issues About Heart Failure With Reduced Ejection Fraction.” The document is meant to complement the 2017 heart failure (HF) guidelines published jointly by the ACC, the American Heart Association, and the Heart Failure Society of America (J Am Coll Cardiol. 2017 Aug 8;70(6):776-803. doi: 10.1016/j.jacc.2017.04.025)

Treatment of heart failure with reduced ejection fraction (HFrEF) “can feel overwhelming, and many opportunities to improve patient outcomes are being missed; hopefully, this Expert Consensus Decision Pathway may streamline care to realize best possible patient outcomes,” the authors wrote.

The 10 issues and their detailed answers address therapeutic options, adherence, treatment barriers, drug costs, special populations, and palliative care. The document is full of tables and figures of treatment algorithms, drug doses, and other matters.

There’s a good deal of advice about using two newer HFrEF options: sacubitril/valsartan and ivabradine (Corlanor). Sacubitril/valsartan, an angiotensin receptor-neprilysin inhibitor (ANRI), is a switch agent for patients who tolerate but remain symptomatic on ACE inhibitors (ACEIs) or angiotensin II receptor blockers (ARB). Moving over to sacubitril/valsartan has been shown to decrease the risk of hospitalization and death.

Switching from an ACEI requires a 36-hour washout period to avoid angdioedema; no washout is needed for ARB switches. Sacubitril/valsartan doses can be increased every 2-4 weeks to allow time for adjustment to vasodilatory effects. In one study, gradual titration over about 6 weeks maximized attainment of target dosages. As with ACEIs and ARBs, titration might require lowering loop diuretic doses, with careful attention paid to potassium concentrations.

“The committee is aware that clinicians may occasionally consider initiating ANRI in patients who have not previously been treated with an ACEI or ARB. To be explicitly clear, no predicate data supports this approach,” but it “might be considered” if patients are well informed of the risks, including angioedema and hypotension, the committee wrote.

Ivabradine is for patients whose resting heart rate is at or above 70 bpm despite maximal beta-blocker treatment. “It is important to emphasize that ivabradine is indicated only for patients in sinus rhythm, not in those with atrial fibrillation, patients who are 100% atrially paced, or unstable patients. From a safety standpoint, patients treated with ivabradine had more bradycardia and developed more atrial fibrillation as well as transient blurring of vision,” according to the consensus document.

Turning to wireless implantable pulmonary artery pressure monitoring, another newer approach, the group noted that, compared with standard care, it reduced hospitalization and led to more frequent adjustment of diuretic doses, suggesting a benefit “in well-selected patients with recurrent congestion. … The impact on mortality is unknown.”

“For a number of reasons,” hydralazine/isosorbide dinitrate “is often neglected in eligible patients. However, given the benefits of this combination (43% relative reduction in mortality and 33% relative reduction in HF hospitalization), African-American patients should receive these drugs once target or maximally tolerated doses of beta-blocker and ACEI/ ARB/ARNI are achieved. This is especially important for those patients with [New York Heart Association] class III to IV symptoms,” the committee members said.

Regarding treatment adherence, the group noted that “monetary incentives or other rewards for adherence to medications may be cost saving for highly efficacious and inexpensive drugs such as beta-blockers.”

The work was supported by the ACC with no industry funding. Dr. Yancy had no disclosures.

SOURCE: Yancy C et. al. J Am Coll Cardiol. 2017 Dec 22. doi: 10.1016/j.jacc.2017.11.025

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

Enhanced recovery protocols after colectomy safely cut LOS

Article Type
Changed
Wed, 01/02/2019 - 10:03

 

Enhanced recovery protocols for elective colectomy shortened length of stay (LOS) by more than a day and decreased complications without increasing readmissions, at 15 hospitals in a pilot study of the Enhanced Recovery in National Surgical Quality Improvement Program.

Guidance from experts, engaged multidisciplinary team leadership, continuous data collection and auditing, and collaboration across institutions were all key to success. “The pilot may serve to inform future implementation efforts across hospitals varied in size, location, and resource availability,” investigators led by Julia R. Berian, MD, a surgery resident at the University of Chicago, wrote in a study published online in JAMA Surgery.

Dr. Julia R. Berian
The American College of Surgeons launched the Enhanced Recovery in National Surgical Quality Improvement Program (ERIN) several years ago to help hospitals develop enhanced recovery protocols (ERPs), standardized perioperative care plans to improve outcomes. ERIN provided the 15 hospitals with experts in implementation, sample patient education materials and order sets, and opportunities for personnel to share ideas and trouble shoot through workshops and monthly conference calls. Each hospital formed a steering committee with surgery, anesthesia, and nursing leaders; and ERIN helped them to track protocol adherence and outcomes.

The program suggested 13 measures aimed at improved pain control, reduced gut dysfunction, and early nutrition and physical activity. Recommendations included shorter fluid fasts and better preop patient counseling; discontinuation of IV fluids and mobilization of patients within 24 hours of surgery; and solid diets within 24-48 hours.

The measures weren’t mandatory; each hospital tailored its protocols, and timing of implementation was at their discretion.

The report didn’t name the 15 hospitals, but they varied by size and academic status. Hospitals were selected for the program because they were outliers on elective colectomy LOS. The study ran during 2013-2015.

There were 3,437 colectomies at the hospitals before implementation, and 1,538 after. Results were compared with those of 9,950 colectomies over the study period at hospitals not involved in the efforts. Emergency and septic cases were excluded.

ERPs decreased mean LOS by 1.7 days, from 6.9 to 5.2 days. After taking patient characteristics and other matters into account, the adjusted decrease was 1.1 days. LOS fell by 0.4 days in the control hospitals (P less than .001).

Serious morbidity or mortality in the ERP hospitals decreased from 485 cases (14.1%) before implementation to 162 (10.5%) afterward (P less than .001); there was no change in the control hospitals. After implementation, serious morbidity or mortality was significantly less likely in ERP hospitals (adjusted odds ratio, 0.76; 95% confidence interval, 0.60-0.96).

Meanwhile, there was no difference in readmission rates before and after implementation.

“The ERIN pilot study included hospitals of various sizes, indicating that both small and large hospitals can successfully decrease LOS with implementation of an ERP. ... Regardless of resource limitations, small hospitals may have the advantage of decreased bureaucracy and improved communication and collaboration across disciplines. ... We strongly believe that surgeon engagement and leadership in such initiatives are critical to sustained success,” the investigators wrote.

The ACS; Johns Hopkins’ Armstrong Institute for Patient Safety and Quality, Baltimore; and the Agency for Healthcare Research and Quality have recently launched the “Improving Surgical Care and Recovery” program to provide more than 750 hospitals with tools, experts, and other resources for implementing ERPs. “The program is one opportunity for hospitals seeking implementation guidance,” the investigators noted.

Dr. Berian reported receiving salary support from the John A. Hartford Foundation. Her coinvestigators reported receiving grant or salary support from the foundation and the Agency for Healthcare Research and Quality. One investigator reported relationships with a variety of drug and device companies.
 

SOURCE: Berian J et. al. JAMA Surg. 2017 Dec 20. doi: 10.1001/jamasurg.2017.4906

Publications
Topics
Sections

 

Enhanced recovery protocols for elective colectomy shortened length of stay (LOS) by more than a day and decreased complications without increasing readmissions, at 15 hospitals in a pilot study of the Enhanced Recovery in National Surgical Quality Improvement Program.

Guidance from experts, engaged multidisciplinary team leadership, continuous data collection and auditing, and collaboration across institutions were all key to success. “The pilot may serve to inform future implementation efforts across hospitals varied in size, location, and resource availability,” investigators led by Julia R. Berian, MD, a surgery resident at the University of Chicago, wrote in a study published online in JAMA Surgery.

Dr. Julia R. Berian
The American College of Surgeons launched the Enhanced Recovery in National Surgical Quality Improvement Program (ERIN) several years ago to help hospitals develop enhanced recovery protocols (ERPs), standardized perioperative care plans to improve outcomes. ERIN provided the 15 hospitals with experts in implementation, sample patient education materials and order sets, and opportunities for personnel to share ideas and trouble shoot through workshops and monthly conference calls. Each hospital formed a steering committee with surgery, anesthesia, and nursing leaders; and ERIN helped them to track protocol adherence and outcomes.

The program suggested 13 measures aimed at improved pain control, reduced gut dysfunction, and early nutrition and physical activity. Recommendations included shorter fluid fasts and better preop patient counseling; discontinuation of IV fluids and mobilization of patients within 24 hours of surgery; and solid diets within 24-48 hours.

The measures weren’t mandatory; each hospital tailored its protocols, and timing of implementation was at their discretion.

The report didn’t name the 15 hospitals, but they varied by size and academic status. Hospitals were selected for the program because they were outliers on elective colectomy LOS. The study ran during 2013-2015.

There were 3,437 colectomies at the hospitals before implementation, and 1,538 after. Results were compared with those of 9,950 colectomies over the study period at hospitals not involved in the efforts. Emergency and septic cases were excluded.

ERPs decreased mean LOS by 1.7 days, from 6.9 to 5.2 days. After taking patient characteristics and other matters into account, the adjusted decrease was 1.1 days. LOS fell by 0.4 days in the control hospitals (P less than .001).

Serious morbidity or mortality in the ERP hospitals decreased from 485 cases (14.1%) before implementation to 162 (10.5%) afterward (P less than .001); there was no change in the control hospitals. After implementation, serious morbidity or mortality was significantly less likely in ERP hospitals (adjusted odds ratio, 0.76; 95% confidence interval, 0.60-0.96).

Meanwhile, there was no difference in readmission rates before and after implementation.

“The ERIN pilot study included hospitals of various sizes, indicating that both small and large hospitals can successfully decrease LOS with implementation of an ERP. ... Regardless of resource limitations, small hospitals may have the advantage of decreased bureaucracy and improved communication and collaboration across disciplines. ... We strongly believe that surgeon engagement and leadership in such initiatives are critical to sustained success,” the investigators wrote.

The ACS; Johns Hopkins’ Armstrong Institute for Patient Safety and Quality, Baltimore; and the Agency for Healthcare Research and Quality have recently launched the “Improving Surgical Care and Recovery” program to provide more than 750 hospitals with tools, experts, and other resources for implementing ERPs. “The program is one opportunity for hospitals seeking implementation guidance,” the investigators noted.

Dr. Berian reported receiving salary support from the John A. Hartford Foundation. Her coinvestigators reported receiving grant or salary support from the foundation and the Agency for Healthcare Research and Quality. One investigator reported relationships with a variety of drug and device companies.
 

SOURCE: Berian J et. al. JAMA Surg. 2017 Dec 20. doi: 10.1001/jamasurg.2017.4906

 

Enhanced recovery protocols for elective colectomy shortened length of stay (LOS) by more than a day and decreased complications without increasing readmissions, at 15 hospitals in a pilot study of the Enhanced Recovery in National Surgical Quality Improvement Program.

Guidance from experts, engaged multidisciplinary team leadership, continuous data collection and auditing, and collaboration across institutions were all key to success. “The pilot may serve to inform future implementation efforts across hospitals varied in size, location, and resource availability,” investigators led by Julia R. Berian, MD, a surgery resident at the University of Chicago, wrote in a study published online in JAMA Surgery.

Dr. Julia R. Berian
The American College of Surgeons launched the Enhanced Recovery in National Surgical Quality Improvement Program (ERIN) several years ago to help hospitals develop enhanced recovery protocols (ERPs), standardized perioperative care plans to improve outcomes. ERIN provided the 15 hospitals with experts in implementation, sample patient education materials and order sets, and opportunities for personnel to share ideas and trouble shoot through workshops and monthly conference calls. Each hospital formed a steering committee with surgery, anesthesia, and nursing leaders; and ERIN helped them to track protocol adherence and outcomes.

The program suggested 13 measures aimed at improved pain control, reduced gut dysfunction, and early nutrition and physical activity. Recommendations included shorter fluid fasts and better preop patient counseling; discontinuation of IV fluids and mobilization of patients within 24 hours of surgery; and solid diets within 24-48 hours.

The measures weren’t mandatory; each hospital tailored its protocols, and timing of implementation was at their discretion.

The report didn’t name the 15 hospitals, but they varied by size and academic status. Hospitals were selected for the program because they were outliers on elective colectomy LOS. The study ran during 2013-2015.

There were 3,437 colectomies at the hospitals before implementation, and 1,538 after. Results were compared with those of 9,950 colectomies over the study period at hospitals not involved in the efforts. Emergency and septic cases were excluded.

ERPs decreased mean LOS by 1.7 days, from 6.9 to 5.2 days. After taking patient characteristics and other matters into account, the adjusted decrease was 1.1 days. LOS fell by 0.4 days in the control hospitals (P less than .001).

Serious morbidity or mortality in the ERP hospitals decreased from 485 cases (14.1%) before implementation to 162 (10.5%) afterward (P less than .001); there was no change in the control hospitals. After implementation, serious morbidity or mortality was significantly less likely in ERP hospitals (adjusted odds ratio, 0.76; 95% confidence interval, 0.60-0.96).

Meanwhile, there was no difference in readmission rates before and after implementation.

“The ERIN pilot study included hospitals of various sizes, indicating that both small and large hospitals can successfully decrease LOS with implementation of an ERP. ... Regardless of resource limitations, small hospitals may have the advantage of decreased bureaucracy and improved communication and collaboration across disciplines. ... We strongly believe that surgeon engagement and leadership in such initiatives are critical to sustained success,” the investigators wrote.

The ACS; Johns Hopkins’ Armstrong Institute for Patient Safety and Quality, Baltimore; and the Agency for Healthcare Research and Quality have recently launched the “Improving Surgical Care and Recovery” program to provide more than 750 hospitals with tools, experts, and other resources for implementing ERPs. “The program is one opportunity for hospitals seeking implementation guidance,” the investigators noted.

Dr. Berian reported receiving salary support from the John A. Hartford Foundation. Her coinvestigators reported receiving grant or salary support from the foundation and the Agency for Healthcare Research and Quality. One investigator reported relationships with a variety of drug and device companies.
 

SOURCE: Berian J et. al. JAMA Surg. 2017 Dec 20. doi: 10.1001/jamasurg.2017.4906

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA SURGERY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: With the help of the Enhanced Recovery in National Surgical Quality Improvement Program, 15 hospitals enacted enhanced recovery protocols for elective colectomy that shortened length of stay and decreased complications, without increasing readmissions.

Major finding: After taking patient characteristics and other matters into account, the adjusted decrease in LOS was 1.1 days, versus 0.4 days in control hospitals (P less than .001).

Study details: The study compared 3,437 colectomies at 15 hospitals before ERP implementation to 1,538 after.

Disclosures: Dr. Berian reported receiving salary support from the John A. Hartford Foundation. Her coinvestigators reported receiving grant or salary support from the foundation and the Agency for Healthcare Research and Quality. One investigator reported relationships with a variety of drug and device companies.

Source: Berian J et. al. JAMA Surg. 2017 Dec 20. doi: 10.1001/jamasurg.2017.4906

Disqus Comments
Default

Corynebacterium in the gut can trigger Parkinson’s disease

Article Type
Changed
Mon, 01/07/2019 - 13:03

 

– The presence of Corynebacterium in the gut microbiome of people with two G alleles at the rs356219 single nucleotide polymorphism locus of the alpha-synuclein gene was associated with 100% probability of having Parkinson’s disease in a study conducted by the NeuroGenetics Research Consortium.

If the finding is replicated, it means that Corynebacterium is the trigger for Parkinson’s disease (PD) in people with the GG genotype. The GG signature at rs356219 is the strongest genetic risk factor for PD identified to date, but it’s not necessarily strong enough to cause the disease on its own. “It definitely needs a trigger,” and there’s a good chance that Corynebacterium is it, said senior investigator Haydeh Payami, PhD, professor of neurology and genomics at the University of Alabama, Birmingham.

Kuo Chun Hung/Thinkstock
It’s a potentially huge finding that begins to unravel the link between the dozens of genetic risk factors for PD and environmental triggers that push people over the edge. “Corynebacterium isn’t the only bug. I think there are other bugs that go with other” genetic risk factors. Eventually, “we are going to map this whole thing out: Which bugs go with which genetic susceptibilities, and which genetic susceptibilities are triggered by” other things in the environment, such as pesticides, Dr. Payami, leader of the multicenter neurogenetics research collaboration, said in an interview.

Her team genotyped SNCA rs356219 from blood samples in 197 middle-aged PD patients and 115 age-matched controls. They also extracted DNA from stool samples to see what bacteria were in their gut and then looked for interactions between rs356219 genotype, gut microbiome, and PD risk.

The medical literature has been full of hints for a while now that PD might be set off by something going wrong in the gastrointestinal tract. Colonic inflammation, alpha-synuclein pathology in the gut, and dysbiosis of the gut microbiome in PD are among the many clues. The goal of the work was to find the link between PD and its GI aberrations.

Ninety genera were identified in the stool samples, but “no matter how you looked at the data, whichever method you used, one [genus] kept coming up” for interaction with the rs356219 genotype, “and that was Corynebacterium,” Dr. Payami said.

As in past studies, the rs356219 AA genotype did not increase the odds of PD, and there was no difference in microbiome abundance between PD patients and controls. The GA genotype increased the odds slightly without Corynebacterium, but it increased the odds more than fivefold when Corynebacterium was in the gut (odds ratio, 5.9; P = .04). If people had GG plus Corynebacterium, however, “you nailed it,” Dr. Payami said: The odds of developing PD were infinite (P = .0003).

Corynebacterium was more abundant in GA subjects with PD than GA subjects without PD, but it was by far the most abundant in GG subjects, and every person who had the GG genotype and gut Corynebacterium also had PD.

Corynebacterium are gram-positive, aerobic bacilli commonly found on the skin. Some members of the genus are opportunistic pathogens. It’s not clear how they get incorporated into the gut microbiome, or if they can be wiped out selectively in the gut with antibiotics or probiotics.

Perhaps Corynebacterium in the GI tract induces expression of alpha-synuclein protein, a major component of PD Lewy bodies that’s known to travel from the gut to the brain. Maybe the amount expressed depends on how many Gs people have in rs356219. Perhaps “if you have two Gs, you get so much alpha-synuclein that’s there’s no turning back, and it’s enough to cause PD,” Dr. Payami said.

The study was led by Zachary Wallen, a PhD candidate in Dr. Payami’s lab, and presented by him at the annual meeting of the American Neurological Association. The work was supported by the National Institutes of Health. Dr. Payami and Mr. Wallen had no industry disclosures.

SOURCE: Wallen Z et al. ANA 2017 abstract number S268

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event
Related Articles

 

– The presence of Corynebacterium in the gut microbiome of people with two G alleles at the rs356219 single nucleotide polymorphism locus of the alpha-synuclein gene was associated with 100% probability of having Parkinson’s disease in a study conducted by the NeuroGenetics Research Consortium.

If the finding is replicated, it means that Corynebacterium is the trigger for Parkinson’s disease (PD) in people with the GG genotype. The GG signature at rs356219 is the strongest genetic risk factor for PD identified to date, but it’s not necessarily strong enough to cause the disease on its own. “It definitely needs a trigger,” and there’s a good chance that Corynebacterium is it, said senior investigator Haydeh Payami, PhD, professor of neurology and genomics at the University of Alabama, Birmingham.

Kuo Chun Hung/Thinkstock
It’s a potentially huge finding that begins to unravel the link between the dozens of genetic risk factors for PD and environmental triggers that push people over the edge. “Corynebacterium isn’t the only bug. I think there are other bugs that go with other” genetic risk factors. Eventually, “we are going to map this whole thing out: Which bugs go with which genetic susceptibilities, and which genetic susceptibilities are triggered by” other things in the environment, such as pesticides, Dr. Payami, leader of the multicenter neurogenetics research collaboration, said in an interview.

Her team genotyped SNCA rs356219 from blood samples in 197 middle-aged PD patients and 115 age-matched controls. They also extracted DNA from stool samples to see what bacteria were in their gut and then looked for interactions between rs356219 genotype, gut microbiome, and PD risk.

The medical literature has been full of hints for a while now that PD might be set off by something going wrong in the gastrointestinal tract. Colonic inflammation, alpha-synuclein pathology in the gut, and dysbiosis of the gut microbiome in PD are among the many clues. The goal of the work was to find the link between PD and its GI aberrations.

Ninety genera were identified in the stool samples, but “no matter how you looked at the data, whichever method you used, one [genus] kept coming up” for interaction with the rs356219 genotype, “and that was Corynebacterium,” Dr. Payami said.

As in past studies, the rs356219 AA genotype did not increase the odds of PD, and there was no difference in microbiome abundance between PD patients and controls. The GA genotype increased the odds slightly without Corynebacterium, but it increased the odds more than fivefold when Corynebacterium was in the gut (odds ratio, 5.9; P = .04). If people had GG plus Corynebacterium, however, “you nailed it,” Dr. Payami said: The odds of developing PD were infinite (P = .0003).

Corynebacterium was more abundant in GA subjects with PD than GA subjects without PD, but it was by far the most abundant in GG subjects, and every person who had the GG genotype and gut Corynebacterium also had PD.

Corynebacterium are gram-positive, aerobic bacilli commonly found on the skin. Some members of the genus are opportunistic pathogens. It’s not clear how they get incorporated into the gut microbiome, or if they can be wiped out selectively in the gut with antibiotics or probiotics.

Perhaps Corynebacterium in the GI tract induces expression of alpha-synuclein protein, a major component of PD Lewy bodies that’s known to travel from the gut to the brain. Maybe the amount expressed depends on how many Gs people have in rs356219. Perhaps “if you have two Gs, you get so much alpha-synuclein that’s there’s no turning back, and it’s enough to cause PD,” Dr. Payami said.

The study was led by Zachary Wallen, a PhD candidate in Dr. Payami’s lab, and presented by him at the annual meeting of the American Neurological Association. The work was supported by the National Institutes of Health. Dr. Payami and Mr. Wallen had no industry disclosures.

SOURCE: Wallen Z et al. ANA 2017 abstract number S268

 

– The presence of Corynebacterium in the gut microbiome of people with two G alleles at the rs356219 single nucleotide polymorphism locus of the alpha-synuclein gene was associated with 100% probability of having Parkinson’s disease in a study conducted by the NeuroGenetics Research Consortium.

If the finding is replicated, it means that Corynebacterium is the trigger for Parkinson’s disease (PD) in people with the GG genotype. The GG signature at rs356219 is the strongest genetic risk factor for PD identified to date, but it’s not necessarily strong enough to cause the disease on its own. “It definitely needs a trigger,” and there’s a good chance that Corynebacterium is it, said senior investigator Haydeh Payami, PhD, professor of neurology and genomics at the University of Alabama, Birmingham.

Kuo Chun Hung/Thinkstock
It’s a potentially huge finding that begins to unravel the link between the dozens of genetic risk factors for PD and environmental triggers that push people over the edge. “Corynebacterium isn’t the only bug. I think there are other bugs that go with other” genetic risk factors. Eventually, “we are going to map this whole thing out: Which bugs go with which genetic susceptibilities, and which genetic susceptibilities are triggered by” other things in the environment, such as pesticides, Dr. Payami, leader of the multicenter neurogenetics research collaboration, said in an interview.

Her team genotyped SNCA rs356219 from blood samples in 197 middle-aged PD patients and 115 age-matched controls. They also extracted DNA from stool samples to see what bacteria were in their gut and then looked for interactions between rs356219 genotype, gut microbiome, and PD risk.

The medical literature has been full of hints for a while now that PD might be set off by something going wrong in the gastrointestinal tract. Colonic inflammation, alpha-synuclein pathology in the gut, and dysbiosis of the gut microbiome in PD are among the many clues. The goal of the work was to find the link between PD and its GI aberrations.

Ninety genera were identified in the stool samples, but “no matter how you looked at the data, whichever method you used, one [genus] kept coming up” for interaction with the rs356219 genotype, “and that was Corynebacterium,” Dr. Payami said.

As in past studies, the rs356219 AA genotype did not increase the odds of PD, and there was no difference in microbiome abundance between PD patients and controls. The GA genotype increased the odds slightly without Corynebacterium, but it increased the odds more than fivefold when Corynebacterium was in the gut (odds ratio, 5.9; P = .04). If people had GG plus Corynebacterium, however, “you nailed it,” Dr. Payami said: The odds of developing PD were infinite (P = .0003).

Corynebacterium was more abundant in GA subjects with PD than GA subjects without PD, but it was by far the most abundant in GG subjects, and every person who had the GG genotype and gut Corynebacterium also had PD.

Corynebacterium are gram-positive, aerobic bacilli commonly found on the skin. Some members of the genus are opportunistic pathogens. It’s not clear how they get incorporated into the gut microbiome, or if they can be wiped out selectively in the gut with antibiotics or probiotics.

Perhaps Corynebacterium in the GI tract induces expression of alpha-synuclein protein, a major component of PD Lewy bodies that’s known to travel from the gut to the brain. Maybe the amount expressed depends on how many Gs people have in rs356219. Perhaps “if you have two Gs, you get so much alpha-synuclein that’s there’s no turning back, and it’s enough to cause PD,” Dr. Payami said.

The study was led by Zachary Wallen, a PhD candidate in Dr. Payami’s lab, and presented by him at the annual meeting of the American Neurological Association. The work was supported by the National Institutes of Health. Dr. Payami and Mr. Wallen had no industry disclosures.

SOURCE: Wallen Z et al. ANA 2017 abstract number S268

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

REPORTING FROM ANA 2017

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: A causal link between the gut microbiome and Parkinson’s disease might finally have been identified, raising the possibility of treatment to prevent PD in genetically susceptible people.

Major finding: Those who have Corynebacterium in their gut microbiome plus two G alleles at the rs356219 locus of the alpha-synuclein gene have infinite odds of developing PD (P = .0003).

Study details: A case-control study involving 197 middle-aged PD patients and 115 age-matched controls.

Disclosures: The work was supported by the National Institutes of Health. The lead and senior investigators had no industry disclosures.

Source: Wallen Z et al. ANA 2017 abstract number S268

Disqus Comments
Default

Head injury linked to amyloid deposition decades later

Article Type
Changed
Fri, 01/18/2019 - 17:17

 

A history of head injury is associated with increased amyloid deposition in the brain, both globally and in the frontal cortex, according to PET imaging in 329 older adults without dementia.

Previous studies have found an association between head injury and later dementia, but the mechanism isn’t understood. The importance of the new investigation is that it “hints at pathophysiology. We were very excited” to find amyloid in the frontal cortex, “which is also associated with Alzheimer’s disease,” said lead investigator Andrea Schneider, MD, PhD, a neurology resident at Johns Hopkins University, Baltimore.

Increased amyloid deposition was not found in other areas associated with the disease, but frontal cortex involvement “does suggest perhaps a common” denominator, she noted.

M. Alexander Otto/Frontline Medical News
Dr. Andrea Schneider
Although there’s no intervention to prevent amyloid deposition, perhaps there will be someday. “That’s the hope,” Dr. Schneider said at the annual meeting of the American Neurological Association.

The older adults in the study were all participants in the Atherosclerosis Risk in Communities (ARIC) cohort, a group of over 15,000 community-dwelling adults who have been followed since the late 1980s. All 329 had brain amyloid deposition assessed by florbetapir (Amyvid) PET scans in 2011-2013; mean age was 76 years.

Sixty-six (20%) reported a history of head trauma at a median of about 25 years before the scan, with self-reports correlating well with hospitalization billing codes collected as part of ARIC. The cause of the head trauma was not known.

Head injury was associated with elevated standardized uptake value ratios (SUVRs, greater than 1.2). Head injury patients had a 1.31-fold increased prevalence of elevated global amyloid deposition (95% confidence interval, 1.07-1.60) as well as a 1.24-fold increased prevalence of elevated deposition in the orbitofrontal cortex (95% CI, 1.02-1.50) and prefrontal cortex (95% CI, 1.03-1.49), and 1.29-fold increased prevalence in the superior frontal cortex (95% CI, 1.06-1.56).

There were no differences in the prevalence of elevated SUVRs in the anterior cingulate, posterior cingulate, precuneus, or the lateral temporal, parietal, or occipital lobes (all P greater than .05) Dr. Schneider reported.

The model was adjusted for age, sex, race, total intracranial volume, and cardiovascular disease, among other things.

Thirty percent of the participants had either one or two APOE4 alleles, and 27% had mild cognitive impairment; neither correlated with increased amyloid deposition.

Head injuries were more common among participants who were men (43%) or white (57%). There was a trend for a slightly stronger link between head injury and increased amyloid deposition among black individuals (P for interaction 0.169).

Dr. Schneider is continuing her work to illuminate the connection between head trauma and dementia. She plans to integrate more detailed cognitive data from ARIC into the analysis, and perhaps emergency department and outpatient data. She also wants to look at more acute imaging after head trauma.

The work was funded by the National Institutes of Health. Avid Radiopharmaceuticals provided the florbetapir. Dr. Schneider did not have any relevant disclosures.

SOURCE: Schneider A et al. ANA 2017 abstract M336WIP

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event
Related Articles

 

A history of head injury is associated with increased amyloid deposition in the brain, both globally and in the frontal cortex, according to PET imaging in 329 older adults without dementia.

Previous studies have found an association between head injury and later dementia, but the mechanism isn’t understood. The importance of the new investigation is that it “hints at pathophysiology. We were very excited” to find amyloid in the frontal cortex, “which is also associated with Alzheimer’s disease,” said lead investigator Andrea Schneider, MD, PhD, a neurology resident at Johns Hopkins University, Baltimore.

Increased amyloid deposition was not found in other areas associated with the disease, but frontal cortex involvement “does suggest perhaps a common” denominator, she noted.

M. Alexander Otto/Frontline Medical News
Dr. Andrea Schneider
Although there’s no intervention to prevent amyloid deposition, perhaps there will be someday. “That’s the hope,” Dr. Schneider said at the annual meeting of the American Neurological Association.

The older adults in the study were all participants in the Atherosclerosis Risk in Communities (ARIC) cohort, a group of over 15,000 community-dwelling adults who have been followed since the late 1980s. All 329 had brain amyloid deposition assessed by florbetapir (Amyvid) PET scans in 2011-2013; mean age was 76 years.

Sixty-six (20%) reported a history of head trauma at a median of about 25 years before the scan, with self-reports correlating well with hospitalization billing codes collected as part of ARIC. The cause of the head trauma was not known.

Head injury was associated with elevated standardized uptake value ratios (SUVRs, greater than 1.2). Head injury patients had a 1.31-fold increased prevalence of elevated global amyloid deposition (95% confidence interval, 1.07-1.60) as well as a 1.24-fold increased prevalence of elevated deposition in the orbitofrontal cortex (95% CI, 1.02-1.50) and prefrontal cortex (95% CI, 1.03-1.49), and 1.29-fold increased prevalence in the superior frontal cortex (95% CI, 1.06-1.56).

There were no differences in the prevalence of elevated SUVRs in the anterior cingulate, posterior cingulate, precuneus, or the lateral temporal, parietal, or occipital lobes (all P greater than .05) Dr. Schneider reported.

The model was adjusted for age, sex, race, total intracranial volume, and cardiovascular disease, among other things.

Thirty percent of the participants had either one or two APOE4 alleles, and 27% had mild cognitive impairment; neither correlated with increased amyloid deposition.

Head injuries were more common among participants who were men (43%) or white (57%). There was a trend for a slightly stronger link between head injury and increased amyloid deposition among black individuals (P for interaction 0.169).

Dr. Schneider is continuing her work to illuminate the connection between head trauma and dementia. She plans to integrate more detailed cognitive data from ARIC into the analysis, and perhaps emergency department and outpatient data. She also wants to look at more acute imaging after head trauma.

The work was funded by the National Institutes of Health. Avid Radiopharmaceuticals provided the florbetapir. Dr. Schneider did not have any relevant disclosures.

SOURCE: Schneider A et al. ANA 2017 abstract M336WIP

 

A history of head injury is associated with increased amyloid deposition in the brain, both globally and in the frontal cortex, according to PET imaging in 329 older adults without dementia.

Previous studies have found an association between head injury and later dementia, but the mechanism isn’t understood. The importance of the new investigation is that it “hints at pathophysiology. We were very excited” to find amyloid in the frontal cortex, “which is also associated with Alzheimer’s disease,” said lead investigator Andrea Schneider, MD, PhD, a neurology resident at Johns Hopkins University, Baltimore.

Increased amyloid deposition was not found in other areas associated with the disease, but frontal cortex involvement “does suggest perhaps a common” denominator, she noted.

M. Alexander Otto/Frontline Medical News
Dr. Andrea Schneider
Although there’s no intervention to prevent amyloid deposition, perhaps there will be someday. “That’s the hope,” Dr. Schneider said at the annual meeting of the American Neurological Association.

The older adults in the study were all participants in the Atherosclerosis Risk in Communities (ARIC) cohort, a group of over 15,000 community-dwelling adults who have been followed since the late 1980s. All 329 had brain amyloid deposition assessed by florbetapir (Amyvid) PET scans in 2011-2013; mean age was 76 years.

Sixty-six (20%) reported a history of head trauma at a median of about 25 years before the scan, with self-reports correlating well with hospitalization billing codes collected as part of ARIC. The cause of the head trauma was not known.

Head injury was associated with elevated standardized uptake value ratios (SUVRs, greater than 1.2). Head injury patients had a 1.31-fold increased prevalence of elevated global amyloid deposition (95% confidence interval, 1.07-1.60) as well as a 1.24-fold increased prevalence of elevated deposition in the orbitofrontal cortex (95% CI, 1.02-1.50) and prefrontal cortex (95% CI, 1.03-1.49), and 1.29-fold increased prevalence in the superior frontal cortex (95% CI, 1.06-1.56).

There were no differences in the prevalence of elevated SUVRs in the anterior cingulate, posterior cingulate, precuneus, or the lateral temporal, parietal, or occipital lobes (all P greater than .05) Dr. Schneider reported.

The model was adjusted for age, sex, race, total intracranial volume, and cardiovascular disease, among other things.

Thirty percent of the participants had either one or two APOE4 alleles, and 27% had mild cognitive impairment; neither correlated with increased amyloid deposition.

Head injuries were more common among participants who were men (43%) or white (57%). There was a trend for a slightly stronger link between head injury and increased amyloid deposition among black individuals (P for interaction 0.169).

Dr. Schneider is continuing her work to illuminate the connection between head trauma and dementia. She plans to integrate more detailed cognitive data from ARIC into the analysis, and perhaps emergency department and outpatient data. She also wants to look at more acute imaging after head trauma.

The work was funded by the National Institutes of Health. Avid Radiopharmaceuticals provided the florbetapir. Dr. Schneider did not have any relevant disclosures.

SOURCE: Schneider A et al. ANA 2017 abstract M336WIP

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

REPORTING FROM ANA 2017

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Head injury is associated with increased brain amyloid deposition globally and in the frontal cortex, which may help explain the association between head trauma and dementia.

Major finding: Head injury patients had a 1.31-fold increased prevalence of elevated global amyloid deposition (95% CI, 1.07-1.60) as well as a 1.24-fold increased prevalence of elevated deposition in the orbitofrontal cortex (95% CI, 1.02-1.50) and prefrontal cortex (95% CI, 1.03-1.49) and 1.29-fold increased prevalence in the superior frontal cortex (95% CI, 1.06-1.56).

Data source: PET imaging in 329 older adults without dementia.

Disclosures: The work was funded by the National Institutes of Health. Avid Radiopharmaceuticals provided the florbetapir. The lead author did not have any relevant disclosures.

Source: Schneider A et al. ANA 2017 abstract M336WIP.

Disqus Comments
Default