User login
Urgent care and retail clinics fuel inappropriate antibiotic prescribing
When it comes to inappropriate antibiotic prescribing, urgent care centers and retail clinics may have an outsized impact, a review of commercial insurance claims suggests.
Antibiotic prescription rates were at least twice as high in those settings, compared with emergency departments and medical office visits, according to the retrospective analysis.
The issue may be particularly pronounced in urgent care centers, based on this study, in which nearly half of visits for antibiotic-inappropriate respiratory diagnoses resulted in antibiotic prescribing.
Those findings suggest a need for “antibiotic stewardship interventions” to reduce unnecessary prescribing of antibiotics in ambulatory care settings, authors of the analysis reported in a research letter to JAMA Internal Medicine.
“Efforts targeting urgent care centers are urgently needed,” wrote Danielle L. Palms, MPH, of the Centers for Disease Control and Prevention, Atlanta, and her coauthors.
The retrospective study by Ms. Palms and her colleagues included claims from 2014 in a database of individuals 65 years of age or younger with employer-sponsored insurance.
The researchers included encounters in which medical and prescription coverage data were captured, including approximately 2.7 million urgent care center visits, 58,000 retail clinic visits, 4.8 million emergency department visits, and 148.5 million medical office visits.[[{"fid":"194045","view_mode":"medstat_image_flush_right","attributes":{"class":"media-element file-medstat-image-flush-right","data-delta":"1"},"fields":{"format":"medstat_image_flush_right","field_file_image_caption[und][0][value]":"","field_file_image_caption[und][0][format]":"filtered_html","field_file_image_credit[und][0][value]":"Sheep purple/flickr/CC BY 2.0 /en.wikipedia/CC BY-SA 4.0"},"type":"media","field_deltas":{"1":{"format":"medstat_image_flush_right","field_file_image_caption[und][0][value]":"","field_file_image_caption[und][0][format]":"filtered_html","field_file_image_credit[und][0][value]":"Sheep purple/flickr/CC BY 2.0 /en.wikipedia/CC BY-SA 4.0"}}}]]
They found antibiotic prescriptions linked to 39.0% of urgent care and 36.4% of retail clinic visits, compared with 13.8% of emergency department visits and 7.1% of medical office visits.
For respiratory diagnoses where antibiotics would be inappropriate, such as viral upper respiratory infections, antibiotics were nevertheless prescribed in 45.7% of urgent care visits, compared with 24.6% of emergency department, 17.0% of medical office visits, and 14.4% of retail clinic visits.
Those data show “substantial variability” that suggests case mix differences and evidence of antibiotic overuse, particularly in the urgent care setting, the researchers said in their letter.
In another recent study, looking at the 2010-2011 period, at least 30% of antibiotic prescriptions written in U.S. physician offices and emergency departments were unnecessary.
“The finding of the present study that antibiotic prescribing for antibiotic inappropriate respiratory diagnoses was highest in urgent care centers suggests that unnecessary antibiotic prescribing nationally in all outpatient settings may be higher than the estimated 30%,” wrote Ms. Palms and her coinvestigators.
The research was funded by the Centers for Disease Control and Prevention. Ms. Palms and her coauthors reported no conflicts of interest.
SOURCE: Palms DL et al. JAMA Intern Med. 2018 Jul 16.
This study suggests urgent care and retail clinics are “underrecognized” contributors to the ongoing problem of inappropriate antibiotic prescribing, according to authors of an invited commentary.
The urgent care sector, a $15 billion business representing more than 10,000 U.S. high-volume clinics, is growing very rapidly due to convenient locations, same-day access to care, and lower out-of-pocket expenditures versus emergency departments, the authors said.
“Lowering barriers for an office visit to such a degree may prompt frequent visits for mild self-resolving illnesses that would be better treated with rest and symptom management at home,” they wrote.
Innovations such as telephone triage lines could help reduce inappropriate antibiotic prescribing, but might “conflict with the business model” of urgent care and retail clinics, they added.
“Unfortunately, we all pay – in increased insurance premiums and increased antibiotic resistance – from the overprescribing of antibiotics for upper respiratory tract infections,” they wrote.
Michael A. Incze, MD, MSEd, and Rita F. Redberg, MD, MSc, are with the department of medicine, University of California, San Francisco. Mitchell H. Katz, MD, is with New York City Health and Hospitals. These comments are based on their invited commentary appearing in JAMA Internal Medicine . All three authors reported having no conflicts of interest.
This study suggests urgent care and retail clinics are “underrecognized” contributors to the ongoing problem of inappropriate antibiotic prescribing, according to authors of an invited commentary.
The urgent care sector, a $15 billion business representing more than 10,000 U.S. high-volume clinics, is growing very rapidly due to convenient locations, same-day access to care, and lower out-of-pocket expenditures versus emergency departments, the authors said.
“Lowering barriers for an office visit to such a degree may prompt frequent visits for mild self-resolving illnesses that would be better treated with rest and symptom management at home,” they wrote.
Innovations such as telephone triage lines could help reduce inappropriate antibiotic prescribing, but might “conflict with the business model” of urgent care and retail clinics, they added.
“Unfortunately, we all pay – in increased insurance premiums and increased antibiotic resistance – from the overprescribing of antibiotics for upper respiratory tract infections,” they wrote.
Michael A. Incze, MD, MSEd, and Rita F. Redberg, MD, MSc, are with the department of medicine, University of California, San Francisco. Mitchell H. Katz, MD, is with New York City Health and Hospitals. These comments are based on their invited commentary appearing in JAMA Internal Medicine . All three authors reported having no conflicts of interest.
This study suggests urgent care and retail clinics are “underrecognized” contributors to the ongoing problem of inappropriate antibiotic prescribing, according to authors of an invited commentary.
The urgent care sector, a $15 billion business representing more than 10,000 U.S. high-volume clinics, is growing very rapidly due to convenient locations, same-day access to care, and lower out-of-pocket expenditures versus emergency departments, the authors said.
“Lowering barriers for an office visit to such a degree may prompt frequent visits for mild self-resolving illnesses that would be better treated with rest and symptom management at home,” they wrote.
Innovations such as telephone triage lines could help reduce inappropriate antibiotic prescribing, but might “conflict with the business model” of urgent care and retail clinics, they added.
“Unfortunately, we all pay – in increased insurance premiums and increased antibiotic resistance – from the overprescribing of antibiotics for upper respiratory tract infections,” they wrote.
Michael A. Incze, MD, MSEd, and Rita F. Redberg, MD, MSc, are with the department of medicine, University of California, San Francisco. Mitchell H. Katz, MD, is with New York City Health and Hospitals. These comments are based on their invited commentary appearing in JAMA Internal Medicine . All three authors reported having no conflicts of interest.
When it comes to inappropriate antibiotic prescribing, urgent care centers and retail clinics may have an outsized impact, a review of commercial insurance claims suggests.
Antibiotic prescription rates were at least twice as high in those settings, compared with emergency departments and medical office visits, according to the retrospective analysis.
The issue may be particularly pronounced in urgent care centers, based on this study, in which nearly half of visits for antibiotic-inappropriate respiratory diagnoses resulted in antibiotic prescribing.
Those findings suggest a need for “antibiotic stewardship interventions” to reduce unnecessary prescribing of antibiotics in ambulatory care settings, authors of the analysis reported in a research letter to JAMA Internal Medicine.
“Efforts targeting urgent care centers are urgently needed,” wrote Danielle L. Palms, MPH, of the Centers for Disease Control and Prevention, Atlanta, and her coauthors.
The retrospective study by Ms. Palms and her colleagues included claims from 2014 in a database of individuals 65 years of age or younger with employer-sponsored insurance.
The researchers included encounters in which medical and prescription coverage data were captured, including approximately 2.7 million urgent care center visits, 58,000 retail clinic visits, 4.8 million emergency department visits, and 148.5 million medical office visits.[[{"fid":"194045","view_mode":"medstat_image_flush_right","attributes":{"class":"media-element file-medstat-image-flush-right","data-delta":"1"},"fields":{"format":"medstat_image_flush_right","field_file_image_caption[und][0][value]":"","field_file_image_caption[und][0][format]":"filtered_html","field_file_image_credit[und][0][value]":"Sheep purple/flickr/CC BY 2.0 /en.wikipedia/CC BY-SA 4.0"},"type":"media","field_deltas":{"1":{"format":"medstat_image_flush_right","field_file_image_caption[und][0][value]":"","field_file_image_caption[und][0][format]":"filtered_html","field_file_image_credit[und][0][value]":"Sheep purple/flickr/CC BY 2.0 /en.wikipedia/CC BY-SA 4.0"}}}]]
They found antibiotic prescriptions linked to 39.0% of urgent care and 36.4% of retail clinic visits, compared with 13.8% of emergency department visits and 7.1% of medical office visits.
For respiratory diagnoses where antibiotics would be inappropriate, such as viral upper respiratory infections, antibiotics were nevertheless prescribed in 45.7% of urgent care visits, compared with 24.6% of emergency department, 17.0% of medical office visits, and 14.4% of retail clinic visits.
Those data show “substantial variability” that suggests case mix differences and evidence of antibiotic overuse, particularly in the urgent care setting, the researchers said in their letter.
In another recent study, looking at the 2010-2011 period, at least 30% of antibiotic prescriptions written in U.S. physician offices and emergency departments were unnecessary.
“The finding of the present study that antibiotic prescribing for antibiotic inappropriate respiratory diagnoses was highest in urgent care centers suggests that unnecessary antibiotic prescribing nationally in all outpatient settings may be higher than the estimated 30%,” wrote Ms. Palms and her coinvestigators.
The research was funded by the Centers for Disease Control and Prevention. Ms. Palms and her coauthors reported no conflicts of interest.
SOURCE: Palms DL et al. JAMA Intern Med. 2018 Jul 16.
When it comes to inappropriate antibiotic prescribing, urgent care centers and retail clinics may have an outsized impact, a review of commercial insurance claims suggests.
Antibiotic prescription rates were at least twice as high in those settings, compared with emergency departments and medical office visits, according to the retrospective analysis.
The issue may be particularly pronounced in urgent care centers, based on this study, in which nearly half of visits for antibiotic-inappropriate respiratory diagnoses resulted in antibiotic prescribing.
Those findings suggest a need for “antibiotic stewardship interventions” to reduce unnecessary prescribing of antibiotics in ambulatory care settings, authors of the analysis reported in a research letter to JAMA Internal Medicine.
“Efforts targeting urgent care centers are urgently needed,” wrote Danielle L. Palms, MPH, of the Centers for Disease Control and Prevention, Atlanta, and her coauthors.
The retrospective study by Ms. Palms and her colleagues included claims from 2014 in a database of individuals 65 years of age or younger with employer-sponsored insurance.
The researchers included encounters in which medical and prescription coverage data were captured, including approximately 2.7 million urgent care center visits, 58,000 retail clinic visits, 4.8 million emergency department visits, and 148.5 million medical office visits.[[{"fid":"194045","view_mode":"medstat_image_flush_right","attributes":{"class":"media-element file-medstat-image-flush-right","data-delta":"1"},"fields":{"format":"medstat_image_flush_right","field_file_image_caption[und][0][value]":"","field_file_image_caption[und][0][format]":"filtered_html","field_file_image_credit[und][0][value]":"Sheep purple/flickr/CC BY 2.0 /en.wikipedia/CC BY-SA 4.0"},"type":"media","field_deltas":{"1":{"format":"medstat_image_flush_right","field_file_image_caption[und][0][value]":"","field_file_image_caption[und][0][format]":"filtered_html","field_file_image_credit[und][0][value]":"Sheep purple/flickr/CC BY 2.0 /en.wikipedia/CC BY-SA 4.0"}}}]]
They found antibiotic prescriptions linked to 39.0% of urgent care and 36.4% of retail clinic visits, compared with 13.8% of emergency department visits and 7.1% of medical office visits.
For respiratory diagnoses where antibiotics would be inappropriate, such as viral upper respiratory infections, antibiotics were nevertheless prescribed in 45.7% of urgent care visits, compared with 24.6% of emergency department, 17.0% of medical office visits, and 14.4% of retail clinic visits.
Those data show “substantial variability” that suggests case mix differences and evidence of antibiotic overuse, particularly in the urgent care setting, the researchers said in their letter.
In another recent study, looking at the 2010-2011 period, at least 30% of antibiotic prescriptions written in U.S. physician offices and emergency departments were unnecessary.
“The finding of the present study that antibiotic prescribing for antibiotic inappropriate respiratory diagnoses was highest in urgent care centers suggests that unnecessary antibiotic prescribing nationally in all outpatient settings may be higher than the estimated 30%,” wrote Ms. Palms and her coinvestigators.
The research was funded by the Centers for Disease Control and Prevention. Ms. Palms and her coauthors reported no conflicts of interest.
SOURCE: Palms DL et al. JAMA Intern Med. 2018 Jul 16.
FROM JAMA INTERNAL MEDICINE
Key clinical point:
Major finding: For respiratory diagnoses where antibiotics would be inappropriate, such as viral upper respiratory infections, antibiotics were nevertheless prescribed in 45.7% of urgent care visits.
Study details: A retrospective cohort study including claims from 2014 in a database for individuals 65 years of age or younger with employer-sponsored insurance.
Disclosures: The research was funded by the Centers for Disease Control and Prevention. The authors reported no conflicts of interest.
Source: Palms DL et al. JAMA Intern Med. 2018 Jul 16.
T-DM1 produces 44% response rate in HER2-mutant lung cancers
Ado-trastuzumab emtansine (T-DM1) has met a predefined efficacy endpoint in what investigators say is the first positive clinical trial in patients with advanced HER2-mutant lung cancers.
The HER2-targeted therapy produced a 44% overall response rate among 18 patients enrolled in the phase 2, investigator-initiated basket trial reported in the Journal of Clinical Oncology.
The median progression-free survival (PFS) was 5 months in this heavily pretreated group of patients, according to first author Bob T. Li, MD, of Memorial Sloan Kettering Cancer Center, New York, and his coauthors.
“This is important therapeutic progress in the context of more than a decade of negative clinical trials targeting HER2 in lung cancer,” the researchers wrote.
The patients (median age, 64 years; 72% female) had metastatic lung adenocarcinomas treated in 2016 at Memorial Sloan Kettering Cancer Center. All patients had HER2-activating mutations identified by next-generation sequencing. They had received a median of two lines of prior therapy. Three of the 18 patients were treatment naive. Prior HER2-targeted therapy, including trastuzumab, was allowed.
All patients received intravenous infusions of T-DM1 at 3.6 mg/kg every 21 days until progression of disease or unacceptable toxicity.
Confirmed partial responses were seen in 8 patients (44%), while an additional 7 (39%) had stable disease, Dr. Li and his coauthors reported. Median PFS was 5 months overall and 6 months for responders, with the longest observed PFS being more than 11 months.
The treatment was well tolerated, with treatment-related adverse events mainly consisting of grade 1-2 infusion reactions, elevation of hepatic transaminases, and thrombocytopenia.
The rate of infusion reactions was higher than what was expected based on the experience with T-DM1 in breast cancer, though these reactions were generally mild and did not require discontinuation of the drug, which is approved for the treatment of HER2-amplified or –overexpressing metastatic breast cancer.
These findings have important implication for drug development in HER-2 mutant lung cancers, particularly as next-generation sequencing becomes more commonly used for initial tumor evaluation, Dr. Li and his coauthors noted.
“Just as the discovery of EGFR mutations eventually led to a plethora of approved oncogene-targeted therapies transforming the care of patients around the world, HER2-activating mutations similarly show promise as a therapeutic target,” they wrote.
Disappointing results were seen in previous studies looking at trastuzumab in lung cancer patients, and in those trials, patients were selected on the basis of HER2 protein expression by immunohistochemistry (IHC). “More recent studies have again confirmed that HER2 IHC is not the ideal biomarker in lung cancers,” the researchers wrote.
The 18-patient cohort in this study was part of a larger, investigator-initiated basket trial. Cohorts not reported at this time involved patients with other HER2-amplified solid tumors, including bladder.
The study was supported by the Conquer Cancer Foundation, Genentech, and a grant from the National Institutes of Health.
Dr. Li reported disclosures related to Roche, Biosceptre International, Thermo Fisher Scientific, Mersana, Guardant Health, Genentech, Illumina, BioMed Valley Discoveries, AstraZeneca, and GRAIL. Several co-authors reported disclosures, including employment, with NantOmics, a developer of molecular profiling tools.
SOURCE: Li BT et al. J Clin Oncol. 2018 July 10. doi: 10.1200/JCO.2018.77.9777.
Ado-trastuzumab emtansine (T-DM1) has met a predefined efficacy endpoint in what investigators say is the first positive clinical trial in patients with advanced HER2-mutant lung cancers.
The HER2-targeted therapy produced a 44% overall response rate among 18 patients enrolled in the phase 2, investigator-initiated basket trial reported in the Journal of Clinical Oncology.
The median progression-free survival (PFS) was 5 months in this heavily pretreated group of patients, according to first author Bob T. Li, MD, of Memorial Sloan Kettering Cancer Center, New York, and his coauthors.
“This is important therapeutic progress in the context of more than a decade of negative clinical trials targeting HER2 in lung cancer,” the researchers wrote.
The patients (median age, 64 years; 72% female) had metastatic lung adenocarcinomas treated in 2016 at Memorial Sloan Kettering Cancer Center. All patients had HER2-activating mutations identified by next-generation sequencing. They had received a median of two lines of prior therapy. Three of the 18 patients were treatment naive. Prior HER2-targeted therapy, including trastuzumab, was allowed.
All patients received intravenous infusions of T-DM1 at 3.6 mg/kg every 21 days until progression of disease or unacceptable toxicity.
Confirmed partial responses were seen in 8 patients (44%), while an additional 7 (39%) had stable disease, Dr. Li and his coauthors reported. Median PFS was 5 months overall and 6 months for responders, with the longest observed PFS being more than 11 months.
The treatment was well tolerated, with treatment-related adverse events mainly consisting of grade 1-2 infusion reactions, elevation of hepatic transaminases, and thrombocytopenia.
The rate of infusion reactions was higher than what was expected based on the experience with T-DM1 in breast cancer, though these reactions were generally mild and did not require discontinuation of the drug, which is approved for the treatment of HER2-amplified or –overexpressing metastatic breast cancer.
These findings have important implication for drug development in HER-2 mutant lung cancers, particularly as next-generation sequencing becomes more commonly used for initial tumor evaluation, Dr. Li and his coauthors noted.
“Just as the discovery of EGFR mutations eventually led to a plethora of approved oncogene-targeted therapies transforming the care of patients around the world, HER2-activating mutations similarly show promise as a therapeutic target,” they wrote.
Disappointing results were seen in previous studies looking at trastuzumab in lung cancer patients, and in those trials, patients were selected on the basis of HER2 protein expression by immunohistochemistry (IHC). “More recent studies have again confirmed that HER2 IHC is not the ideal biomarker in lung cancers,” the researchers wrote.
The 18-patient cohort in this study was part of a larger, investigator-initiated basket trial. Cohorts not reported at this time involved patients with other HER2-amplified solid tumors, including bladder.
The study was supported by the Conquer Cancer Foundation, Genentech, and a grant from the National Institutes of Health.
Dr. Li reported disclosures related to Roche, Biosceptre International, Thermo Fisher Scientific, Mersana, Guardant Health, Genentech, Illumina, BioMed Valley Discoveries, AstraZeneca, and GRAIL. Several co-authors reported disclosures, including employment, with NantOmics, a developer of molecular profiling tools.
SOURCE: Li BT et al. J Clin Oncol. 2018 July 10. doi: 10.1200/JCO.2018.77.9777.
Ado-trastuzumab emtansine (T-DM1) has met a predefined efficacy endpoint in what investigators say is the first positive clinical trial in patients with advanced HER2-mutant lung cancers.
The HER2-targeted therapy produced a 44% overall response rate among 18 patients enrolled in the phase 2, investigator-initiated basket trial reported in the Journal of Clinical Oncology.
The median progression-free survival (PFS) was 5 months in this heavily pretreated group of patients, according to first author Bob T. Li, MD, of Memorial Sloan Kettering Cancer Center, New York, and his coauthors.
“This is important therapeutic progress in the context of more than a decade of negative clinical trials targeting HER2 in lung cancer,” the researchers wrote.
The patients (median age, 64 years; 72% female) had metastatic lung adenocarcinomas treated in 2016 at Memorial Sloan Kettering Cancer Center. All patients had HER2-activating mutations identified by next-generation sequencing. They had received a median of two lines of prior therapy. Three of the 18 patients were treatment naive. Prior HER2-targeted therapy, including trastuzumab, was allowed.
All patients received intravenous infusions of T-DM1 at 3.6 mg/kg every 21 days until progression of disease or unacceptable toxicity.
Confirmed partial responses were seen in 8 patients (44%), while an additional 7 (39%) had stable disease, Dr. Li and his coauthors reported. Median PFS was 5 months overall and 6 months for responders, with the longest observed PFS being more than 11 months.
The treatment was well tolerated, with treatment-related adverse events mainly consisting of grade 1-2 infusion reactions, elevation of hepatic transaminases, and thrombocytopenia.
The rate of infusion reactions was higher than what was expected based on the experience with T-DM1 in breast cancer, though these reactions were generally mild and did not require discontinuation of the drug, which is approved for the treatment of HER2-amplified or –overexpressing metastatic breast cancer.
These findings have important implication for drug development in HER-2 mutant lung cancers, particularly as next-generation sequencing becomes more commonly used for initial tumor evaluation, Dr. Li and his coauthors noted.
“Just as the discovery of EGFR mutations eventually led to a plethora of approved oncogene-targeted therapies transforming the care of patients around the world, HER2-activating mutations similarly show promise as a therapeutic target,” they wrote.
Disappointing results were seen in previous studies looking at trastuzumab in lung cancer patients, and in those trials, patients were selected on the basis of HER2 protein expression by immunohistochemistry (IHC). “More recent studies have again confirmed that HER2 IHC is not the ideal biomarker in lung cancers,” the researchers wrote.
The 18-patient cohort in this study was part of a larger, investigator-initiated basket trial. Cohorts not reported at this time involved patients with other HER2-amplified solid tumors, including bladder.
The study was supported by the Conquer Cancer Foundation, Genentech, and a grant from the National Institutes of Health.
Dr. Li reported disclosures related to Roche, Biosceptre International, Thermo Fisher Scientific, Mersana, Guardant Health, Genentech, Illumina, BioMed Valley Discoveries, AstraZeneca, and GRAIL. Several co-authors reported disclosures, including employment, with NantOmics, a developer of molecular profiling tools.
SOURCE: Li BT et al. J Clin Oncol. 2018 July 10. doi: 10.1200/JCO.2018.77.9777.
FROM THE JOURNAL OF CLINICAL ONCOLOGY
Key clinical point: Ado-trastuzumab emtansine (T-DM1) demonstrated activity in advanced, HER2-mutant lung cancer.
Major finding: Confirmed partial responses were seen in 44% of patients. Median progression-free survival was 5 months.
Study details: Analysis of a cohort of 18 patients with HER2-mutant metastatic lung adenocarcinomas enrolled in a phase 2, investigator-initiated basket trial.
Disclosures: The Conquer Cancer Foundation, Genentech, and the National Institutes of Health supported the study. The researchers reported disclosures related to Roche, Biosceptre International, Thermo Fisher Scientific, Mersana, Guardant Health, Genentech, Illumina, BioMed Valley Discoveries, AstraZeneca, GRAIL, and NantOmics, among others.
Source: Li BT et al. J Clin Oncol. 2018 July 10. doi: 10.1200/JCO.2018.77.9777.
New hypertension guidelines would add 15.6 million new diagnoses
A new analysis estimates that adopting the 2017 ACC/AHA hypertension guidelines would add 15.6 million Americans to the ranks of the hypertensives, and half of those would be candidates for treatment.
Similar increases would occur in other countries, according to study authors, who analyzed two large datasets from the United States and China.
That happened by resetting the definition of adult hypertension from the long-standing threshold of 140/90 mm Hg to a blood pressure at or above 130/80 mm Hg, meaning more than half of people aged 45-75 years in both countries would be classified as having hypertension, according to the researchers, led by Harlan M. Krumholz, MD, of the Center for Outcomes Research and Evaluation at Yale–New Haven (Conn.) Hospital and the section of cardiovascular medicine at Yale
An additional 7.5 million Americans would be recommended for treatment under the new lower treatment thresholds, with a correspondingly large increase in the Chinese population, according to results published in the BMJ.
The guideline changes are “not firmly rooted in evidence” and could have health policy implications that include strain on public health programs, Dr. Krumholz and his colleagues said in their report on the study.
“The change occurs at a time when both countries have substantial numbers of people who are not aware of having hypertension, and who have hypertension that is not controlled, even according to the previous standards,” they wrote.
The analysis by Dr. Krumholz and his colleagues was based on the two most recent cycles of the U.S. National Health and Nutrition Examination Survey (NHANES), representing 2013-2014 and 2015-2016 periods, as well as the China Health and Retirement Longitudinal Study (CHARLS) in 2011-2012.
Under the new ACC/AHA guidelines, they found, 70.1 million Americans aged 45-65 years would be classified as hypertensive, representing 63% of that age group. That’s a 27% relative increase over the 55.3 million individuals, or 49.7%, with hypertension as defined in the JNC-8 guidelines.
In addition, 15.6 million persons would be classified as eligible for treatment but not receiving it, up from 8.1 million under the JNC-8 guidance.
Previous estimates projected a far greater jump in new hypertension classifications, including one that used data from the National Health and Nutrition Examination Survey, antihypertensive clinical trials, and population-based cohort studies. That study estimated that 31 million people would newly carry the label (JAMA Cardiol. 2018 May 23; doi: 10.1001/jamacardio.2018.1240.)
In the current analysis, in China, 267 million aged 45-65 years (55% of that age group) would be classified with hypertension under the ACC/AHA guidelines, a relative increase of 45% over the JNC-8 guidelines, while the number of candidates for treatment would be 129 million, up from 74.5 million under the earlier guidelines.
Dr. Krumholz noted that the ACC/AHA guideline changes were prompted by results from the SPRINT trial. However, the improvements in outcomes seen in SPRINT, which included patients at high risk for cardiovascular events but without diabetes, have not been observed in individuals at low or intermediate risk, or in those with diabetes, they said.
“Expanding the pool of patients who merit treatment to include those at low risk could potentially render public health programs less efficient and viable,” they wrote in a discussion of health policy implications.
The new guidelines also put millions at risk of the “psychological morbidity” that comes with the label of a chronic disease, and at risk for more adverse events caused by inappropriate use of drug therapy, they added.
Dr. Krumholz reported research agreements from Medtronic and from Johnson and Johnson (Janssen) through Yale University, and a grant from the Food and Drug Administration and Medtronic. He reported other disclosures related to UnitedHealth, the IBM Watson Health Life Sciences Board, Element Science, Aetna, and Hugo, a personal health information platform he founded. First author Rohan Khera, MD, reported support from the National Institutes of Health.
SOURCE: Khera R et al. BMJ. 2018 Jul 11;362:k2357
This article was updated 7/19/18.
This study addressing hypertension guideline changes is unique because it was initially published on a public preprint server.
Preprints are common in some scientific areas, but uncommon in major medical journals. They allow investigators to share research, quickly and openly, for critique and feedback before standard peer review and publication.
In the case of this study, researchers analyzed the public health implications of the anticipated changes to the 2017 ACC/AHA hypertension guidelines in two nationally representative data sets from the United States and China.
The authors quickly finalized their manuscript right after the revised hypertension guidelines were released. They chose the preprint approach because they realized their research would be immediately relevant to the discussion that followed, first author Rohan Khera, MD, recounted on BMJ Blogs.
“The traditional approach of submitting to a medical journal would mean being out of the public eye for several months,” Dr. Khera said in his post. “The preprint platform offered us an excellent opportunity of ensuring early dissemination of our research study in its entirety, while we sought its evaluation by peer reviewers and the refinement by a medical journal.”
The manuscript was submitted via a Web-based system and was publicly available 2 hours later on the same day the guidelines were published. The researchers received comments and suggestions on the preprint, some of which were incorporated into the final manuscript they submitted for peer review.
Then the manuscript went through the usual iterative peer review process; however, the preprint was still available online to guide other investigators and limit duplication of effort, Dr. Khera said in his blog post.
That contrasts with another recent experience in which Dr. Khera and his colleagues performed work that “failed to inform” ongoing policy discussions, and other research efforts, while they waited for eventual publication.
“We hope that more journals will accept the benefits of science that is publicly available while journal editors and peer reviewers carry out their critical role of improving both the quality and the impact of these scientific contributions,” Dr. Khera wrote.
Rohan Khera, MD, a cardiology fellow at the University of Texas (Dallas) Southwestern Medical Center in, wrote about his experience with preprints for BMJ Blogs . Dr. Khera had no conflicts of interest to disclose.
This study addressing hypertension guideline changes is unique because it was initially published on a public preprint server.
Preprints are common in some scientific areas, but uncommon in major medical journals. They allow investigators to share research, quickly and openly, for critique and feedback before standard peer review and publication.
In the case of this study, researchers analyzed the public health implications of the anticipated changes to the 2017 ACC/AHA hypertension guidelines in two nationally representative data sets from the United States and China.
The authors quickly finalized their manuscript right after the revised hypertension guidelines were released. They chose the preprint approach because they realized their research would be immediately relevant to the discussion that followed, first author Rohan Khera, MD, recounted on BMJ Blogs.
“The traditional approach of submitting to a medical journal would mean being out of the public eye for several months,” Dr. Khera said in his post. “The preprint platform offered us an excellent opportunity of ensuring early dissemination of our research study in its entirety, while we sought its evaluation by peer reviewers and the refinement by a medical journal.”
The manuscript was submitted via a Web-based system and was publicly available 2 hours later on the same day the guidelines were published. The researchers received comments and suggestions on the preprint, some of which were incorporated into the final manuscript they submitted for peer review.
Then the manuscript went through the usual iterative peer review process; however, the preprint was still available online to guide other investigators and limit duplication of effort, Dr. Khera said in his blog post.
That contrasts with another recent experience in which Dr. Khera and his colleagues performed work that “failed to inform” ongoing policy discussions, and other research efforts, while they waited for eventual publication.
“We hope that more journals will accept the benefits of science that is publicly available while journal editors and peer reviewers carry out their critical role of improving both the quality and the impact of these scientific contributions,” Dr. Khera wrote.
Rohan Khera, MD, a cardiology fellow at the University of Texas (Dallas) Southwestern Medical Center in, wrote about his experience with preprints for BMJ Blogs . Dr. Khera had no conflicts of interest to disclose.
This study addressing hypertension guideline changes is unique because it was initially published on a public preprint server.
Preprints are common in some scientific areas, but uncommon in major medical journals. They allow investigators to share research, quickly and openly, for critique and feedback before standard peer review and publication.
In the case of this study, researchers analyzed the public health implications of the anticipated changes to the 2017 ACC/AHA hypertension guidelines in two nationally representative data sets from the United States and China.
The authors quickly finalized their manuscript right after the revised hypertension guidelines were released. They chose the preprint approach because they realized their research would be immediately relevant to the discussion that followed, first author Rohan Khera, MD, recounted on BMJ Blogs.
“The traditional approach of submitting to a medical journal would mean being out of the public eye for several months,” Dr. Khera said in his post. “The preprint platform offered us an excellent opportunity of ensuring early dissemination of our research study in its entirety, while we sought its evaluation by peer reviewers and the refinement by a medical journal.”
The manuscript was submitted via a Web-based system and was publicly available 2 hours later on the same day the guidelines were published. The researchers received comments and suggestions on the preprint, some of which were incorporated into the final manuscript they submitted for peer review.
Then the manuscript went through the usual iterative peer review process; however, the preprint was still available online to guide other investigators and limit duplication of effort, Dr. Khera said in his blog post.
That contrasts with another recent experience in which Dr. Khera and his colleagues performed work that “failed to inform” ongoing policy discussions, and other research efforts, while they waited for eventual publication.
“We hope that more journals will accept the benefits of science that is publicly available while journal editors and peer reviewers carry out their critical role of improving both the quality and the impact of these scientific contributions,” Dr. Khera wrote.
Rohan Khera, MD, a cardiology fellow at the University of Texas (Dallas) Southwestern Medical Center in, wrote about his experience with preprints for BMJ Blogs . Dr. Khera had no conflicts of interest to disclose.
A new analysis estimates that adopting the 2017 ACC/AHA hypertension guidelines would add 15.6 million Americans to the ranks of the hypertensives, and half of those would be candidates for treatment.
Similar increases would occur in other countries, according to study authors, who analyzed two large datasets from the United States and China.
That happened by resetting the definition of adult hypertension from the long-standing threshold of 140/90 mm Hg to a blood pressure at or above 130/80 mm Hg, meaning more than half of people aged 45-75 years in both countries would be classified as having hypertension, according to the researchers, led by Harlan M. Krumholz, MD, of the Center for Outcomes Research and Evaluation at Yale–New Haven (Conn.) Hospital and the section of cardiovascular medicine at Yale
An additional 7.5 million Americans would be recommended for treatment under the new lower treatment thresholds, with a correspondingly large increase in the Chinese population, according to results published in the BMJ.
The guideline changes are “not firmly rooted in evidence” and could have health policy implications that include strain on public health programs, Dr. Krumholz and his colleagues said in their report on the study.
“The change occurs at a time when both countries have substantial numbers of people who are not aware of having hypertension, and who have hypertension that is not controlled, even according to the previous standards,” they wrote.
The analysis by Dr. Krumholz and his colleagues was based on the two most recent cycles of the U.S. National Health and Nutrition Examination Survey (NHANES), representing 2013-2014 and 2015-2016 periods, as well as the China Health and Retirement Longitudinal Study (CHARLS) in 2011-2012.
Under the new ACC/AHA guidelines, they found, 70.1 million Americans aged 45-65 years would be classified as hypertensive, representing 63% of that age group. That’s a 27% relative increase over the 55.3 million individuals, or 49.7%, with hypertension as defined in the JNC-8 guidelines.
In addition, 15.6 million persons would be classified as eligible for treatment but not receiving it, up from 8.1 million under the JNC-8 guidance.
Previous estimates projected a far greater jump in new hypertension classifications, including one that used data from the National Health and Nutrition Examination Survey, antihypertensive clinical trials, and population-based cohort studies. That study estimated that 31 million people would newly carry the label (JAMA Cardiol. 2018 May 23; doi: 10.1001/jamacardio.2018.1240.)
In the current analysis, in China, 267 million aged 45-65 years (55% of that age group) would be classified with hypertension under the ACC/AHA guidelines, a relative increase of 45% over the JNC-8 guidelines, while the number of candidates for treatment would be 129 million, up from 74.5 million under the earlier guidelines.
Dr. Krumholz noted that the ACC/AHA guideline changes were prompted by results from the SPRINT trial. However, the improvements in outcomes seen in SPRINT, which included patients at high risk for cardiovascular events but without diabetes, have not been observed in individuals at low or intermediate risk, or in those with diabetes, they said.
“Expanding the pool of patients who merit treatment to include those at low risk could potentially render public health programs less efficient and viable,” they wrote in a discussion of health policy implications.
The new guidelines also put millions at risk of the “psychological morbidity” that comes with the label of a chronic disease, and at risk for more adverse events caused by inappropriate use of drug therapy, they added.
Dr. Krumholz reported research agreements from Medtronic and from Johnson and Johnson (Janssen) through Yale University, and a grant from the Food and Drug Administration and Medtronic. He reported other disclosures related to UnitedHealth, the IBM Watson Health Life Sciences Board, Element Science, Aetna, and Hugo, a personal health information platform he founded. First author Rohan Khera, MD, reported support from the National Institutes of Health.
SOURCE: Khera R et al. BMJ. 2018 Jul 11;362:k2357
This article was updated 7/19/18.
A new analysis estimates that adopting the 2017 ACC/AHA hypertension guidelines would add 15.6 million Americans to the ranks of the hypertensives, and half of those would be candidates for treatment.
Similar increases would occur in other countries, according to study authors, who analyzed two large datasets from the United States and China.
That happened by resetting the definition of adult hypertension from the long-standing threshold of 140/90 mm Hg to a blood pressure at or above 130/80 mm Hg, meaning more than half of people aged 45-75 years in both countries would be classified as having hypertension, according to the researchers, led by Harlan M. Krumholz, MD, of the Center for Outcomes Research and Evaluation at Yale–New Haven (Conn.) Hospital and the section of cardiovascular medicine at Yale
An additional 7.5 million Americans would be recommended for treatment under the new lower treatment thresholds, with a correspondingly large increase in the Chinese population, according to results published in the BMJ.
The guideline changes are “not firmly rooted in evidence” and could have health policy implications that include strain on public health programs, Dr. Krumholz and his colleagues said in their report on the study.
“The change occurs at a time when both countries have substantial numbers of people who are not aware of having hypertension, and who have hypertension that is not controlled, even according to the previous standards,” they wrote.
The analysis by Dr. Krumholz and his colleagues was based on the two most recent cycles of the U.S. National Health and Nutrition Examination Survey (NHANES), representing 2013-2014 and 2015-2016 periods, as well as the China Health and Retirement Longitudinal Study (CHARLS) in 2011-2012.
Under the new ACC/AHA guidelines, they found, 70.1 million Americans aged 45-65 years would be classified as hypertensive, representing 63% of that age group. That’s a 27% relative increase over the 55.3 million individuals, or 49.7%, with hypertension as defined in the JNC-8 guidelines.
In addition, 15.6 million persons would be classified as eligible for treatment but not receiving it, up from 8.1 million under the JNC-8 guidance.
Previous estimates projected a far greater jump in new hypertension classifications, including one that used data from the National Health and Nutrition Examination Survey, antihypertensive clinical trials, and population-based cohort studies. That study estimated that 31 million people would newly carry the label (JAMA Cardiol. 2018 May 23; doi: 10.1001/jamacardio.2018.1240.)
In the current analysis, in China, 267 million aged 45-65 years (55% of that age group) would be classified with hypertension under the ACC/AHA guidelines, a relative increase of 45% over the JNC-8 guidelines, while the number of candidates for treatment would be 129 million, up from 74.5 million under the earlier guidelines.
Dr. Krumholz noted that the ACC/AHA guideline changes were prompted by results from the SPRINT trial. However, the improvements in outcomes seen in SPRINT, which included patients at high risk for cardiovascular events but without diabetes, have not been observed in individuals at low or intermediate risk, or in those with diabetes, they said.
“Expanding the pool of patients who merit treatment to include those at low risk could potentially render public health programs less efficient and viable,” they wrote in a discussion of health policy implications.
The new guidelines also put millions at risk of the “psychological morbidity” that comes with the label of a chronic disease, and at risk for more adverse events caused by inappropriate use of drug therapy, they added.
Dr. Krumholz reported research agreements from Medtronic and from Johnson and Johnson (Janssen) through Yale University, and a grant from the Food and Drug Administration and Medtronic. He reported other disclosures related to UnitedHealth, the IBM Watson Health Life Sciences Board, Element Science, Aetna, and Hugo, a personal health information platform he founded. First author Rohan Khera, MD, reported support from the National Institutes of Health.
SOURCE: Khera R et al. BMJ. 2018 Jul 11;362:k2357
This article was updated 7/19/18.
FROM THE BMJ
Key clinical point: The 2017 ACC/AHA hypertension guidelines could dramatically increase the number of individuals with hypertension and candidates for treatment.
Major finding: The number of individuals with untreated hypertension increased from 8.1 million to 15.6 million.
Study details: A cross-sectional study of adults in nationally representative databases in the United States (NHANES) and China (CHARLS).
Disclosures: Authors reported disclosures related to Medtronic, Johnson and Johnson (Janssen), the Food and Drug Administration, UnitedHealth, the IBM Watson Health Life Sciences Board, Element Science, Aetna, and Hugo.
Source: Khera R et al. BMJ 2018;362:k2357.
Primary efficacy not met by new M. tuberculosis vaccine strategies
Vaccination may have reduced the rate of sustained Mycobacterium tuberculosis infection in a recent randomized, placebo-controlled clinical trial conducted in a high-risk setting for tuberculosis transmission, despite not meeting the primary endpoint of the study.
In adolescents who had received the bacille Calmette-Guérin (BCG) vaccine in infancy, BCG revaccination reduced the rate of sustained conversion of QuantiFERON-TB Gold In-Tube assay (QFT), a test that is thought to reflect sustained M. tuberculosis infection.
The study also evaluated a candidate subunit vaccine, H4:IC31, which also reduced the rate of sustained QFT conversion, though the efficacy estimate did not reach statistical significance, investigators reported.
Neither H4:IC31 nor BCG revaccination prevented initial QFT conversion, the primary endpoint of the study; however, both vaccines were immunogenic, they said.
Moreover, the significantly reduced rate of sustained conversion with BCG revaccination provides a “promising signal,” study authors said in the New England Journal of Medicine.
“The durability of this important finding and potential public health significance for protection against tuberculosis disease warrants epidemiologic modeling and further clinical evaluation,” wrote Elisa Nemes, PhD, of the South African Tuberculosis Vaccine Initiative, which is part of the Institute of Infectious Disease and Molecular Medicine at the University of Cape Town (South Africa), and her coauthors.
Similarly, the nonsignificantly reduced rate of sustained QFT conversion seen with H4:IC31 suggested that subunit vaccines can have a biologic effect in this setting, which may inform development of new tuberculosis vaccines, Dr. Nemes and her colleagues added.
The phase 2 trial included 990 adolescents in South Africa who had undergone neonatal BCG vaccination. They were randomly assigned to receive BCG revaccination, H4:IC31 vaccine, or placebo.
Neither vaccine met the primary efficacy criterion based on initial QFT conversion rates, which were 13.1% for BCG revaccination, 14.3% for H4:IC31 vaccine, and 15.8% for placebo.
For the secondary endpoint of sustained QFT conversion, the efficacy of BCG revaccination was 45.4% (95% confidence interval, 6.4%-68.1%; P = .03), while the efficacy of H4:IC31 vaccine was 34.2% (95% CI, –10.4% to 60.7%; P = .11).
“These encouraging findings provide an impetus to reevaluate the use of BCG revaccination of populations that are free of M. tuberculosis infection for the prevention of disease,” Dr. Nemes and her coauthors wrote in their report.
Revaccination with BCG was associated with more adverse events, compared with the other groups, although adverse events in the trial were predominantly injection-site reactions that were mild to moderate in severity, investigators reported. There were no serious adverse events judged by investigators to be related to trial vaccine.
Taken together, these results raise important questions regarding the potential benefits of vaccine-mediated prevention of M. tuberculosis infection for control of tuberculosis disease, according to Dr. Nemes and her coauthors.
However, interpretation of the findings is limited because there is no definitive test for M. tuberculosis infection.
Recent infection diagnosed by tuberculin skin test or QFT conversion has been associated with higher risk of disease, compared with nonconversion, according to investigators, while reversion to a negative tuberculin skin test correlates with infection containment and lower risk of tuberculosis.
“Although the clinical significance of QFT reversion remains to be established, we propose that sustained QFT conversion more likely represents sustained M. tuberculosis infection and a higher risk of progression to disease than transient QFT conversion,” they wrote.
The study was supported by Aeras, Sanofi Pasteur, the Bill & Melinda Gates Foundation, the Government of the Netherlands Directorate-General for International Cooperation and Development, and the United Kingdom Department for International Development. Study authors reported disclosures related to GlaxoSmithKline, Sanofi Pasteur, and Aeras.
SOURCE: Nemes E et al. N Engl J Med. 2018;379:138-49.
Vaccination may have reduced the rate of sustained Mycobacterium tuberculosis infection in a recent randomized, placebo-controlled clinical trial conducted in a high-risk setting for tuberculosis transmission, despite not meeting the primary endpoint of the study.
In adolescents who had received the bacille Calmette-Guérin (BCG) vaccine in infancy, BCG revaccination reduced the rate of sustained conversion of QuantiFERON-TB Gold In-Tube assay (QFT), a test that is thought to reflect sustained M. tuberculosis infection.
The study also evaluated a candidate subunit vaccine, H4:IC31, which also reduced the rate of sustained QFT conversion, though the efficacy estimate did not reach statistical significance, investigators reported.
Neither H4:IC31 nor BCG revaccination prevented initial QFT conversion, the primary endpoint of the study; however, both vaccines were immunogenic, they said.
Moreover, the significantly reduced rate of sustained conversion with BCG revaccination provides a “promising signal,” study authors said in the New England Journal of Medicine.
“The durability of this important finding and potential public health significance for protection against tuberculosis disease warrants epidemiologic modeling and further clinical evaluation,” wrote Elisa Nemes, PhD, of the South African Tuberculosis Vaccine Initiative, which is part of the Institute of Infectious Disease and Molecular Medicine at the University of Cape Town (South Africa), and her coauthors.
Similarly, the nonsignificantly reduced rate of sustained QFT conversion seen with H4:IC31 suggested that subunit vaccines can have a biologic effect in this setting, which may inform development of new tuberculosis vaccines, Dr. Nemes and her colleagues added.
The phase 2 trial included 990 adolescents in South Africa who had undergone neonatal BCG vaccination. They were randomly assigned to receive BCG revaccination, H4:IC31 vaccine, or placebo.
Neither vaccine met the primary efficacy criterion based on initial QFT conversion rates, which were 13.1% for BCG revaccination, 14.3% for H4:IC31 vaccine, and 15.8% for placebo.
For the secondary endpoint of sustained QFT conversion, the efficacy of BCG revaccination was 45.4% (95% confidence interval, 6.4%-68.1%; P = .03), while the efficacy of H4:IC31 vaccine was 34.2% (95% CI, –10.4% to 60.7%; P = .11).
“These encouraging findings provide an impetus to reevaluate the use of BCG revaccination of populations that are free of M. tuberculosis infection for the prevention of disease,” Dr. Nemes and her coauthors wrote in their report.
Revaccination with BCG was associated with more adverse events, compared with the other groups, although adverse events in the trial were predominantly injection-site reactions that were mild to moderate in severity, investigators reported. There were no serious adverse events judged by investigators to be related to trial vaccine.
Taken together, these results raise important questions regarding the potential benefits of vaccine-mediated prevention of M. tuberculosis infection for control of tuberculosis disease, according to Dr. Nemes and her coauthors.
However, interpretation of the findings is limited because there is no definitive test for M. tuberculosis infection.
Recent infection diagnosed by tuberculin skin test or QFT conversion has been associated with higher risk of disease, compared with nonconversion, according to investigators, while reversion to a negative tuberculin skin test correlates with infection containment and lower risk of tuberculosis.
“Although the clinical significance of QFT reversion remains to be established, we propose that sustained QFT conversion more likely represents sustained M. tuberculosis infection and a higher risk of progression to disease than transient QFT conversion,” they wrote.
The study was supported by Aeras, Sanofi Pasteur, the Bill & Melinda Gates Foundation, the Government of the Netherlands Directorate-General for International Cooperation and Development, and the United Kingdom Department for International Development. Study authors reported disclosures related to GlaxoSmithKline, Sanofi Pasteur, and Aeras.
SOURCE: Nemes E et al. N Engl J Med. 2018;379:138-49.
Vaccination may have reduced the rate of sustained Mycobacterium tuberculosis infection in a recent randomized, placebo-controlled clinical trial conducted in a high-risk setting for tuberculosis transmission, despite not meeting the primary endpoint of the study.
In adolescents who had received the bacille Calmette-Guérin (BCG) vaccine in infancy, BCG revaccination reduced the rate of sustained conversion of QuantiFERON-TB Gold In-Tube assay (QFT), a test that is thought to reflect sustained M. tuberculosis infection.
The study also evaluated a candidate subunit vaccine, H4:IC31, which also reduced the rate of sustained QFT conversion, though the efficacy estimate did not reach statistical significance, investigators reported.
Neither H4:IC31 nor BCG revaccination prevented initial QFT conversion, the primary endpoint of the study; however, both vaccines were immunogenic, they said.
Moreover, the significantly reduced rate of sustained conversion with BCG revaccination provides a “promising signal,” study authors said in the New England Journal of Medicine.
“The durability of this important finding and potential public health significance for protection against tuberculosis disease warrants epidemiologic modeling and further clinical evaluation,” wrote Elisa Nemes, PhD, of the South African Tuberculosis Vaccine Initiative, which is part of the Institute of Infectious Disease and Molecular Medicine at the University of Cape Town (South Africa), and her coauthors.
Similarly, the nonsignificantly reduced rate of sustained QFT conversion seen with H4:IC31 suggested that subunit vaccines can have a biologic effect in this setting, which may inform development of new tuberculosis vaccines, Dr. Nemes and her colleagues added.
The phase 2 trial included 990 adolescents in South Africa who had undergone neonatal BCG vaccination. They were randomly assigned to receive BCG revaccination, H4:IC31 vaccine, or placebo.
Neither vaccine met the primary efficacy criterion based on initial QFT conversion rates, which were 13.1% for BCG revaccination, 14.3% for H4:IC31 vaccine, and 15.8% for placebo.
For the secondary endpoint of sustained QFT conversion, the efficacy of BCG revaccination was 45.4% (95% confidence interval, 6.4%-68.1%; P = .03), while the efficacy of H4:IC31 vaccine was 34.2% (95% CI, –10.4% to 60.7%; P = .11).
“These encouraging findings provide an impetus to reevaluate the use of BCG revaccination of populations that are free of M. tuberculosis infection for the prevention of disease,” Dr. Nemes and her coauthors wrote in their report.
Revaccination with BCG was associated with more adverse events, compared with the other groups, although adverse events in the trial were predominantly injection-site reactions that were mild to moderate in severity, investigators reported. There were no serious adverse events judged by investigators to be related to trial vaccine.
Taken together, these results raise important questions regarding the potential benefits of vaccine-mediated prevention of M. tuberculosis infection for control of tuberculosis disease, according to Dr. Nemes and her coauthors.
However, interpretation of the findings is limited because there is no definitive test for M. tuberculosis infection.
Recent infection diagnosed by tuberculin skin test or QFT conversion has been associated with higher risk of disease, compared with nonconversion, according to investigators, while reversion to a negative tuberculin skin test correlates with infection containment and lower risk of tuberculosis.
“Although the clinical significance of QFT reversion remains to be established, we propose that sustained QFT conversion more likely represents sustained M. tuberculosis infection and a higher risk of progression to disease than transient QFT conversion,” they wrote.
The study was supported by Aeras, Sanofi Pasteur, the Bill & Melinda Gates Foundation, the Government of the Netherlands Directorate-General for International Cooperation and Development, and the United Kingdom Department for International Development. Study authors reported disclosures related to GlaxoSmithKline, Sanofi Pasteur, and Aeras.
SOURCE: Nemes E et al. N Engl J Med. 2018;379:138-49.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Key clinical point: Neither H4:IC31 nor BCG revaccination prevented initial QFT conversion, the primary endpoint; however, both vaccines were immunogenic.
Major finding: For the secondary endpoint of sustained QuantiFERON-TB Gold In-Tube Assay (QFT) conversion, efficacy was 45.4% (P = .03) for BCG revaccination and 34.2% (P = .11) for H4:IC31, a candidate subunit vaccine.
Study details: A phase 2, randomized, placebo-controlled trial including 990 adolescents in South Africa who had received BCG vaccine in infancy.
Disclosures: The study was supported by Aeras, Sanofi Pasteur, the Bill & Melinda Gates Foundation, the Government of the Netherlands Directorate-General for International Cooperation and Development, and the United Kingdom Department for International Development. Study authors reported disclosures related to GlaxoSmithKline, Sanofi Pasteur, and Aeras.
Source: Nemes E et al. N Engl J Med. 2018;379:138-49.
Recombinant poliovirus appears safe, active as recurrent glioblastoma treatment
Treatment with the recombinant poliovirus vaccine PVSRIPO in patients with recurrent glioblastoma can be delivered at a safe dose with efficacy that compares favorably with historical data, recently reported results of a phase 1, nonrandomized study suggest.
The survival rate at 36 months after intratumoral infusion of PVSRIPO was 21%, versus 4% in a control group of patients who would have met the study’s eligibility criteria, investigators wrote in the New England Journal of Medicine.
There was no evidence of virus shedding or viral neuropathogenicity in the study, which included 61 patients with recurrent World Health Organization grade IV malignant glioma. “Further investigations are warranted,” wrote Annick Desjardins, MD, of Duke University, Durham, N.C., and her coauthors.
The prognosis of WHO grade IV malignant glioma remains dismal despite aggressive therapy and decades of research focused on advanced surgery, radiation, chemotherapy, and targeted agents, Dr. Desjardins and her colleagues said.
Accordingly, they sought to evaluate the potential of PVSRIPO, a live-attenuated poliovirus type 1 vaccine with its viral internal ribosome entry site replaced by one of human rhinovirus type. The engineered virus gains entry via the CD155 receptor, which is upregulated in solid tumors such as glioblastomas and expressed in antigen-presenting cells.
“Tumor cytotoxic effects, interferon-dominant activation of antigen-presenting cells, and the profound inflammatory response to poliovirus may counter tumor-induced immunosuppression and instigate antitumor immunity,” the investigators wrote.
With a median follow-up of 27.6 months, the median overall survival for PVSRIPO-treated patients was 12.5 months, longer than the 11.3 months seen in the historical control group. It was also longer than the 6.6 months found in a second comparison group of patients who underwent therapy with tumor-treating fields, which involves application of alternating electrical current to the head.
Survival hit a “plateau” in the PVSRIPO-treated patients, investigators said, with an overall survival rate of 21% at both 24 and 36 months. That stood in contrast to a decline in the historical control group from 14% at 24 months to 4% at 36 months, and a decline from 8% to 3% in the tumor-treating-fields group.
The phase 1 study had a dose-escalation phase including 9 patients and a dose-expansion phase with 52 patients. In the dose-expansion phase, 19% of patients had grade 3 or greater adverse events attributable to PVSRIPO, according to the report.
Of all 61 patients, 69% had a vaccine-related grade 1 or 2 event as their most severe adverse event.
One patient death caused by complications from an intracranial hemorrhage was attributed to bevacizumab. As part of a study protocol amendment, bevacizumab at half the standard dose was allowed to control symptoms of locoregional inflammation, investigators said.
In an ongoing, phase 2, randomized trial, PVSRIPO is being evaluated alone or with lomustine in patients with recurrent WHO grade IV malignant glioma. The Food and Drug Administration granted breakthrough therapy designation to PVSRIPO in May 2016.
Seven study authors reported equity in Istari Oncology, a biotechnology company that is developing PVSRIPO. Authors also reported disclosures related to Genentech/Roche, Celgene, Celldex, and Eli Lilly, among other entities. The study was supported by grants from the Brain Tumor Research Charity, the Tisch family through the Jewish Communal Fund, the National Institutes of Health, and others.
SOURCE: Desjardins A et al .N Engl J Med. 2018 Jun 26. doi: 10.1056/NEJMoa1716435.
The potentially useful anticancer properties of viruses are just starting to be recognized and exploited, Dan L. Longo, MD, and Lindsey R. Baden, MD, both with the Dana-Farber Cancer Institute at Brigham and Women’s Hospital, Boston, said in an editorial.
One approach is the development of oncolytic viruses that can not only directly kill tumor cells, but can also prompt an immune response against viable tumor cells, they wrote. The study by Dr. Desjardins and her colleagues describes clinical experience with PVSRIPO, a recombinant, nonpathogenic polio-rhinovirus chimera. This engineered virus targets glioblastoma by gaining cell entry through the CD155 receptor, which is expressed on solid tumors.
The survival data showed a plateau, with a 36-month survival rate of 21%, compared with 4% for a historical control cohort of patients, Dr. Longo and Dr. Baden noted.
In this study, PVSRIPO was delivered into intracranial tumors using an indwelling catheter. One of the outstanding questions with viral approaches to cancer treatment, according to the editorialists, is how local administration impacts systemic immunity in terms of recognition and elimination of remote lesions.
“Much more needs to be learned, but the clinical results to date encourage further exploration of this new treatment approach,” Dr. Longo and Dr. Baden wrote.
This summary is based on an editorial written by Dr. Longo and Dr. Baden that appeared in the New England Journal of Medicine. Dr. Baden and Longo both reported employment by the New England Journal of Medicine as deputy editor. Dr. Baden reported grant support from the Ragon Institute, the National Institutes of Health and the National Institute of Allergy and Infectious Disease, and the Gates Foundation outside the submitted work and also reported involvement in HIV vaccine trials done in collaboration with NIH, HIV Vaccine Trials Network, and others.
The potentially useful anticancer properties of viruses are just starting to be recognized and exploited, Dan L. Longo, MD, and Lindsey R. Baden, MD, both with the Dana-Farber Cancer Institute at Brigham and Women’s Hospital, Boston, said in an editorial.
One approach is the development of oncolytic viruses that can not only directly kill tumor cells, but can also prompt an immune response against viable tumor cells, they wrote. The study by Dr. Desjardins and her colleagues describes clinical experience with PVSRIPO, a recombinant, nonpathogenic polio-rhinovirus chimera. This engineered virus targets glioblastoma by gaining cell entry through the CD155 receptor, which is expressed on solid tumors.
The survival data showed a plateau, with a 36-month survival rate of 21%, compared with 4% for a historical control cohort of patients, Dr. Longo and Dr. Baden noted.
In this study, PVSRIPO was delivered into intracranial tumors using an indwelling catheter. One of the outstanding questions with viral approaches to cancer treatment, according to the editorialists, is how local administration impacts systemic immunity in terms of recognition and elimination of remote lesions.
“Much more needs to be learned, but the clinical results to date encourage further exploration of this new treatment approach,” Dr. Longo and Dr. Baden wrote.
This summary is based on an editorial written by Dr. Longo and Dr. Baden that appeared in the New England Journal of Medicine. Dr. Baden and Longo both reported employment by the New England Journal of Medicine as deputy editor. Dr. Baden reported grant support from the Ragon Institute, the National Institutes of Health and the National Institute of Allergy and Infectious Disease, and the Gates Foundation outside the submitted work and also reported involvement in HIV vaccine trials done in collaboration with NIH, HIV Vaccine Trials Network, and others.
The potentially useful anticancer properties of viruses are just starting to be recognized and exploited, Dan L. Longo, MD, and Lindsey R. Baden, MD, both with the Dana-Farber Cancer Institute at Brigham and Women’s Hospital, Boston, said in an editorial.
One approach is the development of oncolytic viruses that can not only directly kill tumor cells, but can also prompt an immune response against viable tumor cells, they wrote. The study by Dr. Desjardins and her colleagues describes clinical experience with PVSRIPO, a recombinant, nonpathogenic polio-rhinovirus chimera. This engineered virus targets glioblastoma by gaining cell entry through the CD155 receptor, which is expressed on solid tumors.
The survival data showed a plateau, with a 36-month survival rate of 21%, compared with 4% for a historical control cohort of patients, Dr. Longo and Dr. Baden noted.
In this study, PVSRIPO was delivered into intracranial tumors using an indwelling catheter. One of the outstanding questions with viral approaches to cancer treatment, according to the editorialists, is how local administration impacts systemic immunity in terms of recognition and elimination of remote lesions.
“Much more needs to be learned, but the clinical results to date encourage further exploration of this new treatment approach,” Dr. Longo and Dr. Baden wrote.
This summary is based on an editorial written by Dr. Longo and Dr. Baden that appeared in the New England Journal of Medicine. Dr. Baden and Longo both reported employment by the New England Journal of Medicine as deputy editor. Dr. Baden reported grant support from the Ragon Institute, the National Institutes of Health and the National Institute of Allergy and Infectious Disease, and the Gates Foundation outside the submitted work and also reported involvement in HIV vaccine trials done in collaboration with NIH, HIV Vaccine Trials Network, and others.
Treatment with the recombinant poliovirus vaccine PVSRIPO in patients with recurrent glioblastoma can be delivered at a safe dose with efficacy that compares favorably with historical data, recently reported results of a phase 1, nonrandomized study suggest.
The survival rate at 36 months after intratumoral infusion of PVSRIPO was 21%, versus 4% in a control group of patients who would have met the study’s eligibility criteria, investigators wrote in the New England Journal of Medicine.
There was no evidence of virus shedding or viral neuropathogenicity in the study, which included 61 patients with recurrent World Health Organization grade IV malignant glioma. “Further investigations are warranted,” wrote Annick Desjardins, MD, of Duke University, Durham, N.C., and her coauthors.
The prognosis of WHO grade IV malignant glioma remains dismal despite aggressive therapy and decades of research focused on advanced surgery, radiation, chemotherapy, and targeted agents, Dr. Desjardins and her colleagues said.
Accordingly, they sought to evaluate the potential of PVSRIPO, a live-attenuated poliovirus type 1 vaccine with its viral internal ribosome entry site replaced by one of human rhinovirus type. The engineered virus gains entry via the CD155 receptor, which is upregulated in solid tumors such as glioblastomas and expressed in antigen-presenting cells.
“Tumor cytotoxic effects, interferon-dominant activation of antigen-presenting cells, and the profound inflammatory response to poliovirus may counter tumor-induced immunosuppression and instigate antitumor immunity,” the investigators wrote.
With a median follow-up of 27.6 months, the median overall survival for PVSRIPO-treated patients was 12.5 months, longer than the 11.3 months seen in the historical control group. It was also longer than the 6.6 months found in a second comparison group of patients who underwent therapy with tumor-treating fields, which involves application of alternating electrical current to the head.
Survival hit a “plateau” in the PVSRIPO-treated patients, investigators said, with an overall survival rate of 21% at both 24 and 36 months. That stood in contrast to a decline in the historical control group from 14% at 24 months to 4% at 36 months, and a decline from 8% to 3% in the tumor-treating-fields group.
The phase 1 study had a dose-escalation phase including 9 patients and a dose-expansion phase with 52 patients. In the dose-expansion phase, 19% of patients had grade 3 or greater adverse events attributable to PVSRIPO, according to the report.
Of all 61 patients, 69% had a vaccine-related grade 1 or 2 event as their most severe adverse event.
One patient death caused by complications from an intracranial hemorrhage was attributed to bevacizumab. As part of a study protocol amendment, bevacizumab at half the standard dose was allowed to control symptoms of locoregional inflammation, investigators said.
In an ongoing, phase 2, randomized trial, PVSRIPO is being evaluated alone or with lomustine in patients with recurrent WHO grade IV malignant glioma. The Food and Drug Administration granted breakthrough therapy designation to PVSRIPO in May 2016.
Seven study authors reported equity in Istari Oncology, a biotechnology company that is developing PVSRIPO. Authors also reported disclosures related to Genentech/Roche, Celgene, Celldex, and Eli Lilly, among other entities. The study was supported by grants from the Brain Tumor Research Charity, the Tisch family through the Jewish Communal Fund, the National Institutes of Health, and others.
SOURCE: Desjardins A et al .N Engl J Med. 2018 Jun 26. doi: 10.1056/NEJMoa1716435.
Treatment with the recombinant poliovirus vaccine PVSRIPO in patients with recurrent glioblastoma can be delivered at a safe dose with efficacy that compares favorably with historical data, recently reported results of a phase 1, nonrandomized study suggest.
The survival rate at 36 months after intratumoral infusion of PVSRIPO was 21%, versus 4% in a control group of patients who would have met the study’s eligibility criteria, investigators wrote in the New England Journal of Medicine.
There was no evidence of virus shedding or viral neuropathogenicity in the study, which included 61 patients with recurrent World Health Organization grade IV malignant glioma. “Further investigations are warranted,” wrote Annick Desjardins, MD, of Duke University, Durham, N.C., and her coauthors.
The prognosis of WHO grade IV malignant glioma remains dismal despite aggressive therapy and decades of research focused on advanced surgery, radiation, chemotherapy, and targeted agents, Dr. Desjardins and her colleagues said.
Accordingly, they sought to evaluate the potential of PVSRIPO, a live-attenuated poliovirus type 1 vaccine with its viral internal ribosome entry site replaced by one of human rhinovirus type. The engineered virus gains entry via the CD155 receptor, which is upregulated in solid tumors such as glioblastomas and expressed in antigen-presenting cells.
“Tumor cytotoxic effects, interferon-dominant activation of antigen-presenting cells, and the profound inflammatory response to poliovirus may counter tumor-induced immunosuppression and instigate antitumor immunity,” the investigators wrote.
With a median follow-up of 27.6 months, the median overall survival for PVSRIPO-treated patients was 12.5 months, longer than the 11.3 months seen in the historical control group. It was also longer than the 6.6 months found in a second comparison group of patients who underwent therapy with tumor-treating fields, which involves application of alternating electrical current to the head.
Survival hit a “plateau” in the PVSRIPO-treated patients, investigators said, with an overall survival rate of 21% at both 24 and 36 months. That stood in contrast to a decline in the historical control group from 14% at 24 months to 4% at 36 months, and a decline from 8% to 3% in the tumor-treating-fields group.
The phase 1 study had a dose-escalation phase including 9 patients and a dose-expansion phase with 52 patients. In the dose-expansion phase, 19% of patients had grade 3 or greater adverse events attributable to PVSRIPO, according to the report.
Of all 61 patients, 69% had a vaccine-related grade 1 or 2 event as their most severe adverse event.
One patient death caused by complications from an intracranial hemorrhage was attributed to bevacizumab. As part of a study protocol amendment, bevacizumab at half the standard dose was allowed to control symptoms of locoregional inflammation, investigators said.
In an ongoing, phase 2, randomized trial, PVSRIPO is being evaluated alone or with lomustine in patients with recurrent WHO grade IV malignant glioma. The Food and Drug Administration granted breakthrough therapy designation to PVSRIPO in May 2016.
Seven study authors reported equity in Istari Oncology, a biotechnology company that is developing PVSRIPO. Authors also reported disclosures related to Genentech/Roche, Celgene, Celldex, and Eli Lilly, among other entities. The study was supported by grants from the Brain Tumor Research Charity, the Tisch family through the Jewish Communal Fund, the National Institutes of Health, and others.
SOURCE: Desjardins A et al .N Engl J Med. 2018 Jun 26. doi: 10.1056/NEJMoa1716435.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Key clinical point: Delivery of PVSRIPO was safe, with efficacy comparing favorably with historical data.
Major finding: Overall survival reached 21% at 24 months and remained at 21% at 36 months.
Study details: A phase 1 study including 61 patients with recurrent World Health Organization grade IV glioma.
Disclosures: Seven study authors reported equity in Istari Oncology, a biotechnology company that is developing PVSRIPO. Study authors also reported disclosures related to Genentech/Roche, Celgene, Celldex, and Eli Lilly, among other entities. The study was supported by grants from the Brain Tumor Research Charity, the Tisch family through the Jewish Communal Fund, the National Institutes of Health, and others.
Source: Desjardins A et al. N Engl J Med. 2018 Jun 26. doi: 10.1056/NEJMoa1716435.
Intensive nonaspirin NSAID use linked to reduced ovarian cancer mortality
Intensive use of nonaspirin nonsteroidal anti-inflammatory drugs (NSAIDs) was associated with improved survival of patients with serous ovarian cancer in a recent population-based study.
By contrast, any use of nonaspirin NSAIDs was not associated with survival benefit, according to the authors of the study, which was based on records for more than 4,000 patients in the Danish Cancer Registry.
“More intensive use of nonaspirin NSAIDs appears necessary to obtain a prognostic benefit,” wrote Freija Verdoodt, PhD, postdoctoral researcher with the Danish Cancer Society Research Center, Copenhagen and her coauthors. The report was published in Gynecologic Oncology.
In addition, there was a suggestion that use of these drugs was associated with increased mortality in patients that with nonserous ovarian cancer, Dr. Verdoodt and her colleagues noted.
The study population comprised 4,117 women who were alive 1 year after a diagnosis of epithelial ovarian cancer. Ovarian cancer–specific mortality, the primary outcome of the analysis, was evaluated in relation to postdiagnosis use of nonaspirin NSAIDs.
The investigators found that any postdiagnosis use of nonaspirin NSAIDs was not associated with a difference in mortality, with a hazard ratio of 0.97 (95% confidence interval, 0.87-1.08) after adjusting for factors such as age, clinical stage, and year of diagnosis.
“Nonaspirin NSAIDs are typically used sporadically, and thus limited use among a substantial proportion of the postdiagnosis users may have attenuated the mortality risk estimates of our main analysis,” Dr. Verdoodt and her coauthors wrote.
However, increasing cumulative dose was associated with decreases in mortality, with hazard ratios of 1.03, 0.96, and 0.75 for low, medium, and high cumulative doses, respectively.
Likewise, the intensity of use, defined as cumulative dose divided by the number of days between the first and most recent postdiagnosis NSAID prescription, was associated with decreased mortality, with hazard ratios of 1.04, 0.98, and 0.86 for low, medium, and high use intensity, the reported data show.
When stratified by tumor histology, the data showed an association between reduced ovarian cancer–specific mortality and serous ovarian tumors (hazard ratio, 0.87; 95% CI, 0.77-0.99), and post hoc analyses confirmed further reductions in mortality based on high cumulative doses, with a hazard ratio of 0.62, and high intensity of use, with a hazard ratio of 0.79.
Conversely, other histologies were associated with increases in ovarian cancer mortality, though the numbers of patients in these subgroups were small, limiting interpretation of the results, investigators said.
Prior to this study, there were few epidemiologic investigations of the impact of nonaspirin NSAIDs on ovarian cancer prognosis, and of those, most did not include separate estimates based on histological subtypes, according to the investigators.
Although this study suggests intensive nonaspirin NSAID use comes with a potential prognostic benefit, these drugs also have potential adverse effects, including serious cardiovascular adverse events, the investigators said.
“A consideration of such risks in the light of a survival benefit among poor-prognosis serous ovarian cancer patients should guide further research,” they wrote.
The study was supported by the Sapere Aude program of the Independent Research Fund Denmark and the Mermaid project. The authors declared no conflicts of interest.
SOURCE: Verdoodt F et al. Gynecol Oncol. 2018 Jun 27. doi: 10.1016/j.ygyno.2018.06.018.
Intensive use of nonaspirin nonsteroidal anti-inflammatory drugs (NSAIDs) was associated with improved survival of patients with serous ovarian cancer in a recent population-based study.
By contrast, any use of nonaspirin NSAIDs was not associated with survival benefit, according to the authors of the study, which was based on records for more than 4,000 patients in the Danish Cancer Registry.
“More intensive use of nonaspirin NSAIDs appears necessary to obtain a prognostic benefit,” wrote Freija Verdoodt, PhD, postdoctoral researcher with the Danish Cancer Society Research Center, Copenhagen and her coauthors. The report was published in Gynecologic Oncology.
In addition, there was a suggestion that use of these drugs was associated with increased mortality in patients that with nonserous ovarian cancer, Dr. Verdoodt and her colleagues noted.
The study population comprised 4,117 women who were alive 1 year after a diagnosis of epithelial ovarian cancer. Ovarian cancer–specific mortality, the primary outcome of the analysis, was evaluated in relation to postdiagnosis use of nonaspirin NSAIDs.
The investigators found that any postdiagnosis use of nonaspirin NSAIDs was not associated with a difference in mortality, with a hazard ratio of 0.97 (95% confidence interval, 0.87-1.08) after adjusting for factors such as age, clinical stage, and year of diagnosis.
“Nonaspirin NSAIDs are typically used sporadically, and thus limited use among a substantial proportion of the postdiagnosis users may have attenuated the mortality risk estimates of our main analysis,” Dr. Verdoodt and her coauthors wrote.
However, increasing cumulative dose was associated with decreases in mortality, with hazard ratios of 1.03, 0.96, and 0.75 for low, medium, and high cumulative doses, respectively.
Likewise, the intensity of use, defined as cumulative dose divided by the number of days between the first and most recent postdiagnosis NSAID prescription, was associated with decreased mortality, with hazard ratios of 1.04, 0.98, and 0.86 for low, medium, and high use intensity, the reported data show.
When stratified by tumor histology, the data showed an association between reduced ovarian cancer–specific mortality and serous ovarian tumors (hazard ratio, 0.87; 95% CI, 0.77-0.99), and post hoc analyses confirmed further reductions in mortality based on high cumulative doses, with a hazard ratio of 0.62, and high intensity of use, with a hazard ratio of 0.79.
Conversely, other histologies were associated with increases in ovarian cancer mortality, though the numbers of patients in these subgroups were small, limiting interpretation of the results, investigators said.
Prior to this study, there were few epidemiologic investigations of the impact of nonaspirin NSAIDs on ovarian cancer prognosis, and of those, most did not include separate estimates based on histological subtypes, according to the investigators.
Although this study suggests intensive nonaspirin NSAID use comes with a potential prognostic benefit, these drugs also have potential adverse effects, including serious cardiovascular adverse events, the investigators said.
“A consideration of such risks in the light of a survival benefit among poor-prognosis serous ovarian cancer patients should guide further research,” they wrote.
The study was supported by the Sapere Aude program of the Independent Research Fund Denmark and the Mermaid project. The authors declared no conflicts of interest.
SOURCE: Verdoodt F et al. Gynecol Oncol. 2018 Jun 27. doi: 10.1016/j.ygyno.2018.06.018.
Intensive use of nonaspirin nonsteroidal anti-inflammatory drugs (NSAIDs) was associated with improved survival of patients with serous ovarian cancer in a recent population-based study.
By contrast, any use of nonaspirin NSAIDs was not associated with survival benefit, according to the authors of the study, which was based on records for more than 4,000 patients in the Danish Cancer Registry.
“More intensive use of nonaspirin NSAIDs appears necessary to obtain a prognostic benefit,” wrote Freija Verdoodt, PhD, postdoctoral researcher with the Danish Cancer Society Research Center, Copenhagen and her coauthors. The report was published in Gynecologic Oncology.
In addition, there was a suggestion that use of these drugs was associated with increased mortality in patients that with nonserous ovarian cancer, Dr. Verdoodt and her colleagues noted.
The study population comprised 4,117 women who were alive 1 year after a diagnosis of epithelial ovarian cancer. Ovarian cancer–specific mortality, the primary outcome of the analysis, was evaluated in relation to postdiagnosis use of nonaspirin NSAIDs.
The investigators found that any postdiagnosis use of nonaspirin NSAIDs was not associated with a difference in mortality, with a hazard ratio of 0.97 (95% confidence interval, 0.87-1.08) after adjusting for factors such as age, clinical stage, and year of diagnosis.
“Nonaspirin NSAIDs are typically used sporadically, and thus limited use among a substantial proportion of the postdiagnosis users may have attenuated the mortality risk estimates of our main analysis,” Dr. Verdoodt and her coauthors wrote.
However, increasing cumulative dose was associated with decreases in mortality, with hazard ratios of 1.03, 0.96, and 0.75 for low, medium, and high cumulative doses, respectively.
Likewise, the intensity of use, defined as cumulative dose divided by the number of days between the first and most recent postdiagnosis NSAID prescription, was associated with decreased mortality, with hazard ratios of 1.04, 0.98, and 0.86 for low, medium, and high use intensity, the reported data show.
When stratified by tumor histology, the data showed an association between reduced ovarian cancer–specific mortality and serous ovarian tumors (hazard ratio, 0.87; 95% CI, 0.77-0.99), and post hoc analyses confirmed further reductions in mortality based on high cumulative doses, with a hazard ratio of 0.62, and high intensity of use, with a hazard ratio of 0.79.
Conversely, other histologies were associated with increases in ovarian cancer mortality, though the numbers of patients in these subgroups were small, limiting interpretation of the results, investigators said.
Prior to this study, there were few epidemiologic investigations of the impact of nonaspirin NSAIDs on ovarian cancer prognosis, and of those, most did not include separate estimates based on histological subtypes, according to the investigators.
Although this study suggests intensive nonaspirin NSAID use comes with a potential prognostic benefit, these drugs also have potential adverse effects, including serious cardiovascular adverse events, the investigators said.
“A consideration of such risks in the light of a survival benefit among poor-prognosis serous ovarian cancer patients should guide further research,” they wrote.
The study was supported by the Sapere Aude program of the Independent Research Fund Denmark and the Mermaid project. The authors declared no conflicts of interest.
SOURCE: Verdoodt F et al. Gynecol Oncol. 2018 Jun 27. doi: 10.1016/j.ygyno.2018.06.018.
FROM GYNECOLOGIC ONCOLOGY
Key clinical point: Intensive use of nonaspirin nonsteroidal anti-inflammatory drugs (NSAIDs) was associated with improved survival of patients with serous ovarian cancer
Major finding: Serous ovarian tumors were associated with reduced ovarian cancer-specific mortality (HR, 0.87; 95% CI, 0.77-0.99), while post hoc analyses confirmed further reductions in mortality based on high cumulative doses (HR, 0.62), and high use intensity (HR, 0.79).
Study details: A population-based study of the Danish Cancer Registry including 4,117 women alive at least 1 year after an ovarian cancer diagnosis.
Disclosures: The study was supported by the Sapere Aude program of the Independent Research Fund Denmark and the Mermaid project. The authors declared no conflicts of interest.
Source: Verdoodt F et al. Gynecol Oncol. 2018 Jun 27. doi: 10.1016/j.ygyno.2018.06.018.
Mutations may be detectable years before AML diagnosis
Individuals who develop acute myeloid leukemia (AML) may have somatic mutations detectable years before diagnosis, a newly published analysis shows.
Mutations in IDH1, IDH2, TP53, DNMT3A, TET2, and spliceosome genes at baseline assessment increased the odds of developing AML with a median follow-up of 9.6 years in the study, which was based on blood samples from participants in the Women’s Health Initiative (WHI).
The findings suggest a “premalignant landscape of mutations” that may precede overt AML by many years, according to Pinkal Desai, MD, assistant professor of medicine at Cornell University and oncologist at New York–Presbyterian/Weill Cornell Medical Center, New York, and her coauthors.
“The ability to detect and identify high-risk mutations suggests that monitoring strategies for patients, as well as clinical trials of potentially preventative or disease-intercepting interventions should be considered,” wrote Dr. Desai and her colleagues. The report was published in Nature Medicine.
Their analysis comprised 212 women who participated in the WHI who were healthy at the baseline evaluation but went on to develop AML during follow-up. They performed deep sequencing on peripheral blood DNA for these cases and for 212 age-matched controls.
Women who developed AML were more likely than were controls to have mutations in baseline assessment (odds ratio, 4.86; 95% confidence interval, 3.07-7.77), and had demonstrated greater clonal complexity versus controls (comutations in 46.8% and 5.5%, respectively; odds ratio, 9.01; 95% CI, 4.1-21.4), investigators found.
All 21 patients with TP53 mutations went on to develop AML, as did all 15 with IDH1 or IDH2 mutations and all 3 with RUNX1 mutations. Multivariate analysis showed that TP53, IDH1 and IDH2, TET2, DNMT3A and several spliceosome genes were associated with significantly increased odds of AML versus controls.
Based on these results, Dr. Desai and colleagues proposed that patients at increased AML risk should be followed in long-term monitoring studies that incorporate next-generation sequencing.
“Data from these studies will provide a robust rationale for clinical trials of preventative intervention strategies in populations at high risk of developing AML,” they wrote.
In clinical practice, monitoring individuals for AML-associated mutations will become more feasible as costs decrease and new therapies with favorable toxicity profiles are introduced, they added.
“Molecularly targeted therapy is already available for IDH2 mutations and is under development for mutations in other candidate genes found in this study including IDH1, TP53 and spliceosome genes,” they wrote.
The authors reported having no relevant financial disclosures. The WHI program is funded by the National Institutes of Health.
SOURCE: Desai P et al. Nat Med. 2018;24:1015-23.
Individuals who develop acute myeloid leukemia (AML) may have somatic mutations detectable years before diagnosis, a newly published analysis shows.
Mutations in IDH1, IDH2, TP53, DNMT3A, TET2, and spliceosome genes at baseline assessment increased the odds of developing AML with a median follow-up of 9.6 years in the study, which was based on blood samples from participants in the Women’s Health Initiative (WHI).
The findings suggest a “premalignant landscape of mutations” that may precede overt AML by many years, according to Pinkal Desai, MD, assistant professor of medicine at Cornell University and oncologist at New York–Presbyterian/Weill Cornell Medical Center, New York, and her coauthors.
“The ability to detect and identify high-risk mutations suggests that monitoring strategies for patients, as well as clinical trials of potentially preventative or disease-intercepting interventions should be considered,” wrote Dr. Desai and her colleagues. The report was published in Nature Medicine.
Their analysis comprised 212 women who participated in the WHI who were healthy at the baseline evaluation but went on to develop AML during follow-up. They performed deep sequencing on peripheral blood DNA for these cases and for 212 age-matched controls.
Women who developed AML were more likely than were controls to have mutations in baseline assessment (odds ratio, 4.86; 95% confidence interval, 3.07-7.77), and had demonstrated greater clonal complexity versus controls (comutations in 46.8% and 5.5%, respectively; odds ratio, 9.01; 95% CI, 4.1-21.4), investigators found.
All 21 patients with TP53 mutations went on to develop AML, as did all 15 with IDH1 or IDH2 mutations and all 3 with RUNX1 mutations. Multivariate analysis showed that TP53, IDH1 and IDH2, TET2, DNMT3A and several spliceosome genes were associated with significantly increased odds of AML versus controls.
Based on these results, Dr. Desai and colleagues proposed that patients at increased AML risk should be followed in long-term monitoring studies that incorporate next-generation sequencing.
“Data from these studies will provide a robust rationale for clinical trials of preventative intervention strategies in populations at high risk of developing AML,” they wrote.
In clinical practice, monitoring individuals for AML-associated mutations will become more feasible as costs decrease and new therapies with favorable toxicity profiles are introduced, they added.
“Molecularly targeted therapy is already available for IDH2 mutations and is under development for mutations in other candidate genes found in this study including IDH1, TP53 and spliceosome genes,” they wrote.
The authors reported having no relevant financial disclosures. The WHI program is funded by the National Institutes of Health.
SOURCE: Desai P et al. Nat Med. 2018;24:1015-23.
Individuals who develop acute myeloid leukemia (AML) may have somatic mutations detectable years before diagnosis, a newly published analysis shows.
Mutations in IDH1, IDH2, TP53, DNMT3A, TET2, and spliceosome genes at baseline assessment increased the odds of developing AML with a median follow-up of 9.6 years in the study, which was based on blood samples from participants in the Women’s Health Initiative (WHI).
The findings suggest a “premalignant landscape of mutations” that may precede overt AML by many years, according to Pinkal Desai, MD, assistant professor of medicine at Cornell University and oncologist at New York–Presbyterian/Weill Cornell Medical Center, New York, and her coauthors.
“The ability to detect and identify high-risk mutations suggests that monitoring strategies for patients, as well as clinical trials of potentially preventative or disease-intercepting interventions should be considered,” wrote Dr. Desai and her colleagues. The report was published in Nature Medicine.
Their analysis comprised 212 women who participated in the WHI who were healthy at the baseline evaluation but went on to develop AML during follow-up. They performed deep sequencing on peripheral blood DNA for these cases and for 212 age-matched controls.
Women who developed AML were more likely than were controls to have mutations in baseline assessment (odds ratio, 4.86; 95% confidence interval, 3.07-7.77), and had demonstrated greater clonal complexity versus controls (comutations in 46.8% and 5.5%, respectively; odds ratio, 9.01; 95% CI, 4.1-21.4), investigators found.
All 21 patients with TP53 mutations went on to develop AML, as did all 15 with IDH1 or IDH2 mutations and all 3 with RUNX1 mutations. Multivariate analysis showed that TP53, IDH1 and IDH2, TET2, DNMT3A and several spliceosome genes were associated with significantly increased odds of AML versus controls.
Based on these results, Dr. Desai and colleagues proposed that patients at increased AML risk should be followed in long-term monitoring studies that incorporate next-generation sequencing.
“Data from these studies will provide a robust rationale for clinical trials of preventative intervention strategies in populations at high risk of developing AML,” they wrote.
In clinical practice, monitoring individuals for AML-associated mutations will become more feasible as costs decrease and new therapies with favorable toxicity profiles are introduced, they added.
“Molecularly targeted therapy is already available for IDH2 mutations and is under development for mutations in other candidate genes found in this study including IDH1, TP53 and spliceosome genes,” they wrote.
The authors reported having no relevant financial disclosures. The WHI program is funded by the National Institutes of Health.
SOURCE: Desai P et al. Nat Med. 2018;24:1015-23.
FROM NATURE MEDICINE
Key clinical point:
Major finding: Compared with controls, those who eventually developed AML were more likely to have mutations (odds ratio, 4.86; 95% CI, 3.07-7.77) in baseline assessment at a median of 9.6 years before diagnosis.
Study details: Analysis of blood samples from 212 women who developed AML and 212 age-matched controls in the Women’s Health Initiative.
Disclosures: The researchers reported having no relevant financial disclosures. The WHI program is funded by the National Institutes of Health.
Source: Desai P et al. Nat Med. 2018;24:1015-23.
DOACs’ safety affirmed in real-world setting
Direct oral anticoagulants were associated with decreased bleeding risk versus warfarin in a recent retrospective analysis of primary care databases.
Apixaban (Eliquis) was associated with decreased risk of major bleeding events versus warfarin both in patients with atrial fibrillation (AF) and those prescribed anticoagulants for other causes, according to study results.
Rivaroxaban (Xarelto) was associated with a decrease in risk of intracranial bleeding, compared with warfarin in patients without AF, as was dabigatran (Pradaxa), reported Yana Vinogradova, a research statistician in the division of primary care at the University of Nottingham, England, and her coauthors.
An increased risk of all-cause mortality was seen with both rivaroxaban and low-dose apixaban, possibly because more patients died of age-related causes while on these direct oral anticoagulants (DOACs), they reported.
“This large observational study, based on a general population in a primary care setting, provides reassurance about the safety of DOACs as an alternative to warfarin across all new incident users,” Ms. Vinogradova and her colleagues said in the BMJ.
Evidence establishing the noninferiority of DOACs to warfarin comes mostly from controlled trials in AF leaving “residual concerns” about the safety of these newer agents in real world settings, where a broader range of patients may receive them, they added.
Accordingly, they conducted an analysis based on patient data from two U.K. primary care databases that were representative of the national population, according to the researchers.
A total of 196,061 patients were represented in the study, including 103,270 (53%) with AF and 92,791 (47%) who received anticoagulants for other reasons.
A total of 67% of patients received warfarin, though its use declined from 98% in 2011, the beginning of the study period, to 23% in 2016, the end of the study period. Over that same time period, use of rivaroxaban rose from 1% to 42%, and use of apixaban rose from 0% to 31%, while dabigatran use peaked in 2013 at 10%, dropping to 3% by 2016.
Edoxaban was excluded from the study because it was not licensed in the United Kingdom until the end of 2015, investigators said.
For patients with AF, apixaban was linked to a lower major bleeding risk, both versus warfarin (adjusted hazard ratio, 0.66; 95% confidence interval, 0.54-0.79) and versus rivaroxaban, the published data show. Apixaban was associated with a lower risk of intracranial bleed versus warfarin in patients with AF (aHR, 0.40; 95% CI, 0.25-0.64) as was dabigatran (aHR, 0.45; 95% CI, 0.26-0.77).
For patients without AF, apixaban was again associated with a lower risk of major bleeding versus warfarin and versus rivaroxaban, while rivaroxaban was associated with lower intracranial bleeding risk versus warfarin, and apixaban with lower risks for gastrointestinal bleeds.
Compared with apixaban, rivaroxaban and dabigatran were associated with higher risks of certain bleeding events, further analyses show.
Rivaroxaban and lower-dose apixaban were both associated with increased all-cause mortality risk versus warfarin, both in the atrial fibrillation and non-AF groups, Ms. Vinogradova and her coinvestigators noted.
“A greater proportion of the older patients on apixaban and rivaroxaban may have died while still taking anticoagulants but from age-related causes other than ischemic stroke or venous thromboembolism,” they wrote.
Compared with patients on higher doses of DOACs, patients receiving lower doses were older and had more comorbidities and more previous events, they added.
Between DOACs, results of this particular analysis were most favorable for apixaban, according to investigators.
“Our study has shown that the risk of major bleeding is lower in patients taking apixaban regardless of the reason for prescribing,” they wrote. “This was most pronounced for intracranial bleeding in patients with atrial fibrillation and for gastrointestinal bleeding in patients without atrial fibrillation, appearing, in general, to show apixaban to be the safest drug.”
The study was supported by a grant from the National Institute for Health Research. The investigators had no relevant disclosures.
SOURCE: Vinogradova Y et al. BMJ 2018; 362:K2505.
Direct oral anticoagulants were associated with decreased bleeding risk versus warfarin in a recent retrospective analysis of primary care databases.
Apixaban (Eliquis) was associated with decreased risk of major bleeding events versus warfarin both in patients with atrial fibrillation (AF) and those prescribed anticoagulants for other causes, according to study results.
Rivaroxaban (Xarelto) was associated with a decrease in risk of intracranial bleeding, compared with warfarin in patients without AF, as was dabigatran (Pradaxa), reported Yana Vinogradova, a research statistician in the division of primary care at the University of Nottingham, England, and her coauthors.
An increased risk of all-cause mortality was seen with both rivaroxaban and low-dose apixaban, possibly because more patients died of age-related causes while on these direct oral anticoagulants (DOACs), they reported.
“This large observational study, based on a general population in a primary care setting, provides reassurance about the safety of DOACs as an alternative to warfarin across all new incident users,” Ms. Vinogradova and her colleagues said in the BMJ.
Evidence establishing the noninferiority of DOACs to warfarin comes mostly from controlled trials in AF leaving “residual concerns” about the safety of these newer agents in real world settings, where a broader range of patients may receive them, they added.
Accordingly, they conducted an analysis based on patient data from two U.K. primary care databases that were representative of the national population, according to the researchers.
A total of 196,061 patients were represented in the study, including 103,270 (53%) with AF and 92,791 (47%) who received anticoagulants for other reasons.
A total of 67% of patients received warfarin, though its use declined from 98% in 2011, the beginning of the study period, to 23% in 2016, the end of the study period. Over that same time period, use of rivaroxaban rose from 1% to 42%, and use of apixaban rose from 0% to 31%, while dabigatran use peaked in 2013 at 10%, dropping to 3% by 2016.
Edoxaban was excluded from the study because it was not licensed in the United Kingdom until the end of 2015, investigators said.
For patients with AF, apixaban was linked to a lower major bleeding risk, both versus warfarin (adjusted hazard ratio, 0.66; 95% confidence interval, 0.54-0.79) and versus rivaroxaban, the published data show. Apixaban was associated with a lower risk of intracranial bleed versus warfarin in patients with AF (aHR, 0.40; 95% CI, 0.25-0.64) as was dabigatran (aHR, 0.45; 95% CI, 0.26-0.77).
For patients without AF, apixaban was again associated with a lower risk of major bleeding versus warfarin and versus rivaroxaban, while rivaroxaban was associated with lower intracranial bleeding risk versus warfarin, and apixaban with lower risks for gastrointestinal bleeds.
Compared with apixaban, rivaroxaban and dabigatran were associated with higher risks of certain bleeding events, further analyses show.
Rivaroxaban and lower-dose apixaban were both associated with increased all-cause mortality risk versus warfarin, both in the atrial fibrillation and non-AF groups, Ms. Vinogradova and her coinvestigators noted.
“A greater proportion of the older patients on apixaban and rivaroxaban may have died while still taking anticoagulants but from age-related causes other than ischemic stroke or venous thromboembolism,” they wrote.
Compared with patients on higher doses of DOACs, patients receiving lower doses were older and had more comorbidities and more previous events, they added.
Between DOACs, results of this particular analysis were most favorable for apixaban, according to investigators.
“Our study has shown that the risk of major bleeding is lower in patients taking apixaban regardless of the reason for prescribing,” they wrote. “This was most pronounced for intracranial bleeding in patients with atrial fibrillation and for gastrointestinal bleeding in patients without atrial fibrillation, appearing, in general, to show apixaban to be the safest drug.”
The study was supported by a grant from the National Institute for Health Research. The investigators had no relevant disclosures.
SOURCE: Vinogradova Y et al. BMJ 2018; 362:K2505.
Direct oral anticoagulants were associated with decreased bleeding risk versus warfarin in a recent retrospective analysis of primary care databases.
Apixaban (Eliquis) was associated with decreased risk of major bleeding events versus warfarin both in patients with atrial fibrillation (AF) and those prescribed anticoagulants for other causes, according to study results.
Rivaroxaban (Xarelto) was associated with a decrease in risk of intracranial bleeding, compared with warfarin in patients without AF, as was dabigatran (Pradaxa), reported Yana Vinogradova, a research statistician in the division of primary care at the University of Nottingham, England, and her coauthors.
An increased risk of all-cause mortality was seen with both rivaroxaban and low-dose apixaban, possibly because more patients died of age-related causes while on these direct oral anticoagulants (DOACs), they reported.
“This large observational study, based on a general population in a primary care setting, provides reassurance about the safety of DOACs as an alternative to warfarin across all new incident users,” Ms. Vinogradova and her colleagues said in the BMJ.
Evidence establishing the noninferiority of DOACs to warfarin comes mostly from controlled trials in AF leaving “residual concerns” about the safety of these newer agents in real world settings, where a broader range of patients may receive them, they added.
Accordingly, they conducted an analysis based on patient data from two U.K. primary care databases that were representative of the national population, according to the researchers.
A total of 196,061 patients were represented in the study, including 103,270 (53%) with AF and 92,791 (47%) who received anticoagulants for other reasons.
A total of 67% of patients received warfarin, though its use declined from 98% in 2011, the beginning of the study period, to 23% in 2016, the end of the study period. Over that same time period, use of rivaroxaban rose from 1% to 42%, and use of apixaban rose from 0% to 31%, while dabigatran use peaked in 2013 at 10%, dropping to 3% by 2016.
Edoxaban was excluded from the study because it was not licensed in the United Kingdom until the end of 2015, investigators said.
For patients with AF, apixaban was linked to a lower major bleeding risk, both versus warfarin (adjusted hazard ratio, 0.66; 95% confidence interval, 0.54-0.79) and versus rivaroxaban, the published data show. Apixaban was associated with a lower risk of intracranial bleed versus warfarin in patients with AF (aHR, 0.40; 95% CI, 0.25-0.64) as was dabigatran (aHR, 0.45; 95% CI, 0.26-0.77).
For patients without AF, apixaban was again associated with a lower risk of major bleeding versus warfarin and versus rivaroxaban, while rivaroxaban was associated with lower intracranial bleeding risk versus warfarin, and apixaban with lower risks for gastrointestinal bleeds.
Compared with apixaban, rivaroxaban and dabigatran were associated with higher risks of certain bleeding events, further analyses show.
Rivaroxaban and lower-dose apixaban were both associated with increased all-cause mortality risk versus warfarin, both in the atrial fibrillation and non-AF groups, Ms. Vinogradova and her coinvestigators noted.
“A greater proportion of the older patients on apixaban and rivaroxaban may have died while still taking anticoagulants but from age-related causes other than ischemic stroke or venous thromboembolism,” they wrote.
Compared with patients on higher doses of DOACs, patients receiving lower doses were older and had more comorbidities and more previous events, they added.
Between DOACs, results of this particular analysis were most favorable for apixaban, according to investigators.
“Our study has shown that the risk of major bleeding is lower in patients taking apixaban regardless of the reason for prescribing,” they wrote. “This was most pronounced for intracranial bleeding in patients with atrial fibrillation and for gastrointestinal bleeding in patients without atrial fibrillation, appearing, in general, to show apixaban to be the safest drug.”
The study was supported by a grant from the National Institute for Health Research. The investigators had no relevant disclosures.
SOURCE: Vinogradova Y et al. BMJ 2018; 362:K2505.
FROM THE BMJ
Key clinical point: .
Major finding: Apixaban was linked with an adjusted 34% decreased risk of major bleeding in patients with AF and a 40% lower risk in those prescribed anticoagulants for other causes, compared with warfarin.
Study details: A retrospective cohort study representing 196,061 patients from two U.K. primary care databases.
Disclosures: The study was supported by a grant from the National Institute for Health Research. The investigators had no relevant disclosures.
Source: Vinogradova Y et al. BMJ 2018;362:k2505.
PET-driven chemo strategy helps reduce toxicity in Hodgkin lymphoma
CHICAGO – Positron emission tomography (PET) performed after two cycles of BEACOPP could help identify a subset of advanced-stage Hodgkin lymphoma patients who can receive de-escalated treatment without compromising disease control, results of a phase 3 randomized trial show.
Five-year progression-free survival (PFS) exceeded 85% not only for patients receiving six cycles of escalated BEACOPP (bleomycin, etoposide, doxorubicin, cyclophosphamide, vincristine, procarbazine, and prednisone) but also for patients who were de-escalated to ABVD (doxorubicin, bleomycin, vinblastine, dacarbazine) chemotherapy based on negative PET results, according to the final analysis of the AHL2011 LYSA study, presented at the annual meeting of the American Society of Clinical Oncology.
“This approach allows us to significantly reduce the treatment-related toxicity in most patients, and provides similar patient outcomes compared to standard BEACOPP escalated treatment,” said Olivier Casasnovas, MD, of the University Hospital Le Bocage and Inserm, Dijon, France.
Previous studies have shown that BEACOPP may improve PFS, compared with ABVD, but is more toxic and associated with a higher risk of infertility and myelodysplasia or acute leukemia. Dr. Casasnovas and his colleagues sought to evaluate whether upfront BEACOPP followed by de-escalation to ABVD, when warranted by negative PET results, would improve outcomes without compromising efficacy.
The AHL2011 LYSA study included 823 patients (median age, 30 years; 63% male) with previously untreated advanced classical Hodgkin lymphoma. All patients received PET at baseline, after cycle two of chemotherapy, and again after cycle four.
Patients were randomized either to six cycles of escalated BEACOPP, or to an experimental arm in which patients started with BEACOPP but were de-escalated to ABVD if PET results were negative after two or four cycles of treatment.
PET results after cycle two were negative for 87% of patients in the experimental arm, so on an intent-to-treat basis, 84% of them received two cycles of BEACOPP and four cycles of ABVD, Dr. Casasnovas reported.
PFS, with a median follow-up of 50.4 months, was not significantly different for the standard versus the experimental arm (hazard ratio, 1.084; 95% confidence interval, 0.73-1.59; P = .68). Five-year PFS was 85.7% in the experimental treatment de-escalation arm, compared with 86.2% in the standard arm.
Overall survival was likewise similar between arms, with 5-year overall survival exceeding 95% in both groups, Dr. Casasnovas said.
Although there was no significant difference overall in the incidence of adverse events, there was significantly less anemia, febrile neutropenia, thrombocytopenia, and sepsis in the PET-driven de-escalation arm. Overall, 27% of patients in the standard chemotherapy arm experienced at least one serious adverse event, compared with 17% in the PET-driven arm (P less than .002).
The incidence of second primary malignancies was numerically lower in the experimental arm (1.2% vs. 2.4%), though that finding did not reach statistical significance.
On multivariate analysis, interim PET positivity after cycle two was associated with increased risk of disease progression for patients in this study (HR, 3.316). Risk of progression was even higher for patients with PET positivity after four cycles (HR, 12.968), identifying a subset of patients with “particularly poor outcome” who could benefit from development of new treatment options, Dr. Casasnovas said.
Dr. Casasnovas reported financial ties to AbbVie, Bristol-Myers Squibb, Celgene, Gilead Sciences, Janssen, Merck, Roche/Genentech, Sanofi, and Takeda.
SOURCE: Casasnovas O et al. ASCO 2018, Abstract 7503.
CHICAGO – Positron emission tomography (PET) performed after two cycles of BEACOPP could help identify a subset of advanced-stage Hodgkin lymphoma patients who can receive de-escalated treatment without compromising disease control, results of a phase 3 randomized trial show.
Five-year progression-free survival (PFS) exceeded 85% not only for patients receiving six cycles of escalated BEACOPP (bleomycin, etoposide, doxorubicin, cyclophosphamide, vincristine, procarbazine, and prednisone) but also for patients who were de-escalated to ABVD (doxorubicin, bleomycin, vinblastine, dacarbazine) chemotherapy based on negative PET results, according to the final analysis of the AHL2011 LYSA study, presented at the annual meeting of the American Society of Clinical Oncology.
“This approach allows us to significantly reduce the treatment-related toxicity in most patients, and provides similar patient outcomes compared to standard BEACOPP escalated treatment,” said Olivier Casasnovas, MD, of the University Hospital Le Bocage and Inserm, Dijon, France.
Previous studies have shown that BEACOPP may improve PFS, compared with ABVD, but is more toxic and associated with a higher risk of infertility and myelodysplasia or acute leukemia. Dr. Casasnovas and his colleagues sought to evaluate whether upfront BEACOPP followed by de-escalation to ABVD, when warranted by negative PET results, would improve outcomes without compromising efficacy.
The AHL2011 LYSA study included 823 patients (median age, 30 years; 63% male) with previously untreated advanced classical Hodgkin lymphoma. All patients received PET at baseline, after cycle two of chemotherapy, and again after cycle four.
Patients were randomized either to six cycles of escalated BEACOPP, or to an experimental arm in which patients started with BEACOPP but were de-escalated to ABVD if PET results were negative after two or four cycles of treatment.
PET results after cycle two were negative for 87% of patients in the experimental arm, so on an intent-to-treat basis, 84% of them received two cycles of BEACOPP and four cycles of ABVD, Dr. Casasnovas reported.
PFS, with a median follow-up of 50.4 months, was not significantly different for the standard versus the experimental arm (hazard ratio, 1.084; 95% confidence interval, 0.73-1.59; P = .68). Five-year PFS was 85.7% in the experimental treatment de-escalation arm, compared with 86.2% in the standard arm.
Overall survival was likewise similar between arms, with 5-year overall survival exceeding 95% in both groups, Dr. Casasnovas said.
Although there was no significant difference overall in the incidence of adverse events, there was significantly less anemia, febrile neutropenia, thrombocytopenia, and sepsis in the PET-driven de-escalation arm. Overall, 27% of patients in the standard chemotherapy arm experienced at least one serious adverse event, compared with 17% in the PET-driven arm (P less than .002).
The incidence of second primary malignancies was numerically lower in the experimental arm (1.2% vs. 2.4%), though that finding did not reach statistical significance.
On multivariate analysis, interim PET positivity after cycle two was associated with increased risk of disease progression for patients in this study (HR, 3.316). Risk of progression was even higher for patients with PET positivity after four cycles (HR, 12.968), identifying a subset of patients with “particularly poor outcome” who could benefit from development of new treatment options, Dr. Casasnovas said.
Dr. Casasnovas reported financial ties to AbbVie, Bristol-Myers Squibb, Celgene, Gilead Sciences, Janssen, Merck, Roche/Genentech, Sanofi, and Takeda.
SOURCE: Casasnovas O et al. ASCO 2018, Abstract 7503.
CHICAGO – Positron emission tomography (PET) performed after two cycles of BEACOPP could help identify a subset of advanced-stage Hodgkin lymphoma patients who can receive de-escalated treatment without compromising disease control, results of a phase 3 randomized trial show.
Five-year progression-free survival (PFS) exceeded 85% not only for patients receiving six cycles of escalated BEACOPP (bleomycin, etoposide, doxorubicin, cyclophosphamide, vincristine, procarbazine, and prednisone) but also for patients who were de-escalated to ABVD (doxorubicin, bleomycin, vinblastine, dacarbazine) chemotherapy based on negative PET results, according to the final analysis of the AHL2011 LYSA study, presented at the annual meeting of the American Society of Clinical Oncology.
“This approach allows us to significantly reduce the treatment-related toxicity in most patients, and provides similar patient outcomes compared to standard BEACOPP escalated treatment,” said Olivier Casasnovas, MD, of the University Hospital Le Bocage and Inserm, Dijon, France.
Previous studies have shown that BEACOPP may improve PFS, compared with ABVD, but is more toxic and associated with a higher risk of infertility and myelodysplasia or acute leukemia. Dr. Casasnovas and his colleagues sought to evaluate whether upfront BEACOPP followed by de-escalation to ABVD, when warranted by negative PET results, would improve outcomes without compromising efficacy.
The AHL2011 LYSA study included 823 patients (median age, 30 years; 63% male) with previously untreated advanced classical Hodgkin lymphoma. All patients received PET at baseline, after cycle two of chemotherapy, and again after cycle four.
Patients were randomized either to six cycles of escalated BEACOPP, or to an experimental arm in which patients started with BEACOPP but were de-escalated to ABVD if PET results were negative after two or four cycles of treatment.
PET results after cycle two were negative for 87% of patients in the experimental arm, so on an intent-to-treat basis, 84% of them received two cycles of BEACOPP and four cycles of ABVD, Dr. Casasnovas reported.
PFS, with a median follow-up of 50.4 months, was not significantly different for the standard versus the experimental arm (hazard ratio, 1.084; 95% confidence interval, 0.73-1.59; P = .68). Five-year PFS was 85.7% in the experimental treatment de-escalation arm, compared with 86.2% in the standard arm.
Overall survival was likewise similar between arms, with 5-year overall survival exceeding 95% in both groups, Dr. Casasnovas said.
Although there was no significant difference overall in the incidence of adverse events, there was significantly less anemia, febrile neutropenia, thrombocytopenia, and sepsis in the PET-driven de-escalation arm. Overall, 27% of patients in the standard chemotherapy arm experienced at least one serious adverse event, compared with 17% in the PET-driven arm (P less than .002).
The incidence of second primary malignancies was numerically lower in the experimental arm (1.2% vs. 2.4%), though that finding did not reach statistical significance.
On multivariate analysis, interim PET positivity after cycle two was associated with increased risk of disease progression for patients in this study (HR, 3.316). Risk of progression was even higher for patients with PET positivity after four cycles (HR, 12.968), identifying a subset of patients with “particularly poor outcome” who could benefit from development of new treatment options, Dr. Casasnovas said.
Dr. Casasnovas reported financial ties to AbbVie, Bristol-Myers Squibb, Celgene, Gilead Sciences, Janssen, Merck, Roche/Genentech, Sanofi, and Takeda.
SOURCE: Casasnovas O et al. ASCO 2018, Abstract 7503.
REPORTING FROM ASCO 2018
Key clinical point:
Major finding: With a median follow-up of 50.4 months, PFS was not significantly different for escalated BEACOPP vs. the experimental arm (HR, 1.084; 95% CI, 0.73-1.59; P = .68).
Study details: Final analysis of AHL2011 LYSA, a randomized phase 3 study including 823 patients with advanced-stage Hodgkin lymphoma.
Disclosures: Dr. Casasnovas reported financial ties to AbbVie, Bristol-Myers Squibb, Celgene, Gilead Sciences, Janssen, Merck, Roche/Genentech, Sanofi, and Takeda.
Source: Casasnovas O et al. ASCO 2018, Abstract 7503.
Do carbs drive obesity? With evidence inconclusive, debate continues
While the debate continues, David S. Ludwig, MD, PhD, and Cara B. Ebbeling, PhD, argued in a recent clinical review that diet does indeed affect metabolism and body composition.
While evidence from human studies remains limited, animal research findings are consistent with a carbohydrate-insulin model of obesity, according to Dr. Ludwig and Dr. Ebbeling, who are with the New Balance Foundation Obesity Prevention Center at Boston Children’s Hospital and Harvard Medical School.
The carbohydrate-insulin model holds that eating processed, high–glycemic load carbohydrates causes hormonal changes that promote calorie deposition in fat tissue, aggravate hunger, and reduce energy expenditure, they said in JAMA Internal Medicine.
“The conventional way of thinking assumes that the individual has primary control over their calorie balance, and thus, bases conventional treatment on a target of establishing a negative energy balance – so that is 1,000 variations of the ‘eat less, move more’ recommendation,” Dr. Ludwig said in an interview.
The alternative to that established view has proven controversial. The Endocrine Society, in a recent scientific statement, said diet’s effect on obesity risk is largely explainable by calorie intake, rather than some special adverse effect on internal metabolism or energy expenditure.
“Stated differently, ‘a calorie is a calorie,’ ” the authors of the scientific statement said. “Thus, habitual consumption of highly palatable and energy-dense diets predispose to excess weight gain irrespective of macronutrient content.”
Others have sought to refute the carbohydrate-insulin hypothesis in recent reviews, such as an invited commentary in JAMA Internal Medicine by Kevin D. Hall, PhD, of the National Institute of Diabetes and Digestive and Kidney Diseases, and his coauthors.
“Although it is plausible that variables related to insulin signaling could be involved in obesity pathogenesis, the hypothesis that carbohydrate-stimulated insulin secretion is the primary cause of common obesity via direct effects on adipocytes is difficult to reconcile with current evidence,” Dr. Hall and his coauthors wrote in the commentary (JAMA Intern Med. 2018 Jul 2. doi: 10.1001/jamainternmed.2018.2920).
The conventional calorie balance model is a “straw man” that omits neuroendocrine mechanisms known to regulate homeostasis, added Dr. Hall and his coauthors, stating that accurate models of obesity should include physiological processes resisting weight loss and promoting weight gain.
“They might claim that this is a straw man argument, but I would claim that there is a case of the emperor’s new clothing,” Dr. Ludwig countered in the interview. “They argue that body weight is controlled by biology, and that that’s recognized in the conventional view, but how does that view inform treatment in any way? In the absence of any specific testable hypotheses for why the obesity epidemic has emerged so suddenly, conventional recommendations inevitably resort to advice to ‘eat less and move more.’ ”
Dr. Ludwig and Dr. Ebbeling have both conducted research studies examining the carbohydrate-insulin model, or the view that a high-carbohydrate diet results in postprandial hyperinsulinemia and promotes deposition of calories in adipocytes, leading to weight gain through slowing metabolism, increased hunger, or both.
In a study published in the Lancet, Dr. Ludwig and his coinvestigators found that rats fed a high–glycemic index (GI) diet for 18 weeks had more body fat (97.8 grams vs. 57.3 grams; P = .0152) and less lean body mass versus rats fed a low-GI diet. Rats on the high-GI diet also had greater increases over time in blood glucose and plasma insulin after oral glucose. Similarly, mice on a high-GI diet had nearly twice the body fat of mice on low-GI diet, after 9 weeks of feeding (Lancet. 2004 Aug 28. doi: 10.1016/S0140-6736(04)16937-7).
“There’s no way to explain that finding in view of the conventional view that all calories are alike to the body,” Dr. Ludwig said.
“Contrary to prediction of the conventional model, the inherently lower energy density of low-fat diets does not spontaneously produce sustained weight loss. In fact, several recent meta-analyses found that low-fat diets are inferior to all higher-fat [and thus low-glycemic] comparisons. However, these studies characteristically rely on dietary counseling, a method with limitations for testing mechanistic hypotheses owing to varying levels of noncompliance over the long-term,” Dr. Ludwig and Dr. Ebbeling wrote.
Criticisms that claim to refute the carbohydrate-insulin hypothesis are based in part on misinterpretation of recent feeding studies, according to Dr. Ludwig and Dr. Ebbeling. Multiple studies testing whether or not high–glycemic load meals lead to increased fat storage have reported no meaningful differences between low-fat and low-carbohydrate diets. However, these short-term studies, mostly 2 weeks in duration, preclude definitive findings, according to the review.
That’s because the process of adapting to a high-fat diet after having consumed a high-carbohydrate diet takes weeks, which is a well-recognized phenomenon, Dr. Ludwig said.
“If you put sedentary people into military boot camp and tested their biological state after 6 days, you’d probably find that they were fatigued, weak, and had higher inflammation in their muscles, but clearly, you wouldn’t conclude that fitness training is bad for your health,” he said in the interview. “But yet, these are the sort of data that are being used to ‘falsify’ the carbohydrate-insulin model.
“We acknowledge that there aren’t definitive human data,” he continued, “but the conventional model has failed to both explain the obesity epidemic and control it, and the latest public health data suggests that rates are higher today than ever before, despite 50 years of focusing on calorie balance.”
SOURCE: Ludwig DS et al. JAMA Intern Med. 2018 Jul 2. doi:10.1001/jamainternmed.2018.2933.
While the debate continues, David S. Ludwig, MD, PhD, and Cara B. Ebbeling, PhD, argued in a recent clinical review that diet does indeed affect metabolism and body composition.
While evidence from human studies remains limited, animal research findings are consistent with a carbohydrate-insulin model of obesity, according to Dr. Ludwig and Dr. Ebbeling, who are with the New Balance Foundation Obesity Prevention Center at Boston Children’s Hospital and Harvard Medical School.
The carbohydrate-insulin model holds that eating processed, high–glycemic load carbohydrates causes hormonal changes that promote calorie deposition in fat tissue, aggravate hunger, and reduce energy expenditure, they said in JAMA Internal Medicine.
“The conventional way of thinking assumes that the individual has primary control over their calorie balance, and thus, bases conventional treatment on a target of establishing a negative energy balance – so that is 1,000 variations of the ‘eat less, move more’ recommendation,” Dr. Ludwig said in an interview.
The alternative to that established view has proven controversial. The Endocrine Society, in a recent scientific statement, said diet’s effect on obesity risk is largely explainable by calorie intake, rather than some special adverse effect on internal metabolism or energy expenditure.
“Stated differently, ‘a calorie is a calorie,’ ” the authors of the scientific statement said. “Thus, habitual consumption of highly palatable and energy-dense diets predispose to excess weight gain irrespective of macronutrient content.”
Others have sought to refute the carbohydrate-insulin hypothesis in recent reviews, such as an invited commentary in JAMA Internal Medicine by Kevin D. Hall, PhD, of the National Institute of Diabetes and Digestive and Kidney Diseases, and his coauthors.
“Although it is plausible that variables related to insulin signaling could be involved in obesity pathogenesis, the hypothesis that carbohydrate-stimulated insulin secretion is the primary cause of common obesity via direct effects on adipocytes is difficult to reconcile with current evidence,” Dr. Hall and his coauthors wrote in the commentary (JAMA Intern Med. 2018 Jul 2. doi: 10.1001/jamainternmed.2018.2920).
The conventional calorie balance model is a “straw man” that omits neuroendocrine mechanisms known to regulate homeostasis, added Dr. Hall and his coauthors, stating that accurate models of obesity should include physiological processes resisting weight loss and promoting weight gain.
“They might claim that this is a straw man argument, but I would claim that there is a case of the emperor’s new clothing,” Dr. Ludwig countered in the interview. “They argue that body weight is controlled by biology, and that that’s recognized in the conventional view, but how does that view inform treatment in any way? In the absence of any specific testable hypotheses for why the obesity epidemic has emerged so suddenly, conventional recommendations inevitably resort to advice to ‘eat less and move more.’ ”
Dr. Ludwig and Dr. Ebbeling have both conducted research studies examining the carbohydrate-insulin model, or the view that a high-carbohydrate diet results in postprandial hyperinsulinemia and promotes deposition of calories in adipocytes, leading to weight gain through slowing metabolism, increased hunger, or both.
In a study published in the Lancet, Dr. Ludwig and his coinvestigators found that rats fed a high–glycemic index (GI) diet for 18 weeks had more body fat (97.8 grams vs. 57.3 grams; P = .0152) and less lean body mass versus rats fed a low-GI diet. Rats on the high-GI diet also had greater increases over time in blood glucose and plasma insulin after oral glucose. Similarly, mice on a high-GI diet had nearly twice the body fat of mice on low-GI diet, after 9 weeks of feeding (Lancet. 2004 Aug 28. doi: 10.1016/S0140-6736(04)16937-7).
“There’s no way to explain that finding in view of the conventional view that all calories are alike to the body,” Dr. Ludwig said.
“Contrary to prediction of the conventional model, the inherently lower energy density of low-fat diets does not spontaneously produce sustained weight loss. In fact, several recent meta-analyses found that low-fat diets are inferior to all higher-fat [and thus low-glycemic] comparisons. However, these studies characteristically rely on dietary counseling, a method with limitations for testing mechanistic hypotheses owing to varying levels of noncompliance over the long-term,” Dr. Ludwig and Dr. Ebbeling wrote.
Criticisms that claim to refute the carbohydrate-insulin hypothesis are based in part on misinterpretation of recent feeding studies, according to Dr. Ludwig and Dr. Ebbeling. Multiple studies testing whether or not high–glycemic load meals lead to increased fat storage have reported no meaningful differences between low-fat and low-carbohydrate diets. However, these short-term studies, mostly 2 weeks in duration, preclude definitive findings, according to the review.
That’s because the process of adapting to a high-fat diet after having consumed a high-carbohydrate diet takes weeks, which is a well-recognized phenomenon, Dr. Ludwig said.
“If you put sedentary people into military boot camp and tested their biological state after 6 days, you’d probably find that they were fatigued, weak, and had higher inflammation in their muscles, but clearly, you wouldn’t conclude that fitness training is bad for your health,” he said in the interview. “But yet, these are the sort of data that are being used to ‘falsify’ the carbohydrate-insulin model.
“We acknowledge that there aren’t definitive human data,” he continued, “but the conventional model has failed to both explain the obesity epidemic and control it, and the latest public health data suggests that rates are higher today than ever before, despite 50 years of focusing on calorie balance.”
SOURCE: Ludwig DS et al. JAMA Intern Med. 2018 Jul 2. doi:10.1001/jamainternmed.2018.2933.
While the debate continues, David S. Ludwig, MD, PhD, and Cara B. Ebbeling, PhD, argued in a recent clinical review that diet does indeed affect metabolism and body composition.
While evidence from human studies remains limited, animal research findings are consistent with a carbohydrate-insulin model of obesity, according to Dr. Ludwig and Dr. Ebbeling, who are with the New Balance Foundation Obesity Prevention Center at Boston Children’s Hospital and Harvard Medical School.
The carbohydrate-insulin model holds that eating processed, high–glycemic load carbohydrates causes hormonal changes that promote calorie deposition in fat tissue, aggravate hunger, and reduce energy expenditure, they said in JAMA Internal Medicine.
“The conventional way of thinking assumes that the individual has primary control over their calorie balance, and thus, bases conventional treatment on a target of establishing a negative energy balance – so that is 1,000 variations of the ‘eat less, move more’ recommendation,” Dr. Ludwig said in an interview.
The alternative to that established view has proven controversial. The Endocrine Society, in a recent scientific statement, said diet’s effect on obesity risk is largely explainable by calorie intake, rather than some special adverse effect on internal metabolism or energy expenditure.
“Stated differently, ‘a calorie is a calorie,’ ” the authors of the scientific statement said. “Thus, habitual consumption of highly palatable and energy-dense diets predispose to excess weight gain irrespective of macronutrient content.”
Others have sought to refute the carbohydrate-insulin hypothesis in recent reviews, such as an invited commentary in JAMA Internal Medicine by Kevin D. Hall, PhD, of the National Institute of Diabetes and Digestive and Kidney Diseases, and his coauthors.
“Although it is plausible that variables related to insulin signaling could be involved in obesity pathogenesis, the hypothesis that carbohydrate-stimulated insulin secretion is the primary cause of common obesity via direct effects on adipocytes is difficult to reconcile with current evidence,” Dr. Hall and his coauthors wrote in the commentary (JAMA Intern Med. 2018 Jul 2. doi: 10.1001/jamainternmed.2018.2920).
The conventional calorie balance model is a “straw man” that omits neuroendocrine mechanisms known to regulate homeostasis, added Dr. Hall and his coauthors, stating that accurate models of obesity should include physiological processes resisting weight loss and promoting weight gain.
“They might claim that this is a straw man argument, but I would claim that there is a case of the emperor’s new clothing,” Dr. Ludwig countered in the interview. “They argue that body weight is controlled by biology, and that that’s recognized in the conventional view, but how does that view inform treatment in any way? In the absence of any specific testable hypotheses for why the obesity epidemic has emerged so suddenly, conventional recommendations inevitably resort to advice to ‘eat less and move more.’ ”
Dr. Ludwig and Dr. Ebbeling have both conducted research studies examining the carbohydrate-insulin model, or the view that a high-carbohydrate diet results in postprandial hyperinsulinemia and promotes deposition of calories in adipocytes, leading to weight gain through slowing metabolism, increased hunger, or both.
In a study published in the Lancet, Dr. Ludwig and his coinvestigators found that rats fed a high–glycemic index (GI) diet for 18 weeks had more body fat (97.8 grams vs. 57.3 grams; P = .0152) and less lean body mass versus rats fed a low-GI diet. Rats on the high-GI diet also had greater increases over time in blood glucose and plasma insulin after oral glucose. Similarly, mice on a high-GI diet had nearly twice the body fat of mice on low-GI diet, after 9 weeks of feeding (Lancet. 2004 Aug 28. doi: 10.1016/S0140-6736(04)16937-7).
“There’s no way to explain that finding in view of the conventional view that all calories are alike to the body,” Dr. Ludwig said.
“Contrary to prediction of the conventional model, the inherently lower energy density of low-fat diets does not spontaneously produce sustained weight loss. In fact, several recent meta-analyses found that low-fat diets are inferior to all higher-fat [and thus low-glycemic] comparisons. However, these studies characteristically rely on dietary counseling, a method with limitations for testing mechanistic hypotheses owing to varying levels of noncompliance over the long-term,” Dr. Ludwig and Dr. Ebbeling wrote.
Criticisms that claim to refute the carbohydrate-insulin hypothesis are based in part on misinterpretation of recent feeding studies, according to Dr. Ludwig and Dr. Ebbeling. Multiple studies testing whether or not high–glycemic load meals lead to increased fat storage have reported no meaningful differences between low-fat and low-carbohydrate diets. However, these short-term studies, mostly 2 weeks in duration, preclude definitive findings, according to the review.
That’s because the process of adapting to a high-fat diet after having consumed a high-carbohydrate diet takes weeks, which is a well-recognized phenomenon, Dr. Ludwig said.
“If you put sedentary people into military boot camp and tested their biological state after 6 days, you’d probably find that they were fatigued, weak, and had higher inflammation in their muscles, but clearly, you wouldn’t conclude that fitness training is bad for your health,” he said in the interview. “But yet, these are the sort of data that are being used to ‘falsify’ the carbohydrate-insulin model.
“We acknowledge that there aren’t definitive human data,” he continued, “but the conventional model has failed to both explain the obesity epidemic and control it, and the latest public health data suggests that rates are higher today than ever before, despite 50 years of focusing on calorie balance.”
SOURCE: Ludwig DS et al. JAMA Intern Med. 2018 Jul 2. doi:10.1001/jamainternmed.2018.2933.
FROM JAMA INTERNAL MEDICINE