User login
VIDEO: Blinatumomab, inotuzumab reshape relapsed ALL treatment
NEW YORK – A pair of new monoclonal antibodies have dramatically changed treatment for patients with acute lymphoblastic leukemia to prepare them for a stem cell transplant, Daniel J. DeAngelo, MD, said at a conference held by Imedex.
“We don’t use standard chemotherapy for reinduction anymore; we use blinatumomab or inotuzumab,” said Dr. DeAngelo, a hematologist oncologist at Dana-Farber Cancer Institute in Boston.
Blinatumomab (Blincyto), approved by the Food and Drug Administration in 2014, has produced “exceptional” response rates, becoming “standard of care” for patients with relapsed acute lymphoblastic leukemia (ALL) that does not have a Philadelphia chromosome, Dr. DeAngelo said in a video interview.
Approved based on results from a phase II study, blinatumomab’s efficacy and safety were recently further delineated in results from the first phase III trial (N Engl J Med. 2017 Mar 2;376[9]:836-74), with 376 treated patients. In that trial, blinatumomab more than doubled the complete remission rate, compared with control patients (34% vs. 16%), and nearly doubled median overall survival – 7.7 months with blinatumomab, compared with 4.0 months for control patients treated with standard chemotherapy.
These findings “further substantiated” blinatumomab’s role, he said.
Blinatumomab’s big limitations are certain adverse effects and the logistics of its dosing. The major adverse effect is “cytokine release syndrome,” which manifests as fever, low blood pressure, and neurologic toxicities that can range from tremors to encephalopathy and seizure. These are manageable by close observation of patients by experienced nurses, Dr. DeAngelo said.
Dosing involves 4 weeks of continuous infusion, starting with 10 days done entirely in the hospital, with the remaining 18 days with patients going home but needing to return every 48 hours to have their infusion bag changed. “Depending on how far the patient lives from the clinic, it can be a logistical challenge,” he said.
A second new antibody he has used on many patients is inotuzumab, which was accepted for review for approval by the FDA in February 2017, with action expected by August.
Dr. DeAngelo served as a coinvestigator in a phase III trial reported in 2016 with 218 evaluable patients. In that trial, investigators reported an 81% complete remission rate with inotuzumab treatment, compared with a 29% among control patients on chemotherapy (N Engl J Med. 2016 Aug 25;375[8]:740-53).
Inotuzumab was effective against patients with Philadelphia chromosome positive ALL, but it will not work for the roughly 5%-10% of ALL patients who lack CD-22 expression in their B-cell ALL.
Inotuzumab is easier to administer than blinatumomab, requiring a once a week infusion, and causes little immediate toxicity – although thrombocytopenia and liver-function abnormalities can occur with continued use, and the risk of veno-occlusive disease is increased when patients later receive a stem cell transplant, Dr. DeAngelo said.
“It’s nice to have options” when choosing antibody-based treatment, he said. Blinatumomab is a good choice for patients with a lower tumor burden – either patients with early relapse or with minimal residual disease – while inotuzumab works better for patients with more bulky disease, as well as those who are not able to accommodate the logistic demands of blinatumomab infusions.
Dr. DeAngelo also highlighted several trials now underway that are testing the efficacy of both antibodies when used as part of first-line treatment.
Dr. DeAngelo has been a consultant to Amgen, the company that markets blinatumomab (Blincyto); to Pfizer, the company developing inotuzumab; and to Ariad, InCyte, and Novartis.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
[email protected]
On Twitter @mitchelzoler
NEW YORK – A pair of new monoclonal antibodies have dramatically changed treatment for patients with acute lymphoblastic leukemia to prepare them for a stem cell transplant, Daniel J. DeAngelo, MD, said at a conference held by Imedex.
“We don’t use standard chemotherapy for reinduction anymore; we use blinatumomab or inotuzumab,” said Dr. DeAngelo, a hematologist oncologist at Dana-Farber Cancer Institute in Boston.
Blinatumomab (Blincyto), approved by the Food and Drug Administration in 2014, has produced “exceptional” response rates, becoming “standard of care” for patients with relapsed acute lymphoblastic leukemia (ALL) that does not have a Philadelphia chromosome, Dr. DeAngelo said in a video interview.
Approved based on results from a phase II study, blinatumomab’s efficacy and safety were recently further delineated in results from the first phase III trial (N Engl J Med. 2017 Mar 2;376[9]:836-74), with 376 treated patients. In that trial, blinatumomab more than doubled the complete remission rate, compared with control patients (34% vs. 16%), and nearly doubled median overall survival – 7.7 months with blinatumomab, compared with 4.0 months for control patients treated with standard chemotherapy.
These findings “further substantiated” blinatumomab’s role, he said.
Blinatumomab’s big limitations are certain adverse effects and the logistics of its dosing. The major adverse effect is “cytokine release syndrome,” which manifests as fever, low blood pressure, and neurologic toxicities that can range from tremors to encephalopathy and seizure. These are manageable by close observation of patients by experienced nurses, Dr. DeAngelo said.
Dosing involves 4 weeks of continuous infusion, starting with 10 days done entirely in the hospital, with the remaining 18 days with patients going home but needing to return every 48 hours to have their infusion bag changed. “Depending on how far the patient lives from the clinic, it can be a logistical challenge,” he said.
A second new antibody he has used on many patients is inotuzumab, which was accepted for review for approval by the FDA in February 2017, with action expected by August.
Dr. DeAngelo served as a coinvestigator in a phase III trial reported in 2016 with 218 evaluable patients. In that trial, investigators reported an 81% complete remission rate with inotuzumab treatment, compared with a 29% among control patients on chemotherapy (N Engl J Med. 2016 Aug 25;375[8]:740-53).
Inotuzumab was effective against patients with Philadelphia chromosome positive ALL, but it will not work for the roughly 5%-10% of ALL patients who lack CD-22 expression in their B-cell ALL.
Inotuzumab is easier to administer than blinatumomab, requiring a once a week infusion, and causes little immediate toxicity – although thrombocytopenia and liver-function abnormalities can occur with continued use, and the risk of veno-occlusive disease is increased when patients later receive a stem cell transplant, Dr. DeAngelo said.
“It’s nice to have options” when choosing antibody-based treatment, he said. Blinatumomab is a good choice for patients with a lower tumor burden – either patients with early relapse or with minimal residual disease – while inotuzumab works better for patients with more bulky disease, as well as those who are not able to accommodate the logistic demands of blinatumomab infusions.
Dr. DeAngelo also highlighted several trials now underway that are testing the efficacy of both antibodies when used as part of first-line treatment.
Dr. DeAngelo has been a consultant to Amgen, the company that markets blinatumomab (Blincyto); to Pfizer, the company developing inotuzumab; and to Ariad, InCyte, and Novartis.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
[email protected]
On Twitter @mitchelzoler
NEW YORK – A pair of new monoclonal antibodies have dramatically changed treatment for patients with acute lymphoblastic leukemia to prepare them for a stem cell transplant, Daniel J. DeAngelo, MD, said at a conference held by Imedex.
“We don’t use standard chemotherapy for reinduction anymore; we use blinatumomab or inotuzumab,” said Dr. DeAngelo, a hematologist oncologist at Dana-Farber Cancer Institute in Boston.
Blinatumomab (Blincyto), approved by the Food and Drug Administration in 2014, has produced “exceptional” response rates, becoming “standard of care” for patients with relapsed acute lymphoblastic leukemia (ALL) that does not have a Philadelphia chromosome, Dr. DeAngelo said in a video interview.
Approved based on results from a phase II study, blinatumomab’s efficacy and safety were recently further delineated in results from the first phase III trial (N Engl J Med. 2017 Mar 2;376[9]:836-74), with 376 treated patients. In that trial, blinatumomab more than doubled the complete remission rate, compared with control patients (34% vs. 16%), and nearly doubled median overall survival – 7.7 months with blinatumomab, compared with 4.0 months for control patients treated with standard chemotherapy.
These findings “further substantiated” blinatumomab’s role, he said.
Blinatumomab’s big limitations are certain adverse effects and the logistics of its dosing. The major adverse effect is “cytokine release syndrome,” which manifests as fever, low blood pressure, and neurologic toxicities that can range from tremors to encephalopathy and seizure. These are manageable by close observation of patients by experienced nurses, Dr. DeAngelo said.
Dosing involves 4 weeks of continuous infusion, starting with 10 days done entirely in the hospital, with the remaining 18 days with patients going home but needing to return every 48 hours to have their infusion bag changed. “Depending on how far the patient lives from the clinic, it can be a logistical challenge,” he said.
A second new antibody he has used on many patients is inotuzumab, which was accepted for review for approval by the FDA in February 2017, with action expected by August.
Dr. DeAngelo served as a coinvestigator in a phase III trial reported in 2016 with 218 evaluable patients. In that trial, investigators reported an 81% complete remission rate with inotuzumab treatment, compared with a 29% among control patients on chemotherapy (N Engl J Med. 2016 Aug 25;375[8]:740-53).
Inotuzumab was effective against patients with Philadelphia chromosome positive ALL, but it will not work for the roughly 5%-10% of ALL patients who lack CD-22 expression in their B-cell ALL.
Inotuzumab is easier to administer than blinatumomab, requiring a once a week infusion, and causes little immediate toxicity – although thrombocytopenia and liver-function abnormalities can occur with continued use, and the risk of veno-occlusive disease is increased when patients later receive a stem cell transplant, Dr. DeAngelo said.
“It’s nice to have options” when choosing antibody-based treatment, he said. Blinatumomab is a good choice for patients with a lower tumor burden – either patients with early relapse or with minimal residual disease – while inotuzumab works better for patients with more bulky disease, as well as those who are not able to accommodate the logistic demands of blinatumomab infusions.
Dr. DeAngelo also highlighted several trials now underway that are testing the efficacy of both antibodies when used as part of first-line treatment.
Dr. DeAngelo has been a consultant to Amgen, the company that markets blinatumomab (Blincyto); to Pfizer, the company developing inotuzumab; and to Ariad, InCyte, and Novartis.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
[email protected]
On Twitter @mitchelzoler
EXPERT ANALYSIS FROM A MEETING ON HEMATOLOGIC MALIGNANCIES
Avoid laxatives or stool softeners prior to C. difficile test in ped in patients
Improved education could keep clinicians from ordering laxatives and stool softeners just prior to ordering Clostridium difficile tests, improving the specificity of such testing.
In a 4-month prospective cohort study of pediatric inpatients, most clinicians were aware that their patients were receiving laxatives or stool softeners before they were sent for C. difficile tests.
Misclassifying colonized patients as having CDI reduces the utility of interfacility data comparison and may potentially result in inappropriate penalties for hospitals, they added.
From August to November 2015, 217 pediatric inpatients underwent 278 C. difficile tests that were eligible for inclusion in the study. There were 48 episodes of CDI in 44 patients in which bowel medications were administered prior to the test.
Of the 48 tests sent after receipt of bowel medications, either the ordering clinician or the bedside nurse completed the qualitative survey in 42 instances (88%). The response rate was higher among nurses (71%) than clinicians (48%). Awareness of bowel medication administration before testing was acknowledged by 78% of ordering providers and 88% bedside nurses. A majority of nurses (70%) and providers (71%) reported the medical team wanted the test.
“The main reasons for testing included a change in stool quality from baseline, other changes in clinical status, and the presence of risk factors for CDI,” Ms. Kinlay and Dr. Sandora said.
Read more in the American Journal of Infection Control (2017 Mar 13. doi: 10.1016/j.ajic.2017.01.035).
Improved education could keep clinicians from ordering laxatives and stool softeners just prior to ordering Clostridium difficile tests, improving the specificity of such testing.
In a 4-month prospective cohort study of pediatric inpatients, most clinicians were aware that their patients were receiving laxatives or stool softeners before they were sent for C. difficile tests.
Misclassifying colonized patients as having CDI reduces the utility of interfacility data comparison and may potentially result in inappropriate penalties for hospitals, they added.
From August to November 2015, 217 pediatric inpatients underwent 278 C. difficile tests that were eligible for inclusion in the study. There were 48 episodes of CDI in 44 patients in which bowel medications were administered prior to the test.
Of the 48 tests sent after receipt of bowel medications, either the ordering clinician or the bedside nurse completed the qualitative survey in 42 instances (88%). The response rate was higher among nurses (71%) than clinicians (48%). Awareness of bowel medication administration before testing was acknowledged by 78% of ordering providers and 88% bedside nurses. A majority of nurses (70%) and providers (71%) reported the medical team wanted the test.
“The main reasons for testing included a change in stool quality from baseline, other changes in clinical status, and the presence of risk factors for CDI,” Ms. Kinlay and Dr. Sandora said.
Read more in the American Journal of Infection Control (2017 Mar 13. doi: 10.1016/j.ajic.2017.01.035).
Improved education could keep clinicians from ordering laxatives and stool softeners just prior to ordering Clostridium difficile tests, improving the specificity of such testing.
In a 4-month prospective cohort study of pediatric inpatients, most clinicians were aware that their patients were receiving laxatives or stool softeners before they were sent for C. difficile tests.
Misclassifying colonized patients as having CDI reduces the utility of interfacility data comparison and may potentially result in inappropriate penalties for hospitals, they added.
From August to November 2015, 217 pediatric inpatients underwent 278 C. difficile tests that were eligible for inclusion in the study. There were 48 episodes of CDI in 44 patients in which bowel medications were administered prior to the test.
Of the 48 tests sent after receipt of bowel medications, either the ordering clinician or the bedside nurse completed the qualitative survey in 42 instances (88%). The response rate was higher among nurses (71%) than clinicians (48%). Awareness of bowel medication administration before testing was acknowledged by 78% of ordering providers and 88% bedside nurses. A majority of nurses (70%) and providers (71%) reported the medical team wanted the test.
“The main reasons for testing included a change in stool quality from baseline, other changes in clinical status, and the presence of risk factors for CDI,” Ms. Kinlay and Dr. Sandora said.
Read more in the American Journal of Infection Control (2017 Mar 13. doi: 10.1016/j.ajic.2017.01.035).
FROM THE JOURNAL OF INFECTION CONTROL
MedPAC: Medicare Part B drug payment cuts, shared savings could save $5 billion
WASHINGTON – Reducing the amount physicians are paid for drugs administered in their offices and introducing shared savings could save Medicare up to $5 billion over 5 years, according to recommendations from the Medicare Payment Advisory Commission.
Those MedPAC recommendations to Congress include cutting physicians’ average sales price add-on percentage, as well as an alternative purchasing initiative called the Drug Value Program that would allow shared savings through more effective pharmaceutical utilization.
Physicians should not be in a position to provide Part B drugs at a financial loss, Dr. Crosson noted. But the current 6% add-on to average sales price (ASP) “overpays many physicians and institutions, and is inherently a cost-ineffecient payment system for the Medicare program,” he added.
Dr. Crosson also noted that current free market principles do not seem to be working effectively to keep drug costs down.
MedPAC’s proposal is designed to strengthen market dynamics for Part B drugs by “creating more equilibrium between the buyer and the seller than currently exists,” Dr. Crosson explained. An alternative reimbursement system will lower overall drug costs for patients while preserving quality and sharing savings with physicians, he said.
If implemented, the proposals could save Medicare between $250 million and $750 million in the first year, and between $1 billion and $5 billion within 5 years. MedPAC staff said.
All present MedPAC members (with one member not present) voted unanimously in favor of moving the two-part recommendation forward to Congress.
The first part, which would start in 2018, would alter the current Part B drug payment process. Currently, doctors receive ASP plus 6%, or wholesale acquisition cost (WAC) plus 6% for drugs without sufficient ASP history.
The proposal would enhance ASP reporting, including requiring more manufacturers to submit data and increasing fines by an unspecified amount for those that fail to meet reporting standards. The WAC add-on percentage would be reduced to 3%.
A to-be-determined inflation index would be applied to ASP and would trigger automatic rebates if ASP climbs faster than inflation. Finally, billing codes for biosimilars and their reference products would be combined.
Under the second part of MedPAC’s recommendation, in 2022 providers would face a choice: continue to have Part B drugs paid for under the ASP scheme with a reduced add-on percentage of 3%, or take part in the Drug Value Program.
Under the Drug Value Program, physicians would sign up with one of several vendors that would be charged with negotiating prices for Part B drugs. Physicians would pay the negotiated prices for the drugs. Vendors would have standard formulary tools, such as prior authorization, tiering, and step-therapy. For a very small subset of drugs with no competition in the marketplace, the proposal includes a binding arbitration process, the specific details to be determined later.
Savings generated from participating in the Drug Value Program would be shared with providers, much like other value programs that provide opportunities for shared savings in exchange for assuming a level of risk.
It was the binding-arbitration process that garnered the most concern from commission members.
“I am absolutely opposed to arbitration,” Amy Bricker, vice president of supply chain strategy at Express Scripts, St. Louis, said. “The message that the commission is sending is that we believe in free markets, but then we don’t. The free market today would allow for many of the things that we are attempting to do with the DVP.”
She called for more detailed discussion on the arbitration process. Her concerns were echoed by other commission members. “I don’t think that arbitration ultimately results in lowering the pricing,” Ms. Bricker added, suggesting it could also open the door to collusion between DVP vendors.
The proposal will be included in MedPAC’s June 2017 report to Congress.
WASHINGTON – Reducing the amount physicians are paid for drugs administered in their offices and introducing shared savings could save Medicare up to $5 billion over 5 years, according to recommendations from the Medicare Payment Advisory Commission.
Those MedPAC recommendations to Congress include cutting physicians’ average sales price add-on percentage, as well as an alternative purchasing initiative called the Drug Value Program that would allow shared savings through more effective pharmaceutical utilization.
Physicians should not be in a position to provide Part B drugs at a financial loss, Dr. Crosson noted. But the current 6% add-on to average sales price (ASP) “overpays many physicians and institutions, and is inherently a cost-ineffecient payment system for the Medicare program,” he added.
Dr. Crosson also noted that current free market principles do not seem to be working effectively to keep drug costs down.
MedPAC’s proposal is designed to strengthen market dynamics for Part B drugs by “creating more equilibrium between the buyer and the seller than currently exists,” Dr. Crosson explained. An alternative reimbursement system will lower overall drug costs for patients while preserving quality and sharing savings with physicians, he said.
If implemented, the proposals could save Medicare between $250 million and $750 million in the first year, and between $1 billion and $5 billion within 5 years. MedPAC staff said.
All present MedPAC members (with one member not present) voted unanimously in favor of moving the two-part recommendation forward to Congress.
The first part, which would start in 2018, would alter the current Part B drug payment process. Currently, doctors receive ASP plus 6%, or wholesale acquisition cost (WAC) plus 6% for drugs without sufficient ASP history.
The proposal would enhance ASP reporting, including requiring more manufacturers to submit data and increasing fines by an unspecified amount for those that fail to meet reporting standards. The WAC add-on percentage would be reduced to 3%.
A to-be-determined inflation index would be applied to ASP and would trigger automatic rebates if ASP climbs faster than inflation. Finally, billing codes for biosimilars and their reference products would be combined.
Under the second part of MedPAC’s recommendation, in 2022 providers would face a choice: continue to have Part B drugs paid for under the ASP scheme with a reduced add-on percentage of 3%, or take part in the Drug Value Program.
Under the Drug Value Program, physicians would sign up with one of several vendors that would be charged with negotiating prices for Part B drugs. Physicians would pay the negotiated prices for the drugs. Vendors would have standard formulary tools, such as prior authorization, tiering, and step-therapy. For a very small subset of drugs with no competition in the marketplace, the proposal includes a binding arbitration process, the specific details to be determined later.
Savings generated from participating in the Drug Value Program would be shared with providers, much like other value programs that provide opportunities for shared savings in exchange for assuming a level of risk.
It was the binding-arbitration process that garnered the most concern from commission members.
“I am absolutely opposed to arbitration,” Amy Bricker, vice president of supply chain strategy at Express Scripts, St. Louis, said. “The message that the commission is sending is that we believe in free markets, but then we don’t. The free market today would allow for many of the things that we are attempting to do with the DVP.”
She called for more detailed discussion on the arbitration process. Her concerns were echoed by other commission members. “I don’t think that arbitration ultimately results in lowering the pricing,” Ms. Bricker added, suggesting it could also open the door to collusion between DVP vendors.
The proposal will be included in MedPAC’s June 2017 report to Congress.
WASHINGTON – Reducing the amount physicians are paid for drugs administered in their offices and introducing shared savings could save Medicare up to $5 billion over 5 years, according to recommendations from the Medicare Payment Advisory Commission.
Those MedPAC recommendations to Congress include cutting physicians’ average sales price add-on percentage, as well as an alternative purchasing initiative called the Drug Value Program that would allow shared savings through more effective pharmaceutical utilization.
Physicians should not be in a position to provide Part B drugs at a financial loss, Dr. Crosson noted. But the current 6% add-on to average sales price (ASP) “overpays many physicians and institutions, and is inherently a cost-ineffecient payment system for the Medicare program,” he added.
Dr. Crosson also noted that current free market principles do not seem to be working effectively to keep drug costs down.
MedPAC’s proposal is designed to strengthen market dynamics for Part B drugs by “creating more equilibrium between the buyer and the seller than currently exists,” Dr. Crosson explained. An alternative reimbursement system will lower overall drug costs for patients while preserving quality and sharing savings with physicians, he said.
If implemented, the proposals could save Medicare between $250 million and $750 million in the first year, and between $1 billion and $5 billion within 5 years. MedPAC staff said.
All present MedPAC members (with one member not present) voted unanimously in favor of moving the two-part recommendation forward to Congress.
The first part, which would start in 2018, would alter the current Part B drug payment process. Currently, doctors receive ASP plus 6%, or wholesale acquisition cost (WAC) plus 6% for drugs without sufficient ASP history.
The proposal would enhance ASP reporting, including requiring more manufacturers to submit data and increasing fines by an unspecified amount for those that fail to meet reporting standards. The WAC add-on percentage would be reduced to 3%.
A to-be-determined inflation index would be applied to ASP and would trigger automatic rebates if ASP climbs faster than inflation. Finally, billing codes for biosimilars and their reference products would be combined.
Under the second part of MedPAC’s recommendation, in 2022 providers would face a choice: continue to have Part B drugs paid for under the ASP scheme with a reduced add-on percentage of 3%, or take part in the Drug Value Program.
Under the Drug Value Program, physicians would sign up with one of several vendors that would be charged with negotiating prices for Part B drugs. Physicians would pay the negotiated prices for the drugs. Vendors would have standard formulary tools, such as prior authorization, tiering, and step-therapy. For a very small subset of drugs with no competition in the marketplace, the proposal includes a binding arbitration process, the specific details to be determined later.
Savings generated from participating in the Drug Value Program would be shared with providers, much like other value programs that provide opportunities for shared savings in exchange for assuming a level of risk.
It was the binding-arbitration process that garnered the most concern from commission members.
“I am absolutely opposed to arbitration,” Amy Bricker, vice president of supply chain strategy at Express Scripts, St. Louis, said. “The message that the commission is sending is that we believe in free markets, but then we don’t. The free market today would allow for many of the things that we are attempting to do with the DVP.”
She called for more detailed discussion on the arbitration process. Her concerns were echoed by other commission members. “I don’t think that arbitration ultimately results in lowering the pricing,” Ms. Bricker added, suggesting it could also open the door to collusion between DVP vendors.
The proposal will be included in MedPAC’s June 2017 report to Congress.
AT MEDPAC
Cutting back ICU antibiotics could significantly reduce MDRO transmissions
Cutting back on antibiotic courses in intensive care unit settings can significantly reduce the number of multidrug-resistant organism (MDRO) transmissions, according to the findings of a modeling study.
“Significant opportunities exist to optimize and reduce antibiotic usage, [but] the impact of reducing overall antibiotic usage on antibiotic resistance is not known and would be difficult to assess using traditional study designs,” wrote Sean L. Barnes, PhD, of the University of Maryland, College Park, and his colleagues. “Therefore, we applied mathematical modeling to estimate the effect of reducing antibiotic usage on antibiotic resistance.”
Using an agent-based model – which allows for a realistic prediction of interactions between patients and health care workers, while also allowing for heterogeneity in the characteristics of each distinct “person” – Dr. Barnes and his coinvestigators simulated the transmission of MDROs from health care workers to patients.
Methicillin-resistant Staphylococcus aureus and vancomycin-resistant enterococci were deemed “high-prevalence pathogens;” carbapenem-resistant Enterobacteriaceae, multidrug-resistant Acinetobacter baumannii, and multidrug-resistant Pseudomonas aeruginosa were deemed low-prevalence pathogens. These designations were based on transmission rates found in existing literature.
Patients on antibiotic courses were set at 75% (0.75) at baseline, which was then adjusted to determine its effect on overall MDRO transmission. The number of patients at baseline was 18, with nine nurses, two physicians, and six other health care workers. Mean length-of-stay was 3.5 days, hand hygiene rates were set at 80% for nurses and 50% for physicians, with a 0.83 (83%) efficacy rate when followed. The probability of worker-to-patient transmission was set at 0.025 (2.5%), and set at 0.075 (7.5%) for transmission going the other way.
“We simulated the transmission of the high- and low-prevalence MDROs for 1 year [and] performed 200 replications each for 33 parameter-based scenarios,” the authors said.
When the number of patients on an antibiotic course was dropped from 75% to 65% (a drop of 10%), the rate of high-prevalence MDRO transmission dropped by 11.2% (P < .001). When reduced from 75% to 50% (a drop of 25%), the high-prevalence MDRO transmission rate fell by 28.3% (P < .001), according to the model.
Low-prevalence MDROs also reduced by significant amounts when antibiotic regimens were cut back by the same percentages, with transmission rates falling by 14.3% (P < .001) and 29.8% (P < .001), respectively.
In terms of microbiome effects, the 10% reduction in antibiotics lowered high-prevalence rates by an effect of 1.5, and low-prevalence rates by 1.7; those numbers were 1.2 and 1.4, respectively, when antibiotics were dropped by 25%.
“These reductions are statistically significant and proportionally similar for both high- and low-prevalence MDROs,” the authors concluded, “and they can potentially decrease MDRO acquisition among patients who are receiving antibiotics, as well as among patients who are not receiving antibiotics.”
The National Institutes of Health and the Department of Veterans Affairs’ Health Services Research and Development Department funded the study. Dr. Barnes and his coauthors reported no relevant financial disclosures.
Cutting back on antibiotic courses in intensive care unit settings can significantly reduce the number of multidrug-resistant organism (MDRO) transmissions, according to the findings of a modeling study.
“Significant opportunities exist to optimize and reduce antibiotic usage, [but] the impact of reducing overall antibiotic usage on antibiotic resistance is not known and would be difficult to assess using traditional study designs,” wrote Sean L. Barnes, PhD, of the University of Maryland, College Park, and his colleagues. “Therefore, we applied mathematical modeling to estimate the effect of reducing antibiotic usage on antibiotic resistance.”
Using an agent-based model – which allows for a realistic prediction of interactions between patients and health care workers, while also allowing for heterogeneity in the characteristics of each distinct “person” – Dr. Barnes and his coinvestigators simulated the transmission of MDROs from health care workers to patients.
Methicillin-resistant Staphylococcus aureus and vancomycin-resistant enterococci were deemed “high-prevalence pathogens;” carbapenem-resistant Enterobacteriaceae, multidrug-resistant Acinetobacter baumannii, and multidrug-resistant Pseudomonas aeruginosa were deemed low-prevalence pathogens. These designations were based on transmission rates found in existing literature.
Patients on antibiotic courses were set at 75% (0.75) at baseline, which was then adjusted to determine its effect on overall MDRO transmission. The number of patients at baseline was 18, with nine nurses, two physicians, and six other health care workers. Mean length-of-stay was 3.5 days, hand hygiene rates were set at 80% for nurses and 50% for physicians, with a 0.83 (83%) efficacy rate when followed. The probability of worker-to-patient transmission was set at 0.025 (2.5%), and set at 0.075 (7.5%) for transmission going the other way.
“We simulated the transmission of the high- and low-prevalence MDROs for 1 year [and] performed 200 replications each for 33 parameter-based scenarios,” the authors said.
When the number of patients on an antibiotic course was dropped from 75% to 65% (a drop of 10%), the rate of high-prevalence MDRO transmission dropped by 11.2% (P < .001). When reduced from 75% to 50% (a drop of 25%), the high-prevalence MDRO transmission rate fell by 28.3% (P < .001), according to the model.
Low-prevalence MDROs also reduced by significant amounts when antibiotic regimens were cut back by the same percentages, with transmission rates falling by 14.3% (P < .001) and 29.8% (P < .001), respectively.
In terms of microbiome effects, the 10% reduction in antibiotics lowered high-prevalence rates by an effect of 1.5, and low-prevalence rates by 1.7; those numbers were 1.2 and 1.4, respectively, when antibiotics were dropped by 25%.
“These reductions are statistically significant and proportionally similar for both high- and low-prevalence MDROs,” the authors concluded, “and they can potentially decrease MDRO acquisition among patients who are receiving antibiotics, as well as among patients who are not receiving antibiotics.”
The National Institutes of Health and the Department of Veterans Affairs’ Health Services Research and Development Department funded the study. Dr. Barnes and his coauthors reported no relevant financial disclosures.
Cutting back on antibiotic courses in intensive care unit settings can significantly reduce the number of multidrug-resistant organism (MDRO) transmissions, according to the findings of a modeling study.
“Significant opportunities exist to optimize and reduce antibiotic usage, [but] the impact of reducing overall antibiotic usage on antibiotic resistance is not known and would be difficult to assess using traditional study designs,” wrote Sean L. Barnes, PhD, of the University of Maryland, College Park, and his colleagues. “Therefore, we applied mathematical modeling to estimate the effect of reducing antibiotic usage on antibiotic resistance.”
Using an agent-based model – which allows for a realistic prediction of interactions between patients and health care workers, while also allowing for heterogeneity in the characteristics of each distinct “person” – Dr. Barnes and his coinvestigators simulated the transmission of MDROs from health care workers to patients.
Methicillin-resistant Staphylococcus aureus and vancomycin-resistant enterococci were deemed “high-prevalence pathogens;” carbapenem-resistant Enterobacteriaceae, multidrug-resistant Acinetobacter baumannii, and multidrug-resistant Pseudomonas aeruginosa were deemed low-prevalence pathogens. These designations were based on transmission rates found in existing literature.
Patients on antibiotic courses were set at 75% (0.75) at baseline, which was then adjusted to determine its effect on overall MDRO transmission. The number of patients at baseline was 18, with nine nurses, two physicians, and six other health care workers. Mean length-of-stay was 3.5 days, hand hygiene rates were set at 80% for nurses and 50% for physicians, with a 0.83 (83%) efficacy rate when followed. The probability of worker-to-patient transmission was set at 0.025 (2.5%), and set at 0.075 (7.5%) for transmission going the other way.
“We simulated the transmission of the high- and low-prevalence MDROs for 1 year [and] performed 200 replications each for 33 parameter-based scenarios,” the authors said.
When the number of patients on an antibiotic course was dropped from 75% to 65% (a drop of 10%), the rate of high-prevalence MDRO transmission dropped by 11.2% (P < .001). When reduced from 75% to 50% (a drop of 25%), the high-prevalence MDRO transmission rate fell by 28.3% (P < .001), according to the model.
Low-prevalence MDROs also reduced by significant amounts when antibiotic regimens were cut back by the same percentages, with transmission rates falling by 14.3% (P < .001) and 29.8% (P < .001), respectively.
In terms of microbiome effects, the 10% reduction in antibiotics lowered high-prevalence rates by an effect of 1.5, and low-prevalence rates by 1.7; those numbers were 1.2 and 1.4, respectively, when antibiotics were dropped by 25%.
“These reductions are statistically significant and proportionally similar for both high- and low-prevalence MDROs,” the authors concluded, “and they can potentially decrease MDRO acquisition among patients who are receiving antibiotics, as well as among patients who are not receiving antibiotics.”
The National Institutes of Health and the Department of Veterans Affairs’ Health Services Research and Development Department funded the study. Dr. Barnes and his coauthors reported no relevant financial disclosures.
FROM INFECTION CONTROL & HOSPITAL EPIDEMIOLOGY
Key clinical point:
Major finding: A 10% reduction in prescribed antibiotic courses saw high-prevalence MDRO transmission drop by 11.2%, and a 25% reduction caused a drop of 28.3%; low-prevalence MDROs dropped by 14.3% and 29.8%, respectively (P < .001 for all).
Data source: An agent-based model of a single ICU with 18 patients and 17 health care workers at baseline.
Disclosures: The National Institutes of Health and the Department of Veterans Affairs’ Health Services Research and Development Department funded the study. Dr. Barnes and his coauthors reported no relevant financial disclosures.
Infants’ head circumference larger with PCOS moms on metformin
Women with polycystic ovarian syndrome (PCOS) who took metformin had newborns with larger heads than the offspring of women with PCOS who took a placebo, though PCOS offspring were, on average, shorter than newborns in a reference population, according to a recent study.
Metformin is commonly prescribed to women with PCOS, and though metformin passes the placental barrier and can reach therapeutic concentrations in the umbilical cord blood, it hasn’t been proven teratogenic, said Anna Hjorth-Hansen, MD, of the internal medicine department at Levanger Hospital, Norway.
The study was a post hoc analysis of data from the PregMet study, which was run from 2005 to 2009, and compared metformin to placebo, in combination with diet and lifestyle changes, for women with PCOS, said Dr. Hjorth-Hansen. The PregMet study tested the hypothesis that women with PCOS who received metformin from the first trimester until delivery had fewer pregnancy complications overall than women who did not.
Though women receiving placebo in the PregMet study had no more eclampsia, preterm delivery, or gestational diabetes than those who received metformin, the newborns who had in utero metformin exposure had significantly larger head circumference.
Dr. Hjorth-Hansen said that the study looked at in utero growth and anthropometric measurements at birth of infants born to women taking metformin, to determine whether metformin could affect fetal growth and newborn anthropometrics.
Ultrasound examination was used to measure crown-rump length, biparietal diameter (BPD), and mean abdominal diameter (MAD). At birth, head circumference (HC), length, and weight were measured.
Maternal characteristics were comparable between the metformin (131 patients) and placebo (127 patients) groups, said Dr. Hjorth-Hansen. Specifically, there were no significant differences between groups in terms of PCOS phenotype, blood glucose levels, and parity.
Infants born to women who took metformin had, on average, a larger BPD at 32 weeks gestation, compared with the infants whose mothers took placebo (86.1 mm versus 85.2 mm, P = .027). This larger head size was also seen at birth (mean HC, metformin, 35.6 cm; placebo, 35.0 cm; P = .007).
There were no significant differences between the groups in MAD or weight, either as assessed by ultrasound at 32 weeks gestation or as measured at birth.
Although the two groups did not differ in length at birth, the aggregate study population of infants born to women with PCOS was shorter than a large Swedish reference population.
When Dr. Hjorth-Hansen and her colleagues stratified the results by maternal body mass index (BMI), looking at babies born to women with BMIs below 25 kg/m2, compared with those with BMI of 25 kg/m2 and greater, they saw no differences in infant anthropometric measurements for women who had taken placebo.
However, when the investigators dichotomized maternal BMI for the metformin group, they found that infants born to the higher BMI group had a larger head size (P = .022), and were heavier (P = .002) and longer (P = .003) than infants born to women with BMIs less than 25.
“Metformin resulted in a larger head size, traceable already in utero,” said Dr. Hjorth-Hansen. However, she said, there’s a “PCOS effect” that results in the offspring of women with the condition to have a shorter body, compared with offspring of women without PCOS.
Dr. Hjorth-Hansen reported no relevant disclosures.
[email protected]
On Twitter @karioakes
Women with polycystic ovarian syndrome (PCOS) who took metformin had newborns with larger heads than the offspring of women with PCOS who took a placebo, though PCOS offspring were, on average, shorter than newborns in a reference population, according to a recent study.
Metformin is commonly prescribed to women with PCOS, and though metformin passes the placental barrier and can reach therapeutic concentrations in the umbilical cord blood, it hasn’t been proven teratogenic, said Anna Hjorth-Hansen, MD, of the internal medicine department at Levanger Hospital, Norway.
The study was a post hoc analysis of data from the PregMet study, which was run from 2005 to 2009, and compared metformin to placebo, in combination with diet and lifestyle changes, for women with PCOS, said Dr. Hjorth-Hansen. The PregMet study tested the hypothesis that women with PCOS who received metformin from the first trimester until delivery had fewer pregnancy complications overall than women who did not.
Though women receiving placebo in the PregMet study had no more eclampsia, preterm delivery, or gestational diabetes than those who received metformin, the newborns who had in utero metformin exposure had significantly larger head circumference.
Dr. Hjorth-Hansen said that the study looked at in utero growth and anthropometric measurements at birth of infants born to women taking metformin, to determine whether metformin could affect fetal growth and newborn anthropometrics.
Ultrasound examination was used to measure crown-rump length, biparietal diameter (BPD), and mean abdominal diameter (MAD). At birth, head circumference (HC), length, and weight were measured.
Maternal characteristics were comparable between the metformin (131 patients) and placebo (127 patients) groups, said Dr. Hjorth-Hansen. Specifically, there were no significant differences between groups in terms of PCOS phenotype, blood glucose levels, and parity.
Infants born to women who took metformin had, on average, a larger BPD at 32 weeks gestation, compared with the infants whose mothers took placebo (86.1 mm versus 85.2 mm, P = .027). This larger head size was also seen at birth (mean HC, metformin, 35.6 cm; placebo, 35.0 cm; P = .007).
There were no significant differences between the groups in MAD or weight, either as assessed by ultrasound at 32 weeks gestation or as measured at birth.
Although the two groups did not differ in length at birth, the aggregate study population of infants born to women with PCOS was shorter than a large Swedish reference population.
When Dr. Hjorth-Hansen and her colleagues stratified the results by maternal body mass index (BMI), looking at babies born to women with BMIs below 25 kg/m2, compared with those with BMI of 25 kg/m2 and greater, they saw no differences in infant anthropometric measurements for women who had taken placebo.
However, when the investigators dichotomized maternal BMI for the metformin group, they found that infants born to the higher BMI group had a larger head size (P = .022), and were heavier (P = .002) and longer (P = .003) than infants born to women with BMIs less than 25.
“Metformin resulted in a larger head size, traceable already in utero,” said Dr. Hjorth-Hansen. However, she said, there’s a “PCOS effect” that results in the offspring of women with the condition to have a shorter body, compared with offspring of women without PCOS.
Dr. Hjorth-Hansen reported no relevant disclosures.
[email protected]
On Twitter @karioakes
Women with polycystic ovarian syndrome (PCOS) who took metformin had newborns with larger heads than the offspring of women with PCOS who took a placebo, though PCOS offspring were, on average, shorter than newborns in a reference population, according to a recent study.
Metformin is commonly prescribed to women with PCOS, and though metformin passes the placental barrier and can reach therapeutic concentrations in the umbilical cord blood, it hasn’t been proven teratogenic, said Anna Hjorth-Hansen, MD, of the internal medicine department at Levanger Hospital, Norway.
The study was a post hoc analysis of data from the PregMet study, which was run from 2005 to 2009, and compared metformin to placebo, in combination with diet and lifestyle changes, for women with PCOS, said Dr. Hjorth-Hansen. The PregMet study tested the hypothesis that women with PCOS who received metformin from the first trimester until delivery had fewer pregnancy complications overall than women who did not.
Though women receiving placebo in the PregMet study had no more eclampsia, preterm delivery, or gestational diabetes than those who received metformin, the newborns who had in utero metformin exposure had significantly larger head circumference.
Dr. Hjorth-Hansen said that the study looked at in utero growth and anthropometric measurements at birth of infants born to women taking metformin, to determine whether metformin could affect fetal growth and newborn anthropometrics.
Ultrasound examination was used to measure crown-rump length, biparietal diameter (BPD), and mean abdominal diameter (MAD). At birth, head circumference (HC), length, and weight were measured.
Maternal characteristics were comparable between the metformin (131 patients) and placebo (127 patients) groups, said Dr. Hjorth-Hansen. Specifically, there were no significant differences between groups in terms of PCOS phenotype, blood glucose levels, and parity.
Infants born to women who took metformin had, on average, a larger BPD at 32 weeks gestation, compared with the infants whose mothers took placebo (86.1 mm versus 85.2 mm, P = .027). This larger head size was also seen at birth (mean HC, metformin, 35.6 cm; placebo, 35.0 cm; P = .007).
There were no significant differences between the groups in MAD or weight, either as assessed by ultrasound at 32 weeks gestation or as measured at birth.
Although the two groups did not differ in length at birth, the aggregate study population of infants born to women with PCOS was shorter than a large Swedish reference population.
When Dr. Hjorth-Hansen and her colleagues stratified the results by maternal body mass index (BMI), looking at babies born to women with BMIs below 25 kg/m2, compared with those with BMI of 25 kg/m2 and greater, they saw no differences in infant anthropometric measurements for women who had taken placebo.
However, when the investigators dichotomized maternal BMI for the metformin group, they found that infants born to the higher BMI group had a larger head size (P = .022), and were heavier (P = .002) and longer (P = .003) than infants born to women with BMIs less than 25.
“Metformin resulted in a larger head size, traceable already in utero,” said Dr. Hjorth-Hansen. However, she said, there’s a “PCOS effect” that results in the offspring of women with the condition to have a shorter body, compared with offspring of women without PCOS.
Dr. Hjorth-Hansen reported no relevant disclosures.
[email protected]
On Twitter @karioakes
FROM ENDO 2017
Key clinical point:
Major finding: Infants born to women with PCOS who took metformin during pregnancy had larger head circumferences at birth than those born to women with PCOS who took placebo (35.6 cm versus 35.0 cm, P = .007).
Data source: Retrospective analysis of data from 258 women in the PregMet study.
Disclosures: Dr. Hjorth-Hansen reported no relevant disclosures.
Secukinumab beat etanercept in 52-week psoriasis quality of life analysis
Compared with etanercept, secukinumab was associated with significantly faster and greater improvements in skin-specific quality of life among adults with moderate to severe psoriasis, according to a pooled analysis of data from the randomized, multicenter phase III ERASURE and FIXTURE trials.
The findings confirm the clinical superiority of secukinumab and extend prior head-to-head comparisons of this IL-17A inhibitor and etanercept, said Bruce E. Strober, MD, PhD, of the University of Connecticut in Farmington, Conn.
ERASURE and FIXTURE compared secukinumab (weekly for 4 weeks, then once-monthly) with etanercept (twice weekly for 12 weeks, then once-weekly) and placebo. Both secukinumab doses outperformed etanercept and placebo based on coprimary endpoints of PASI 75 and the proportion of patients who were clear or almost clear on a 5-point modified investigator’s global assessment (N Engl J Med. 2014 Jul 24;371[4]:326-38). A subsequent analysis showed that the clinical superiority of secukinumab held up at 52 weeks (J Dermatolog Treat. 2016;27[1]:1).
Because health-related quality of life is of increasing interest in psoriasis, Dr. Strober and his associates analyzed ERASURE and FIXTURE data on the Dermatology Life Quality Index (DLQI), a 10-item assessment of daily activities, leisure, personal relationships, symptoms and feelings, treatment, and work or school. Their analysis included 572 patients who received 300 mg secukinumab, 572 patients who received 150 mg secukinumab, and 326 patients who received 50 mg etanercept.
Baseline DLQI scores were similar among treatment groups. By week 4, significantly more secukinumab than etanercept recipients reported DLQI scores of 0 or 1 (DLQI 0/1), indicating that psoriasis was not impairing their quality of life (P < .05). This difference remained significant through week 52 at the 300-mg dose. The 150-mg dose significantly outperformed etanercept by week 8, and the difference remained significant through week 48.
In all, 81% of patients who received 300 mg secukinumab reported DLQI 0/1 at week 24, compared with 71% of those who received 150 mg secukinumab and 55% of those who received etanercept (P < .001 for each comparison with etanercept).
Among patients with complete data available, 86% of 300-mg secukinumab recipients achieved DLQI 0/1 at week 24, maintained this response through 52 weeks, and achieved a 90%-100% reduction in PASI total score at week 24. That was true of only 75% of etanercept recipients (P = .02, compared with 300 mg secukinumab) and 79% of 150-mg secukinumab recipients (P = .4, compared with etanercept).
In an adjusted model, patients who received 300 mg secukinumab were 1.9 times more likely to reach DLQI 0/1 than those who received etanercept (95% confidence interval, 1.6-2.3). Median time to response to both doses of secukinumab was 12 weeks, versus 24 weeks for etanercept (P < .0001).
This analysis offers “a robust, longer-term head-to-head comparison” of how secukinumab and etanercept affect quality of life in psoriasis, the researchers wrote. “Future research should consider replicating this evaluation using real-world long-term data and including other aspects of patient-centered outcomes, such as work productivity and psoriasis-related symptoms.”
Novartis funded ERASURE and FIXTURE. Dr. Strober reported financial ties to AbbVie, Amgen, Novartis, and several other pharmaceutical companies.
Compared with etanercept, secukinumab was associated with significantly faster and greater improvements in skin-specific quality of life among adults with moderate to severe psoriasis, according to a pooled analysis of data from the randomized, multicenter phase III ERASURE and FIXTURE trials.
The findings confirm the clinical superiority of secukinumab and extend prior head-to-head comparisons of this IL-17A inhibitor and etanercept, said Bruce E. Strober, MD, PhD, of the University of Connecticut in Farmington, Conn.
ERASURE and FIXTURE compared secukinumab (weekly for 4 weeks, then once-monthly) with etanercept (twice weekly for 12 weeks, then once-weekly) and placebo. Both secukinumab doses outperformed etanercept and placebo based on coprimary endpoints of PASI 75 and the proportion of patients who were clear or almost clear on a 5-point modified investigator’s global assessment (N Engl J Med. 2014 Jul 24;371[4]:326-38). A subsequent analysis showed that the clinical superiority of secukinumab held up at 52 weeks (J Dermatolog Treat. 2016;27[1]:1).
Because health-related quality of life is of increasing interest in psoriasis, Dr. Strober and his associates analyzed ERASURE and FIXTURE data on the Dermatology Life Quality Index (DLQI), a 10-item assessment of daily activities, leisure, personal relationships, symptoms and feelings, treatment, and work or school. Their analysis included 572 patients who received 300 mg secukinumab, 572 patients who received 150 mg secukinumab, and 326 patients who received 50 mg etanercept.
Baseline DLQI scores were similar among treatment groups. By week 4, significantly more secukinumab than etanercept recipients reported DLQI scores of 0 or 1 (DLQI 0/1), indicating that psoriasis was not impairing their quality of life (P < .05). This difference remained significant through week 52 at the 300-mg dose. The 150-mg dose significantly outperformed etanercept by week 8, and the difference remained significant through week 48.
In all, 81% of patients who received 300 mg secukinumab reported DLQI 0/1 at week 24, compared with 71% of those who received 150 mg secukinumab and 55% of those who received etanercept (P < .001 for each comparison with etanercept).
Among patients with complete data available, 86% of 300-mg secukinumab recipients achieved DLQI 0/1 at week 24, maintained this response through 52 weeks, and achieved a 90%-100% reduction in PASI total score at week 24. That was true of only 75% of etanercept recipients (P = .02, compared with 300 mg secukinumab) and 79% of 150-mg secukinumab recipients (P = .4, compared with etanercept).
In an adjusted model, patients who received 300 mg secukinumab were 1.9 times more likely to reach DLQI 0/1 than those who received etanercept (95% confidence interval, 1.6-2.3). Median time to response to both doses of secukinumab was 12 weeks, versus 24 weeks for etanercept (P < .0001).
This analysis offers “a robust, longer-term head-to-head comparison” of how secukinumab and etanercept affect quality of life in psoriasis, the researchers wrote. “Future research should consider replicating this evaluation using real-world long-term data and including other aspects of patient-centered outcomes, such as work productivity and psoriasis-related symptoms.”
Novartis funded ERASURE and FIXTURE. Dr. Strober reported financial ties to AbbVie, Amgen, Novartis, and several other pharmaceutical companies.
Compared with etanercept, secukinumab was associated with significantly faster and greater improvements in skin-specific quality of life among adults with moderate to severe psoriasis, according to a pooled analysis of data from the randomized, multicenter phase III ERASURE and FIXTURE trials.
The findings confirm the clinical superiority of secukinumab and extend prior head-to-head comparisons of this IL-17A inhibitor and etanercept, said Bruce E. Strober, MD, PhD, of the University of Connecticut in Farmington, Conn.
ERASURE and FIXTURE compared secukinumab (weekly for 4 weeks, then once-monthly) with etanercept (twice weekly for 12 weeks, then once-weekly) and placebo. Both secukinumab doses outperformed etanercept and placebo based on coprimary endpoints of PASI 75 and the proportion of patients who were clear or almost clear on a 5-point modified investigator’s global assessment (N Engl J Med. 2014 Jul 24;371[4]:326-38). A subsequent analysis showed that the clinical superiority of secukinumab held up at 52 weeks (J Dermatolog Treat. 2016;27[1]:1).
Because health-related quality of life is of increasing interest in psoriasis, Dr. Strober and his associates analyzed ERASURE and FIXTURE data on the Dermatology Life Quality Index (DLQI), a 10-item assessment of daily activities, leisure, personal relationships, symptoms and feelings, treatment, and work or school. Their analysis included 572 patients who received 300 mg secukinumab, 572 patients who received 150 mg secukinumab, and 326 patients who received 50 mg etanercept.
Baseline DLQI scores were similar among treatment groups. By week 4, significantly more secukinumab than etanercept recipients reported DLQI scores of 0 or 1 (DLQI 0/1), indicating that psoriasis was not impairing their quality of life (P < .05). This difference remained significant through week 52 at the 300-mg dose. The 150-mg dose significantly outperformed etanercept by week 8, and the difference remained significant through week 48.
In all, 81% of patients who received 300 mg secukinumab reported DLQI 0/1 at week 24, compared with 71% of those who received 150 mg secukinumab and 55% of those who received etanercept (P < .001 for each comparison with etanercept).
Among patients with complete data available, 86% of 300-mg secukinumab recipients achieved DLQI 0/1 at week 24, maintained this response through 52 weeks, and achieved a 90%-100% reduction in PASI total score at week 24. That was true of only 75% of etanercept recipients (P = .02, compared with 300 mg secukinumab) and 79% of 150-mg secukinumab recipients (P = .4, compared with etanercept).
In an adjusted model, patients who received 300 mg secukinumab were 1.9 times more likely to reach DLQI 0/1 than those who received etanercept (95% confidence interval, 1.6-2.3). Median time to response to both doses of secukinumab was 12 weeks, versus 24 weeks for etanercept (P < .0001).
This analysis offers “a robust, longer-term head-to-head comparison” of how secukinumab and etanercept affect quality of life in psoriasis, the researchers wrote. “Future research should consider replicating this evaluation using real-world long-term data and including other aspects of patient-centered outcomes, such as work productivity and psoriasis-related symptoms.”
Novartis funded ERASURE and FIXTURE. Dr. Strober reported financial ties to AbbVie, Amgen, Novartis, and several other pharmaceutical companies.
FROM JOURNAL OF THE AMERICAN ACADEMY OF DERMATOLOGY
Key clinical point: Secukinumab was associated with faster and greater improvements in self-reported quality of life, compared with etanercept, in adults with moderate to severe plaque psoriasis.
Major finding: In all, 81% of patients who received 300 mg secukinumab reported DLQI 0/1 at week 24, compared with 71% of those who received 150 mg secukinumab and 55% of those who received etanercept (P < .001 for each comparison with etanercept).
Data source: A pooled analysis of the randomized, multicenter, phase III ERASURE and FIXTURE trials.
Disclosures: Novartis funded ERASURE and FIXTURE. Dr. Strober reported financial ties to AbbVie, Amgen, Novartis, and several other pharmaceutical companies.
Parental smoking linked to genetic changes in kids with ALL
Smoking by either parent helps promote genetic deletions in children that are associated with the development and progression of acute lymphoblastic leukemia (ALL), according to a study published in Cancer Research.
The strongest associations in this study were found in children whose parents smoked while the children were in utero and during their infancy.
However, the genetic deletions were also noted in the offspring of parents who may have quit smoking even before conception.
The link between ALL and parental smoking has already been established, but this is the first study that points to specific genetic changes in children with ALL, according to study author Adam de Smith, PhD, of the University of California San Francisco.
“With more smoking among the parents, we saw more deletions within the child’s ALL cells at diagnosis,” Dr de smith said.
For this study, he and his colleagues looked at pre-treatment tumor samples from 559 ALL patients.
The team wanted to see if any of the 8 genes that are frequently deleted in ALL patients (CDKN2A, ETV6, IKZF1, PAX5, RB1, BTG1, PAR1 region, and EBF1) were missing in the samples.
Questionnaires were given to parents to find out if smoking habits impacted the number of genetic deletions. Data was corroborated by a biomarker in newborns’ blood samples that indicates exposure to maternal smoking during pregnancy.
The researchers found that approximately two-thirds of the tumor samples (n=353) contained at least 1 deletion.
Deletions were considerably more common in children whose mothers had smoked during pregnancy and after birth.
For each 5 cigarettes smoked daily during pregnancy, there was a 22% increase in the number of deletions. For each 5 cigarettes smoked daily during breastfeeding, there was a 74% increase in the number of deletions.
Smoking of 5 cigarettes daily by the mother or father before conception was associated with a 7% to 8% higher number of deletions.
Role of child age and sex
One discovery the researchers found intriguing was the link between the fathers’ pre-conception smoking and their child’s age at diagnosis.
“There was a significant effect on deletion numbers in cases where the patient was age 6 or younger,” Dr de Smith said. “Our results suggest that paternal pre-conception smoking, which is known to cause oxidative damage to sperm DNA, may lead to a higher propensity of deletions in children with earlier-onset ALL.”
“It is also true that some of those fathers who smoked before conception also continue to smoke in the presence of the mother and child, so more research is needed to explain the mechanism of smoking-related damage in all of the time periods of exposure to the child.”
In addition, male children were found to be more sensitive to the effects of maternal smoking, including smoking that occurred pre-conception.
The researchers said this could be explained by the fact that male fetuses grow more rapidly, which leads to increased vulnerability of developing lymphocytes to toxins that cause genetic damage.
“Our study indicates that the more tobacco exposure, the more cumulative DNA damage is evident in the ALL cell,” said study author Joseph Wiemels, PhD, of the University of California San Francisco.
“While causes of ALL are multifactorial—including the inborn genetic makeup of the child, patterns of infection, pesticides, and other environmental exposure—if there was no smoking in the environment, then there would likely be fewer children with the disease. We may add ALL to the long list of diseases impacted by smoking, and, in this case, affecting one of our most vulnerable populations—our children.”
Smoking by either parent helps promote genetic deletions in children that are associated with the development and progression of acute lymphoblastic leukemia (ALL), according to a study published in Cancer Research.
The strongest associations in this study were found in children whose parents smoked while the children were in utero and during their infancy.
However, the genetic deletions were also noted in the offspring of parents who may have quit smoking even before conception.
The link between ALL and parental smoking has already been established, but this is the first study that points to specific genetic changes in children with ALL, according to study author Adam de Smith, PhD, of the University of California San Francisco.
“With more smoking among the parents, we saw more deletions within the child’s ALL cells at diagnosis,” Dr de smith said.
For this study, he and his colleagues looked at pre-treatment tumor samples from 559 ALL patients.
The team wanted to see if any of the 8 genes that are frequently deleted in ALL patients (CDKN2A, ETV6, IKZF1, PAX5, RB1, BTG1, PAR1 region, and EBF1) were missing in the samples.
Questionnaires were given to parents to find out if smoking habits impacted the number of genetic deletions. Data was corroborated by a biomarker in newborns’ blood samples that indicates exposure to maternal smoking during pregnancy.
The researchers found that approximately two-thirds of the tumor samples (n=353) contained at least 1 deletion.
Deletions were considerably more common in children whose mothers had smoked during pregnancy and after birth.
For each 5 cigarettes smoked daily during pregnancy, there was a 22% increase in the number of deletions. For each 5 cigarettes smoked daily during breastfeeding, there was a 74% increase in the number of deletions.
Smoking of 5 cigarettes daily by the mother or father before conception was associated with a 7% to 8% higher number of deletions.
Role of child age and sex
One discovery the researchers found intriguing was the link between the fathers’ pre-conception smoking and their child’s age at diagnosis.
“There was a significant effect on deletion numbers in cases where the patient was age 6 or younger,” Dr de Smith said. “Our results suggest that paternal pre-conception smoking, which is known to cause oxidative damage to sperm DNA, may lead to a higher propensity of deletions in children with earlier-onset ALL.”
“It is also true that some of those fathers who smoked before conception also continue to smoke in the presence of the mother and child, so more research is needed to explain the mechanism of smoking-related damage in all of the time periods of exposure to the child.”
In addition, male children were found to be more sensitive to the effects of maternal smoking, including smoking that occurred pre-conception.
The researchers said this could be explained by the fact that male fetuses grow more rapidly, which leads to increased vulnerability of developing lymphocytes to toxins that cause genetic damage.
“Our study indicates that the more tobacco exposure, the more cumulative DNA damage is evident in the ALL cell,” said study author Joseph Wiemels, PhD, of the University of California San Francisco.
“While causes of ALL are multifactorial—including the inborn genetic makeup of the child, patterns of infection, pesticides, and other environmental exposure—if there was no smoking in the environment, then there would likely be fewer children with the disease. We may add ALL to the long list of diseases impacted by smoking, and, in this case, affecting one of our most vulnerable populations—our children.”
Smoking by either parent helps promote genetic deletions in children that are associated with the development and progression of acute lymphoblastic leukemia (ALL), according to a study published in Cancer Research.
The strongest associations in this study were found in children whose parents smoked while the children were in utero and during their infancy.
However, the genetic deletions were also noted in the offspring of parents who may have quit smoking even before conception.
The link between ALL and parental smoking has already been established, but this is the first study that points to specific genetic changes in children with ALL, according to study author Adam de Smith, PhD, of the University of California San Francisco.
“With more smoking among the parents, we saw more deletions within the child’s ALL cells at diagnosis,” Dr de smith said.
For this study, he and his colleagues looked at pre-treatment tumor samples from 559 ALL patients.
The team wanted to see if any of the 8 genes that are frequently deleted in ALL patients (CDKN2A, ETV6, IKZF1, PAX5, RB1, BTG1, PAR1 region, and EBF1) were missing in the samples.
Questionnaires were given to parents to find out if smoking habits impacted the number of genetic deletions. Data was corroborated by a biomarker in newborns’ blood samples that indicates exposure to maternal smoking during pregnancy.
The researchers found that approximately two-thirds of the tumor samples (n=353) contained at least 1 deletion.
Deletions were considerably more common in children whose mothers had smoked during pregnancy and after birth.
For each 5 cigarettes smoked daily during pregnancy, there was a 22% increase in the number of deletions. For each 5 cigarettes smoked daily during breastfeeding, there was a 74% increase in the number of deletions.
Smoking of 5 cigarettes daily by the mother or father before conception was associated with a 7% to 8% higher number of deletions.
Role of child age and sex
One discovery the researchers found intriguing was the link between the fathers’ pre-conception smoking and their child’s age at diagnosis.
“There was a significant effect on deletion numbers in cases where the patient was age 6 or younger,” Dr de Smith said. “Our results suggest that paternal pre-conception smoking, which is known to cause oxidative damage to sperm DNA, may lead to a higher propensity of deletions in children with earlier-onset ALL.”
“It is also true that some of those fathers who smoked before conception also continue to smoke in the presence of the mother and child, so more research is needed to explain the mechanism of smoking-related damage in all of the time periods of exposure to the child.”
In addition, male children were found to be more sensitive to the effects of maternal smoking, including smoking that occurred pre-conception.
The researchers said this could be explained by the fact that male fetuses grow more rapidly, which leads to increased vulnerability of developing lymphocytes to toxins that cause genetic damage.
“Our study indicates that the more tobacco exposure, the more cumulative DNA damage is evident in the ALL cell,” said study author Joseph Wiemels, PhD, of the University of California San Francisco.
“While causes of ALL are multifactorial—including the inborn genetic makeup of the child, patterns of infection, pesticides, and other environmental exposure—if there was no smoking in the environment, then there would likely be fewer children with the disease. We may add ALL to the long list of diseases impacted by smoking, and, in this case, affecting one of our most vulnerable populations—our children.”
NET can benefit breast cancer patients with delayed surgery
SEATTLE – A short course of neoadjuvant therapy could be considered in breast cancer patients with expected delays to resection, while they are awaiting surgery, according to study findings presented at the annual Society of Surgical Oncology Cancer Symposium.
More than half of breast cancer patients who undergo surgical resection as the initial modality will experience delays to surgery of more than 4 weeks. Of this group, more than half of patients receive shorter than standard courses of neoadjuvant therapy (NET), and the patients most likely to benefit were those older than 50 years, with ductal tumors, and the effect was seen in all T stages.
Recent reports show that NET is increasing. However, Dr. De Andrade pointed out, delays in receiving surgery remain a problem in breast cancer treatment and are associated with worse overall and cancer specific survival.
“Off-label use of NET is sometimes used in patients undergoing surgical delays,” he said.
NET use for 3 months has been associated with decreasing the size of tumors in patients with hormone receptor–positive (HR+) invasive breast cancer and allowing for breast conservation therapy. While short-term NET is sometimes used in women who are experiencing delay to surgery, the incidence and efficacy of this regimen remains undefined.
In the current study, Dr. De Andrade and his colleagues sought to answer three clinical questions:
• How long are patients with operable breast cancer waiting to undergo surgery?
• What is the pattern of use of short-course NET?
• What are the effects of short-course NET on outcomes?
The investigators used the National Cancer Database (NCDB) to identify women who had undergone surgery for stage 1-3 HR+ invasive breast cancer from 2004 to 2013. A total of 530,009 patients met inclusion criteria.
The primary outcomes of the study were time to surgery, the duration of NET, and if the pathologic stage at surgery was lower than clinical stage.
Among patients who did not receive NET, 49.3% underwent surgery within 30 days of diagnosis. More than a third (37.2%) underwent surgery within 60 days of diagnosis, and 13.5% did not have surgery until more than 60 days after their initial diagnosis. A total of 1.8% (9,664) patients underwent NET.
When looking at NET duration, 48% underwent NET for 12 or more weeks, while 52% received NET for less than 12 weeks; 27% received NET for less than 4 weeks, 17% for 4-8 weeks, and 9% for 8-12 weeks.
Downstaging from clinical stage to final pathology stage increased with longer duration of NET. It was 5.5% for less than 1 month on therapy, 9.7% for 1-2 months, and 17.2% for 2-3 months.
“For less than 4 weeks, there was no improvement in N or T downstaging,” said Dr. De Andrade. “As the amount of time on NET increased, it was associated with greater T downstaging. But for N downstaging, it was only at the standard of 12 or more weeks that a difference was seen in nodal downstaging.”
Standard NET of 12 or more weeks was associated with reduced mastectomy rates, but mastectomy rates were not lower in short-course NET.
Among patients undergoing breast conservation therapy, longer duration NET was also associated with a lower risk for re-excision (1-2 months: odds ratio, 0.82, P = .02; 2-3 months: OR, 0.40, P < .001). There was no reduction in re-excision for shorter courses of therapy.
Dr. De Andrade had no disclosures.
SEATTLE – A short course of neoadjuvant therapy could be considered in breast cancer patients with expected delays to resection, while they are awaiting surgery, according to study findings presented at the annual Society of Surgical Oncology Cancer Symposium.
More than half of breast cancer patients who undergo surgical resection as the initial modality will experience delays to surgery of more than 4 weeks. Of this group, more than half of patients receive shorter than standard courses of neoadjuvant therapy (NET), and the patients most likely to benefit were those older than 50 years, with ductal tumors, and the effect was seen in all T stages.
Recent reports show that NET is increasing. However, Dr. De Andrade pointed out, delays in receiving surgery remain a problem in breast cancer treatment and are associated with worse overall and cancer specific survival.
“Off-label use of NET is sometimes used in patients undergoing surgical delays,” he said.
NET use for 3 months has been associated with decreasing the size of tumors in patients with hormone receptor–positive (HR+) invasive breast cancer and allowing for breast conservation therapy. While short-term NET is sometimes used in women who are experiencing delay to surgery, the incidence and efficacy of this regimen remains undefined.
In the current study, Dr. De Andrade and his colleagues sought to answer three clinical questions:
• How long are patients with operable breast cancer waiting to undergo surgery?
• What is the pattern of use of short-course NET?
• What are the effects of short-course NET on outcomes?
The investigators used the National Cancer Database (NCDB) to identify women who had undergone surgery for stage 1-3 HR+ invasive breast cancer from 2004 to 2013. A total of 530,009 patients met inclusion criteria.
The primary outcomes of the study were time to surgery, the duration of NET, and if the pathologic stage at surgery was lower than clinical stage.
Among patients who did not receive NET, 49.3% underwent surgery within 30 days of diagnosis. More than a third (37.2%) underwent surgery within 60 days of diagnosis, and 13.5% did not have surgery until more than 60 days after their initial diagnosis. A total of 1.8% (9,664) patients underwent NET.
When looking at NET duration, 48% underwent NET for 12 or more weeks, while 52% received NET for less than 12 weeks; 27% received NET for less than 4 weeks, 17% for 4-8 weeks, and 9% for 8-12 weeks.
Downstaging from clinical stage to final pathology stage increased with longer duration of NET. It was 5.5% for less than 1 month on therapy, 9.7% for 1-2 months, and 17.2% for 2-3 months.
“For less than 4 weeks, there was no improvement in N or T downstaging,” said Dr. De Andrade. “As the amount of time on NET increased, it was associated with greater T downstaging. But for N downstaging, it was only at the standard of 12 or more weeks that a difference was seen in nodal downstaging.”
Standard NET of 12 or more weeks was associated with reduced mastectomy rates, but mastectomy rates were not lower in short-course NET.
Among patients undergoing breast conservation therapy, longer duration NET was also associated with a lower risk for re-excision (1-2 months: odds ratio, 0.82, P = .02; 2-3 months: OR, 0.40, P < .001). There was no reduction in re-excision for shorter courses of therapy.
Dr. De Andrade had no disclosures.
SEATTLE – A short course of neoadjuvant therapy could be considered in breast cancer patients with expected delays to resection, while they are awaiting surgery, according to study findings presented at the annual Society of Surgical Oncology Cancer Symposium.
More than half of breast cancer patients who undergo surgical resection as the initial modality will experience delays to surgery of more than 4 weeks. Of this group, more than half of patients receive shorter than standard courses of neoadjuvant therapy (NET), and the patients most likely to benefit were those older than 50 years, with ductal tumors, and the effect was seen in all T stages.
Recent reports show that NET is increasing. However, Dr. De Andrade pointed out, delays in receiving surgery remain a problem in breast cancer treatment and are associated with worse overall and cancer specific survival.
“Off-label use of NET is sometimes used in patients undergoing surgical delays,” he said.
NET use for 3 months has been associated with decreasing the size of tumors in patients with hormone receptor–positive (HR+) invasive breast cancer and allowing for breast conservation therapy. While short-term NET is sometimes used in women who are experiencing delay to surgery, the incidence and efficacy of this regimen remains undefined.
In the current study, Dr. De Andrade and his colleagues sought to answer three clinical questions:
• How long are patients with operable breast cancer waiting to undergo surgery?
• What is the pattern of use of short-course NET?
• What are the effects of short-course NET on outcomes?
The investigators used the National Cancer Database (NCDB) to identify women who had undergone surgery for stage 1-3 HR+ invasive breast cancer from 2004 to 2013. A total of 530,009 patients met inclusion criteria.
The primary outcomes of the study were time to surgery, the duration of NET, and if the pathologic stage at surgery was lower than clinical stage.
Among patients who did not receive NET, 49.3% underwent surgery within 30 days of diagnosis. More than a third (37.2%) underwent surgery within 60 days of diagnosis, and 13.5% did not have surgery until more than 60 days after their initial diagnosis. A total of 1.8% (9,664) patients underwent NET.
When looking at NET duration, 48% underwent NET for 12 or more weeks, while 52% received NET for less than 12 weeks; 27% received NET for less than 4 weeks, 17% for 4-8 weeks, and 9% for 8-12 weeks.
Downstaging from clinical stage to final pathology stage increased with longer duration of NET. It was 5.5% for less than 1 month on therapy, 9.7% for 1-2 months, and 17.2% for 2-3 months.
“For less than 4 weeks, there was no improvement in N or T downstaging,” said Dr. De Andrade. “As the amount of time on NET increased, it was associated with greater T downstaging. But for N downstaging, it was only at the standard of 12 or more weeks that a difference was seen in nodal downstaging.”
Standard NET of 12 or more weeks was associated with reduced mastectomy rates, but mastectomy rates were not lower in short-course NET.
Among patients undergoing breast conservation therapy, longer duration NET was also associated with a lower risk for re-excision (1-2 months: odds ratio, 0.82, P = .02; 2-3 months: OR, 0.40, P < .001). There was no reduction in re-excision for shorter courses of therapy.
Dr. De Andrade had no disclosures.
AT SSO 2017
Key clinical point: Short-course neoadjuvant therapy is an option for breast cancer patients with expected delays to surgery.
Major finding: Use of neoadjuvant therapy was associated with downstaging from clinical stage to final pathology stage and reducing re-excision in breast conservation surgery.
Data source: The National Cancer Database was used to identify 530,009 patients.
Disclosures: Dr. De Andrade had no disclosures.
Routine U.S. mitral clip use found reassuring
WASHINGTON – U.S. heart teams have used the mitral valve transcatheter clip repair device for fixing leaky mitral valves exactly the way it was designed to be used once the device hit the U.S. market in 2013.
In the first review of periprocedural and 1-year outcomes of U.S. patients treated with the MitraClip repair device and entered in the national device registry, the results showed “acute effectiveness and safety of transcatheter mitral valve repair,” Paul Sorajja, MD, said at the annual meeting of the American College of Cardiology.
“We need to be keenly aware of the impact of comorbidities on the prognosis of these patients. The data show that untreated comorbidities really impact prognosis,” said Dr. Sorajja, an interventional cardiologist and director of the Center of Valve and Structural Heart Disease of the Minneapolis Heart Institute.
“The clip is for the no-option patient, meaning patients at high risk who have no surgical option. The data show that these are the patients who are being treated” in routine U.S. practice. “The data show that, even for these patients, you can still get pretty good results,” Dr. Sorajja said in an interview. “These are the first data on clip use in routine U.S. practice, and they are really reassuring. The data show that the clip is being used in the correct way, without risk creep, on patients with prohibitive surgical risk based on their STS [Society of Thoracic Surgeons] predicted mortality and frailty scores.”
The data he and his associates reviewed came from the 2,952 U.S. patients who underwent a transcatheter mitral valve clip repair following the devices premarketing approval from the Food and Drug Administration in November 2013, and through September 2015 at any of 250 U.S. sites offering the procedure.
The data on patient demographics and clinical status came from the STS/American College of Cardiology Transcatheter Valve Therapy registry, and data on 1-year outcomes came from Medicare records for 1,867 (63%) of the patients.
The mitral valve repair patients averaged 82 years old, 85% had a New York Heart Association functional class of III or IV, 93% had a mitral valve regurgitation grade of 3 or 4, half were judged frail, and their STS predicted mortality risk from mitral valve repair was about 6% and from valve replacement about 9%.
Immediately after their procedure, 93% of patients had a valve regurgitation grade of 2 or less, the periprocedural mortality rate was just under 3%, and 86% of patients were discharged home following a median length of stay of 2 days. Acute procedural success occurred in 92% of patients, Dr. Sorajja reported.
At 1 year, the mortality rate among the patients followed through their Medicare records showed that 26% of patients had died, 20% had been hospitalized at least once for heart failure, and 38% had at least one of these two outcomes. In addition, 6% underwent a repeat procedure of transcatheter mitral repair, and 2% had mitral valve replacement surgery.
Although patients who had a successful repair with a residual regurgitation grade of 0 or 1 still had a substantial mortality rate of 22% during 1-year follow-up, survival was worse in patients with higher grades of residual mitral regurgitation. One-year mortality among those with residual grade 2 regurgitation was 29%, and for those with residual grade 3 or 4 regurgitation, 1-year mortality was 49%.
Many patients also had at least one comorbidity, and when these were present, 1-year survival was significantly worse. In a multivariate model, patients on dialysis had twofold greater mortality than did those not on dialysis, patients with severe tricuspid valve regurgitation had twice the mortality of those with lesser or no tricuspid regurgitation, and patients with moderate or severe lung disease had a 50% higher mortality, compared with those with milder or no lung disease.
The study was supported in part by Abbott Vascular, the company that markets the MitraClip. Dr. Sorajja has been a consultant to and speaker on behalf of Abbott Vascular. He has also been a consultant to Integer, Lake Region Medical, and Medtronic, and a speaker on behalf of Boston Scientific.
[email protected]
On Twitter @mitchelzoler
WASHINGTON – U.S. heart teams have used the mitral valve transcatheter clip repair device for fixing leaky mitral valves exactly the way it was designed to be used once the device hit the U.S. market in 2013.
In the first review of periprocedural and 1-year outcomes of U.S. patients treated with the MitraClip repair device and entered in the national device registry, the results showed “acute effectiveness and safety of transcatheter mitral valve repair,” Paul Sorajja, MD, said at the annual meeting of the American College of Cardiology.
“We need to be keenly aware of the impact of comorbidities on the prognosis of these patients. The data show that untreated comorbidities really impact prognosis,” said Dr. Sorajja, an interventional cardiologist and director of the Center of Valve and Structural Heart Disease of the Minneapolis Heart Institute.
“The clip is for the no-option patient, meaning patients at high risk who have no surgical option. The data show that these are the patients who are being treated” in routine U.S. practice. “The data show that, even for these patients, you can still get pretty good results,” Dr. Sorajja said in an interview. “These are the first data on clip use in routine U.S. practice, and they are really reassuring. The data show that the clip is being used in the correct way, without risk creep, on patients with prohibitive surgical risk based on their STS [Society of Thoracic Surgeons] predicted mortality and frailty scores.”
The data he and his associates reviewed came from the 2,952 U.S. patients who underwent a transcatheter mitral valve clip repair following the devices premarketing approval from the Food and Drug Administration in November 2013, and through September 2015 at any of 250 U.S. sites offering the procedure.
The data on patient demographics and clinical status came from the STS/American College of Cardiology Transcatheter Valve Therapy registry, and data on 1-year outcomes came from Medicare records for 1,867 (63%) of the patients.
The mitral valve repair patients averaged 82 years old, 85% had a New York Heart Association functional class of III or IV, 93% had a mitral valve regurgitation grade of 3 or 4, half were judged frail, and their STS predicted mortality risk from mitral valve repair was about 6% and from valve replacement about 9%.
Immediately after their procedure, 93% of patients had a valve regurgitation grade of 2 or less, the periprocedural mortality rate was just under 3%, and 86% of patients were discharged home following a median length of stay of 2 days. Acute procedural success occurred in 92% of patients, Dr. Sorajja reported.
At 1 year, the mortality rate among the patients followed through their Medicare records showed that 26% of patients had died, 20% had been hospitalized at least once for heart failure, and 38% had at least one of these two outcomes. In addition, 6% underwent a repeat procedure of transcatheter mitral repair, and 2% had mitral valve replacement surgery.
Although patients who had a successful repair with a residual regurgitation grade of 0 or 1 still had a substantial mortality rate of 22% during 1-year follow-up, survival was worse in patients with higher grades of residual mitral regurgitation. One-year mortality among those with residual grade 2 regurgitation was 29%, and for those with residual grade 3 or 4 regurgitation, 1-year mortality was 49%.
Many patients also had at least one comorbidity, and when these were present, 1-year survival was significantly worse. In a multivariate model, patients on dialysis had twofold greater mortality than did those not on dialysis, patients with severe tricuspid valve regurgitation had twice the mortality of those with lesser or no tricuspid regurgitation, and patients with moderate or severe lung disease had a 50% higher mortality, compared with those with milder or no lung disease.
The study was supported in part by Abbott Vascular, the company that markets the MitraClip. Dr. Sorajja has been a consultant to and speaker on behalf of Abbott Vascular. He has also been a consultant to Integer, Lake Region Medical, and Medtronic, and a speaker on behalf of Boston Scientific.
[email protected]
On Twitter @mitchelzoler
WASHINGTON – U.S. heart teams have used the mitral valve transcatheter clip repair device for fixing leaky mitral valves exactly the way it was designed to be used once the device hit the U.S. market in 2013.
In the first review of periprocedural and 1-year outcomes of U.S. patients treated with the MitraClip repair device and entered in the national device registry, the results showed “acute effectiveness and safety of transcatheter mitral valve repair,” Paul Sorajja, MD, said at the annual meeting of the American College of Cardiology.
“We need to be keenly aware of the impact of comorbidities on the prognosis of these patients. The data show that untreated comorbidities really impact prognosis,” said Dr. Sorajja, an interventional cardiologist and director of the Center of Valve and Structural Heart Disease of the Minneapolis Heart Institute.
“The clip is for the no-option patient, meaning patients at high risk who have no surgical option. The data show that these are the patients who are being treated” in routine U.S. practice. “The data show that, even for these patients, you can still get pretty good results,” Dr. Sorajja said in an interview. “These are the first data on clip use in routine U.S. practice, and they are really reassuring. The data show that the clip is being used in the correct way, without risk creep, on patients with prohibitive surgical risk based on their STS [Society of Thoracic Surgeons] predicted mortality and frailty scores.”
The data he and his associates reviewed came from the 2,952 U.S. patients who underwent a transcatheter mitral valve clip repair following the devices premarketing approval from the Food and Drug Administration in November 2013, and through September 2015 at any of 250 U.S. sites offering the procedure.
The data on patient demographics and clinical status came from the STS/American College of Cardiology Transcatheter Valve Therapy registry, and data on 1-year outcomes came from Medicare records for 1,867 (63%) of the patients.
The mitral valve repair patients averaged 82 years old, 85% had a New York Heart Association functional class of III or IV, 93% had a mitral valve regurgitation grade of 3 or 4, half were judged frail, and their STS predicted mortality risk from mitral valve repair was about 6% and from valve replacement about 9%.
Immediately after their procedure, 93% of patients had a valve regurgitation grade of 2 or less, the periprocedural mortality rate was just under 3%, and 86% of patients were discharged home following a median length of stay of 2 days. Acute procedural success occurred in 92% of patients, Dr. Sorajja reported.
At 1 year, the mortality rate among the patients followed through their Medicare records showed that 26% of patients had died, 20% had been hospitalized at least once for heart failure, and 38% had at least one of these two outcomes. In addition, 6% underwent a repeat procedure of transcatheter mitral repair, and 2% had mitral valve replacement surgery.
Although patients who had a successful repair with a residual regurgitation grade of 0 or 1 still had a substantial mortality rate of 22% during 1-year follow-up, survival was worse in patients with higher grades of residual mitral regurgitation. One-year mortality among those with residual grade 2 regurgitation was 29%, and for those with residual grade 3 or 4 regurgitation, 1-year mortality was 49%.
Many patients also had at least one comorbidity, and when these were present, 1-year survival was significantly worse. In a multivariate model, patients on dialysis had twofold greater mortality than did those not on dialysis, patients with severe tricuspid valve regurgitation had twice the mortality of those with lesser or no tricuspid regurgitation, and patients with moderate or severe lung disease had a 50% higher mortality, compared with those with milder or no lung disease.
The study was supported in part by Abbott Vascular, the company that markets the MitraClip. Dr. Sorajja has been a consultant to and speaker on behalf of Abbott Vascular. He has also been a consultant to Integer, Lake Region Medical, and Medtronic, and a speaker on behalf of Boston Scientific.
[email protected]
On Twitter @mitchelzoler
AT ACC 17
Key clinical point:
Major finding: U.S. mitral clip patients averaged 82 years of age, their acute success rate was 92%, and 1-year mortality was 26%.
Data source: A review of 2,952 U.S. patients who underwent transcatheter mitral clip repair and entered into the STS/ACC/TVT registry through September 2015.
Disclosures: The study was supported in part by Abbott Vascular, the company that markets the MitraClip. Dr. Sorajja has been a consultant to and speaker on behalf of Abbott Vascular. He has also been a consultant to Integer, Lake Region Medical, and Medtronic, and a speaker on behalf of Boston Scientific.
Three factors linked to rhinovirus pneumonia in HCT patients
ORLANDO – For patients who have received hematopoietic cell transplants, a rhinovirus infection can become much more than a cold.
“It holds true that rhinovirus is just as likely to be associated with mortality as are other respiratory viruses” among HCT recipients, Alpana Waghmare, MD, said at the combined annual meetings of the Center for International Blood & Marrow Transplant Research and the American Society for Blood and Marrow Transplantation.
In a new retrospective study, Dr. Waghmare and her coinvestigators found that the median time for a rhinovirus infection to progress from an upper to a lower respiratory tract infection was about 2 weeks among post-HCT patients.
Clinical and demographic risk factors for progression to lower respiratory tract infection included higher levels of steroid use (2 mg/kg per day or more) before developing the upper respiratory infection, a low white blood cell count, and a low monocyte count, said Dr. Waghmare, an infectious disease specialist and professor of pediatrics at the University of Washington, Seattle.
Of 3,445 HCT patients treated at the university center during the 6-year study, 732 patients (21%) were positive for human rhinovirus. Patients were classified as having upper respiratory infections if they had a PCR-positive nasal swab.
Patients were classed in one of three categories for potential lower respiratory infections: Proven lower respiratory infections were those detected by bronchoalveolar lavage or biopsy in patients who had a new radiographic abnormality. Probable lower respiratory infections were those with positive findings on bronchoalveolar lavage or biopsy but without radiographic changes. In possible lower respiratory infections, patients had upper tract virus detected on nasal swabs but did have a new radiographic abnormality.
Among the patients positive for human rhinovirus, 85% (665 patients) presented with upper respiratory infections and 15% (117 patients) with lower respiratory tract infections. By day 90, 16% of patients progressed from upper to lower respiratory tract infections. The median time to progression was 13.5 days. Progression to proven lower respiratory tract infection affected 5% of the HCT recipients.
In multivariable analytic models, a minimum white blood cell count of 1,000 or less was associated with a hazard ratio (HR) of 2.21 for progression to lower respiratory tract infection. A minimum monocyte count of 1,000 or less was associated with a HR of 3.66 for progression to lower respiratory tract infection.
The model also found a HR of 3.37 for lower respiratory tract infection with steroid use of 2 mg/kg per day or more. The patient’s conditioning regimen and donor type were not significantly associated with risk of progression to lower respiratory infection.
Viral copathogens, prior respiratory virus episodes, and the duration of time since HCT were not associated with risk of progress to lower respiratory infections. Neither were patient age, baseline lung function, and the year the transplant occurred.
“These data provide an initial framework for patient risk stratification and the development of rational prevention and treatment strategies in HCT recipients,” she said.
Dr. Waghmare reported receiving research funding from Aviragen, the maker of vapendavir, an investigational drug for human rhinovirus infection, and Gilead Sciences.
[email protected]
On Twitter @karioakes
ORLANDO – For patients who have received hematopoietic cell transplants, a rhinovirus infection can become much more than a cold.
“It holds true that rhinovirus is just as likely to be associated with mortality as are other respiratory viruses” among HCT recipients, Alpana Waghmare, MD, said at the combined annual meetings of the Center for International Blood & Marrow Transplant Research and the American Society for Blood and Marrow Transplantation.
In a new retrospective study, Dr. Waghmare and her coinvestigators found that the median time for a rhinovirus infection to progress from an upper to a lower respiratory tract infection was about 2 weeks among post-HCT patients.
Clinical and demographic risk factors for progression to lower respiratory tract infection included higher levels of steroid use (2 mg/kg per day or more) before developing the upper respiratory infection, a low white blood cell count, and a low monocyte count, said Dr. Waghmare, an infectious disease specialist and professor of pediatrics at the University of Washington, Seattle.
Of 3,445 HCT patients treated at the university center during the 6-year study, 732 patients (21%) were positive for human rhinovirus. Patients were classified as having upper respiratory infections if they had a PCR-positive nasal swab.
Patients were classed in one of three categories for potential lower respiratory infections: Proven lower respiratory infections were those detected by bronchoalveolar lavage or biopsy in patients who had a new radiographic abnormality. Probable lower respiratory infections were those with positive findings on bronchoalveolar lavage or biopsy but without radiographic changes. In possible lower respiratory infections, patients had upper tract virus detected on nasal swabs but did have a new radiographic abnormality.
Among the patients positive for human rhinovirus, 85% (665 patients) presented with upper respiratory infections and 15% (117 patients) with lower respiratory tract infections. By day 90, 16% of patients progressed from upper to lower respiratory tract infections. The median time to progression was 13.5 days. Progression to proven lower respiratory tract infection affected 5% of the HCT recipients.
In multivariable analytic models, a minimum white blood cell count of 1,000 or less was associated with a hazard ratio (HR) of 2.21 for progression to lower respiratory tract infection. A minimum monocyte count of 1,000 or less was associated with a HR of 3.66 for progression to lower respiratory tract infection.
The model also found a HR of 3.37 for lower respiratory tract infection with steroid use of 2 mg/kg per day or more. The patient’s conditioning regimen and donor type were not significantly associated with risk of progression to lower respiratory infection.
Viral copathogens, prior respiratory virus episodes, and the duration of time since HCT were not associated with risk of progress to lower respiratory infections. Neither were patient age, baseline lung function, and the year the transplant occurred.
“These data provide an initial framework for patient risk stratification and the development of rational prevention and treatment strategies in HCT recipients,” she said.
Dr. Waghmare reported receiving research funding from Aviragen, the maker of vapendavir, an investigational drug for human rhinovirus infection, and Gilead Sciences.
[email protected]
On Twitter @karioakes
ORLANDO – For patients who have received hematopoietic cell transplants, a rhinovirus infection can become much more than a cold.
“It holds true that rhinovirus is just as likely to be associated with mortality as are other respiratory viruses” among HCT recipients, Alpana Waghmare, MD, said at the combined annual meetings of the Center for International Blood & Marrow Transplant Research and the American Society for Blood and Marrow Transplantation.
In a new retrospective study, Dr. Waghmare and her coinvestigators found that the median time for a rhinovirus infection to progress from an upper to a lower respiratory tract infection was about 2 weeks among post-HCT patients.
Clinical and demographic risk factors for progression to lower respiratory tract infection included higher levels of steroid use (2 mg/kg per day or more) before developing the upper respiratory infection, a low white blood cell count, and a low monocyte count, said Dr. Waghmare, an infectious disease specialist and professor of pediatrics at the University of Washington, Seattle.
Of 3,445 HCT patients treated at the university center during the 6-year study, 732 patients (21%) were positive for human rhinovirus. Patients were classified as having upper respiratory infections if they had a PCR-positive nasal swab.
Patients were classed in one of three categories for potential lower respiratory infections: Proven lower respiratory infections were those detected by bronchoalveolar lavage or biopsy in patients who had a new radiographic abnormality. Probable lower respiratory infections were those with positive findings on bronchoalveolar lavage or biopsy but without radiographic changes. In possible lower respiratory infections, patients had upper tract virus detected on nasal swabs but did have a new radiographic abnormality.
Among the patients positive for human rhinovirus, 85% (665 patients) presented with upper respiratory infections and 15% (117 patients) with lower respiratory tract infections. By day 90, 16% of patients progressed from upper to lower respiratory tract infections. The median time to progression was 13.5 days. Progression to proven lower respiratory tract infection affected 5% of the HCT recipients.
In multivariable analytic models, a minimum white blood cell count of 1,000 or less was associated with a hazard ratio (HR) of 2.21 for progression to lower respiratory tract infection. A minimum monocyte count of 1,000 or less was associated with a HR of 3.66 for progression to lower respiratory tract infection.
The model also found a HR of 3.37 for lower respiratory tract infection with steroid use of 2 mg/kg per day or more. The patient’s conditioning regimen and donor type were not significantly associated with risk of progression to lower respiratory infection.
Viral copathogens, prior respiratory virus episodes, and the duration of time since HCT were not associated with risk of progress to lower respiratory infections. Neither were patient age, baseline lung function, and the year the transplant occurred.
“These data provide an initial framework for patient risk stratification and the development of rational prevention and treatment strategies in HCT recipients,” she said.
Dr. Waghmare reported receiving research funding from Aviragen, the maker of vapendavir, an investigational drug for human rhinovirus infection, and Gilead Sciences.
[email protected]
On Twitter @karioakes
AT THE BMT TANDEM MEETINGS
Key clinical point:
Major finding: Of 3,445 HCT patients, 732 patients (21%) were positive for human rhinovirus.
Data source: Single-center, 6-year retrospective study of 732 HCT patients with human rhinovirus infection.
Disclosures: Dr. Waghmare reported receiving research funding from Aviragen, the maker of vapendavir, an investigational drug for human rhinovirus infection, and Gilead Sciences.