Combination Therapy with Ribociclib Improves Progression-Free Survival in Advanced Breast Cancer

Article Type
Changed
Thu, 03/01/2018 - 11:21
Display Headline
Combination Therapy with Ribociclib Improves Progression-Free Survival in Advanced Breast Cancer

Study Overview

Objective. To evaluate the efficacy and safety of the CDK4/6 inhibitor ribociclib in combination with letrozole as initial therapy in patients with hormone-receptor (HR)–positive, human epidermal growth factor receptor 2 (HER-2)–negative advanced breast cancer.

Design. Pre-planned interim analysis of a randomized, double-blind, phase 3 clinical trial.

Setting and participants. This study enrolled patients in 29 countries at 223 centers. A total of 668 postmenopausal women underwent randomization, with 334 assigned to receive ribociclib plus letrozole and 334 assigned to receive placebo plus letrozole. All women had HR-positive, HER-2 negative recurrent or metastatic breast cancer and had not received prior systemic therapy. Enrolled patients had either measurable disease on imaging or at least 1 lytic bone lesion. All patients were required to have an Eastern Cooperative Oncology Group performance status of 0 or 1. Patients were excluded if they had received prior therapy with a CDK4/6 inhibitor, previous systemic chemotherapy or endocrine therapy. If a patient received an aromatase inhibitor for neoadjuvant or adjuvant therapy, the disease-free interval needed to be more than 12 months to be included in the study. Patients with inflammatory breast cancer or central nervous system involvement were also excluded. Normal cardiac function (normal QT interval) was required for enrollment. The randomization was stratified by presence of liver or lung metastases.

Intervention. The patients were randomized to oral ribociclib 600 mg per day 3 weeks on, 1 week off in a 28-day treatment cycle plus letrozole 2.5 mg daily or placebo plus letrozole. The dosing of ribociclib was based on a prior phase 1 study [1]. Treatment was continued until disease progression, unacceptable toxicity, discontinuation, or death. Dose reductions of ribociclib were allowed; however, dose reductions of letrozole were not permitted. Crossover between treatment arms was not allowed. Patients were assessed with computed tomo-graphy at the time of randomization, every 8 weeks for the first 18 months and every 12 weeks there-after. Patients were monitored for hematological toxicity each cycle. Electrocardiographic assessment was done at screening, on day 15 of cycle 1 and on day 1 of all subsequent cycles to monitor for QT prolongation.

Main outcome measures. The primary outcome was progression-free survival. The secondary outcomes were overall survival, overall response rate (complete or partial response), clinical benefit rate, and safety. Clinical benefit rate was defined as overall response plus stable disease lasting 24 weeks or more. A prespecified interim analysis was planned after disease progression or death was reported in 211 of 302 patients (70%).

Results. The baseline characteristics were balanced between the 2 groups. Visceral disease was present in 58.8% and bone-only disease in 22% of the patients. The median duration of therapy exposure was 13 months in the ribociclib group and 12.4 months in the placebo group. The median duration of follow-up was 15.3 months. After 18 months, progression-free survival was 63% (95% confidence interval [CI], 54.6 to 70.3) in the ribociclib/letrozole group versus 42.2% (95% CI, 34.8 to 49.5) in the placebo group (P < 0.001). The median progression-free survival was not met in the combination group (95% CI, 19.3 to not reached) versus 14.7 months (95% CI, 13.0 to 16.5) in the placebo group. The improved progression-free survival was seen across all subgroups. The overall response rate was higher in the combination arm (52.7% vs. 37.1%) as was the clinical benefit rate (80.1% vs. 71.8%). Serious adverse events occurred in 21.3% of patients in the ribociclib group and 11.8% in the placebo group. Serious adverse events were attributed to the study drug in 7.5% of the ribociclib group and 1.5% of the placebo group. The most common adverse events were myelosuppression, nausea, fatigue and diarrhea. Grade 3 and 4 neutropenia was noted in 59.3% in the ribociclib group versus < 1% in the placebo arm. The discontinuation rate due to adverse events in the ribociclib and placebo groups was 7.5% versus 2.1%, respectively. The most common reason for discontinuation was disease progression in 26% in the ribociclib group and 43.7% in the placebo group. Three deaths occurred in the ribociclib group and one in the placebo group. Interruptions in ribociclib occurred in 76.9% of patients. Dose reductions occurred in 53.9% of patients in the ribociclib group versus 7% in the placebo group. The most common reason a dose reduction occurred was neutropenia.

Conclusion. First-line treatment with ribociclib plus letrozole in postmenopausal women with HR-positive, HER-2 negative advanced breast cancer was associated with significantly longer progression-free survival compared with letrozole plus placebo. The improved progression-free survival was seen across all subgroups.

Commentary

Nearly 80% of all breast cancers express hormone receptor positivity. Hormonal therapy has been an important component of treatment for women with hormone-positive breast cancer in both the local and metastatic setting. Many tumors will eventually develop resistance to such therapy with the median progression-free survival with first-line endocrine therapy alone being around 9 months [2]. Cyclin dependent kinases 4 and 6 (CDK4/6) play an important role in estrogen-receptor signaling and cell cycle progression. CDK 4/6 mediates progression through the cell cycle from G1 to S phase via phosphorylation and inactivation of the retinoblastoma tumor suppressor protein [3]. Overexpression of CDK 4/6 in hormone receptor positive breast cancer is thought to play an important role in the development of endocrine therapy resistance [4].

The previously published PALOMA-2 trial, which compared treatment with the CDK 4/6 inhibitor palbociclib plus letrozole with letrozole alone, reported a significant improvement in progression-free survival with the addition of palbociclib (24.8 months vs. 14.5 months) in the front-line setting for women with advanced, hormone-positive breast cancer [5]. The improved progression-free survival with palbociclib was seen across all subgroups with a favorable toxicity profile. The current study represents the second randomized trial to show that the addition of CDK4/6 inhibitor to endocrine-based therapy significantly improves progression-free survival. This benefit was also seen across all patient subgroups including those with liver and lung metastases. In addition, the combination of ribociclib and letrozole also show significantly higher rates of overall response compared with placebo. In general, the addition of ribociclib to letrozole was well tolerated with a very low rate (7.5%) of discontinuation of therapy. Although neutropenia was a frequent complication in the ribociclib group febrile neutropenia occurred in only 1.5% of patients.

The incorporation of CDK4/6 inhibitors to endocrine-based therapy in the front-line setting has proven effective with an impressive early separation of the progression-free survival curves. Both the PALOMA-2 trial and the current MONALEESA-2 trial have shown similar results with approximately 40% improvement in progression-free survival. Whether the results seen in these trials will translate into an improvement in overall survival is yet to be determined. The results of these 2 trial suggest that CDK4/6 inhibitors have activity in both patients who have not received previous treatment with endocrine therapy and in those who received adjuvant endocrine therapy with late (> 12 months) relapse. Further determination of the subset of women who would benefit from the addition of CDK4/6 inhibitors remains an important clinical question. There are currently no clinical biomarkers that can be used to predict whether a patient would benefit from the addition of these medications.

Applications for Clinical Practice

The results of the current trial represent an exciting step forward in the treatment of advanced breast cancer. Palbociclib in combination with endocrine therapy is currently incorporated into clinical practice. The cost of these agents remains a concern; however, most insurance policies will cover them. Clinical trials are ongoing in the neoadjuvant and adjuvant setting for early breast cancer.

—Daniel Isaac, DO, MS

References

1. Infante JR, Cassier PA, Gerecitano JF, et al. A phase 1 study of cyclin-dependent kinase 4/6 inhibitor ribociclib (LEE011) in patients with advanced solid tumors and lymphomas. Clin Cancer Res 2016.

2. Mouridse H, Gershanovich M, Sun Y, et al. Phase III study of letrozole versus tamoxifen as first-line therapy of advanced breast cancer in post-menopausal women: analysis of survival and update of efficacy from the international letrozole breast cancer group. J Clin Oncol 2003 21:2101–9.

3. Weinberg RA. The retinoblastoma protein and cell cycle control. Cell 1995;81:323–30.

4. Zavardas D, Baselga J, Piccart M. Emerging targeted agents in metastatic breast cancer. Nature Rev Clin Oncol 2013;10:191–210.

5. Finn RS, Martin M, Rugo HS, et al. PALOMA-2: primary results from a phase III trial of palbociclib with letrozole compared with letrozole alone in women with ER+/HER2- advanced breast cancer. J Clin Oncol 2016;34(Supp). Abst 507.

Issue
Journal of Clinical Outcomes Management - December 2016, Vol. 23, No. 12
Publications
Topics
Sections

Study Overview

Objective. To evaluate the efficacy and safety of the CDK4/6 inhibitor ribociclib in combination with letrozole as initial therapy in patients with hormone-receptor (HR)–positive, human epidermal growth factor receptor 2 (HER-2)–negative advanced breast cancer.

Design. Pre-planned interim analysis of a randomized, double-blind, phase 3 clinical trial.

Setting and participants. This study enrolled patients in 29 countries at 223 centers. A total of 668 postmenopausal women underwent randomization, with 334 assigned to receive ribociclib plus letrozole and 334 assigned to receive placebo plus letrozole. All women had HR-positive, HER-2 negative recurrent or metastatic breast cancer and had not received prior systemic therapy. Enrolled patients had either measurable disease on imaging or at least 1 lytic bone lesion. All patients were required to have an Eastern Cooperative Oncology Group performance status of 0 or 1. Patients were excluded if they had received prior therapy with a CDK4/6 inhibitor, previous systemic chemotherapy or endocrine therapy. If a patient received an aromatase inhibitor for neoadjuvant or adjuvant therapy, the disease-free interval needed to be more than 12 months to be included in the study. Patients with inflammatory breast cancer or central nervous system involvement were also excluded. Normal cardiac function (normal QT interval) was required for enrollment. The randomization was stratified by presence of liver or lung metastases.

Intervention. The patients were randomized to oral ribociclib 600 mg per day 3 weeks on, 1 week off in a 28-day treatment cycle plus letrozole 2.5 mg daily or placebo plus letrozole. The dosing of ribociclib was based on a prior phase 1 study [1]. Treatment was continued until disease progression, unacceptable toxicity, discontinuation, or death. Dose reductions of ribociclib were allowed; however, dose reductions of letrozole were not permitted. Crossover between treatment arms was not allowed. Patients were assessed with computed tomo-graphy at the time of randomization, every 8 weeks for the first 18 months and every 12 weeks there-after. Patients were monitored for hematological toxicity each cycle. Electrocardiographic assessment was done at screening, on day 15 of cycle 1 and on day 1 of all subsequent cycles to monitor for QT prolongation.

Main outcome measures. The primary outcome was progression-free survival. The secondary outcomes were overall survival, overall response rate (complete or partial response), clinical benefit rate, and safety. Clinical benefit rate was defined as overall response plus stable disease lasting 24 weeks or more. A prespecified interim analysis was planned after disease progression or death was reported in 211 of 302 patients (70%).

Results. The baseline characteristics were balanced between the 2 groups. Visceral disease was present in 58.8% and bone-only disease in 22% of the patients. The median duration of therapy exposure was 13 months in the ribociclib group and 12.4 months in the placebo group. The median duration of follow-up was 15.3 months. After 18 months, progression-free survival was 63% (95% confidence interval [CI], 54.6 to 70.3) in the ribociclib/letrozole group versus 42.2% (95% CI, 34.8 to 49.5) in the placebo group (P < 0.001). The median progression-free survival was not met in the combination group (95% CI, 19.3 to not reached) versus 14.7 months (95% CI, 13.0 to 16.5) in the placebo group. The improved progression-free survival was seen across all subgroups. The overall response rate was higher in the combination arm (52.7% vs. 37.1%) as was the clinical benefit rate (80.1% vs. 71.8%). Serious adverse events occurred in 21.3% of patients in the ribociclib group and 11.8% in the placebo group. Serious adverse events were attributed to the study drug in 7.5% of the ribociclib group and 1.5% of the placebo group. The most common adverse events were myelosuppression, nausea, fatigue and diarrhea. Grade 3 and 4 neutropenia was noted in 59.3% in the ribociclib group versus < 1% in the placebo arm. The discontinuation rate due to adverse events in the ribociclib and placebo groups was 7.5% versus 2.1%, respectively. The most common reason for discontinuation was disease progression in 26% in the ribociclib group and 43.7% in the placebo group. Three deaths occurred in the ribociclib group and one in the placebo group. Interruptions in ribociclib occurred in 76.9% of patients. Dose reductions occurred in 53.9% of patients in the ribociclib group versus 7% in the placebo group. The most common reason a dose reduction occurred was neutropenia.

Conclusion. First-line treatment with ribociclib plus letrozole in postmenopausal women with HR-positive, HER-2 negative advanced breast cancer was associated with significantly longer progression-free survival compared with letrozole plus placebo. The improved progression-free survival was seen across all subgroups.

Commentary

Nearly 80% of all breast cancers express hormone receptor positivity. Hormonal therapy has been an important component of treatment for women with hormone-positive breast cancer in both the local and metastatic setting. Many tumors will eventually develop resistance to such therapy with the median progression-free survival with first-line endocrine therapy alone being around 9 months [2]. Cyclin dependent kinases 4 and 6 (CDK4/6) play an important role in estrogen-receptor signaling and cell cycle progression. CDK 4/6 mediates progression through the cell cycle from G1 to S phase via phosphorylation and inactivation of the retinoblastoma tumor suppressor protein [3]. Overexpression of CDK 4/6 in hormone receptor positive breast cancer is thought to play an important role in the development of endocrine therapy resistance [4].

The previously published PALOMA-2 trial, which compared treatment with the CDK 4/6 inhibitor palbociclib plus letrozole with letrozole alone, reported a significant improvement in progression-free survival with the addition of palbociclib (24.8 months vs. 14.5 months) in the front-line setting for women with advanced, hormone-positive breast cancer [5]. The improved progression-free survival with palbociclib was seen across all subgroups with a favorable toxicity profile. The current study represents the second randomized trial to show that the addition of CDK4/6 inhibitor to endocrine-based therapy significantly improves progression-free survival. This benefit was also seen across all patient subgroups including those with liver and lung metastases. In addition, the combination of ribociclib and letrozole also show significantly higher rates of overall response compared with placebo. In general, the addition of ribociclib to letrozole was well tolerated with a very low rate (7.5%) of discontinuation of therapy. Although neutropenia was a frequent complication in the ribociclib group febrile neutropenia occurred in only 1.5% of patients.

The incorporation of CDK4/6 inhibitors to endocrine-based therapy in the front-line setting has proven effective with an impressive early separation of the progression-free survival curves. Both the PALOMA-2 trial and the current MONALEESA-2 trial have shown similar results with approximately 40% improvement in progression-free survival. Whether the results seen in these trials will translate into an improvement in overall survival is yet to be determined. The results of these 2 trial suggest that CDK4/6 inhibitors have activity in both patients who have not received previous treatment with endocrine therapy and in those who received adjuvant endocrine therapy with late (> 12 months) relapse. Further determination of the subset of women who would benefit from the addition of CDK4/6 inhibitors remains an important clinical question. There are currently no clinical biomarkers that can be used to predict whether a patient would benefit from the addition of these medications.

Applications for Clinical Practice

The results of the current trial represent an exciting step forward in the treatment of advanced breast cancer. Palbociclib in combination with endocrine therapy is currently incorporated into clinical practice. The cost of these agents remains a concern; however, most insurance policies will cover them. Clinical trials are ongoing in the neoadjuvant and adjuvant setting for early breast cancer.

—Daniel Isaac, DO, MS

Study Overview

Objective. To evaluate the efficacy and safety of the CDK4/6 inhibitor ribociclib in combination with letrozole as initial therapy in patients with hormone-receptor (HR)–positive, human epidermal growth factor receptor 2 (HER-2)–negative advanced breast cancer.

Design. Pre-planned interim analysis of a randomized, double-blind, phase 3 clinical trial.

Setting and participants. This study enrolled patients in 29 countries at 223 centers. A total of 668 postmenopausal women underwent randomization, with 334 assigned to receive ribociclib plus letrozole and 334 assigned to receive placebo plus letrozole. All women had HR-positive, HER-2 negative recurrent or metastatic breast cancer and had not received prior systemic therapy. Enrolled patients had either measurable disease on imaging or at least 1 lytic bone lesion. All patients were required to have an Eastern Cooperative Oncology Group performance status of 0 or 1. Patients were excluded if they had received prior therapy with a CDK4/6 inhibitor, previous systemic chemotherapy or endocrine therapy. If a patient received an aromatase inhibitor for neoadjuvant or adjuvant therapy, the disease-free interval needed to be more than 12 months to be included in the study. Patients with inflammatory breast cancer or central nervous system involvement were also excluded. Normal cardiac function (normal QT interval) was required for enrollment. The randomization was stratified by presence of liver or lung metastases.

Intervention. The patients were randomized to oral ribociclib 600 mg per day 3 weeks on, 1 week off in a 28-day treatment cycle plus letrozole 2.5 mg daily or placebo plus letrozole. The dosing of ribociclib was based on a prior phase 1 study [1]. Treatment was continued until disease progression, unacceptable toxicity, discontinuation, or death. Dose reductions of ribociclib were allowed; however, dose reductions of letrozole were not permitted. Crossover between treatment arms was not allowed. Patients were assessed with computed tomo-graphy at the time of randomization, every 8 weeks for the first 18 months and every 12 weeks there-after. Patients were monitored for hematological toxicity each cycle. Electrocardiographic assessment was done at screening, on day 15 of cycle 1 and on day 1 of all subsequent cycles to monitor for QT prolongation.

Main outcome measures. The primary outcome was progression-free survival. The secondary outcomes were overall survival, overall response rate (complete or partial response), clinical benefit rate, and safety. Clinical benefit rate was defined as overall response plus stable disease lasting 24 weeks or more. A prespecified interim analysis was planned after disease progression or death was reported in 211 of 302 patients (70%).

Results. The baseline characteristics were balanced between the 2 groups. Visceral disease was present in 58.8% and bone-only disease in 22% of the patients. The median duration of therapy exposure was 13 months in the ribociclib group and 12.4 months in the placebo group. The median duration of follow-up was 15.3 months. After 18 months, progression-free survival was 63% (95% confidence interval [CI], 54.6 to 70.3) in the ribociclib/letrozole group versus 42.2% (95% CI, 34.8 to 49.5) in the placebo group (P < 0.001). The median progression-free survival was not met in the combination group (95% CI, 19.3 to not reached) versus 14.7 months (95% CI, 13.0 to 16.5) in the placebo group. The improved progression-free survival was seen across all subgroups. The overall response rate was higher in the combination arm (52.7% vs. 37.1%) as was the clinical benefit rate (80.1% vs. 71.8%). Serious adverse events occurred in 21.3% of patients in the ribociclib group and 11.8% in the placebo group. Serious adverse events were attributed to the study drug in 7.5% of the ribociclib group and 1.5% of the placebo group. The most common adverse events were myelosuppression, nausea, fatigue and diarrhea. Grade 3 and 4 neutropenia was noted in 59.3% in the ribociclib group versus < 1% in the placebo arm. The discontinuation rate due to adverse events in the ribociclib and placebo groups was 7.5% versus 2.1%, respectively. The most common reason for discontinuation was disease progression in 26% in the ribociclib group and 43.7% in the placebo group. Three deaths occurred in the ribociclib group and one in the placebo group. Interruptions in ribociclib occurred in 76.9% of patients. Dose reductions occurred in 53.9% of patients in the ribociclib group versus 7% in the placebo group. The most common reason a dose reduction occurred was neutropenia.

Conclusion. First-line treatment with ribociclib plus letrozole in postmenopausal women with HR-positive, HER-2 negative advanced breast cancer was associated with significantly longer progression-free survival compared with letrozole plus placebo. The improved progression-free survival was seen across all subgroups.

Commentary

Nearly 80% of all breast cancers express hormone receptor positivity. Hormonal therapy has been an important component of treatment for women with hormone-positive breast cancer in both the local and metastatic setting. Many tumors will eventually develop resistance to such therapy with the median progression-free survival with first-line endocrine therapy alone being around 9 months [2]. Cyclin dependent kinases 4 and 6 (CDK4/6) play an important role in estrogen-receptor signaling and cell cycle progression. CDK 4/6 mediates progression through the cell cycle from G1 to S phase via phosphorylation and inactivation of the retinoblastoma tumor suppressor protein [3]. Overexpression of CDK 4/6 in hormone receptor positive breast cancer is thought to play an important role in the development of endocrine therapy resistance [4].

The previously published PALOMA-2 trial, which compared treatment with the CDK 4/6 inhibitor palbociclib plus letrozole with letrozole alone, reported a significant improvement in progression-free survival with the addition of palbociclib (24.8 months vs. 14.5 months) in the front-line setting for women with advanced, hormone-positive breast cancer [5]. The improved progression-free survival with palbociclib was seen across all subgroups with a favorable toxicity profile. The current study represents the second randomized trial to show that the addition of CDK4/6 inhibitor to endocrine-based therapy significantly improves progression-free survival. This benefit was also seen across all patient subgroups including those with liver and lung metastases. In addition, the combination of ribociclib and letrozole also show significantly higher rates of overall response compared with placebo. In general, the addition of ribociclib to letrozole was well tolerated with a very low rate (7.5%) of discontinuation of therapy. Although neutropenia was a frequent complication in the ribociclib group febrile neutropenia occurred in only 1.5% of patients.

The incorporation of CDK4/6 inhibitors to endocrine-based therapy in the front-line setting has proven effective with an impressive early separation of the progression-free survival curves. Both the PALOMA-2 trial and the current MONALEESA-2 trial have shown similar results with approximately 40% improvement in progression-free survival. Whether the results seen in these trials will translate into an improvement in overall survival is yet to be determined. The results of these 2 trial suggest that CDK4/6 inhibitors have activity in both patients who have not received previous treatment with endocrine therapy and in those who received adjuvant endocrine therapy with late (> 12 months) relapse. Further determination of the subset of women who would benefit from the addition of CDK4/6 inhibitors remains an important clinical question. There are currently no clinical biomarkers that can be used to predict whether a patient would benefit from the addition of these medications.

Applications for Clinical Practice

The results of the current trial represent an exciting step forward in the treatment of advanced breast cancer. Palbociclib in combination with endocrine therapy is currently incorporated into clinical practice. The cost of these agents remains a concern; however, most insurance policies will cover them. Clinical trials are ongoing in the neoadjuvant and adjuvant setting for early breast cancer.

—Daniel Isaac, DO, MS

References

1. Infante JR, Cassier PA, Gerecitano JF, et al. A phase 1 study of cyclin-dependent kinase 4/6 inhibitor ribociclib (LEE011) in patients with advanced solid tumors and lymphomas. Clin Cancer Res 2016.

2. Mouridse H, Gershanovich M, Sun Y, et al. Phase III study of letrozole versus tamoxifen as first-line therapy of advanced breast cancer in post-menopausal women: analysis of survival and update of efficacy from the international letrozole breast cancer group. J Clin Oncol 2003 21:2101–9.

3. Weinberg RA. The retinoblastoma protein and cell cycle control. Cell 1995;81:323–30.

4. Zavardas D, Baselga J, Piccart M. Emerging targeted agents in metastatic breast cancer. Nature Rev Clin Oncol 2013;10:191–210.

5. Finn RS, Martin M, Rugo HS, et al. PALOMA-2: primary results from a phase III trial of palbociclib with letrozole compared with letrozole alone in women with ER+/HER2- advanced breast cancer. J Clin Oncol 2016;34(Supp). Abst 507.

References

1. Infante JR, Cassier PA, Gerecitano JF, et al. A phase 1 study of cyclin-dependent kinase 4/6 inhibitor ribociclib (LEE011) in patients with advanced solid tumors and lymphomas. Clin Cancer Res 2016.

2. Mouridse H, Gershanovich M, Sun Y, et al. Phase III study of letrozole versus tamoxifen as first-line therapy of advanced breast cancer in post-menopausal women: analysis of survival and update of efficacy from the international letrozole breast cancer group. J Clin Oncol 2003 21:2101–9.

3. Weinberg RA. The retinoblastoma protein and cell cycle control. Cell 1995;81:323–30.

4. Zavardas D, Baselga J, Piccart M. Emerging targeted agents in metastatic breast cancer. Nature Rev Clin Oncol 2013;10:191–210.

5. Finn RS, Martin M, Rugo HS, et al. PALOMA-2: primary results from a phase III trial of palbociclib with letrozole compared with letrozole alone in women with ER+/HER2- advanced breast cancer. J Clin Oncol 2016;34(Supp). Abst 507.

Issue
Journal of Clinical Outcomes Management - December 2016, Vol. 23, No. 12
Issue
Journal of Clinical Outcomes Management - December 2016, Vol. 23, No. 12
Publications
Publications
Topics
Article Type
Display Headline
Combination Therapy with Ribociclib Improves Progression-Free Survival in Advanced Breast Cancer
Display Headline
Combination Therapy with Ribociclib Improves Progression-Free Survival in Advanced Breast Cancer
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

VIDEO: 33A + ‘7 + 3’ equals good remission numbers in untreated AML

Article Type
Changed
Fri, 01/04/2019 - 09:57

 

– Call it “7+3+1”: an experimental induction regimen combining standard chemotherapy with an antibody drug conjugate induced rapid and deep remissions in a majority of patients with newly diagnosed acute myeloid leukemia in a small study.

Among 42 evaluable patients with previously untreated AML, the combination of cytarabine and an anthracycline (7+3, also known as 3+7), and the investigational antibody drug conjugate vadastuximab talirine was associated with a 60% complete remission (CR) rate, and 17% complete remission with incomplete recovery of platelets (CRi), reported Harry P. Erba, MD, PhD, of the University of Alabama at Birmingham, who discussed the findings in a video interview.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

“In 1973, 43 years ago, the first paper was published on what we still continue to use as the initial therapy for a very aggressive cancer, acute myeloid leukemia,” he said at a briefing at the American Society of Hematology annual meeting.

“Nothing has been shown yet to be superior to that, despite four decades of clinical research,” he added.

Recent studies have suggested that depth of postinduction remissions, specifically being minimal residual disease (MRD)-negative, is associated with improved survival, he noted.

Vadastuximab talirine (33A, for short) is an antibody-drug conjugate targeted to CD33, which is expressed in approximately 90% of AML cells. The drug is designed to deliver a cytotoxic agent to myeloid leukemia cells.

As reported previously, 33A, in combination with a hypomethylating agent (decitabine or azacitidine) in 49 evaluable patients, was associated with a composite CR/CRi rate of 71%; the rates of CR/CRi were similar regardless of the partner agent used.

The overall response rate in that study was 76%, with responses seen among higher-risk patients, including remissions in 16 of 22 patients with underlying myelodysplasia, and in 15 of 18 patients with adverse cytogenetics.

Rapid complete remissions

In the phase Ib trial reported at ASH 2016 by Dr. Erba, adults aged 18-65 years with untreated primary or secondary AML (except acute promyelocytic leukemia) were enrolled.

The patients received 33A in combination with 7+3 induction therapy (cytarabine 100 mg/m2 and daunorubicin 60 mg/m2) on days 1 and 4 of a 28-day treatment cycle. Patients were assessed for response on days 1 and 28 according to International Working Group Criteria.

Second induction regimens and postremission therapies were permitted at the investigators discretion, and did not include 33A.

The median patient age was 45.5 years. The patients had generally good performance status (Eastern Cooperative Oncology Group 0 or 1). In all, 17% of patients had secondary AML. In all, 12% had favorable cytogenetic risk disease, 50% had intermediate risk, and 36% had adverse risk. Ten percent of patients had NPM1 mutated disease, and 14% had FLT-3 mutations.

As noted, the composite CR/CRi rate was 76%, consisting of 60% CR and 17% CRI.

All five patients with favorable risk disease had a CR. The rate of CR/CRi was 86% among patients with intermediate-risk disease, and 60 for those with adverse-risk disease.

Of the 32 patients who achieved a CR or CRi, 94% did so after 1 cycle of therapy, and 25 were MRD negative, as evaluated by an independent laboratory using 10-color multi-parameter flow cytometry.

Treatment-related adverse hematologic events included febrile neutropenia (primarily grade 3) in 43% of patients, thrombocytopenia (mostly grade 4) in 38%, anemia (all grade 3) in 24%, and neutropenia (mostly grade 4) in 17%. Other treatment related events were similar to those seen with 7 + 3 alone, and included nausea, diarrhea, decreased appetite and fatigue, mostly grade 1 or 2. One patient had a grade 3 irreversible hepatic toxicity.

The death rate was 2%.

“What we felt we showed is that we were able to combine active doses of 33A with 7 + 3. The doses here were less than the doses used as a single agent, but all doses used in our phase 1b study, including lower doses that what we actually used here, showed complete remissions as a single agent.”

33A “added acceptable on-target myelosuppression. We saw platelet counts recovering to over 100,000, and neutrophils over 1,000 by about four-and-a-half to five weeks, which we felt was reasonable, and patients were able to go on to get post-remission therapy.

A randomized phase II trial comparing 33A and 7+3 to 7+3 alone is slated to launch in the first quarter of 2017.

Dr. Erba disclosed serving as a consultant to and receiving research funding from Seattle Genetics, which supported the study.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– Call it “7+3+1”: an experimental induction regimen combining standard chemotherapy with an antibody drug conjugate induced rapid and deep remissions in a majority of patients with newly diagnosed acute myeloid leukemia in a small study.

Among 42 evaluable patients with previously untreated AML, the combination of cytarabine and an anthracycline (7+3, also known as 3+7), and the investigational antibody drug conjugate vadastuximab talirine was associated with a 60% complete remission (CR) rate, and 17% complete remission with incomplete recovery of platelets (CRi), reported Harry P. Erba, MD, PhD, of the University of Alabama at Birmingham, who discussed the findings in a video interview.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

“In 1973, 43 years ago, the first paper was published on what we still continue to use as the initial therapy for a very aggressive cancer, acute myeloid leukemia,” he said at a briefing at the American Society of Hematology annual meeting.

“Nothing has been shown yet to be superior to that, despite four decades of clinical research,” he added.

Recent studies have suggested that depth of postinduction remissions, specifically being minimal residual disease (MRD)-negative, is associated with improved survival, he noted.

Vadastuximab talirine (33A, for short) is an antibody-drug conjugate targeted to CD33, which is expressed in approximately 90% of AML cells. The drug is designed to deliver a cytotoxic agent to myeloid leukemia cells.

As reported previously, 33A, in combination with a hypomethylating agent (decitabine or azacitidine) in 49 evaluable patients, was associated with a composite CR/CRi rate of 71%; the rates of CR/CRi were similar regardless of the partner agent used.

The overall response rate in that study was 76%, with responses seen among higher-risk patients, including remissions in 16 of 22 patients with underlying myelodysplasia, and in 15 of 18 patients with adverse cytogenetics.

Rapid complete remissions

In the phase Ib trial reported at ASH 2016 by Dr. Erba, adults aged 18-65 years with untreated primary or secondary AML (except acute promyelocytic leukemia) were enrolled.

The patients received 33A in combination with 7+3 induction therapy (cytarabine 100 mg/m2 and daunorubicin 60 mg/m2) on days 1 and 4 of a 28-day treatment cycle. Patients were assessed for response on days 1 and 28 according to International Working Group Criteria.

Second induction regimens and postremission therapies were permitted at the investigators discretion, and did not include 33A.

The median patient age was 45.5 years. The patients had generally good performance status (Eastern Cooperative Oncology Group 0 or 1). In all, 17% of patients had secondary AML. In all, 12% had favorable cytogenetic risk disease, 50% had intermediate risk, and 36% had adverse risk. Ten percent of patients had NPM1 mutated disease, and 14% had FLT-3 mutations.

As noted, the composite CR/CRi rate was 76%, consisting of 60% CR and 17% CRI.

All five patients with favorable risk disease had a CR. The rate of CR/CRi was 86% among patients with intermediate-risk disease, and 60 for those with adverse-risk disease.

Of the 32 patients who achieved a CR or CRi, 94% did so after 1 cycle of therapy, and 25 were MRD negative, as evaluated by an independent laboratory using 10-color multi-parameter flow cytometry.

Treatment-related adverse hematologic events included febrile neutropenia (primarily grade 3) in 43% of patients, thrombocytopenia (mostly grade 4) in 38%, anemia (all grade 3) in 24%, and neutropenia (mostly grade 4) in 17%. Other treatment related events were similar to those seen with 7 + 3 alone, and included nausea, diarrhea, decreased appetite and fatigue, mostly grade 1 or 2. One patient had a grade 3 irreversible hepatic toxicity.

The death rate was 2%.

“What we felt we showed is that we were able to combine active doses of 33A with 7 + 3. The doses here were less than the doses used as a single agent, but all doses used in our phase 1b study, including lower doses that what we actually used here, showed complete remissions as a single agent.”

33A “added acceptable on-target myelosuppression. We saw platelet counts recovering to over 100,000, and neutrophils over 1,000 by about four-and-a-half to five weeks, which we felt was reasonable, and patients were able to go on to get post-remission therapy.

A randomized phase II trial comparing 33A and 7+3 to 7+3 alone is slated to launch in the first quarter of 2017.

Dr. Erba disclosed serving as a consultant to and receiving research funding from Seattle Genetics, which supported the study.

 

– Call it “7+3+1”: an experimental induction regimen combining standard chemotherapy with an antibody drug conjugate induced rapid and deep remissions in a majority of patients with newly diagnosed acute myeloid leukemia in a small study.

Among 42 evaluable patients with previously untreated AML, the combination of cytarabine and an anthracycline (7+3, also known as 3+7), and the investigational antibody drug conjugate vadastuximab talirine was associated with a 60% complete remission (CR) rate, and 17% complete remission with incomplete recovery of platelets (CRi), reported Harry P. Erba, MD, PhD, of the University of Alabama at Birmingham, who discussed the findings in a video interview.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

“In 1973, 43 years ago, the first paper was published on what we still continue to use as the initial therapy for a very aggressive cancer, acute myeloid leukemia,” he said at a briefing at the American Society of Hematology annual meeting.

“Nothing has been shown yet to be superior to that, despite four decades of clinical research,” he added.

Recent studies have suggested that depth of postinduction remissions, specifically being minimal residual disease (MRD)-negative, is associated with improved survival, he noted.

Vadastuximab talirine (33A, for short) is an antibody-drug conjugate targeted to CD33, which is expressed in approximately 90% of AML cells. The drug is designed to deliver a cytotoxic agent to myeloid leukemia cells.

As reported previously, 33A, in combination with a hypomethylating agent (decitabine or azacitidine) in 49 evaluable patients, was associated with a composite CR/CRi rate of 71%; the rates of CR/CRi were similar regardless of the partner agent used.

The overall response rate in that study was 76%, with responses seen among higher-risk patients, including remissions in 16 of 22 patients with underlying myelodysplasia, and in 15 of 18 patients with adverse cytogenetics.

Rapid complete remissions

In the phase Ib trial reported at ASH 2016 by Dr. Erba, adults aged 18-65 years with untreated primary or secondary AML (except acute promyelocytic leukemia) were enrolled.

The patients received 33A in combination with 7+3 induction therapy (cytarabine 100 mg/m2 and daunorubicin 60 mg/m2) on days 1 and 4 of a 28-day treatment cycle. Patients were assessed for response on days 1 and 28 according to International Working Group Criteria.

Second induction regimens and postremission therapies were permitted at the investigators discretion, and did not include 33A.

The median patient age was 45.5 years. The patients had generally good performance status (Eastern Cooperative Oncology Group 0 or 1). In all, 17% of patients had secondary AML. In all, 12% had favorable cytogenetic risk disease, 50% had intermediate risk, and 36% had adverse risk. Ten percent of patients had NPM1 mutated disease, and 14% had FLT-3 mutations.

As noted, the composite CR/CRi rate was 76%, consisting of 60% CR and 17% CRI.

All five patients with favorable risk disease had a CR. The rate of CR/CRi was 86% among patients with intermediate-risk disease, and 60 for those with adverse-risk disease.

Of the 32 patients who achieved a CR or CRi, 94% did so after 1 cycle of therapy, and 25 were MRD negative, as evaluated by an independent laboratory using 10-color multi-parameter flow cytometry.

Treatment-related adverse hematologic events included febrile neutropenia (primarily grade 3) in 43% of patients, thrombocytopenia (mostly grade 4) in 38%, anemia (all grade 3) in 24%, and neutropenia (mostly grade 4) in 17%. Other treatment related events were similar to those seen with 7 + 3 alone, and included nausea, diarrhea, decreased appetite and fatigue, mostly grade 1 or 2. One patient had a grade 3 irreversible hepatic toxicity.

The death rate was 2%.

“What we felt we showed is that we were able to combine active doses of 33A with 7 + 3. The doses here were less than the doses used as a single agent, but all doses used in our phase 1b study, including lower doses that what we actually used here, showed complete remissions as a single agent.”

33A “added acceptable on-target myelosuppression. We saw platelet counts recovering to over 100,000, and neutrophils over 1,000 by about four-and-a-half to five weeks, which we felt was reasonable, and patients were able to go on to get post-remission therapy.

A randomized phase II trial comparing 33A and 7+3 to 7+3 alone is slated to launch in the first quarter of 2017.

Dr. Erba disclosed serving as a consultant to and receiving research funding from Seattle Genetics, which supported the study.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT ASH 2016

Disallow All Ads
Vitals

Key clinical point: Deep remissions following induction therapy with AML are associated with better survival outcomes.

Major finding: Adding the antibody drug conjugate vadastuximab talirine (33A) to 7+3 induction therapy induced complete or near-complete remissions 76% of patients with newly diagnosed acute myeloid leukemia.

Data source: Phase Ib study in 42 patients with previously untreated primary or secondary AML.

Disclosures: Dr. Erba disclosed serving as a consultant to and receiving research funding from Seattle Genetics, which supported the study.

Does Higher BMI Directly Increase Risk of Cardiovascular Disease? Maybe Not . . .

Article Type
Changed
Thu, 03/01/2018 - 11:23
Display Headline
Does Higher BMI Directly Increase Risk of Cardiovascular Disease? Maybe Not . . .

Study Overview

Objective. To evaluate whether higher BMI alone contributes to risk of cardiovascular disease (CVD) and death.

Study design. Cohort study of weight-discordant monozygotic twin pairs

Setting and participants. This study took place in Sweden, using a subset of data from the Swedish Twin Registry and the Screening Across Lifespan Twin (SALT) study, which aimed to screen Swedish twins born prior to 1958 for the development of “common complex diseases.” From a total of 44,820 individuals, the current study limited to a subset of 4046 monozygotic twin pairs where both twins had self-reported height and weight data, and where calculated body mass index (BMI) was discordant between the twins, defined as a difference > 0.01 kg/m2. No other inclusion or exclusion criteria are mentioned. Data for the study were collected from several different sources, including telephone interviews (eg, height and weight, behaviors such as physical activity and smoking), national registries on health conditions (eg, myocardial infarction [MI], stroke, diabetes) or prescriptions (eg, diabetes medications), the national causes of death register, and a nationwide database containing socioeconomic variables (eg, income and education). The primary exposure of interest for this study was weight status, categorized as “leaner” or “heavier,” depending on the relative BMI of each twin in a given pair. “Leaner” twins were assumed to have lower adiposity than their “heavier” counterparts, and yet to have identical genetic makeup, thereby allowing the authors to eliminate the contribution of genetic confounding in evaluating the relationship between weight status and CVD risk. The classification system could mean that one person with a BMI of 26 kg/m2 would be placed in the “leaner” category if their twin had a BMI of 28, while someone else in another twin pair but also with a BMI of 26 kg/mmight be classified in the “heavier” category if their twin had a BMI of 22. Twin pairs were followed for up to 15 years to assess for incident outcomes of interest, with baseline data collected between 1998 and 2002, and follow-up through 2013.

Main outcome measures. The primary outcome of interest was the occurrence of incident MI or death from any cause. As above, these outcomes were assessed using national disease and death registries spanning 1987-2013, and ICD-9 or -10 codes of interest. A secondary outcome of incident diabetes was also specified, presumably limited to development of type 2 diabetes mellitus, and identified using the same datasets, as well as the national prescription registry. Kaplan-Meier curves for incident MI and death were constructed comparing all “leaner” twins against all “heavier” twins, and Cox proportional hazards modeling was used to compare the hazard of the primary composite outcome between groups. Logistic regression was used to evaluate the odds of each outcome including diabetes incidence, and several models were built, ranging from an unadjusted model to one adjusting for a number of lifestyle factors (eg, smoking status, physical activity), baseline health conditions, and sociodemographic factors.

The authors separately examined risk of MI/death in the subgroup of twins where the “heavier” twin had a BMI ≤ 24.9 kg/m2 at baseline (ie, despite being labeled “heavier” they still had a technically normal BMI), and examined the impact of weight trajectory prior to the defined baseline (eg, they were able to incorporate into models whether someone had been actively gaining or losing weight over time prior to the baseline exposure categorization). The authors also conducted several sensitivity analyses, including running models excluding twins with < 1 year of follow-up in an effort to insure that results of the main analysis were not biased due to differential loss to follow-up between exposure categories.

Results. Of the 4046 twin pairs in this study, 56% (2283 pairs) were female, and mean (SD) age at baseline was 57.6 (9.5) years. Race/ethnicity was not reported but presumably the vast majority, if not all, are non-Hispanic white, based on the country of origin. In comparing the group of “heavier” twins to “leaner” twins, several important baseline differences were found. By design, the “heavier” twins had significantly higher mean (SD) BMI at study baseline (25.9 [3.6] kg/m2 vs. 23.9 [3.1] kg/m2) and reported greater increases in BMI over the 15–20 years preceding baseline (change since 1973 was +4.3 [2.9] BMI units for “heavier” twins, vs. +2.6 [2.6] for “leaner” twins). Smoking status differed significantly between groups, with 15% of “heavier” twins reporting they were current smokers versus ~21% of “leaner” twins. “Leaner” twins were also slightly more active than their “heavier” counterparts (50.4% reported getting “rather much or very much” exercise versus 46.5%). The groups were otherwise very similar with respect to marital status, educational level, income, and baseline diagnoses of MI, stroke, diabetes, cancer or alcohol abuse.

In fully adjusted models over a mean (SD) 12.4 (2.5)-year follow-up, “heavier” twins had a significantly lower odds of MI or death (combined) than “leaner” twins (odds ratio [OR] 0.75, 95% CI 0.63–0.91). Because the “heavier” vs. “leaner” dichotomy did not map to clinical definitions of overweight or obesity, the investigators also examined this primary outcome among subgroups with more clinical relevance. Being “heavier” actually had the greatest protective effect against MI/death (OR 0.61, 95% CI 0.46–0.80) among pairs where the so-called “heavier” twin had a normal BMI (< 25.0 kg/m2), and this subgroup appeared to be driving the overall finding of lower odds of MI/death in the “heavier” group as a whole. This pattern was underscored when examining the subgroup of twin pairs where the “heavier” twin had a BMI ≥ 30 kg/m2 at baseline – in this group the protective effect of being “heavier” disappeared (OR 0.92, 95% CI 0.60 to 1.42). Besides not always reflecting clinically relevant weight categories, the “heavier” vs. “leaner” twin dichotomy could, in some cases, amount to a very small difference in BMI between twins (anything > 0.01 unit counted as discordant). As such, the investigators sought to examine whether their results held up when looking at pairs with a higher threshold for BMI discordance (1.0 to 7.0 units or more difference between twins), finding that risk of MI or death did not increase among the “heavier” group in these more widely split twin pairs, even when adjusting for smoking status and physical activity.

In contrast to the MI/mortality analyses, “heavier” twins did have significantly greater odds of developing diabetes during follow-up compared to their “leaner” counterparts (OR 1.94, 95% CI 1.51 to 2.48, adjusted for smoking and physical activity). Also unlike the MI/death analyses, this relationship of increased diabetes risk among “heavier” twins was enhanced by increasing BMI dissimilarity between twins, and among twins who had been gaining weight prior to baseline BMI measurement.

Sensitivity analyses excluding twins with less than 1 year of follow-up did not result in changes to the main findings—“heavier” twins still had similar odds of MI/death as “leaner” twins.

Conclusion. The authors conclude that among monozygotic twin pairs, where the possibility for genetic confounding has been eliminated, obesity is not causally associated with increased risk of MI or death, although the results do support an increased risk of developing incident diabetes among individuals with higher BMI.

Commentary

Obesity is a known risk factor for many chronic conditions, including diabetes, osteoarthritis, sleep apnea, and hypertension [1]. However, the relationship between obesity and cardiovascular outcomes, particularly coronary artery disease and death from heart disease, has been more controversial. Some epidemiologic studies have demonstrated reduced mortality risk among patients with obesity and heart failure, and even among those with established coronary artery disease—the so-called “obesity paradox” [2]. Others have observed that overweight older adults may have lower overall mortality compared to their normal weight counterparts [3]. On the other hand, it is known that obesity increases risk for diabetes, which is itself a clear and proven risk factor for CVD and death.

As the authors of the current study point out, genetic confounding may be a potential reason for the conflicting results produced in studies of the obesity–CVD risk relationship. In other words, patients who have genes that promote weight gain may also have genes that promote CVD, through pathways independent of excess adipose tissue, with these hidden pathways acting as confounders of the obesity–CVD relationship. By studying monozygotic twin pairs, who have identical genetic makeup but have developed differential weight status due to different environmental exposures, the investigators designed a study that would eliminate any genetic confounding and allow them to better isolate the relationship between higher BMI and CVD. This is an important topic area because, at a population level, we are faced with an immense number of adults who have obesity. Treatment of this condition is resource intense and it is critical that patients and health care systems understand the potential risk reduction that will be achieved with sustained weight loss.

The strengths of this study include the use of a very unique dataset with longitudinal measures on a large number of monozygotic twin pairs, and the authors’ ability to link this dataset with nationwide comprehensive datasets on health conditions, health care use (pharmacy), sociodemographics, and death. Sweden’s national registries are quite impressive and permit these types of studies in a way that would be very difficult to achieve in the United States, with its innumerable separate health care systems and few data sources that contain information on all citizens. Because of these multiple data sources, the authors were able to adjust for some important lifestyle factors that could easily confound the weight status-MI/death relationship, such as smoking and physical activity. Additionally, their models were able to factor in trajectory of weight on some individuals prior to baseline, rather than viewing baseline weight only as a “snapshot” which could risk missing an important trend of weight gain or loss over time, with important health implications.

There are several limitations of the study that are worth reviewing. First, and most importantly, as pointed out in a commentary associated with the article, the categorization of “leaner” and “heavier” can be somewhat misleading if the true question is whether or not excess adiposity is an independent driver of cardiovascular risk [4]. BMI, at the individual level, is not an ideal measure of adiposity and it does not speak to distribution of fat tissue, which is critically important in evaluating CVD risk [5]. For example, 2 siblings could have identical BMIs, but one might have significantly more lean mass in their legs and buttocks, and the other could have more central adipose tissue, translating to a much higher cardiovascular risk. Measures such as waist circumference are critical factors in addition to BMI to better understand an individual’s adipose tissue volume and distribution.

Although the authors did adjust for some self-reported behaviors that are important predictors of CVD (smoking, exercise), there is still potential for confounding due to unscreened or unreported exposures that differ systematically between “leaner” and “heavier” twins. Of note, smoking status—probably the single most important risk factor for CVD—was missing in 13% of the cohort, and no imputation techniques were used for missing data. Another limitation of this study is that its generalizability to more racial/ethnically diverse populations may be limited. Presumably, the patients in this study were non-Hispanic white Swedes, and whether or not these findings would be replicated in other groups, such as those of African or Asian ancestry, is not known.

Finally, the finding that “heavier” twins had greater odds of developing diabetes during follow-up is certainly consistent with existing literature. However, it is also known that diabetes is a strong risk factor for the development of CVD, including MI, and for death [6]. This raises the question of why the authors observed an increased diabetes risk yet no change in MI/death rates among heavier twins. Most likely the discrepancy is due to inadequate follow-up time of incident diabetes cases. Complications of diabetes can take a number of years to materialize, and, with an average of 12 years’ total follow-up in this study, there simply may not have been time to observe an increased risk of MI/death in heavier twins.

Applications for Clinical Practice

For patients interested in weight loss as a way of reducing CVD risk, this paper does not support the notion that lower body weight alone exerts direct influence on this endpoint. However, it reinforces the link between higher body weight and diabetes, which is a clear risk factor for CVD. Therefore, it still seems reasonable to advise patients who are at risk of diabetes that improving dietary quality, increasing cardiorespiratory fitness, and losing weight can reduce their long-term risk of CVD, even if indirectly so.

 

—Kristina Lewis, MD, MPH

References

1. Jensen MD, Ryan DH, Apovian CM, et al. 2013 AHA/ACC/TOS guideline for the management of overweight and obesity in adults: a report of the American College of Cardiology/American Heart Association Task Force on Practice Guidelines and The Obesity Society. Circulation 2014;129(25 Suppl 2):S102–138.

2. Antonopoulos AS, Oikonomou EK, Antoniades C, Tousoulis D. From the BMI paradox to the obesity paradox: the obesity-mortality association in coronary heart disease. Obes Rev 2016;17:989–1000.

3. Flegal KM, Kit BK, Orpana H, Graubard BI. Association of all-cause mortality with overweight and obesity using standard body mass index categories: a systematic review and meta-analysis. JAMA 2013;309:71–82.

4. Davidson DJ, Davidson MH. Using discordance in monozygotic twins to understand causality of cardiovascular disease risk factors. JAMA Intern Med 2016;176:1530.

5. Amato MC, Guarnotta V, Giordano C. Body composition assessment for the definition of cardiometabolic risk. J Endocrinol Invest 2013;36:537–43.

6. The Emerging Risk Factors Collaboration, Seshasai SR, Kaptoge S, et al. Diabetes mellitus, fasting glucose, and risk of cause-specific death. N Engl J Med 2011;364:829–41.

Issue
Journal of Clinical Outcomes Management - December 2016, Vol. 23, No. 12
Publications
Topics
Sections

Study Overview

Objective. To evaluate whether higher BMI alone contributes to risk of cardiovascular disease (CVD) and death.

Study design. Cohort study of weight-discordant monozygotic twin pairs

Setting and participants. This study took place in Sweden, using a subset of data from the Swedish Twin Registry and the Screening Across Lifespan Twin (SALT) study, which aimed to screen Swedish twins born prior to 1958 for the development of “common complex diseases.” From a total of 44,820 individuals, the current study limited to a subset of 4046 monozygotic twin pairs where both twins had self-reported height and weight data, and where calculated body mass index (BMI) was discordant between the twins, defined as a difference > 0.01 kg/m2. No other inclusion or exclusion criteria are mentioned. Data for the study were collected from several different sources, including telephone interviews (eg, height and weight, behaviors such as physical activity and smoking), national registries on health conditions (eg, myocardial infarction [MI], stroke, diabetes) or prescriptions (eg, diabetes medications), the national causes of death register, and a nationwide database containing socioeconomic variables (eg, income and education). The primary exposure of interest for this study was weight status, categorized as “leaner” or “heavier,” depending on the relative BMI of each twin in a given pair. “Leaner” twins were assumed to have lower adiposity than their “heavier” counterparts, and yet to have identical genetic makeup, thereby allowing the authors to eliminate the contribution of genetic confounding in evaluating the relationship between weight status and CVD risk. The classification system could mean that one person with a BMI of 26 kg/m2 would be placed in the “leaner” category if their twin had a BMI of 28, while someone else in another twin pair but also with a BMI of 26 kg/mmight be classified in the “heavier” category if their twin had a BMI of 22. Twin pairs were followed for up to 15 years to assess for incident outcomes of interest, with baseline data collected between 1998 and 2002, and follow-up through 2013.

Main outcome measures. The primary outcome of interest was the occurrence of incident MI or death from any cause. As above, these outcomes were assessed using national disease and death registries spanning 1987-2013, and ICD-9 or -10 codes of interest. A secondary outcome of incident diabetes was also specified, presumably limited to development of type 2 diabetes mellitus, and identified using the same datasets, as well as the national prescription registry. Kaplan-Meier curves for incident MI and death were constructed comparing all “leaner” twins against all “heavier” twins, and Cox proportional hazards modeling was used to compare the hazard of the primary composite outcome between groups. Logistic regression was used to evaluate the odds of each outcome including diabetes incidence, and several models were built, ranging from an unadjusted model to one adjusting for a number of lifestyle factors (eg, smoking status, physical activity), baseline health conditions, and sociodemographic factors.

The authors separately examined risk of MI/death in the subgroup of twins where the “heavier” twin had a BMI ≤ 24.9 kg/m2 at baseline (ie, despite being labeled “heavier” they still had a technically normal BMI), and examined the impact of weight trajectory prior to the defined baseline (eg, they were able to incorporate into models whether someone had been actively gaining or losing weight over time prior to the baseline exposure categorization). The authors also conducted several sensitivity analyses, including running models excluding twins with < 1 year of follow-up in an effort to insure that results of the main analysis were not biased due to differential loss to follow-up between exposure categories.

Results. Of the 4046 twin pairs in this study, 56% (2283 pairs) were female, and mean (SD) age at baseline was 57.6 (9.5) years. Race/ethnicity was not reported but presumably the vast majority, if not all, are non-Hispanic white, based on the country of origin. In comparing the group of “heavier” twins to “leaner” twins, several important baseline differences were found. By design, the “heavier” twins had significantly higher mean (SD) BMI at study baseline (25.9 [3.6] kg/m2 vs. 23.9 [3.1] kg/m2) and reported greater increases in BMI over the 15–20 years preceding baseline (change since 1973 was +4.3 [2.9] BMI units for “heavier” twins, vs. +2.6 [2.6] for “leaner” twins). Smoking status differed significantly between groups, with 15% of “heavier” twins reporting they were current smokers versus ~21% of “leaner” twins. “Leaner” twins were also slightly more active than their “heavier” counterparts (50.4% reported getting “rather much or very much” exercise versus 46.5%). The groups were otherwise very similar with respect to marital status, educational level, income, and baseline diagnoses of MI, stroke, diabetes, cancer or alcohol abuse.

In fully adjusted models over a mean (SD) 12.4 (2.5)-year follow-up, “heavier” twins had a significantly lower odds of MI or death (combined) than “leaner” twins (odds ratio [OR] 0.75, 95% CI 0.63–0.91). Because the “heavier” vs. “leaner” dichotomy did not map to clinical definitions of overweight or obesity, the investigators also examined this primary outcome among subgroups with more clinical relevance. Being “heavier” actually had the greatest protective effect against MI/death (OR 0.61, 95% CI 0.46–0.80) among pairs where the so-called “heavier” twin had a normal BMI (< 25.0 kg/m2), and this subgroup appeared to be driving the overall finding of lower odds of MI/death in the “heavier” group as a whole. This pattern was underscored when examining the subgroup of twin pairs where the “heavier” twin had a BMI ≥ 30 kg/m2 at baseline – in this group the protective effect of being “heavier” disappeared (OR 0.92, 95% CI 0.60 to 1.42). Besides not always reflecting clinically relevant weight categories, the “heavier” vs. “leaner” twin dichotomy could, in some cases, amount to a very small difference in BMI between twins (anything > 0.01 unit counted as discordant). As such, the investigators sought to examine whether their results held up when looking at pairs with a higher threshold for BMI discordance (1.0 to 7.0 units or more difference between twins), finding that risk of MI or death did not increase among the “heavier” group in these more widely split twin pairs, even when adjusting for smoking status and physical activity.

In contrast to the MI/mortality analyses, “heavier” twins did have significantly greater odds of developing diabetes during follow-up compared to their “leaner” counterparts (OR 1.94, 95% CI 1.51 to 2.48, adjusted for smoking and physical activity). Also unlike the MI/death analyses, this relationship of increased diabetes risk among “heavier” twins was enhanced by increasing BMI dissimilarity between twins, and among twins who had been gaining weight prior to baseline BMI measurement.

Sensitivity analyses excluding twins with less than 1 year of follow-up did not result in changes to the main findings—“heavier” twins still had similar odds of MI/death as “leaner” twins.

Conclusion. The authors conclude that among monozygotic twin pairs, where the possibility for genetic confounding has been eliminated, obesity is not causally associated with increased risk of MI or death, although the results do support an increased risk of developing incident diabetes among individuals with higher BMI.

Commentary

Obesity is a known risk factor for many chronic conditions, including diabetes, osteoarthritis, sleep apnea, and hypertension [1]. However, the relationship between obesity and cardiovascular outcomes, particularly coronary artery disease and death from heart disease, has been more controversial. Some epidemiologic studies have demonstrated reduced mortality risk among patients with obesity and heart failure, and even among those with established coronary artery disease—the so-called “obesity paradox” [2]. Others have observed that overweight older adults may have lower overall mortality compared to their normal weight counterparts [3]. On the other hand, it is known that obesity increases risk for diabetes, which is itself a clear and proven risk factor for CVD and death.

As the authors of the current study point out, genetic confounding may be a potential reason for the conflicting results produced in studies of the obesity–CVD risk relationship. In other words, patients who have genes that promote weight gain may also have genes that promote CVD, through pathways independent of excess adipose tissue, with these hidden pathways acting as confounders of the obesity–CVD relationship. By studying monozygotic twin pairs, who have identical genetic makeup but have developed differential weight status due to different environmental exposures, the investigators designed a study that would eliminate any genetic confounding and allow them to better isolate the relationship between higher BMI and CVD. This is an important topic area because, at a population level, we are faced with an immense number of adults who have obesity. Treatment of this condition is resource intense and it is critical that patients and health care systems understand the potential risk reduction that will be achieved with sustained weight loss.

The strengths of this study include the use of a very unique dataset with longitudinal measures on a large number of monozygotic twin pairs, and the authors’ ability to link this dataset with nationwide comprehensive datasets on health conditions, health care use (pharmacy), sociodemographics, and death. Sweden’s national registries are quite impressive and permit these types of studies in a way that would be very difficult to achieve in the United States, with its innumerable separate health care systems and few data sources that contain information on all citizens. Because of these multiple data sources, the authors were able to adjust for some important lifestyle factors that could easily confound the weight status-MI/death relationship, such as smoking and physical activity. Additionally, their models were able to factor in trajectory of weight on some individuals prior to baseline, rather than viewing baseline weight only as a “snapshot” which could risk missing an important trend of weight gain or loss over time, with important health implications.

There are several limitations of the study that are worth reviewing. First, and most importantly, as pointed out in a commentary associated with the article, the categorization of “leaner” and “heavier” can be somewhat misleading if the true question is whether or not excess adiposity is an independent driver of cardiovascular risk [4]. BMI, at the individual level, is not an ideal measure of adiposity and it does not speak to distribution of fat tissue, which is critically important in evaluating CVD risk [5]. For example, 2 siblings could have identical BMIs, but one might have significantly more lean mass in their legs and buttocks, and the other could have more central adipose tissue, translating to a much higher cardiovascular risk. Measures such as waist circumference are critical factors in addition to BMI to better understand an individual’s adipose tissue volume and distribution.

Although the authors did adjust for some self-reported behaviors that are important predictors of CVD (smoking, exercise), there is still potential for confounding due to unscreened or unreported exposures that differ systematically between “leaner” and “heavier” twins. Of note, smoking status—probably the single most important risk factor for CVD—was missing in 13% of the cohort, and no imputation techniques were used for missing data. Another limitation of this study is that its generalizability to more racial/ethnically diverse populations may be limited. Presumably, the patients in this study were non-Hispanic white Swedes, and whether or not these findings would be replicated in other groups, such as those of African or Asian ancestry, is not known.

Finally, the finding that “heavier” twins had greater odds of developing diabetes during follow-up is certainly consistent with existing literature. However, it is also known that diabetes is a strong risk factor for the development of CVD, including MI, and for death [6]. This raises the question of why the authors observed an increased diabetes risk yet no change in MI/death rates among heavier twins. Most likely the discrepancy is due to inadequate follow-up time of incident diabetes cases. Complications of diabetes can take a number of years to materialize, and, with an average of 12 years’ total follow-up in this study, there simply may not have been time to observe an increased risk of MI/death in heavier twins.

Applications for Clinical Practice

For patients interested in weight loss as a way of reducing CVD risk, this paper does not support the notion that lower body weight alone exerts direct influence on this endpoint. However, it reinforces the link between higher body weight and diabetes, which is a clear risk factor for CVD. Therefore, it still seems reasonable to advise patients who are at risk of diabetes that improving dietary quality, increasing cardiorespiratory fitness, and losing weight can reduce their long-term risk of CVD, even if indirectly so.

 

—Kristina Lewis, MD, MPH

Study Overview

Objective. To evaluate whether higher BMI alone contributes to risk of cardiovascular disease (CVD) and death.

Study design. Cohort study of weight-discordant monozygotic twin pairs

Setting and participants. This study took place in Sweden, using a subset of data from the Swedish Twin Registry and the Screening Across Lifespan Twin (SALT) study, which aimed to screen Swedish twins born prior to 1958 for the development of “common complex diseases.” From a total of 44,820 individuals, the current study limited to a subset of 4046 monozygotic twin pairs where both twins had self-reported height and weight data, and where calculated body mass index (BMI) was discordant between the twins, defined as a difference > 0.01 kg/m2. No other inclusion or exclusion criteria are mentioned. Data for the study were collected from several different sources, including telephone interviews (eg, height and weight, behaviors such as physical activity and smoking), national registries on health conditions (eg, myocardial infarction [MI], stroke, diabetes) or prescriptions (eg, diabetes medications), the national causes of death register, and a nationwide database containing socioeconomic variables (eg, income and education). The primary exposure of interest for this study was weight status, categorized as “leaner” or “heavier,” depending on the relative BMI of each twin in a given pair. “Leaner” twins were assumed to have lower adiposity than their “heavier” counterparts, and yet to have identical genetic makeup, thereby allowing the authors to eliminate the contribution of genetic confounding in evaluating the relationship between weight status and CVD risk. The classification system could mean that one person with a BMI of 26 kg/m2 would be placed in the “leaner” category if their twin had a BMI of 28, while someone else in another twin pair but also with a BMI of 26 kg/mmight be classified in the “heavier” category if their twin had a BMI of 22. Twin pairs were followed for up to 15 years to assess for incident outcomes of interest, with baseline data collected between 1998 and 2002, and follow-up through 2013.

Main outcome measures. The primary outcome of interest was the occurrence of incident MI or death from any cause. As above, these outcomes were assessed using national disease and death registries spanning 1987-2013, and ICD-9 or -10 codes of interest. A secondary outcome of incident diabetes was also specified, presumably limited to development of type 2 diabetes mellitus, and identified using the same datasets, as well as the national prescription registry. Kaplan-Meier curves for incident MI and death were constructed comparing all “leaner” twins against all “heavier” twins, and Cox proportional hazards modeling was used to compare the hazard of the primary composite outcome between groups. Logistic regression was used to evaluate the odds of each outcome including diabetes incidence, and several models were built, ranging from an unadjusted model to one adjusting for a number of lifestyle factors (eg, smoking status, physical activity), baseline health conditions, and sociodemographic factors.

The authors separately examined risk of MI/death in the subgroup of twins where the “heavier” twin had a BMI ≤ 24.9 kg/m2 at baseline (ie, despite being labeled “heavier” they still had a technically normal BMI), and examined the impact of weight trajectory prior to the defined baseline (eg, they were able to incorporate into models whether someone had been actively gaining or losing weight over time prior to the baseline exposure categorization). The authors also conducted several sensitivity analyses, including running models excluding twins with < 1 year of follow-up in an effort to insure that results of the main analysis were not biased due to differential loss to follow-up between exposure categories.

Results. Of the 4046 twin pairs in this study, 56% (2283 pairs) were female, and mean (SD) age at baseline was 57.6 (9.5) years. Race/ethnicity was not reported but presumably the vast majority, if not all, are non-Hispanic white, based on the country of origin. In comparing the group of “heavier” twins to “leaner” twins, several important baseline differences were found. By design, the “heavier” twins had significantly higher mean (SD) BMI at study baseline (25.9 [3.6] kg/m2 vs. 23.9 [3.1] kg/m2) and reported greater increases in BMI over the 15–20 years preceding baseline (change since 1973 was +4.3 [2.9] BMI units for “heavier” twins, vs. +2.6 [2.6] for “leaner” twins). Smoking status differed significantly between groups, with 15% of “heavier” twins reporting they were current smokers versus ~21% of “leaner” twins. “Leaner” twins were also slightly more active than their “heavier” counterparts (50.4% reported getting “rather much or very much” exercise versus 46.5%). The groups were otherwise very similar with respect to marital status, educational level, income, and baseline diagnoses of MI, stroke, diabetes, cancer or alcohol abuse.

In fully adjusted models over a mean (SD) 12.4 (2.5)-year follow-up, “heavier” twins had a significantly lower odds of MI or death (combined) than “leaner” twins (odds ratio [OR] 0.75, 95% CI 0.63–0.91). Because the “heavier” vs. “leaner” dichotomy did not map to clinical definitions of overweight or obesity, the investigators also examined this primary outcome among subgroups with more clinical relevance. Being “heavier” actually had the greatest protective effect against MI/death (OR 0.61, 95% CI 0.46–0.80) among pairs where the so-called “heavier” twin had a normal BMI (< 25.0 kg/m2), and this subgroup appeared to be driving the overall finding of lower odds of MI/death in the “heavier” group as a whole. This pattern was underscored when examining the subgroup of twin pairs where the “heavier” twin had a BMI ≥ 30 kg/m2 at baseline – in this group the protective effect of being “heavier” disappeared (OR 0.92, 95% CI 0.60 to 1.42). Besides not always reflecting clinically relevant weight categories, the “heavier” vs. “leaner” twin dichotomy could, in some cases, amount to a very small difference in BMI between twins (anything > 0.01 unit counted as discordant). As such, the investigators sought to examine whether their results held up when looking at pairs with a higher threshold for BMI discordance (1.0 to 7.0 units or more difference between twins), finding that risk of MI or death did not increase among the “heavier” group in these more widely split twin pairs, even when adjusting for smoking status and physical activity.

In contrast to the MI/mortality analyses, “heavier” twins did have significantly greater odds of developing diabetes during follow-up compared to their “leaner” counterparts (OR 1.94, 95% CI 1.51 to 2.48, adjusted for smoking and physical activity). Also unlike the MI/death analyses, this relationship of increased diabetes risk among “heavier” twins was enhanced by increasing BMI dissimilarity between twins, and among twins who had been gaining weight prior to baseline BMI measurement.

Sensitivity analyses excluding twins with less than 1 year of follow-up did not result in changes to the main findings—“heavier” twins still had similar odds of MI/death as “leaner” twins.

Conclusion. The authors conclude that among monozygotic twin pairs, where the possibility for genetic confounding has been eliminated, obesity is not causally associated with increased risk of MI or death, although the results do support an increased risk of developing incident diabetes among individuals with higher BMI.

Commentary

Obesity is a known risk factor for many chronic conditions, including diabetes, osteoarthritis, sleep apnea, and hypertension [1]. However, the relationship between obesity and cardiovascular outcomes, particularly coronary artery disease and death from heart disease, has been more controversial. Some epidemiologic studies have demonstrated reduced mortality risk among patients with obesity and heart failure, and even among those with established coronary artery disease—the so-called “obesity paradox” [2]. Others have observed that overweight older adults may have lower overall mortality compared to their normal weight counterparts [3]. On the other hand, it is known that obesity increases risk for diabetes, which is itself a clear and proven risk factor for CVD and death.

As the authors of the current study point out, genetic confounding may be a potential reason for the conflicting results produced in studies of the obesity–CVD risk relationship. In other words, patients who have genes that promote weight gain may also have genes that promote CVD, through pathways independent of excess adipose tissue, with these hidden pathways acting as confounders of the obesity–CVD relationship. By studying monozygotic twin pairs, who have identical genetic makeup but have developed differential weight status due to different environmental exposures, the investigators designed a study that would eliminate any genetic confounding and allow them to better isolate the relationship between higher BMI and CVD. This is an important topic area because, at a population level, we are faced with an immense number of adults who have obesity. Treatment of this condition is resource intense and it is critical that patients and health care systems understand the potential risk reduction that will be achieved with sustained weight loss.

The strengths of this study include the use of a very unique dataset with longitudinal measures on a large number of monozygotic twin pairs, and the authors’ ability to link this dataset with nationwide comprehensive datasets on health conditions, health care use (pharmacy), sociodemographics, and death. Sweden’s national registries are quite impressive and permit these types of studies in a way that would be very difficult to achieve in the United States, with its innumerable separate health care systems and few data sources that contain information on all citizens. Because of these multiple data sources, the authors were able to adjust for some important lifestyle factors that could easily confound the weight status-MI/death relationship, such as smoking and physical activity. Additionally, their models were able to factor in trajectory of weight on some individuals prior to baseline, rather than viewing baseline weight only as a “snapshot” which could risk missing an important trend of weight gain or loss over time, with important health implications.

There are several limitations of the study that are worth reviewing. First, and most importantly, as pointed out in a commentary associated with the article, the categorization of “leaner” and “heavier” can be somewhat misleading if the true question is whether or not excess adiposity is an independent driver of cardiovascular risk [4]. BMI, at the individual level, is not an ideal measure of adiposity and it does not speak to distribution of fat tissue, which is critically important in evaluating CVD risk [5]. For example, 2 siblings could have identical BMIs, but one might have significantly more lean mass in their legs and buttocks, and the other could have more central adipose tissue, translating to a much higher cardiovascular risk. Measures such as waist circumference are critical factors in addition to BMI to better understand an individual’s adipose tissue volume and distribution.

Although the authors did adjust for some self-reported behaviors that are important predictors of CVD (smoking, exercise), there is still potential for confounding due to unscreened or unreported exposures that differ systematically between “leaner” and “heavier” twins. Of note, smoking status—probably the single most important risk factor for CVD—was missing in 13% of the cohort, and no imputation techniques were used for missing data. Another limitation of this study is that its generalizability to more racial/ethnically diverse populations may be limited. Presumably, the patients in this study were non-Hispanic white Swedes, and whether or not these findings would be replicated in other groups, such as those of African or Asian ancestry, is not known.

Finally, the finding that “heavier” twins had greater odds of developing diabetes during follow-up is certainly consistent with existing literature. However, it is also known that diabetes is a strong risk factor for the development of CVD, including MI, and for death [6]. This raises the question of why the authors observed an increased diabetes risk yet no change in MI/death rates among heavier twins. Most likely the discrepancy is due to inadequate follow-up time of incident diabetes cases. Complications of diabetes can take a number of years to materialize, and, with an average of 12 years’ total follow-up in this study, there simply may not have been time to observe an increased risk of MI/death in heavier twins.

Applications for Clinical Practice

For patients interested in weight loss as a way of reducing CVD risk, this paper does not support the notion that lower body weight alone exerts direct influence on this endpoint. However, it reinforces the link between higher body weight and diabetes, which is a clear risk factor for CVD. Therefore, it still seems reasonable to advise patients who are at risk of diabetes that improving dietary quality, increasing cardiorespiratory fitness, and losing weight can reduce their long-term risk of CVD, even if indirectly so.

 

—Kristina Lewis, MD, MPH

References

1. Jensen MD, Ryan DH, Apovian CM, et al. 2013 AHA/ACC/TOS guideline for the management of overweight and obesity in adults: a report of the American College of Cardiology/American Heart Association Task Force on Practice Guidelines and The Obesity Society. Circulation 2014;129(25 Suppl 2):S102–138.

2. Antonopoulos AS, Oikonomou EK, Antoniades C, Tousoulis D. From the BMI paradox to the obesity paradox: the obesity-mortality association in coronary heart disease. Obes Rev 2016;17:989–1000.

3. Flegal KM, Kit BK, Orpana H, Graubard BI. Association of all-cause mortality with overweight and obesity using standard body mass index categories: a systematic review and meta-analysis. JAMA 2013;309:71–82.

4. Davidson DJ, Davidson MH. Using discordance in monozygotic twins to understand causality of cardiovascular disease risk factors. JAMA Intern Med 2016;176:1530.

5. Amato MC, Guarnotta V, Giordano C. Body composition assessment for the definition of cardiometabolic risk. J Endocrinol Invest 2013;36:537–43.

6. The Emerging Risk Factors Collaboration, Seshasai SR, Kaptoge S, et al. Diabetes mellitus, fasting glucose, and risk of cause-specific death. N Engl J Med 2011;364:829–41.

References

1. Jensen MD, Ryan DH, Apovian CM, et al. 2013 AHA/ACC/TOS guideline for the management of overweight and obesity in adults: a report of the American College of Cardiology/American Heart Association Task Force on Practice Guidelines and The Obesity Society. Circulation 2014;129(25 Suppl 2):S102–138.

2. Antonopoulos AS, Oikonomou EK, Antoniades C, Tousoulis D. From the BMI paradox to the obesity paradox: the obesity-mortality association in coronary heart disease. Obes Rev 2016;17:989–1000.

3. Flegal KM, Kit BK, Orpana H, Graubard BI. Association of all-cause mortality with overweight and obesity using standard body mass index categories: a systematic review and meta-analysis. JAMA 2013;309:71–82.

4. Davidson DJ, Davidson MH. Using discordance in monozygotic twins to understand causality of cardiovascular disease risk factors. JAMA Intern Med 2016;176:1530.

5. Amato MC, Guarnotta V, Giordano C. Body composition assessment for the definition of cardiometabolic risk. J Endocrinol Invest 2013;36:537–43.

6. The Emerging Risk Factors Collaboration, Seshasai SR, Kaptoge S, et al. Diabetes mellitus, fasting glucose, and risk of cause-specific death. N Engl J Med 2011;364:829–41.

Issue
Journal of Clinical Outcomes Management - December 2016, Vol. 23, No. 12
Issue
Journal of Clinical Outcomes Management - December 2016, Vol. 23, No. 12
Publications
Publications
Topics
Article Type
Display Headline
Does Higher BMI Directly Increase Risk of Cardiovascular Disease? Maybe Not . . .
Display Headline
Does Higher BMI Directly Increase Risk of Cardiovascular Disease? Maybe Not . . .
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default

Centers of excellence program raises quality bar in lung cancer management

Article Type
Changed
Fri, 01/04/2019 - 13:27

– A grassroots, patient-centric program aimed at encouraging U.S. community hospitals in underserved areas to adopt a comprehensive centers-of-excellence model for treatment of lung cancer has demonstrated that community cancer centers can achieve quality of care comparable to that found in academic medical centers, Raymond Osarogiagbon, MD, declared at the World Congress on Lung Cancer.

One major reason why aggregate lung cancer survival in the United States has barely inched upward during the past 3 decades is there is often a huge gap between the quality of care patients get at academic research centers – including access to clinical trials – and what they can get in community hospitals. Eighty percent of lung cancer patients receive their care in these community cancer centers, where not all physicians and surgeons may be up to speed with guideline-recommended best practices, observed Dr. Osarogiagbon, a medical oncologist and director of the multidisciplinary thoracic oncology program at Baptist Cancer Center in Memphis.

Bruce Jancin/Frontline Medical News
Dr. Raymond Osarogiagbon
“Lung cancer is a devil of a problem right now. The disparities in outcome are worse than in most other cancers. There is a prevailing sense of nihilism about lung cancer in many community cancer centers ranging from nontreatment of some patients who have eminently treatable disease to widely disparate methods of treatment, some of which may not be appropriate. If you’re going to move the needle at the population level and help more people survive lung cancer, it seems you’re going to have to interfere at the level of the places patients choose to go to receive their care,” he explained at the meeting, which was sponsored by the International Association for the Study of Lung Cancer.

The centers of excellence program, supported by the Bonnie J. Addario Lung Cancer Foundation, is an attempt to address this disparity. In 3 years, 13 hospitals in areas with large underserved patient populations in nine states have qualified. Dr. Osarogiagbon anticipates that the competitive advantage this designation provides will spur more and more community hospitals with cancer centers to get on board.

To qualify, community cancer centers have to commit to following specific best practices as standards of care in accord with guidelines issued by groups including the National Comprehensive Cancer Network and the American Society of Clinical Oncology. Requirements include the use of multidisciplinary teams for treatment decisions, routine use of molecular diagnostics and targeted therapies, patient access to clinical trials, longitudinal institutional data tracking, patient and caregiver education programs, and minimally invasive surgical and staging techniques. A lung cancer screening program is required. So is a systematic program for management of all incidentally detected lung nodules, many of which today fall between the cracks.

In an interview, Dr. Osarogiagbon shared several examples of how achieving the foundation’s center of excellence designation enables community cancer centers to achieve top-quality care on a par with academic medical centers. At Memorial Cancer Institute in Hollywood, Fla., which serves large Hispanic and black populations, implementation of a lung cancer screening program and other measures has resulted in 40% of patients with lung cancer being diagnosed with stage 1 or 2 disease amenable to curative surgery. The overall U.S. rate is significantly lower at 29%.

And in an area composed of western Tennessee, northern Mississippi, and eastern Arkansas, Dr. Osarogiagbon and coinvestigators at the Baptist Cancer Center conducted a study demonstrating that the use of two complementary surgical staging interventions resulted in improved rates of guideline-recommended surgical staging quality.

The observational study, presented elsewhere at the world congress by Nicholas Faris, MD, of the Thoracic Oncology Research Group, Baptist Cancer Center, Memphis, entailed analysis of curative-intent resections in 2,094 patients with non–small-cell lung cancer during 2004-2016. A novel anatomically sound gross dissection protocol was provided to assist pathologists in retrieving the intrapulmonary lymph nodes required for staging in 161 patients undergoing curative resection. A special lymph node specimen collection kit was utilized in 152. Another 289 resections utilized both interventions. And 1,492 patients received neither intervention.

Use of the interventions was associated with higher rates of adherence to various professional organizations’ guidelines for high-quality surgical staging. For example, the American College of Surgeons Commission on Cancer recommends that at least 10 lymph nodes be examined in patients with stage 1a-2b NSCLC. This was achieved in 71% of patients who had both interventions, 56% of those where the lymph node specimen collection kit was utilized but not the pathology intervention, 48% of patients where the pathology intervention but not the kit was employed, and in only 25% of patients where neither was used.

Similarly, more than three mediastinal lymph node stations were sampled in accord with National Comprehensive Cancer Network guidelines in 96% of patients with both interventions, 90% with the lymph node collection kit, 54% with the pathology intervention, and only 44% with neither, Dr. Osarogiagbon said in the interview.

Dr. Osarogiagbon reported serving as a consultant to Eli Lilly, Genentech, and the Association of Community Cancer Centers.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– A grassroots, patient-centric program aimed at encouraging U.S. community hospitals in underserved areas to adopt a comprehensive centers-of-excellence model for treatment of lung cancer has demonstrated that community cancer centers can achieve quality of care comparable to that found in academic medical centers, Raymond Osarogiagbon, MD, declared at the World Congress on Lung Cancer.

One major reason why aggregate lung cancer survival in the United States has barely inched upward during the past 3 decades is there is often a huge gap between the quality of care patients get at academic research centers – including access to clinical trials – and what they can get in community hospitals. Eighty percent of lung cancer patients receive their care in these community cancer centers, where not all physicians and surgeons may be up to speed with guideline-recommended best practices, observed Dr. Osarogiagbon, a medical oncologist and director of the multidisciplinary thoracic oncology program at Baptist Cancer Center in Memphis.

Bruce Jancin/Frontline Medical News
Dr. Raymond Osarogiagbon
“Lung cancer is a devil of a problem right now. The disparities in outcome are worse than in most other cancers. There is a prevailing sense of nihilism about lung cancer in many community cancer centers ranging from nontreatment of some patients who have eminently treatable disease to widely disparate methods of treatment, some of which may not be appropriate. If you’re going to move the needle at the population level and help more people survive lung cancer, it seems you’re going to have to interfere at the level of the places patients choose to go to receive their care,” he explained at the meeting, which was sponsored by the International Association for the Study of Lung Cancer.

The centers of excellence program, supported by the Bonnie J. Addario Lung Cancer Foundation, is an attempt to address this disparity. In 3 years, 13 hospitals in areas with large underserved patient populations in nine states have qualified. Dr. Osarogiagbon anticipates that the competitive advantage this designation provides will spur more and more community hospitals with cancer centers to get on board.

To qualify, community cancer centers have to commit to following specific best practices as standards of care in accord with guidelines issued by groups including the National Comprehensive Cancer Network and the American Society of Clinical Oncology. Requirements include the use of multidisciplinary teams for treatment decisions, routine use of molecular diagnostics and targeted therapies, patient access to clinical trials, longitudinal institutional data tracking, patient and caregiver education programs, and minimally invasive surgical and staging techniques. A lung cancer screening program is required. So is a systematic program for management of all incidentally detected lung nodules, many of which today fall between the cracks.

In an interview, Dr. Osarogiagbon shared several examples of how achieving the foundation’s center of excellence designation enables community cancer centers to achieve top-quality care on a par with academic medical centers. At Memorial Cancer Institute in Hollywood, Fla., which serves large Hispanic and black populations, implementation of a lung cancer screening program and other measures has resulted in 40% of patients with lung cancer being diagnosed with stage 1 or 2 disease amenable to curative surgery. The overall U.S. rate is significantly lower at 29%.

And in an area composed of western Tennessee, northern Mississippi, and eastern Arkansas, Dr. Osarogiagbon and coinvestigators at the Baptist Cancer Center conducted a study demonstrating that the use of two complementary surgical staging interventions resulted in improved rates of guideline-recommended surgical staging quality.

The observational study, presented elsewhere at the world congress by Nicholas Faris, MD, of the Thoracic Oncology Research Group, Baptist Cancer Center, Memphis, entailed analysis of curative-intent resections in 2,094 patients with non–small-cell lung cancer during 2004-2016. A novel anatomically sound gross dissection protocol was provided to assist pathologists in retrieving the intrapulmonary lymph nodes required for staging in 161 patients undergoing curative resection. A special lymph node specimen collection kit was utilized in 152. Another 289 resections utilized both interventions. And 1,492 patients received neither intervention.

Use of the interventions was associated with higher rates of adherence to various professional organizations’ guidelines for high-quality surgical staging. For example, the American College of Surgeons Commission on Cancer recommends that at least 10 lymph nodes be examined in patients with stage 1a-2b NSCLC. This was achieved in 71% of patients who had both interventions, 56% of those where the lymph node specimen collection kit was utilized but not the pathology intervention, 48% of patients where the pathology intervention but not the kit was employed, and in only 25% of patients where neither was used.

Similarly, more than three mediastinal lymph node stations were sampled in accord with National Comprehensive Cancer Network guidelines in 96% of patients with both interventions, 90% with the lymph node collection kit, 54% with the pathology intervention, and only 44% with neither, Dr. Osarogiagbon said in the interview.

Dr. Osarogiagbon reported serving as a consultant to Eli Lilly, Genentech, and the Association of Community Cancer Centers.

– A grassroots, patient-centric program aimed at encouraging U.S. community hospitals in underserved areas to adopt a comprehensive centers-of-excellence model for treatment of lung cancer has demonstrated that community cancer centers can achieve quality of care comparable to that found in academic medical centers, Raymond Osarogiagbon, MD, declared at the World Congress on Lung Cancer.

One major reason why aggregate lung cancer survival in the United States has barely inched upward during the past 3 decades is there is often a huge gap between the quality of care patients get at academic research centers – including access to clinical trials – and what they can get in community hospitals. Eighty percent of lung cancer patients receive their care in these community cancer centers, where not all physicians and surgeons may be up to speed with guideline-recommended best practices, observed Dr. Osarogiagbon, a medical oncologist and director of the multidisciplinary thoracic oncology program at Baptist Cancer Center in Memphis.

Bruce Jancin/Frontline Medical News
Dr. Raymond Osarogiagbon
“Lung cancer is a devil of a problem right now. The disparities in outcome are worse than in most other cancers. There is a prevailing sense of nihilism about lung cancer in many community cancer centers ranging from nontreatment of some patients who have eminently treatable disease to widely disparate methods of treatment, some of which may not be appropriate. If you’re going to move the needle at the population level and help more people survive lung cancer, it seems you’re going to have to interfere at the level of the places patients choose to go to receive their care,” he explained at the meeting, which was sponsored by the International Association for the Study of Lung Cancer.

The centers of excellence program, supported by the Bonnie J. Addario Lung Cancer Foundation, is an attempt to address this disparity. In 3 years, 13 hospitals in areas with large underserved patient populations in nine states have qualified. Dr. Osarogiagbon anticipates that the competitive advantage this designation provides will spur more and more community hospitals with cancer centers to get on board.

To qualify, community cancer centers have to commit to following specific best practices as standards of care in accord with guidelines issued by groups including the National Comprehensive Cancer Network and the American Society of Clinical Oncology. Requirements include the use of multidisciplinary teams for treatment decisions, routine use of molecular diagnostics and targeted therapies, patient access to clinical trials, longitudinal institutional data tracking, patient and caregiver education programs, and minimally invasive surgical and staging techniques. A lung cancer screening program is required. So is a systematic program for management of all incidentally detected lung nodules, many of which today fall between the cracks.

In an interview, Dr. Osarogiagbon shared several examples of how achieving the foundation’s center of excellence designation enables community cancer centers to achieve top-quality care on a par with academic medical centers. At Memorial Cancer Institute in Hollywood, Fla., which serves large Hispanic and black populations, implementation of a lung cancer screening program and other measures has resulted in 40% of patients with lung cancer being diagnosed with stage 1 or 2 disease amenable to curative surgery. The overall U.S. rate is significantly lower at 29%.

And in an area composed of western Tennessee, northern Mississippi, and eastern Arkansas, Dr. Osarogiagbon and coinvestigators at the Baptist Cancer Center conducted a study demonstrating that the use of two complementary surgical staging interventions resulted in improved rates of guideline-recommended surgical staging quality.

The observational study, presented elsewhere at the world congress by Nicholas Faris, MD, of the Thoracic Oncology Research Group, Baptist Cancer Center, Memphis, entailed analysis of curative-intent resections in 2,094 patients with non–small-cell lung cancer during 2004-2016. A novel anatomically sound gross dissection protocol was provided to assist pathologists in retrieving the intrapulmonary lymph nodes required for staging in 161 patients undergoing curative resection. A special lymph node specimen collection kit was utilized in 152. Another 289 resections utilized both interventions. And 1,492 patients received neither intervention.

Use of the interventions was associated with higher rates of adherence to various professional organizations’ guidelines for high-quality surgical staging. For example, the American College of Surgeons Commission on Cancer recommends that at least 10 lymph nodes be examined in patients with stage 1a-2b NSCLC. This was achieved in 71% of patients who had both interventions, 56% of those where the lymph node specimen collection kit was utilized but not the pathology intervention, 48% of patients where the pathology intervention but not the kit was employed, and in only 25% of patients where neither was used.

Similarly, more than three mediastinal lymph node stations were sampled in accord with National Comprehensive Cancer Network guidelines in 96% of patients with both interventions, 90% with the lymph node collection kit, 54% with the pathology intervention, and only 44% with neither, Dr. Osarogiagbon said in the interview.

Dr. Osarogiagbon reported serving as a consultant to Eli Lilly, Genentech, and the Association of Community Cancer Centers.

Publications
Publications
Topics
Article Type
Sections
Article Source

EXPERT ANALYSIS FROM WCLC 2016

Disallow All Ads

Survey sheds light on clinical neurophysiology fellowship training

Article Type
Changed
Thu, 03/28/2019 - 14:59


HOUSTON – Fellowship training in clinical neurophysiology delivered high rates of satisfaction, but recommended areas for improvements include more focus on training in sleep, brain mapping, and evoked potentials, results from a survey of current trainees found.

“There has been no systematic evaluation of neurophysiology fellowships,” Zulfi Haneef, MD, said in an interview in advance of the annual meeting of the American Epilepsy Society. “In fact, there are some studies on neurology residency, but not on any of the neurology fellowships. This study opens a window into what the situation is after residency: what makes the residents decide on a fellowship, and how they feel they have done with the choice of fellowship.”

Dr. Zulfi Haneef
In late 2015, Dr. Haneef, a neurologist at the Baylor College of Medicine, Houston, and his associates sent a 17-item, Internet-based survey to current neurophysiology fellows in the United States, with the intent to collect data regarding demographics, reasons for choosing fellowship, adequacy of training, and future career plans. Of the 49 trainees who responded, 84% had graduated from a U.S. medical school. While the primary motivator for choosing the fellowship was a personal interest in clinical neurophysiology (92%), the most common secondary and tertiary motivators were reimbursement (47%) and academic prestige (49%), respectively. Program choice was most often guided by location and clinical strength of the program (cited by 52% and 42% of respondents, respectively). “These are factors which can be highlighted by various programs to improve recruiting top candidates,” Dr. Haneef said.

Overall, 87% of respondents expressed satisfaction with their current program and rated it as 4 or a 5 on a 1-5 Likert scale. “It seems that choosing a program based on the location may not have been the best thing to do,” he said. “Satisfaction scores (on a 1-5 scale for the training) were lower among those who chose a program based on location.” Less time spent in the epilepsy monitoring unit and EEG monitoring was also associated with higher satisfaction scores. “Rather than a lack of interest in epilepsy monitoring and EEG, this may reflect overemphasis on these at the expense of other areas, as these were also the areas that appeared to be most stressed during training,” Dr. Haneef said. The researchers observed no differences between male and female respondents in their answers to the various survey questions.

Based on the survey results, better clinical neurophysiology training in some areas were deemed necessary by the responding fellows, including sleep, brain mapping, and evoked potentials. Dr. Haneef acknowledged certain limitations of the study, including the fact that it was voluntary. “As such there could be some self-selection of fellows who have stronger viewpoints that pushed them to respond to a survey,” he said. He reported having no financial disclosures related to the study.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event


HOUSTON – Fellowship training in clinical neurophysiology delivered high rates of satisfaction, but recommended areas for improvements include more focus on training in sleep, brain mapping, and evoked potentials, results from a survey of current trainees found.

“There has been no systematic evaluation of neurophysiology fellowships,” Zulfi Haneef, MD, said in an interview in advance of the annual meeting of the American Epilepsy Society. “In fact, there are some studies on neurology residency, but not on any of the neurology fellowships. This study opens a window into what the situation is after residency: what makes the residents decide on a fellowship, and how they feel they have done with the choice of fellowship.”

Dr. Zulfi Haneef
In late 2015, Dr. Haneef, a neurologist at the Baylor College of Medicine, Houston, and his associates sent a 17-item, Internet-based survey to current neurophysiology fellows in the United States, with the intent to collect data regarding demographics, reasons for choosing fellowship, adequacy of training, and future career plans. Of the 49 trainees who responded, 84% had graduated from a U.S. medical school. While the primary motivator for choosing the fellowship was a personal interest in clinical neurophysiology (92%), the most common secondary and tertiary motivators were reimbursement (47%) and academic prestige (49%), respectively. Program choice was most often guided by location and clinical strength of the program (cited by 52% and 42% of respondents, respectively). “These are factors which can be highlighted by various programs to improve recruiting top candidates,” Dr. Haneef said.

Overall, 87% of respondents expressed satisfaction with their current program and rated it as 4 or a 5 on a 1-5 Likert scale. “It seems that choosing a program based on the location may not have been the best thing to do,” he said. “Satisfaction scores (on a 1-5 scale for the training) were lower among those who chose a program based on location.” Less time spent in the epilepsy monitoring unit and EEG monitoring was also associated with higher satisfaction scores. “Rather than a lack of interest in epilepsy monitoring and EEG, this may reflect overemphasis on these at the expense of other areas, as these were also the areas that appeared to be most stressed during training,” Dr. Haneef said. The researchers observed no differences between male and female respondents in their answers to the various survey questions.

Based on the survey results, better clinical neurophysiology training in some areas were deemed necessary by the responding fellows, including sleep, brain mapping, and evoked potentials. Dr. Haneef acknowledged certain limitations of the study, including the fact that it was voluntary. “As such there could be some self-selection of fellows who have stronger viewpoints that pushed them to respond to a survey,” he said. He reported having no financial disclosures related to the study.


HOUSTON – Fellowship training in clinical neurophysiology delivered high rates of satisfaction, but recommended areas for improvements include more focus on training in sleep, brain mapping, and evoked potentials, results from a survey of current trainees found.

“There has been no systematic evaluation of neurophysiology fellowships,” Zulfi Haneef, MD, said in an interview in advance of the annual meeting of the American Epilepsy Society. “In fact, there are some studies on neurology residency, but not on any of the neurology fellowships. This study opens a window into what the situation is after residency: what makes the residents decide on a fellowship, and how they feel they have done with the choice of fellowship.”

Dr. Zulfi Haneef
In late 2015, Dr. Haneef, a neurologist at the Baylor College of Medicine, Houston, and his associates sent a 17-item, Internet-based survey to current neurophysiology fellows in the United States, with the intent to collect data regarding demographics, reasons for choosing fellowship, adequacy of training, and future career plans. Of the 49 trainees who responded, 84% had graduated from a U.S. medical school. While the primary motivator for choosing the fellowship was a personal interest in clinical neurophysiology (92%), the most common secondary and tertiary motivators were reimbursement (47%) and academic prestige (49%), respectively. Program choice was most often guided by location and clinical strength of the program (cited by 52% and 42% of respondents, respectively). “These are factors which can be highlighted by various programs to improve recruiting top candidates,” Dr. Haneef said.

Overall, 87% of respondents expressed satisfaction with their current program and rated it as 4 or a 5 on a 1-5 Likert scale. “It seems that choosing a program based on the location may not have been the best thing to do,” he said. “Satisfaction scores (on a 1-5 scale for the training) were lower among those who chose a program based on location.” Less time spent in the epilepsy monitoring unit and EEG monitoring was also associated with higher satisfaction scores. “Rather than a lack of interest in epilepsy monitoring and EEG, this may reflect overemphasis on these at the expense of other areas, as these were also the areas that appeared to be most stressed during training,” Dr. Haneef said. The researchers observed no differences between male and female respondents in their answers to the various survey questions.

Based on the survey results, better clinical neurophysiology training in some areas were deemed necessary by the responding fellows, including sleep, brain mapping, and evoked potentials. Dr. Haneef acknowledged certain limitations of the study, including the fact that it was voluntary. “As such there could be some self-selection of fellows who have stronger viewpoints that pushed them to respond to a survey,” he said. He reported having no financial disclosures related to the study.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT AES 2016

Disallow All Ads
Vitals

Key clinical point: A majority of clinical neurophysiology trainees appear to be satisfied with their choice of fellowship.

Major finding: Overall, 87% of clinical neurophysiology fellows expressed satisfaction with their current program and rated it as 4 or a 5 on a 1-5 Likert scale.

Data source: An Internet-based survey of 49 clinical neurophysiology fellows in the United States.

Disclosures: Dr. Haneef reported having no financial disclosures related to the study.

VIDEO: Artificial blood cells clear first phase of animal testing

Article Type
Changed
Fri, 01/04/2019 - 09:57

– An artificial red blood cell has come close to emulating the key functions of natural cells and does not appear to be associated with the side effects such as vasospasm and poor response to changes in blood pH that hampered the development of previous artificial blood products, Allan Doctor, MD, reported at the annual meeting of the American Society of Hematology.

The bio-synthetic cells, called ErythroMer, are about 1/50th the size of natural red blood cells. They can be stored at room temperature and reconstituted with water when needed for use.

In a mouse model, the ErythroMer cells were shown to capture oxygen in the lungs and release it to tissue in a pattern that was nearly identical to blood transfusion. In a rat model of shock, ErythroMer was effective for resuscitation.

In a video interview, Dr. Doctor of Washington University in St. Louis discussed the pharmacokinetics of ErythroMer, the need for a readily available blood substitute for treating trauma patients, other potential uses for artificial blood cells, and next steps for testing the product.

Dr. Doctor has equity ownership in KaloCyte, the company developing ErythroMer. He receives research funding from Children’s Discovery Institute and the National Institutes of Health.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

[email protected]

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– An artificial red blood cell has come close to emulating the key functions of natural cells and does not appear to be associated with the side effects such as vasospasm and poor response to changes in blood pH that hampered the development of previous artificial blood products, Allan Doctor, MD, reported at the annual meeting of the American Society of Hematology.

The bio-synthetic cells, called ErythroMer, are about 1/50th the size of natural red blood cells. They can be stored at room temperature and reconstituted with water when needed for use.

In a mouse model, the ErythroMer cells were shown to capture oxygen in the lungs and release it to tissue in a pattern that was nearly identical to blood transfusion. In a rat model of shock, ErythroMer was effective for resuscitation.

In a video interview, Dr. Doctor of Washington University in St. Louis discussed the pharmacokinetics of ErythroMer, the need for a readily available blood substitute for treating trauma patients, other potential uses for artificial blood cells, and next steps for testing the product.

Dr. Doctor has equity ownership in KaloCyte, the company developing ErythroMer. He receives research funding from Children’s Discovery Institute and the National Institutes of Health.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

[email protected]

– An artificial red blood cell has come close to emulating the key functions of natural cells and does not appear to be associated with the side effects such as vasospasm and poor response to changes in blood pH that hampered the development of previous artificial blood products, Allan Doctor, MD, reported at the annual meeting of the American Society of Hematology.

The bio-synthetic cells, called ErythroMer, are about 1/50th the size of natural red blood cells. They can be stored at room temperature and reconstituted with water when needed for use.

In a mouse model, the ErythroMer cells were shown to capture oxygen in the lungs and release it to tissue in a pattern that was nearly identical to blood transfusion. In a rat model of shock, ErythroMer was effective for resuscitation.

In a video interview, Dr. Doctor of Washington University in St. Louis discussed the pharmacokinetics of ErythroMer, the need for a readily available blood substitute for treating trauma patients, other potential uses for artificial blood cells, and next steps for testing the product.

Dr. Doctor has equity ownership in KaloCyte, the company developing ErythroMer. He receives research funding from Children’s Discovery Institute and the National Institutes of Health.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

[email protected]

Publications
Publications
Topics
Article Type
Sections
Article Source

AT ASH 2016

Disallow All Ads

Halving the TKI dose safe, cost effective in CML patients with stable remissions

Article Type
Changed
Fri, 01/04/2019 - 09:57

– For some chronic myeloid leukemia patients with solid, stable remissions, halving their dose of a tyrosine kinase inhibitor – or even stopping therapy altogether, at least temporarily – appears to be safe and to offer both health and financial benefits, European investigators said at the annual meeting of the American Society of Hematology.

 

In the British De-escalation and Stopping Treatment of Imatinib, Nilotinib, or Sprycel [dasatinib], or Destiny Study, a total of 12 molecular relapses occurred between the second and twelfth month of dose reduction among 174 patients with either an MR3 or MR4 molecular response, and all 12 patients had restoration of molecular remissions after resumption of full dose TKIs, reported co-investigator Mhairi Copland, MD, PhD, from the University of Glasgow, Scotland.

Dr. Mhairi Copland
Halving the TKI dose was also associated with a nearly 50% savings in the expected costs of full-dose TKI therapy. Individual adverse events also diminished significantly in the first 3 months of de-escalation, but not thereafter.

“What we wanted to explore in the Destiny study is cutting the dose of tyrosine kinase inhibitor therapy in CML by half, followed by stopping therapy not just in patients with undetectable disease but also with stable low levels of disease,” Dr. Copland said during a briefing at the meeting.

“We hypothesized that more patients would be able to reduce therapy safely, and a proportion of these would be able to go on to stop therapy; also, that the patients on half-dose therapy would have reduced amount of side effects compared to those on full-dose therapy,” she added.

Several recent studies, including the EURO-SKI trial, have shown that it is safe to stop TKI therapy in those patients who are optimally responding and have undetectable levels of the BCR-ABL transcript.

Rendezvous with Destiny

In Destiny, the investigators enrolled patients with “good, but not perfect” molecular responses: MR3 or better, defined as a minimum of 3 consecutive tests each with greater than 10,000 ABL control transcripts following a minimum of 3 years on a TKI at standard prescribed doses. The median overall duration of TKI therapy was 7 years.

Participants on imatinib had their daily doses reduced to 200 mg, those on nilotinib had their doses cut back to 200 mg twice daily, and those on dasatinib had their quotidian doses halved to 50 mg.

After 12 months of half-dose therapy, molecular recurrence, defined as a loss of MR3 on two consecutive samples, was detected in 9 of 49 patients (18.4%) with MR3 but not MR4 remissions, compared with 3 of 125 patients (2.4%) with MR4 or better remissions (P less than .001).

The median time to relapse was 4.4 months among MR3/not 4 patients vs. 8.7 months for MR4 or better patients.

The probability of molecular recurrence on dose reduction was unrelated to either age, sex, performance status, type of TKI, or the duration of TKI therapy (median 7 years overall).

No patients experienced either progression to advanced phase disease or loss of cytogenetic response. During the course of follow-up, one patient died, and there were 15 serious adverse events, but these were determined to be unrelated to either CML or TKI treatment.

All 12 patients who experienced molecular recurrence regained MR3 within 4 months of resuming TKI therapy at the full dose.

As noted before, patient-reported side effects such as lethargy, diarrhea, rash, nausea, periorbital edema, and hair thinning decreased during the first 3 months of de-escalation, but not thereafter. Dr. Copland said that patients had generally good quality-of-life scores at study entry, suggesting that they were likely not especially bothered by TKI side effects in the first place.

The investigators calculated that for the 174 patients, halving treatment would save an estimated £1,943,364 ($2,474,679) from an expected TKI budget of £4,156,969 ($5,293,484), a savings of 46.7%. Estimated savings were similar for patients with MR4 or better alone (47.7%) and for those with a major molecular response (44.2%).

EURO-SKI Update

Also at ASH 2016, Francois-Xavier Mahon, MD, PhD, from the University of Bordeaux, France, reported additional follow-up data from the EURO-SKI trial, results of which were first reported at the 2016 annual meeting of the European Hematology Association in Copenhagen.

Dr. Francois-Xavier Mahon

 

The investigators found that 50% of 755 assessable patients with CML were free of molecular recurrence at 24 months, as were 47% at 36 months.

As reported previously, patients who had been on a TKI for more than 5.8 years before attempting to stop had a lower rate of relapse (34.5%) than patients who had been on therapy for less than 5.8 years (57.4%). Each additional year of TKI therapy was associated with an approximately 16% better chance of successful TKI cessation.

“With inclusion and relapse criteria less strict than in many previous trials, and with decentralized but standardized PCR monitoring, stopping of TKI therapy in a large cohort of CML patients appears feasible and safe,” Dr. Mahon said at the briefing.

The British Destiny Study was supported by Newcastle University. Dr. Copland reported honoraria, advisory board memberships, and/or research funding from Amgen, Pfizer, Shire, BMS, and Ariad.

EURO-SKI was sponsored by the European LeukemiaNet. Dr. Mahon has previously disclosed being on the scientific advisory board and receiving honoraria from Novartis Oncology and BMS, and serving as consultant to those companies and to Pfizer.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– For some chronic myeloid leukemia patients with solid, stable remissions, halving their dose of a tyrosine kinase inhibitor – or even stopping therapy altogether, at least temporarily – appears to be safe and to offer both health and financial benefits, European investigators said at the annual meeting of the American Society of Hematology.

 

In the British De-escalation and Stopping Treatment of Imatinib, Nilotinib, or Sprycel [dasatinib], or Destiny Study, a total of 12 molecular relapses occurred between the second and twelfth month of dose reduction among 174 patients with either an MR3 or MR4 molecular response, and all 12 patients had restoration of molecular remissions after resumption of full dose TKIs, reported co-investigator Mhairi Copland, MD, PhD, from the University of Glasgow, Scotland.

Dr. Mhairi Copland
Halving the TKI dose was also associated with a nearly 50% savings in the expected costs of full-dose TKI therapy. Individual adverse events also diminished significantly in the first 3 months of de-escalation, but not thereafter.

“What we wanted to explore in the Destiny study is cutting the dose of tyrosine kinase inhibitor therapy in CML by half, followed by stopping therapy not just in patients with undetectable disease but also with stable low levels of disease,” Dr. Copland said during a briefing at the meeting.

“We hypothesized that more patients would be able to reduce therapy safely, and a proportion of these would be able to go on to stop therapy; also, that the patients on half-dose therapy would have reduced amount of side effects compared to those on full-dose therapy,” she added.

Several recent studies, including the EURO-SKI trial, have shown that it is safe to stop TKI therapy in those patients who are optimally responding and have undetectable levels of the BCR-ABL transcript.

Rendezvous with Destiny

In Destiny, the investigators enrolled patients with “good, but not perfect” molecular responses: MR3 or better, defined as a minimum of 3 consecutive tests each with greater than 10,000 ABL control transcripts following a minimum of 3 years on a TKI at standard prescribed doses. The median overall duration of TKI therapy was 7 years.

Participants on imatinib had their daily doses reduced to 200 mg, those on nilotinib had their doses cut back to 200 mg twice daily, and those on dasatinib had their quotidian doses halved to 50 mg.

After 12 months of half-dose therapy, molecular recurrence, defined as a loss of MR3 on two consecutive samples, was detected in 9 of 49 patients (18.4%) with MR3 but not MR4 remissions, compared with 3 of 125 patients (2.4%) with MR4 or better remissions (P less than .001).

The median time to relapse was 4.4 months among MR3/not 4 patients vs. 8.7 months for MR4 or better patients.

The probability of molecular recurrence on dose reduction was unrelated to either age, sex, performance status, type of TKI, or the duration of TKI therapy (median 7 years overall).

No patients experienced either progression to advanced phase disease or loss of cytogenetic response. During the course of follow-up, one patient died, and there were 15 serious adverse events, but these were determined to be unrelated to either CML or TKI treatment.

All 12 patients who experienced molecular recurrence regained MR3 within 4 months of resuming TKI therapy at the full dose.

As noted before, patient-reported side effects such as lethargy, diarrhea, rash, nausea, periorbital edema, and hair thinning decreased during the first 3 months of de-escalation, but not thereafter. Dr. Copland said that patients had generally good quality-of-life scores at study entry, suggesting that they were likely not especially bothered by TKI side effects in the first place.

The investigators calculated that for the 174 patients, halving treatment would save an estimated £1,943,364 ($2,474,679) from an expected TKI budget of £4,156,969 ($5,293,484), a savings of 46.7%. Estimated savings were similar for patients with MR4 or better alone (47.7%) and for those with a major molecular response (44.2%).

EURO-SKI Update

Also at ASH 2016, Francois-Xavier Mahon, MD, PhD, from the University of Bordeaux, France, reported additional follow-up data from the EURO-SKI trial, results of which were first reported at the 2016 annual meeting of the European Hematology Association in Copenhagen.

Dr. Francois-Xavier Mahon

 

The investigators found that 50% of 755 assessable patients with CML were free of molecular recurrence at 24 months, as were 47% at 36 months.

As reported previously, patients who had been on a TKI for more than 5.8 years before attempting to stop had a lower rate of relapse (34.5%) than patients who had been on therapy for less than 5.8 years (57.4%). Each additional year of TKI therapy was associated with an approximately 16% better chance of successful TKI cessation.

“With inclusion and relapse criteria less strict than in many previous trials, and with decentralized but standardized PCR monitoring, stopping of TKI therapy in a large cohort of CML patients appears feasible and safe,” Dr. Mahon said at the briefing.

The British Destiny Study was supported by Newcastle University. Dr. Copland reported honoraria, advisory board memberships, and/or research funding from Amgen, Pfizer, Shire, BMS, and Ariad.

EURO-SKI was sponsored by the European LeukemiaNet. Dr. Mahon has previously disclosed being on the scientific advisory board and receiving honoraria from Novartis Oncology and BMS, and serving as consultant to those companies and to Pfizer.

– For some chronic myeloid leukemia patients with solid, stable remissions, halving their dose of a tyrosine kinase inhibitor – or even stopping therapy altogether, at least temporarily – appears to be safe and to offer both health and financial benefits, European investigators said at the annual meeting of the American Society of Hematology.

 

In the British De-escalation and Stopping Treatment of Imatinib, Nilotinib, or Sprycel [dasatinib], or Destiny Study, a total of 12 molecular relapses occurred between the second and twelfth month of dose reduction among 174 patients with either an MR3 or MR4 molecular response, and all 12 patients had restoration of molecular remissions after resumption of full dose TKIs, reported co-investigator Mhairi Copland, MD, PhD, from the University of Glasgow, Scotland.

Dr. Mhairi Copland
Halving the TKI dose was also associated with a nearly 50% savings in the expected costs of full-dose TKI therapy. Individual adverse events also diminished significantly in the first 3 months of de-escalation, but not thereafter.

“What we wanted to explore in the Destiny study is cutting the dose of tyrosine kinase inhibitor therapy in CML by half, followed by stopping therapy not just in patients with undetectable disease but also with stable low levels of disease,” Dr. Copland said during a briefing at the meeting.

“We hypothesized that more patients would be able to reduce therapy safely, and a proportion of these would be able to go on to stop therapy; also, that the patients on half-dose therapy would have reduced amount of side effects compared to those on full-dose therapy,” she added.

Several recent studies, including the EURO-SKI trial, have shown that it is safe to stop TKI therapy in those patients who are optimally responding and have undetectable levels of the BCR-ABL transcript.

Rendezvous with Destiny

In Destiny, the investigators enrolled patients with “good, but not perfect” molecular responses: MR3 or better, defined as a minimum of 3 consecutive tests each with greater than 10,000 ABL control transcripts following a minimum of 3 years on a TKI at standard prescribed doses. The median overall duration of TKI therapy was 7 years.

Participants on imatinib had their daily doses reduced to 200 mg, those on nilotinib had their doses cut back to 200 mg twice daily, and those on dasatinib had their quotidian doses halved to 50 mg.

After 12 months of half-dose therapy, molecular recurrence, defined as a loss of MR3 on two consecutive samples, was detected in 9 of 49 patients (18.4%) with MR3 but not MR4 remissions, compared with 3 of 125 patients (2.4%) with MR4 or better remissions (P less than .001).

The median time to relapse was 4.4 months among MR3/not 4 patients vs. 8.7 months for MR4 or better patients.

The probability of molecular recurrence on dose reduction was unrelated to either age, sex, performance status, type of TKI, or the duration of TKI therapy (median 7 years overall).

No patients experienced either progression to advanced phase disease or loss of cytogenetic response. During the course of follow-up, one patient died, and there were 15 serious adverse events, but these were determined to be unrelated to either CML or TKI treatment.

All 12 patients who experienced molecular recurrence regained MR3 within 4 months of resuming TKI therapy at the full dose.

As noted before, patient-reported side effects such as lethargy, diarrhea, rash, nausea, periorbital edema, and hair thinning decreased during the first 3 months of de-escalation, but not thereafter. Dr. Copland said that patients had generally good quality-of-life scores at study entry, suggesting that they were likely not especially bothered by TKI side effects in the first place.

The investigators calculated that for the 174 patients, halving treatment would save an estimated £1,943,364 ($2,474,679) from an expected TKI budget of £4,156,969 ($5,293,484), a savings of 46.7%. Estimated savings were similar for patients with MR4 or better alone (47.7%) and for those with a major molecular response (44.2%).

EURO-SKI Update

Also at ASH 2016, Francois-Xavier Mahon, MD, PhD, from the University of Bordeaux, France, reported additional follow-up data from the EURO-SKI trial, results of which were first reported at the 2016 annual meeting of the European Hematology Association in Copenhagen.

Dr. Francois-Xavier Mahon

 

The investigators found that 50% of 755 assessable patients with CML were free of molecular recurrence at 24 months, as were 47% at 36 months.

As reported previously, patients who had been on a TKI for more than 5.8 years before attempting to stop had a lower rate of relapse (34.5%) than patients who had been on therapy for less than 5.8 years (57.4%). Each additional year of TKI therapy was associated with an approximately 16% better chance of successful TKI cessation.

“With inclusion and relapse criteria less strict than in many previous trials, and with decentralized but standardized PCR monitoring, stopping of TKI therapy in a large cohort of CML patients appears feasible and safe,” Dr. Mahon said at the briefing.

The British Destiny Study was supported by Newcastle University. Dr. Copland reported honoraria, advisory board memberships, and/or research funding from Amgen, Pfizer, Shire, BMS, and Ariad.

EURO-SKI was sponsored by the European LeukemiaNet. Dr. Mahon has previously disclosed being on the scientific advisory board and receiving honoraria from Novartis Oncology and BMS, and serving as consultant to those companies and to Pfizer.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ASH 2016

Disallow All Ads
Vitals

Key clinical point: Halving TKI doses in patients with chronic myeloid leukemia in stable remission is safe and cost effective.

Major finding: After halving TKI doses, there were 12 molecular relapses among 174 patients with an MR3 or better molecular response.

Data source: Prospective dose-reduction study in 174 patients with CML in MR3 remission or better.

Disclosures: The British Destiny Study was supported by Newcastle University. Dr. Copland reported honoraria, advisory board memberships, and/or research funding from Amgen, Pfizer, Shire, BMS, and Ariad. EURO-SKI was sponsored by the European LeukemiaNet. Dr. Mahon has previously disclosed being on the scientific advisory board and receiving honoraria from Novartis Oncology and BMS, and serving as consultant to those companies and to Pfizer.

Anti-CD22 CAR T-cells shift ALL into complete remission

Article Type
Changed
Fri, 01/04/2019 - 09:57

– When one CAR stops one working, try another: chimeric antigen receptor (CAR) T-cell therapy for children and young adults with acute lymphoblastic leukemia is driving forward with a novel anti-CD22 target that in an early dose-finding trial has induced complete remissions in some patients with relapsed or refractory disease, including patients previously treated with anti-CD19 CAR-T therapy.

In the first-in-humans trial, CAR T-cell therapy directed against CD22 was shown to be safe and was associated with minimal residual disease (MRD)-negative complete remissions in eight of 10 children and young adults with relapsed/refractory B-precursor acute lymphoblastic leukemia treated at the highest dose level.

Dr. Terry J. Fry
One patient remains in remission more than 1 year after treatment, one had a 6-month remission, and one had a remission lasting 3 months.

“This is the first successful salvage CAR therapy for CD19-negative B-[lineage] ALL,” said co-principal investigator Terry J. Fry, MD, from the Center for Cancer Research at the National Cancer Institute in Bethesda, Md.

Preliminary experience with anti-CD22 immunotherapy suggests that it is comparable in potency to anti-CD19 CAR, and investigators are exploring the possibility that the two chimeric antigen targets could be combined for greater efficacy, he said during a briefing at the annual meeting of the American Society of Hematology.

Tough target

As reported previously from the 2013 ASH annual meeting, anti-CD19 CAR T cells induced complete responses in 10 of 16 children and young adults with relapsed/refractory ALL, and in a second study, CD19-targeted T cells induced complete molecular responses in 12 of 16 adults with B-lineage ALL refractory to chemotherapy.

In current phase 2 trials, anti-CD19 CAR-T therapy is associated with complete remission rates of 80% to 90% of those treated.

However, “we’re learning now that one of the limitations of this approach is the loss of CD19 expression occurring in a substantial number of patients, although it has not been systematically analyzed,” Dr. Fry said.

CD22, an antigen restricted to B-lineage cells, is a promising alternative to CD19 as a target, but finding just the right anti-CD22 CAR was tricky, Dr. Fry said in an interview. The investigators found that many candidate antigens bound well to T cells but had no efficacy, and it took several years of trying before they identified the current version of the antigen

In the phase I trial, the investigators enrolled 16 children and young adults (ages 7 to 22 years) with relapsed/refractory CD22-positive hematologic malignancies. All patients had previously undergone at least one allogeneic stem cell transplant, 11 had previously received anti-CD19 CAR-T cell therapy, and 9 were CD19-negative or had reduced CD19 expression on ALL cells.

The patients underwent peripheral blood mononuclear cells (PBMCs) collected through autologous leukapheresis. The cells were then enriched and expanded, and transduced with a lentiviral vector containing an anti-CD22 CAR for 7 to 10 days, allowing the cells to identify and bind to CD22 expressed on ALL blasts.

The patients then underwent lymphodepletion with fludarabine, and cyclophosphamide, and received infusions of the transduced T-cells at one of three dose levels, starting at 3 x 105 transduced T-cells per recipient weight in kilograms (DL-1), 1 x 106/kg (DL-2), and 3 x 106/kg (DL-3).

The complete remission rate at DL-2 and -3 combined was 80%, with the cytokine-release syndrome (CRS) at a maximum of grade 2.

As noted before, three of the remissions were comparatively durable, with one lasting more than a year.

There were no dose-limiting toxicities at DL-2, and grade 4 hypoxia at DL-3 was seen in one patient.There was one death from sepsis and multi-organ failure in one patient in an expansion cohort. There have been no cases of severe neurotoxicity thus far.

In five patients who experienced relapse, one treated at DL-1 had a loss of CAR cells, and four had changes in CD22 expression, primarily a decrease in site density that may cause the CD22 expression to fall below the threshold for CAR activity, Dr. Fry said.

“At least in our eyes, this may not be best used as a salvage therapy, but we’re beginning to think about how this should be included with CD19 in the upfront CAR treatment,” he said.

The study was funded by the National Institutes of Health with support from Lentigen and Juno Therapeutics. Dr. Fry reported no relevant disclosures.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– When one CAR stops one working, try another: chimeric antigen receptor (CAR) T-cell therapy for children and young adults with acute lymphoblastic leukemia is driving forward with a novel anti-CD22 target that in an early dose-finding trial has induced complete remissions in some patients with relapsed or refractory disease, including patients previously treated with anti-CD19 CAR-T therapy.

In the first-in-humans trial, CAR T-cell therapy directed against CD22 was shown to be safe and was associated with minimal residual disease (MRD)-negative complete remissions in eight of 10 children and young adults with relapsed/refractory B-precursor acute lymphoblastic leukemia treated at the highest dose level.

Dr. Terry J. Fry
One patient remains in remission more than 1 year after treatment, one had a 6-month remission, and one had a remission lasting 3 months.

“This is the first successful salvage CAR therapy for CD19-negative B-[lineage] ALL,” said co-principal investigator Terry J. Fry, MD, from the Center for Cancer Research at the National Cancer Institute in Bethesda, Md.

Preliminary experience with anti-CD22 immunotherapy suggests that it is comparable in potency to anti-CD19 CAR, and investigators are exploring the possibility that the two chimeric antigen targets could be combined for greater efficacy, he said during a briefing at the annual meeting of the American Society of Hematology.

Tough target

As reported previously from the 2013 ASH annual meeting, anti-CD19 CAR T cells induced complete responses in 10 of 16 children and young adults with relapsed/refractory ALL, and in a second study, CD19-targeted T cells induced complete molecular responses in 12 of 16 adults with B-lineage ALL refractory to chemotherapy.

In current phase 2 trials, anti-CD19 CAR-T therapy is associated with complete remission rates of 80% to 90% of those treated.

However, “we’re learning now that one of the limitations of this approach is the loss of CD19 expression occurring in a substantial number of patients, although it has not been systematically analyzed,” Dr. Fry said.

CD22, an antigen restricted to B-lineage cells, is a promising alternative to CD19 as a target, but finding just the right anti-CD22 CAR was tricky, Dr. Fry said in an interview. The investigators found that many candidate antigens bound well to T cells but had no efficacy, and it took several years of trying before they identified the current version of the antigen

In the phase I trial, the investigators enrolled 16 children and young adults (ages 7 to 22 years) with relapsed/refractory CD22-positive hematologic malignancies. All patients had previously undergone at least one allogeneic stem cell transplant, 11 had previously received anti-CD19 CAR-T cell therapy, and 9 were CD19-negative or had reduced CD19 expression on ALL cells.

The patients underwent peripheral blood mononuclear cells (PBMCs) collected through autologous leukapheresis. The cells were then enriched and expanded, and transduced with a lentiviral vector containing an anti-CD22 CAR for 7 to 10 days, allowing the cells to identify and bind to CD22 expressed on ALL blasts.

The patients then underwent lymphodepletion with fludarabine, and cyclophosphamide, and received infusions of the transduced T-cells at one of three dose levels, starting at 3 x 105 transduced T-cells per recipient weight in kilograms (DL-1), 1 x 106/kg (DL-2), and 3 x 106/kg (DL-3).

The complete remission rate at DL-2 and -3 combined was 80%, with the cytokine-release syndrome (CRS) at a maximum of grade 2.

As noted before, three of the remissions were comparatively durable, with one lasting more than a year.

There were no dose-limiting toxicities at DL-2, and grade 4 hypoxia at DL-3 was seen in one patient.There was one death from sepsis and multi-organ failure in one patient in an expansion cohort. There have been no cases of severe neurotoxicity thus far.

In five patients who experienced relapse, one treated at DL-1 had a loss of CAR cells, and four had changes in CD22 expression, primarily a decrease in site density that may cause the CD22 expression to fall below the threshold for CAR activity, Dr. Fry said.

“At least in our eyes, this may not be best used as a salvage therapy, but we’re beginning to think about how this should be included with CD19 in the upfront CAR treatment,” he said.

The study was funded by the National Institutes of Health with support from Lentigen and Juno Therapeutics. Dr. Fry reported no relevant disclosures.

– When one CAR stops one working, try another: chimeric antigen receptor (CAR) T-cell therapy for children and young adults with acute lymphoblastic leukemia is driving forward with a novel anti-CD22 target that in an early dose-finding trial has induced complete remissions in some patients with relapsed or refractory disease, including patients previously treated with anti-CD19 CAR-T therapy.

In the first-in-humans trial, CAR T-cell therapy directed against CD22 was shown to be safe and was associated with minimal residual disease (MRD)-negative complete remissions in eight of 10 children and young adults with relapsed/refractory B-precursor acute lymphoblastic leukemia treated at the highest dose level.

Dr. Terry J. Fry
One patient remains in remission more than 1 year after treatment, one had a 6-month remission, and one had a remission lasting 3 months.

“This is the first successful salvage CAR therapy for CD19-negative B-[lineage] ALL,” said co-principal investigator Terry J. Fry, MD, from the Center for Cancer Research at the National Cancer Institute in Bethesda, Md.

Preliminary experience with anti-CD22 immunotherapy suggests that it is comparable in potency to anti-CD19 CAR, and investigators are exploring the possibility that the two chimeric antigen targets could be combined for greater efficacy, he said during a briefing at the annual meeting of the American Society of Hematology.

Tough target

As reported previously from the 2013 ASH annual meeting, anti-CD19 CAR T cells induced complete responses in 10 of 16 children and young adults with relapsed/refractory ALL, and in a second study, CD19-targeted T cells induced complete molecular responses in 12 of 16 adults with B-lineage ALL refractory to chemotherapy.

In current phase 2 trials, anti-CD19 CAR-T therapy is associated with complete remission rates of 80% to 90% of those treated.

However, “we’re learning now that one of the limitations of this approach is the loss of CD19 expression occurring in a substantial number of patients, although it has not been systematically analyzed,” Dr. Fry said.

CD22, an antigen restricted to B-lineage cells, is a promising alternative to CD19 as a target, but finding just the right anti-CD22 CAR was tricky, Dr. Fry said in an interview. The investigators found that many candidate antigens bound well to T cells but had no efficacy, and it took several years of trying before they identified the current version of the antigen

In the phase I trial, the investigators enrolled 16 children and young adults (ages 7 to 22 years) with relapsed/refractory CD22-positive hematologic malignancies. All patients had previously undergone at least one allogeneic stem cell transplant, 11 had previously received anti-CD19 CAR-T cell therapy, and 9 were CD19-negative or had reduced CD19 expression on ALL cells.

The patients underwent peripheral blood mononuclear cells (PBMCs) collected through autologous leukapheresis. The cells were then enriched and expanded, and transduced with a lentiviral vector containing an anti-CD22 CAR for 7 to 10 days, allowing the cells to identify and bind to CD22 expressed on ALL blasts.

The patients then underwent lymphodepletion with fludarabine, and cyclophosphamide, and received infusions of the transduced T-cells at one of three dose levels, starting at 3 x 105 transduced T-cells per recipient weight in kilograms (DL-1), 1 x 106/kg (DL-2), and 3 x 106/kg (DL-3).

The complete remission rate at DL-2 and -3 combined was 80%, with the cytokine-release syndrome (CRS) at a maximum of grade 2.

As noted before, three of the remissions were comparatively durable, with one lasting more than a year.

There were no dose-limiting toxicities at DL-2, and grade 4 hypoxia at DL-3 was seen in one patient.There was one death from sepsis and multi-organ failure in one patient in an expansion cohort. There have been no cases of severe neurotoxicity thus far.

In five patients who experienced relapse, one treated at DL-1 had a loss of CAR cells, and four had changes in CD22 expression, primarily a decrease in site density that may cause the CD22 expression to fall below the threshold for CAR activity, Dr. Fry said.

“At least in our eyes, this may not be best used as a salvage therapy, but we’re beginning to think about how this should be included with CD19 in the upfront CAR treatment,” he said.

The study was funded by the National Institutes of Health with support from Lentigen and Juno Therapeutics. Dr. Fry reported no relevant disclosures.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ASH 2016

Disallow All Ads
Vitals

 

Key clinical point: CAR T-cell therapy with an anti-CD22 antigen induced complete, MRD-negative remissions in children/young adults with acute lymphoblastic leukemia.

Major finding: The complete remission rate among patients treated at the two highest dose levels was 80%.

Data source: Phase 1 dose-finding trial in 16 children/young adults with relapsed/refractory ALL or diffuse large B-cell lymphoma.

Disclosures The study was funded by the National Institutes of Health with support from Lentigen and Juno Therapeutics. Dr. Fry reported no relevant disclosures views

Malaria elimination in sub-Saharan Africa is possible, study suggests

Article Type
Changed
Sun, 12/04/2016 - 06:00
Display Headline
Malaria elimination in sub-Saharan Africa is possible, study suggests

A small community in the Lake

Kariba area of southern Zambia

where malaria elimination

programs are underway.

Photo from Milen Nikolov

Malaria elimination in historically high transmission areas like southern Africa is possible with tools that are already available, according to new research.

The study suggests that high levels of vector control are key, and mass drug campaigns cannot make much of an impact without proper vector control.

Milen Nikolov, of the Institute for Disease Modeling in Bellevue, Washington, and colleagues reported these findings in PLOS Computational Biology.

The researchers said the Lake Kariba region of Southern Province, Zambia, is part of a multi-country malaria elimination effort. However, elimination in this area is challenging because villages with high and low malaria burden are interconnected through human travel.

With this in mind, the researchers combined a mathematical model of malaria transmission with field data from Zambia to test a variety of strategies for eliminating malaria in the Lake Kariba region.

The team used detailed spatial surveillance data from field studies—including household locations, climate, clinical malaria incidence, prevalence of malaria infections, and bednet usage rates—to construct a model of interconnected villages, then tested a variety of intervention scenarios to see which ones could lead to elimination.

The results indicate that elimination requires high, yet realistic, levels of vector control. And mass drug campaigns deployed to kill parasites in the human population can boost the chances of achieving elimination as long as vector control is well-implemented.

The researchers said this work suggests that elimination programs in sub-Saharan Africa should focus on how to achieve and maintain excellent coverage of vector control measures rather than spending resources on mass drug campaigns that are predicted to have little effect without well-implemented vector control already in place.

Human movement within the region should be targeted to achieve elimination, as should the importation of infections from outside the region. This is because both impact the likelihood of achieving elimination and understanding regional movement patterns can help guide strategies on targeting specific groups of at-risk people.

While no sub-Saharan African country has yet eliminated malaria, the researchers predict that regional malaria elimination is within reach with current tools, provided the efficacy and operational efficiency attained in southern Zambia can be extended and targeted to other key areas.

Publications
Topics

A small community in the Lake

Kariba area of southern Zambia

where malaria elimination

programs are underway.

Photo from Milen Nikolov

Malaria elimination in historically high transmission areas like southern Africa is possible with tools that are already available, according to new research.

The study suggests that high levels of vector control are key, and mass drug campaigns cannot make much of an impact without proper vector control.

Milen Nikolov, of the Institute for Disease Modeling in Bellevue, Washington, and colleagues reported these findings in PLOS Computational Biology.

The researchers said the Lake Kariba region of Southern Province, Zambia, is part of a multi-country malaria elimination effort. However, elimination in this area is challenging because villages with high and low malaria burden are interconnected through human travel.

With this in mind, the researchers combined a mathematical model of malaria transmission with field data from Zambia to test a variety of strategies for eliminating malaria in the Lake Kariba region.

The team used detailed spatial surveillance data from field studies—including household locations, climate, clinical malaria incidence, prevalence of malaria infections, and bednet usage rates—to construct a model of interconnected villages, then tested a variety of intervention scenarios to see which ones could lead to elimination.

The results indicate that elimination requires high, yet realistic, levels of vector control. And mass drug campaigns deployed to kill parasites in the human population can boost the chances of achieving elimination as long as vector control is well-implemented.

The researchers said this work suggests that elimination programs in sub-Saharan Africa should focus on how to achieve and maintain excellent coverage of vector control measures rather than spending resources on mass drug campaigns that are predicted to have little effect without well-implemented vector control already in place.

Human movement within the region should be targeted to achieve elimination, as should the importation of infections from outside the region. This is because both impact the likelihood of achieving elimination and understanding regional movement patterns can help guide strategies on targeting specific groups of at-risk people.

While no sub-Saharan African country has yet eliminated malaria, the researchers predict that regional malaria elimination is within reach with current tools, provided the efficacy and operational efficiency attained in southern Zambia can be extended and targeted to other key areas.

A small community in the Lake

Kariba area of southern Zambia

where malaria elimination

programs are underway.

Photo from Milen Nikolov

Malaria elimination in historically high transmission areas like southern Africa is possible with tools that are already available, according to new research.

The study suggests that high levels of vector control are key, and mass drug campaigns cannot make much of an impact without proper vector control.

Milen Nikolov, of the Institute for Disease Modeling in Bellevue, Washington, and colleagues reported these findings in PLOS Computational Biology.

The researchers said the Lake Kariba region of Southern Province, Zambia, is part of a multi-country malaria elimination effort. However, elimination in this area is challenging because villages with high and low malaria burden are interconnected through human travel.

With this in mind, the researchers combined a mathematical model of malaria transmission with field data from Zambia to test a variety of strategies for eliminating malaria in the Lake Kariba region.

The team used detailed spatial surveillance data from field studies—including household locations, climate, clinical malaria incidence, prevalence of malaria infections, and bednet usage rates—to construct a model of interconnected villages, then tested a variety of intervention scenarios to see which ones could lead to elimination.

The results indicate that elimination requires high, yet realistic, levels of vector control. And mass drug campaigns deployed to kill parasites in the human population can boost the chances of achieving elimination as long as vector control is well-implemented.

The researchers said this work suggests that elimination programs in sub-Saharan Africa should focus on how to achieve and maintain excellent coverage of vector control measures rather than spending resources on mass drug campaigns that are predicted to have little effect without well-implemented vector control already in place.

Human movement within the region should be targeted to achieve elimination, as should the importation of infections from outside the region. This is because both impact the likelihood of achieving elimination and understanding regional movement patterns can help guide strategies on targeting specific groups of at-risk people.

While no sub-Saharan African country has yet eliminated malaria, the researchers predict that regional malaria elimination is within reach with current tools, provided the efficacy and operational efficiency attained in southern Zambia can be extended and targeted to other key areas.

Publications
Publications
Topics
Article Type
Display Headline
Malaria elimination in sub-Saharan Africa is possible, study suggests
Display Headline
Malaria elimination in sub-Saharan Africa is possible, study suggests
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

VIDEO: Anti-CD22 CAR for R/R ALL impresses in early trial

Article Type
Changed
Fri, 01/04/2019 - 09:57

– In a first-in-humans trial, chimeric antigen receptor (CAR) T-cell therapy directed against CD22 was shown to be safe and was associated with minimal residual disease (MRD)–negative complete remissions in 8 of 10 children and young adults with relapsed/refractory B-precursor acute lymphoblastic leukemia treated at the highest dose levels. One patient remains in remission more than 1 year of treatment, one had a 6-month remission, and one had a remission lasting 3 months.

In a video interview, co-principal investigator Terry J. Fry, MD, of the Center for Cancer Research at the National Cancer Institute in Bethesda, Md., discusses the rationale behind using an alternative antigen target in salvage therapy for ALL, and the potential for combining antigen targets to treat patients with relapsed/refractory ALL.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– In a first-in-humans trial, chimeric antigen receptor (CAR) T-cell therapy directed against CD22 was shown to be safe and was associated with minimal residual disease (MRD)–negative complete remissions in 8 of 10 children and young adults with relapsed/refractory B-precursor acute lymphoblastic leukemia treated at the highest dose levels. One patient remains in remission more than 1 year of treatment, one had a 6-month remission, and one had a remission lasting 3 months.

In a video interview, co-principal investigator Terry J. Fry, MD, of the Center for Cancer Research at the National Cancer Institute in Bethesda, Md., discusses the rationale behind using an alternative antigen target in salvage therapy for ALL, and the potential for combining antigen targets to treat patients with relapsed/refractory ALL.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

– In a first-in-humans trial, chimeric antigen receptor (CAR) T-cell therapy directed against CD22 was shown to be safe and was associated with minimal residual disease (MRD)–negative complete remissions in 8 of 10 children and young adults with relapsed/refractory B-precursor acute lymphoblastic leukemia treated at the highest dose levels. One patient remains in remission more than 1 year of treatment, one had a 6-month remission, and one had a remission lasting 3 months.

In a video interview, co-principal investigator Terry J. Fry, MD, of the Center for Cancer Research at the National Cancer Institute in Bethesda, Md., discusses the rationale behind using an alternative antigen target in salvage therapy for ALL, and the potential for combining antigen targets to treat patients with relapsed/refractory ALL.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
Publications
Publications
Topics
Article Type
Sections
Article Source

AT ASH 2016

Disallow All Ads