User login
Novel RA strategy: Target first-line biologics for likely methotrexate nonresponders
MAUI, HAWAII – An ongoing proof-of-concept study in the United Kingdom uses T-cell subset analysis to identify those patients with early rheumatoid arthritis who are unlikely to experience remission with methotrexate alone and therefore warrant more potent first-line therapy with a tumor necrosis factor (TNF) inhibitor in combination with methotrexate, Paul Emery, MD, said at the 2019 Rheumatology Winter Clinical Symposium.
“When we get the readout from this study, if we get the results we expect, we could actually put it to the National Health Service that it would be cost saving to use biologics as first-line therapy in a proportion, which will be about 40% of patients. But we won’t necessarily need to use them for a year, and we’ll get a 70%-plus remission rate, which I think is the sort of level we should be asking for in our patients,” according to Dr. Emery, professor of rheumatology and director of the University of Leeds (England) Musculoskeletal Biomedical Research Center.
This proof-of-concept study capitalizes on earlier work by Dr. Emery and coinvestigators, who showed in 70 patients with early rheumatoid arthritis given methotrexate as their first-ever disease-modifying antirheumatic drug (DMARD) that those with a normal pretreatment frequency of naive CD4+ T cells had an 83% remission rate at 6 months as defined by a Disease Activity Score in 28 joints (DAS28) of less than 2.6. In contrast, only 21% of those with a reduced naive CD4+ T-cell frequency experienced remission when compared with healthy controls. In an analysis adjusted for age and the presence of anti–citrullinated protein antibodies (ACPA), a normal baseline naive CD4+ T-cell frequency was associated with a 5.9-fold increased likelihood of remission on methotrexate (Ann Rheum Dis. 2014 Nov;73[11]:2047-53).
In the new proof-of-concept study, DMARD-naive, ACPA-positive patients with early RA undergo T-cell subset quantification by flow cytometry. Thirty patients with a normal test result are assigned to methotrexate with treat-to-target dose escalation. Based upon the investigators’ earlier work, it’s anticipated that about 80% of these patients will be in remission at 6 months.
Sixty patients with an abnormal baseline T-cell test result are being randomized to methotrexate plus either placebo or biosimilar etanercept. Again based upon the earlier study, the expected remission rate at 6 months in the methotrexate-plus-placebo group is about 20%. In contrast, the anticipated remission rate in the patients on an anti-TNF biologic plus methotrexate as first-line therapy is about 70% on the basis of the results of previous clinical trials, including PRIZE (N Engl J Med. 2014 Nov 6;371[19]:1781-92) and COMET (Ann Rheum Dis. 2012 Jun;71[6]:989-92).
Meanwhile, the price tag for biosimilar TNF inhibitors in the United Kingdom has come down to the point that routine across-the-board use of biologics as first-line therapy is arguably cost effective, a situation Dr. Emery described as hitherto “the unthinkable.”
The cost of biosimilar adalimumab in the coming year will be less than $3,000 annually in the U.K. health care system. So if 100 patients with early RA and no contraindication to biologic therapy are placed on biosimilar adalimumab and methotrexate for 1 year, the total cost for the biologic in this cohort will be less than $300,000, and roughly 70 of the 100 patients will have achieved remission. This approach makes much more sense than current standard practice, which is to reserve biologics as second-line therapy for patients who have failed to achieve remission on nonbiologic DMARDs, thereby allowing their joint damage to advance in the interim to the point that they need to stay on biologic therapy for decades, the rheumatologist argued.
“Once one accepts that remission has become the aim of therapy, then the facts, I think, speak for themselves: There’s absolutely no doubt that the rate of remission is best with the first DMARD. So if our aim is remission, we should use that one opportunity with the best agent first, because we’re not going to get the same response later,” Dr. Emery said.
Also, “there’s no doubt” that the dose of biologics can be halved with no loss of efficacy in patients who achieve remission on full-dose therapy, as previously demonstrated in PRIZE and other trials. This strategy further reduces the overall cost of biologic therapy, he added.
Dr. Emery, who recently received the Order of the British Empire from Queen Elizabeth personally in recognition of his career achievements in rheumatology, reported having no financial conflicts of interest regarding his presentation.
MAUI, HAWAII – An ongoing proof-of-concept study in the United Kingdom uses T-cell subset analysis to identify those patients with early rheumatoid arthritis who are unlikely to experience remission with methotrexate alone and therefore warrant more potent first-line therapy with a tumor necrosis factor (TNF) inhibitor in combination with methotrexate, Paul Emery, MD, said at the 2019 Rheumatology Winter Clinical Symposium.
“When we get the readout from this study, if we get the results we expect, we could actually put it to the National Health Service that it would be cost saving to use biologics as first-line therapy in a proportion, which will be about 40% of patients. But we won’t necessarily need to use them for a year, and we’ll get a 70%-plus remission rate, which I think is the sort of level we should be asking for in our patients,” according to Dr. Emery, professor of rheumatology and director of the University of Leeds (England) Musculoskeletal Biomedical Research Center.
This proof-of-concept study capitalizes on earlier work by Dr. Emery and coinvestigators, who showed in 70 patients with early rheumatoid arthritis given methotrexate as their first-ever disease-modifying antirheumatic drug (DMARD) that those with a normal pretreatment frequency of naive CD4+ T cells had an 83% remission rate at 6 months as defined by a Disease Activity Score in 28 joints (DAS28) of less than 2.6. In contrast, only 21% of those with a reduced naive CD4+ T-cell frequency experienced remission when compared with healthy controls. In an analysis adjusted for age and the presence of anti–citrullinated protein antibodies (ACPA), a normal baseline naive CD4+ T-cell frequency was associated with a 5.9-fold increased likelihood of remission on methotrexate (Ann Rheum Dis. 2014 Nov;73[11]:2047-53).
In the new proof-of-concept study, DMARD-naive, ACPA-positive patients with early RA undergo T-cell subset quantification by flow cytometry. Thirty patients with a normal test result are assigned to methotrexate with treat-to-target dose escalation. Based upon the investigators’ earlier work, it’s anticipated that about 80% of these patients will be in remission at 6 months.
Sixty patients with an abnormal baseline T-cell test result are being randomized to methotrexate plus either placebo or biosimilar etanercept. Again based upon the earlier study, the expected remission rate at 6 months in the methotrexate-plus-placebo group is about 20%. In contrast, the anticipated remission rate in the patients on an anti-TNF biologic plus methotrexate as first-line therapy is about 70% on the basis of the results of previous clinical trials, including PRIZE (N Engl J Med. 2014 Nov 6;371[19]:1781-92) and COMET (Ann Rheum Dis. 2012 Jun;71[6]:989-92).
Meanwhile, the price tag for biosimilar TNF inhibitors in the United Kingdom has come down to the point that routine across-the-board use of biologics as first-line therapy is arguably cost effective, a situation Dr. Emery described as hitherto “the unthinkable.”
The cost of biosimilar adalimumab in the coming year will be less than $3,000 annually in the U.K. health care system. So if 100 patients with early RA and no contraindication to biologic therapy are placed on biosimilar adalimumab and methotrexate for 1 year, the total cost for the biologic in this cohort will be less than $300,000, and roughly 70 of the 100 patients will have achieved remission. This approach makes much more sense than current standard practice, which is to reserve biologics as second-line therapy for patients who have failed to achieve remission on nonbiologic DMARDs, thereby allowing their joint damage to advance in the interim to the point that they need to stay on biologic therapy for decades, the rheumatologist argued.
“Once one accepts that remission has become the aim of therapy, then the facts, I think, speak for themselves: There’s absolutely no doubt that the rate of remission is best with the first DMARD. So if our aim is remission, we should use that one opportunity with the best agent first, because we’re not going to get the same response later,” Dr. Emery said.
Also, “there’s no doubt” that the dose of biologics can be halved with no loss of efficacy in patients who achieve remission on full-dose therapy, as previously demonstrated in PRIZE and other trials. This strategy further reduces the overall cost of biologic therapy, he added.
Dr. Emery, who recently received the Order of the British Empire from Queen Elizabeth personally in recognition of his career achievements in rheumatology, reported having no financial conflicts of interest regarding his presentation.
MAUI, HAWAII – An ongoing proof-of-concept study in the United Kingdom uses T-cell subset analysis to identify those patients with early rheumatoid arthritis who are unlikely to experience remission with methotrexate alone and therefore warrant more potent first-line therapy with a tumor necrosis factor (TNF) inhibitor in combination with methotrexate, Paul Emery, MD, said at the 2019 Rheumatology Winter Clinical Symposium.
“When we get the readout from this study, if we get the results we expect, we could actually put it to the National Health Service that it would be cost saving to use biologics as first-line therapy in a proportion, which will be about 40% of patients. But we won’t necessarily need to use them for a year, and we’ll get a 70%-plus remission rate, which I think is the sort of level we should be asking for in our patients,” according to Dr. Emery, professor of rheumatology and director of the University of Leeds (England) Musculoskeletal Biomedical Research Center.
This proof-of-concept study capitalizes on earlier work by Dr. Emery and coinvestigators, who showed in 70 patients with early rheumatoid arthritis given methotrexate as their first-ever disease-modifying antirheumatic drug (DMARD) that those with a normal pretreatment frequency of naive CD4+ T cells had an 83% remission rate at 6 months as defined by a Disease Activity Score in 28 joints (DAS28) of less than 2.6. In contrast, only 21% of those with a reduced naive CD4+ T-cell frequency experienced remission when compared with healthy controls. In an analysis adjusted for age and the presence of anti–citrullinated protein antibodies (ACPA), a normal baseline naive CD4+ T-cell frequency was associated with a 5.9-fold increased likelihood of remission on methotrexate (Ann Rheum Dis. 2014 Nov;73[11]:2047-53).
In the new proof-of-concept study, DMARD-naive, ACPA-positive patients with early RA undergo T-cell subset quantification by flow cytometry. Thirty patients with a normal test result are assigned to methotrexate with treat-to-target dose escalation. Based upon the investigators’ earlier work, it’s anticipated that about 80% of these patients will be in remission at 6 months.
Sixty patients with an abnormal baseline T-cell test result are being randomized to methotrexate plus either placebo or biosimilar etanercept. Again based upon the earlier study, the expected remission rate at 6 months in the methotrexate-plus-placebo group is about 20%. In contrast, the anticipated remission rate in the patients on an anti-TNF biologic plus methotrexate as first-line therapy is about 70% on the basis of the results of previous clinical trials, including PRIZE (N Engl J Med. 2014 Nov 6;371[19]:1781-92) and COMET (Ann Rheum Dis. 2012 Jun;71[6]:989-92).
Meanwhile, the price tag for biosimilar TNF inhibitors in the United Kingdom has come down to the point that routine across-the-board use of biologics as first-line therapy is arguably cost effective, a situation Dr. Emery described as hitherto “the unthinkable.”
The cost of biosimilar adalimumab in the coming year will be less than $3,000 annually in the U.K. health care system. So if 100 patients with early RA and no contraindication to biologic therapy are placed on biosimilar adalimumab and methotrexate for 1 year, the total cost for the biologic in this cohort will be less than $300,000, and roughly 70 of the 100 patients will have achieved remission. This approach makes much more sense than current standard practice, which is to reserve biologics as second-line therapy for patients who have failed to achieve remission on nonbiologic DMARDs, thereby allowing their joint damage to advance in the interim to the point that they need to stay on biologic therapy for decades, the rheumatologist argued.
“Once one accepts that remission has become the aim of therapy, then the facts, I think, speak for themselves: There’s absolutely no doubt that the rate of remission is best with the first DMARD. So if our aim is remission, we should use that one opportunity with the best agent first, because we’re not going to get the same response later,” Dr. Emery said.
Also, “there’s no doubt” that the dose of biologics can be halved with no loss of efficacy in patients who achieve remission on full-dose therapy, as previously demonstrated in PRIZE and other trials. This strategy further reduces the overall cost of biologic therapy, he added.
Dr. Emery, who recently received the Order of the British Empire from Queen Elizabeth personally in recognition of his career achievements in rheumatology, reported having no financial conflicts of interest regarding his presentation.
REPORTING FROM RWCS 2019
Fluorouracil beats other actinic keratosis treatments in head-to-head trial
A head-to-head comparison of four commonly used field-directed treatments for actinic keratosis (AK) found that 5% fluorouracil cream was the most effective at reducing the size of lesions.
In a study published in the March 7 issue of the New England Journal of Medicine, researchers reported the outcomes of a multicenter, single-blind trial in 602 patients with five or more AK lesions in one continuous area on the head measuring 25-100 cm2. Patients were randomized to treatment with either 5% fluorouracil cream, 5% imiquimod cream, methyl aminolevulinate photodynamic therapy (MAL-PDT), or 0.015% ingenol mebutate gel.
Overall, 74.7% of patients who received fluorouracil cream achieved treatment success – defined as at least a 75% reduction in lesion size at 12 months after the end of treatment – compared with 53.9% of patients treated with imiquimod, 37.7% of those treated with MAL-PDT, and 28.9% of those treated with ingenol mebutate. The differences between fluorouracil and the other treatments was significant.
Maud H.E. Jansen, MD, and Janneke P.H.M. Kessels, MD, of the department of dermatology at Maastricht (the Netherlands) University Medical Center and their coauthors pointed out that, while there was plenty of literature about different AK treatments, there were few head-to-head comparisons and many studies were underpowered or had different outcome measures.
Even when the analysis was restricted to patients with grade I or II lesions, fluorouracil remained the most effective treatment, with 75.3% of patients achieving treatment success, compared with 52.6% with imiquimod, 38.7% with MAL-PDT, and 30.2% with ingenol mebutate.
There were not enough patients with more severe grade III lesions to enable a separate analysis of their outcomes; 49 (7.9%) of patients in the study had at least one grade III lesion.
The authors noted that many previous studies had excluded patients with grade III lesions. “Exclusion of patients with grade III lesions was associated with slightly higher rates of success in the fluorouracil, MAL-PDT, and ingenol mebutate groups than the rates in the unrestricted analysis,” they wrote. The inclusion of patients with grade III AK lesions in this trial made it “more representative of patients seen in daily practice,” they added.
Treatment failure – less than 75% clearance of actinic keratosis at 3 months after the final treatment – was seen after one treatment cycle in 14.8% of patients treated with fluorouracil, 37.2% of patients on imiquimod, 34.6% of patients given photodynamic therapy, and 47.8% of patients on ingenol mebutate therapy.
All these patients were offered a second treatment cycle, but those treated with imiquimod, PDT, and ingenol mebutate were less likely to undergo a second treatment.
The authors suggested that the higher proportion of patients in the fluorouracil group who were willing to undergo a second round of therapy suggests they may have experienced less discomfort and inconvenience with the therapy to begin with, compared with those treated with the other regimens.
Full adherence to treatment was more common in the ingenol mebutate (98.7% of patients) and MAL-PDT (96.8%) groups, compared with the fluorouracil (88.7%) and imiquimod (88.2%) groups. However, patients in the fluorouracil group reported greater levels of patient satisfaction and improvements in health-related quality of life than did patients in the other treatment arms of the study.
No serious adverse events were reported with any of the treatments, and no patients stopped treatment because of adverse events. However, reports of moderate or severe crusts were highest among patients treated with imiquimod, and moderate to severe vesicles or bullae were highest among those treated with ingenol mebutate. Severe pain and severe burning sensation were significantly more common among those treated with MAL-PDT.
While the study had some limitations, the results “could affect treatment choices in both dermatology and primary care,” the authors wrote, pointing out how common AKs are in practice, accounting for 5 million dermatology visits in the United States every year. When considering treatment costs, “fluorouracil is also the most attractive option,” they added. “It is expected that a substantial cost reduction could be achieved with more uniformity in care and the choice for effective therapy.”
The study was supported by the Netherlands Organization for Health Research and Development. Five of the 11 authors declared conference costs, advisory board fees, or trial supplies from private industry, including from manufacturers of some of the products in the study. The remaining authors had no disclosures.
SOURCE: Jansen M et al. N Engl J Med. 2019;380:935-46.
A head-to-head comparison of four commonly used field-directed treatments for actinic keratosis (AK) found that 5% fluorouracil cream was the most effective at reducing the size of lesions.
In a study published in the March 7 issue of the New England Journal of Medicine, researchers reported the outcomes of a multicenter, single-blind trial in 602 patients with five or more AK lesions in one continuous area on the head measuring 25-100 cm2. Patients were randomized to treatment with either 5% fluorouracil cream, 5% imiquimod cream, methyl aminolevulinate photodynamic therapy (MAL-PDT), or 0.015% ingenol mebutate gel.
Overall, 74.7% of patients who received fluorouracil cream achieved treatment success – defined as at least a 75% reduction in lesion size at 12 months after the end of treatment – compared with 53.9% of patients treated with imiquimod, 37.7% of those treated with MAL-PDT, and 28.9% of those treated with ingenol mebutate. The differences between fluorouracil and the other treatments was significant.
Maud H.E. Jansen, MD, and Janneke P.H.M. Kessels, MD, of the department of dermatology at Maastricht (the Netherlands) University Medical Center and their coauthors pointed out that, while there was plenty of literature about different AK treatments, there were few head-to-head comparisons and many studies were underpowered or had different outcome measures.
Even when the analysis was restricted to patients with grade I or II lesions, fluorouracil remained the most effective treatment, with 75.3% of patients achieving treatment success, compared with 52.6% with imiquimod, 38.7% with MAL-PDT, and 30.2% with ingenol mebutate.
There were not enough patients with more severe grade III lesions to enable a separate analysis of their outcomes; 49 (7.9%) of patients in the study had at least one grade III lesion.
The authors noted that many previous studies had excluded patients with grade III lesions. “Exclusion of patients with grade III lesions was associated with slightly higher rates of success in the fluorouracil, MAL-PDT, and ingenol mebutate groups than the rates in the unrestricted analysis,” they wrote. The inclusion of patients with grade III AK lesions in this trial made it “more representative of patients seen in daily practice,” they added.
Treatment failure – less than 75% clearance of actinic keratosis at 3 months after the final treatment – was seen after one treatment cycle in 14.8% of patients treated with fluorouracil, 37.2% of patients on imiquimod, 34.6% of patients given photodynamic therapy, and 47.8% of patients on ingenol mebutate therapy.
All these patients were offered a second treatment cycle, but those treated with imiquimod, PDT, and ingenol mebutate were less likely to undergo a second treatment.
The authors suggested that the higher proportion of patients in the fluorouracil group who were willing to undergo a second round of therapy suggests they may have experienced less discomfort and inconvenience with the therapy to begin with, compared with those treated with the other regimens.
Full adherence to treatment was more common in the ingenol mebutate (98.7% of patients) and MAL-PDT (96.8%) groups, compared with the fluorouracil (88.7%) and imiquimod (88.2%) groups. However, patients in the fluorouracil group reported greater levels of patient satisfaction and improvements in health-related quality of life than did patients in the other treatment arms of the study.
No serious adverse events were reported with any of the treatments, and no patients stopped treatment because of adverse events. However, reports of moderate or severe crusts were highest among patients treated with imiquimod, and moderate to severe vesicles or bullae were highest among those treated with ingenol mebutate. Severe pain and severe burning sensation were significantly more common among those treated with MAL-PDT.
While the study had some limitations, the results “could affect treatment choices in both dermatology and primary care,” the authors wrote, pointing out how common AKs are in practice, accounting for 5 million dermatology visits in the United States every year. When considering treatment costs, “fluorouracil is also the most attractive option,” they added. “It is expected that a substantial cost reduction could be achieved with more uniformity in care and the choice for effective therapy.”
The study was supported by the Netherlands Organization for Health Research and Development. Five of the 11 authors declared conference costs, advisory board fees, or trial supplies from private industry, including from manufacturers of some of the products in the study. The remaining authors had no disclosures.
SOURCE: Jansen M et al. N Engl J Med. 2019;380:935-46.
A head-to-head comparison of four commonly used field-directed treatments for actinic keratosis (AK) found that 5% fluorouracil cream was the most effective at reducing the size of lesions.
In a study published in the March 7 issue of the New England Journal of Medicine, researchers reported the outcomes of a multicenter, single-blind trial in 602 patients with five or more AK lesions in one continuous area on the head measuring 25-100 cm2. Patients were randomized to treatment with either 5% fluorouracil cream, 5% imiquimod cream, methyl aminolevulinate photodynamic therapy (MAL-PDT), or 0.015% ingenol mebutate gel.
Overall, 74.7% of patients who received fluorouracil cream achieved treatment success – defined as at least a 75% reduction in lesion size at 12 months after the end of treatment – compared with 53.9% of patients treated with imiquimod, 37.7% of those treated with MAL-PDT, and 28.9% of those treated with ingenol mebutate. The differences between fluorouracil and the other treatments was significant.
Maud H.E. Jansen, MD, and Janneke P.H.M. Kessels, MD, of the department of dermatology at Maastricht (the Netherlands) University Medical Center and their coauthors pointed out that, while there was plenty of literature about different AK treatments, there were few head-to-head comparisons and many studies were underpowered or had different outcome measures.
Even when the analysis was restricted to patients with grade I or II lesions, fluorouracil remained the most effective treatment, with 75.3% of patients achieving treatment success, compared with 52.6% with imiquimod, 38.7% with MAL-PDT, and 30.2% with ingenol mebutate.
There were not enough patients with more severe grade III lesions to enable a separate analysis of their outcomes; 49 (7.9%) of patients in the study had at least one grade III lesion.
The authors noted that many previous studies had excluded patients with grade III lesions. “Exclusion of patients with grade III lesions was associated with slightly higher rates of success in the fluorouracil, MAL-PDT, and ingenol mebutate groups than the rates in the unrestricted analysis,” they wrote. The inclusion of patients with grade III AK lesions in this trial made it “more representative of patients seen in daily practice,” they added.
Treatment failure – less than 75% clearance of actinic keratosis at 3 months after the final treatment – was seen after one treatment cycle in 14.8% of patients treated with fluorouracil, 37.2% of patients on imiquimod, 34.6% of patients given photodynamic therapy, and 47.8% of patients on ingenol mebutate therapy.
All these patients were offered a second treatment cycle, but those treated with imiquimod, PDT, and ingenol mebutate were less likely to undergo a second treatment.
The authors suggested that the higher proportion of patients in the fluorouracil group who were willing to undergo a second round of therapy suggests they may have experienced less discomfort and inconvenience with the therapy to begin with, compared with those treated with the other regimens.
Full adherence to treatment was more common in the ingenol mebutate (98.7% of patients) and MAL-PDT (96.8%) groups, compared with the fluorouracil (88.7%) and imiquimod (88.2%) groups. However, patients in the fluorouracil group reported greater levels of patient satisfaction and improvements in health-related quality of life than did patients in the other treatment arms of the study.
No serious adverse events were reported with any of the treatments, and no patients stopped treatment because of adverse events. However, reports of moderate or severe crusts were highest among patients treated with imiquimod, and moderate to severe vesicles or bullae were highest among those treated with ingenol mebutate. Severe pain and severe burning sensation were significantly more common among those treated with MAL-PDT.
While the study had some limitations, the results “could affect treatment choices in both dermatology and primary care,” the authors wrote, pointing out how common AKs are in practice, accounting for 5 million dermatology visits in the United States every year. When considering treatment costs, “fluorouracil is also the most attractive option,” they added. “It is expected that a substantial cost reduction could be achieved with more uniformity in care and the choice for effective therapy.”
The study was supported by the Netherlands Organization for Health Research and Development. Five of the 11 authors declared conference costs, advisory board fees, or trial supplies from private industry, including from manufacturers of some of the products in the study. The remaining authors had no disclosures.
SOURCE: Jansen M et al. N Engl J Med. 2019;380:935-46.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Forget what you learned about infective endocarditis
This week in MDedge Cardiocast: Infective endocarditis isn’t what it used to be, there’s a new, lower goal for Americans’ dietary intake of sodium, a drug to treat multiple myeloma also raises heart failure risk, and Big Pharma says it can’t drop drug list prices alone.
Amazon Alexa
Apple Podcasts
Google Podcasts
TuneIn
This week in MDedge Cardiocast: Infective endocarditis isn’t what it used to be, there’s a new, lower goal for Americans’ dietary intake of sodium, a drug to treat multiple myeloma also raises heart failure risk, and Big Pharma says it can’t drop drug list prices alone.
Amazon Alexa
Apple Podcasts
Google Podcasts
TuneIn
This week in MDedge Cardiocast: Infective endocarditis isn’t what it used to be, there’s a new, lower goal for Americans’ dietary intake of sodium, a drug to treat multiple myeloma also raises heart failure risk, and Big Pharma says it can’t drop drug list prices alone.
Amazon Alexa
Apple Podcasts
Google Podcasts
TuneIn
FDA approves liquid colchicine for gout
for prophylaxis of gout flares in adults, according to a statement from Romeg Therapeutics.
Colchicine has been used in capsule and tablet forms to treat this form of arthritis for decades. An advantage to the new formulation is that it allows physicians to “easily make dose adjustments,” according to the statement.
“Existing therapies do not adequately address the physician’s need to adjust dosages of colchicine to manage the toxicity profile for patients with renal and liver impairments, side effects, common drug-to-drug interactions, and age-related health disorders,” said Naomi Vishnupad, PhD, chief scientific officer of Romeg Therapeutics, in the statement.
According to the prescribing information for the drug on the FDA website, this formulation is indicated for prophylaxis rather than acute treatment of gout flares because the safety profile of acute treatment with it has not yet been studied. It is contraindicated in patients with hepatic and/or renal impairment. Gastrointestinal symptoms were the most commonly reported adverse reactions.
The drug is expected to be available this summer.
for prophylaxis of gout flares in adults, according to a statement from Romeg Therapeutics.
Colchicine has been used in capsule and tablet forms to treat this form of arthritis for decades. An advantage to the new formulation is that it allows physicians to “easily make dose adjustments,” according to the statement.
“Existing therapies do not adequately address the physician’s need to adjust dosages of colchicine to manage the toxicity profile for patients with renal and liver impairments, side effects, common drug-to-drug interactions, and age-related health disorders,” said Naomi Vishnupad, PhD, chief scientific officer of Romeg Therapeutics, in the statement.
According to the prescribing information for the drug on the FDA website, this formulation is indicated for prophylaxis rather than acute treatment of gout flares because the safety profile of acute treatment with it has not yet been studied. It is contraindicated in patients with hepatic and/or renal impairment. Gastrointestinal symptoms were the most commonly reported adverse reactions.
The drug is expected to be available this summer.
for prophylaxis of gout flares in adults, according to a statement from Romeg Therapeutics.
Colchicine has been used in capsule and tablet forms to treat this form of arthritis for decades. An advantage to the new formulation is that it allows physicians to “easily make dose adjustments,” according to the statement.
“Existing therapies do not adequately address the physician’s need to adjust dosages of colchicine to manage the toxicity profile for patients with renal and liver impairments, side effects, common drug-to-drug interactions, and age-related health disorders,” said Naomi Vishnupad, PhD, chief scientific officer of Romeg Therapeutics, in the statement.
According to the prescribing information for the drug on the FDA website, this formulation is indicated for prophylaxis rather than acute treatment of gout flares because the safety profile of acute treatment with it has not yet been studied. It is contraindicated in patients with hepatic and/or renal impairment. Gastrointestinal symptoms were the most commonly reported adverse reactions.
The drug is expected to be available this summer.
One-time, universal hepatitis C testing cost effective, researchers say
Universal one-time screening for hepatitis C virus infection is cost effective, compared with birth cohort screening alone, according to the results of a study published in Clinical Gastroenterology and Hepatology.
The Centers for Disease Control and Prevention and the U.S. Preventive Services Task Force recommend testing all individuals born between 1945 and 1965 in addition to injection drug users and other high-risk individuals. But so-called birth cohort screening does not reflect the recent spike in hepatitis C virus (HCV) cases among younger persons in the United States, nor the current recommendation to treat nearly all chronic HCV cases, wrote Mark H. Eckman, MD, of the University of Cincinnati, and his associates.
Using a computer program called Decision Maker, they modeled the cost-effectiveness of universal one-time testing, birth cohort screening, and no screening based on quality-adjusted life-years (QALYS) and 2017 U.S. dollars. They assumed that all HCV-infected patients were treatment naive, treatment eligible, and asymptomatic (for example, had no decompensated cirrhosis). They used efficacy data from the ASTRAL trials of sofosbuvir-velpatasvir as well as the ENDURANCE, SURVEYOR, and EXPEDITION trials of glecaprevir-pibrentasvir. In the model, patients who did not achieve a sustained viral response to treatment went on to complete a 12-week triple direct-acting antiviral (DAA) regimen (sofosbuvir, velpatasvir, and voxilaprevir).
Based on these assumptions, universal one-time screening and treatment of infected individuals cost less than $50,000 per QALY gained, making it highly cost effective, compared with no screening, the investigators wrote. Universal screening also was highly cost effective when compared with birth cohort screening, costing $11,378 for each QALY gained.
“Analyses performed during the era of first-generation DAAs and interferon-based treatment regimens found birth-cohort screening to be ‘cost effective,’ ” the researchers wrote. “However, the availability of a new generation of highly effective, non–interferon-based oral regimens, with fewer side effects and shorter treatment courses, has altered the dynamic around the question of screening.” They pointed to another recent study in which universal one-time HCV testing was more cost effective than birth cohort screening.
Such findings have spurred experts to revisit guidelines on HCV screening, but universal testing is controversial when some states, counties, and communities have a low HCV prevalence. In the model, universal one-time HCV screening was cost effective (less than $50,000 per QALY gained), compared with birth cohort screening as long as prevalence exceeded 0.07% among adults not born between 1945 and 1965. The current prevalence estimate in this group is 0.29%, which is probably low because it does not account for the rising incidence among younger adults, the researchers wrote. In an ideal world, all clinics and hospitals would implement an HCV testing program, but in the real world of scarce resources, “data regarding the cost-effectiveness threshold can guide local policy decisions by directing testing services to settings in which they generate sufficient benefit for the cost.”
Partial funding came from the National Foundation for the Centers for Disease Control and Prevention (CDC Foundation), with funding provided through multiple donors to the CDC Foundation’s Viral Hepatitis Action Coalition. Dr. Eckman reported grant support from Merck and one coinvestigator reported ties to AbbVie, Gilead, Merck, and several other pharmaceutical companies.
SOURCE: Eckman MH et al. Clin Gastroenterol Hepatol. 2018 Sep 7. doi: 10.1016/j.cgh.2018.08.080.
Universal one-time screening for hepatitis C virus infection is cost effective, compared with birth cohort screening alone, according to the results of a study published in Clinical Gastroenterology and Hepatology.
The Centers for Disease Control and Prevention and the U.S. Preventive Services Task Force recommend testing all individuals born between 1945 and 1965 in addition to injection drug users and other high-risk individuals. But so-called birth cohort screening does not reflect the recent spike in hepatitis C virus (HCV) cases among younger persons in the United States, nor the current recommendation to treat nearly all chronic HCV cases, wrote Mark H. Eckman, MD, of the University of Cincinnati, and his associates.
Using a computer program called Decision Maker, they modeled the cost-effectiveness of universal one-time testing, birth cohort screening, and no screening based on quality-adjusted life-years (QALYS) and 2017 U.S. dollars. They assumed that all HCV-infected patients were treatment naive, treatment eligible, and asymptomatic (for example, had no decompensated cirrhosis). They used efficacy data from the ASTRAL trials of sofosbuvir-velpatasvir as well as the ENDURANCE, SURVEYOR, and EXPEDITION trials of glecaprevir-pibrentasvir. In the model, patients who did not achieve a sustained viral response to treatment went on to complete a 12-week triple direct-acting antiviral (DAA) regimen (sofosbuvir, velpatasvir, and voxilaprevir).
Based on these assumptions, universal one-time screening and treatment of infected individuals cost less than $50,000 per QALY gained, making it highly cost effective, compared with no screening, the investigators wrote. Universal screening also was highly cost effective when compared with birth cohort screening, costing $11,378 for each QALY gained.
“Analyses performed during the era of first-generation DAAs and interferon-based treatment regimens found birth-cohort screening to be ‘cost effective,’ ” the researchers wrote. “However, the availability of a new generation of highly effective, non–interferon-based oral regimens, with fewer side effects and shorter treatment courses, has altered the dynamic around the question of screening.” They pointed to another recent study in which universal one-time HCV testing was more cost effective than birth cohort screening.
Such findings have spurred experts to revisit guidelines on HCV screening, but universal testing is controversial when some states, counties, and communities have a low HCV prevalence. In the model, universal one-time HCV screening was cost effective (less than $50,000 per QALY gained), compared with birth cohort screening as long as prevalence exceeded 0.07% among adults not born between 1945 and 1965. The current prevalence estimate in this group is 0.29%, which is probably low because it does not account for the rising incidence among younger adults, the researchers wrote. In an ideal world, all clinics and hospitals would implement an HCV testing program, but in the real world of scarce resources, “data regarding the cost-effectiveness threshold can guide local policy decisions by directing testing services to settings in which they generate sufficient benefit for the cost.”
Partial funding came from the National Foundation for the Centers for Disease Control and Prevention (CDC Foundation), with funding provided through multiple donors to the CDC Foundation’s Viral Hepatitis Action Coalition. Dr. Eckman reported grant support from Merck and one coinvestigator reported ties to AbbVie, Gilead, Merck, and several other pharmaceutical companies.
SOURCE: Eckman MH et al. Clin Gastroenterol Hepatol. 2018 Sep 7. doi: 10.1016/j.cgh.2018.08.080.
Universal one-time screening for hepatitis C virus infection is cost effective, compared with birth cohort screening alone, according to the results of a study published in Clinical Gastroenterology and Hepatology.
The Centers for Disease Control and Prevention and the U.S. Preventive Services Task Force recommend testing all individuals born between 1945 and 1965 in addition to injection drug users and other high-risk individuals. But so-called birth cohort screening does not reflect the recent spike in hepatitis C virus (HCV) cases among younger persons in the United States, nor the current recommendation to treat nearly all chronic HCV cases, wrote Mark H. Eckman, MD, of the University of Cincinnati, and his associates.
Using a computer program called Decision Maker, they modeled the cost-effectiveness of universal one-time testing, birth cohort screening, and no screening based on quality-adjusted life-years (QALYS) and 2017 U.S. dollars. They assumed that all HCV-infected patients were treatment naive, treatment eligible, and asymptomatic (for example, had no decompensated cirrhosis). They used efficacy data from the ASTRAL trials of sofosbuvir-velpatasvir as well as the ENDURANCE, SURVEYOR, and EXPEDITION trials of glecaprevir-pibrentasvir. In the model, patients who did not achieve a sustained viral response to treatment went on to complete a 12-week triple direct-acting antiviral (DAA) regimen (sofosbuvir, velpatasvir, and voxilaprevir).
Based on these assumptions, universal one-time screening and treatment of infected individuals cost less than $50,000 per QALY gained, making it highly cost effective, compared with no screening, the investigators wrote. Universal screening also was highly cost effective when compared with birth cohort screening, costing $11,378 for each QALY gained.
“Analyses performed during the era of first-generation DAAs and interferon-based treatment regimens found birth-cohort screening to be ‘cost effective,’ ” the researchers wrote. “However, the availability of a new generation of highly effective, non–interferon-based oral regimens, with fewer side effects and shorter treatment courses, has altered the dynamic around the question of screening.” They pointed to another recent study in which universal one-time HCV testing was more cost effective than birth cohort screening.
Such findings have spurred experts to revisit guidelines on HCV screening, but universal testing is controversial when some states, counties, and communities have a low HCV prevalence. In the model, universal one-time HCV screening was cost effective (less than $50,000 per QALY gained), compared with birth cohort screening as long as prevalence exceeded 0.07% among adults not born between 1945 and 1965. The current prevalence estimate in this group is 0.29%, which is probably low because it does not account for the rising incidence among younger adults, the researchers wrote. In an ideal world, all clinics and hospitals would implement an HCV testing program, but in the real world of scarce resources, “data regarding the cost-effectiveness threshold can guide local policy decisions by directing testing services to settings in which they generate sufficient benefit for the cost.”
Partial funding came from the National Foundation for the Centers for Disease Control and Prevention (CDC Foundation), with funding provided through multiple donors to the CDC Foundation’s Viral Hepatitis Action Coalition. Dr. Eckman reported grant support from Merck and one coinvestigator reported ties to AbbVie, Gilead, Merck, and several other pharmaceutical companies.
SOURCE: Eckman MH et al. Clin Gastroenterol Hepatol. 2018 Sep 7. doi: 10.1016/j.cgh.2018.08.080.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Poor COPD management might increase MI risk in HIV
SEATTLE – Chronic obstructive pulmonary disease is independently associated with an increased risk of myocardial infarction in people with HIV, according to a report at the Conference on Retroviruses and Opportunistic Infections.
Chronic obstructive pulmonary disease (COPD) is known to increase the risk of myocardial infarction (MI) in the general population, but hadn’t been shown until now to do the same in HIV. The study raises the question of whether COPD is being managed adequately in patients with the virus, according to study lead Kristina Crothers, MD, associate professor in the division of pulmonary, critical care & sleep medicine at the University of Washington, Seattle.
The investigators reviewed 25,509 HIV patients in the Center for AIDS Research Network of Integrated Clinical Systems cohort, a large electronic database of HIV-infected people. They defined COPD by diagnostic codes and inhaler prescriptions. MIs were adjudicated by review.
The team identified 423 subjects with moderate to severe COPD, and 698 who had MIs, including 339 type 1 MIs (T1MI) from a ruptured plaque (54%), and 294 (46%) type 2 heart attacks (T2MI) from a supply-demand mismatch due to sepsis or some other problem. In general, T2MIs are far more common in people with HIV.
COPD was associated with a greater than twofold increased risk of MI after adjustment for age, sex, viral load, nadir CD4 count, hypertension, and other confounders. The risk dropped slightly when smoking – both current smoking and pack years – was added to the model (adjusted hazard ratio 1.88, 95% confidence interval, 1.34-2.63).
The association was particularly strong for T2MI, especially in the setting of bacteremia and sepsis, and unlike T1MI, it remained significant after adjustment for smoking.
The study establishes a link between COPD and MI in HIV, but it could not answer what’s going on. Chronic inflammation from the virus could be at play, but the team also found hints of inadequate COPD management.
“About 60% of patients were on inhalers ... but only about 25% of them were on long-acting inhalers. 75% were only on short-acting.” That’s a problem because long-acting inhalers are needed to control exacerbations, Dr. Crothers said.
The study didn’t capture exacerbation rates, but increased rates could help explain the MI risk. Increased rates of pneumonia could as well, since pneumonia is a common cause of sepsis.
“We need to better manage complications of COPD in this population. I think optimizing long-term COPD management could have many beneficial effects,” Dr. Crothers said.
The National Institutes of Health funded the work. Dr. Crothers had no disclosures.
SOURCE: Crothers K et al. CROI 2019, Abstract 31.
SEATTLE – Chronic obstructive pulmonary disease is independently associated with an increased risk of myocardial infarction in people with HIV, according to a report at the Conference on Retroviruses and Opportunistic Infections.
Chronic obstructive pulmonary disease (COPD) is known to increase the risk of myocardial infarction (MI) in the general population, but hadn’t been shown until now to do the same in HIV. The study raises the question of whether COPD is being managed adequately in patients with the virus, according to study lead Kristina Crothers, MD, associate professor in the division of pulmonary, critical care & sleep medicine at the University of Washington, Seattle.
The investigators reviewed 25,509 HIV patients in the Center for AIDS Research Network of Integrated Clinical Systems cohort, a large electronic database of HIV-infected people. They defined COPD by diagnostic codes and inhaler prescriptions. MIs were adjudicated by review.
The team identified 423 subjects with moderate to severe COPD, and 698 who had MIs, including 339 type 1 MIs (T1MI) from a ruptured plaque (54%), and 294 (46%) type 2 heart attacks (T2MI) from a supply-demand mismatch due to sepsis or some other problem. In general, T2MIs are far more common in people with HIV.
COPD was associated with a greater than twofold increased risk of MI after adjustment for age, sex, viral load, nadir CD4 count, hypertension, and other confounders. The risk dropped slightly when smoking – both current smoking and pack years – was added to the model (adjusted hazard ratio 1.88, 95% confidence interval, 1.34-2.63).
The association was particularly strong for T2MI, especially in the setting of bacteremia and sepsis, and unlike T1MI, it remained significant after adjustment for smoking.
The study establishes a link between COPD and MI in HIV, but it could not answer what’s going on. Chronic inflammation from the virus could be at play, but the team also found hints of inadequate COPD management.
“About 60% of patients were on inhalers ... but only about 25% of them were on long-acting inhalers. 75% were only on short-acting.” That’s a problem because long-acting inhalers are needed to control exacerbations, Dr. Crothers said.
The study didn’t capture exacerbation rates, but increased rates could help explain the MI risk. Increased rates of pneumonia could as well, since pneumonia is a common cause of sepsis.
“We need to better manage complications of COPD in this population. I think optimizing long-term COPD management could have many beneficial effects,” Dr. Crothers said.
The National Institutes of Health funded the work. Dr. Crothers had no disclosures.
SOURCE: Crothers K et al. CROI 2019, Abstract 31.
SEATTLE – Chronic obstructive pulmonary disease is independently associated with an increased risk of myocardial infarction in people with HIV, according to a report at the Conference on Retroviruses and Opportunistic Infections.
Chronic obstructive pulmonary disease (COPD) is known to increase the risk of myocardial infarction (MI) in the general population, but hadn’t been shown until now to do the same in HIV. The study raises the question of whether COPD is being managed adequately in patients with the virus, according to study lead Kristina Crothers, MD, associate professor in the division of pulmonary, critical care & sleep medicine at the University of Washington, Seattle.
The investigators reviewed 25,509 HIV patients in the Center for AIDS Research Network of Integrated Clinical Systems cohort, a large electronic database of HIV-infected people. They defined COPD by diagnostic codes and inhaler prescriptions. MIs were adjudicated by review.
The team identified 423 subjects with moderate to severe COPD, and 698 who had MIs, including 339 type 1 MIs (T1MI) from a ruptured plaque (54%), and 294 (46%) type 2 heart attacks (T2MI) from a supply-demand mismatch due to sepsis or some other problem. In general, T2MIs are far more common in people with HIV.
COPD was associated with a greater than twofold increased risk of MI after adjustment for age, sex, viral load, nadir CD4 count, hypertension, and other confounders. The risk dropped slightly when smoking – both current smoking and pack years – was added to the model (adjusted hazard ratio 1.88, 95% confidence interval, 1.34-2.63).
The association was particularly strong for T2MI, especially in the setting of bacteremia and sepsis, and unlike T1MI, it remained significant after adjustment for smoking.
The study establishes a link between COPD and MI in HIV, but it could not answer what’s going on. Chronic inflammation from the virus could be at play, but the team also found hints of inadequate COPD management.
“About 60% of patients were on inhalers ... but only about 25% of them were on long-acting inhalers. 75% were only on short-acting.” That’s a problem because long-acting inhalers are needed to control exacerbations, Dr. Crothers said.
The study didn’t capture exacerbation rates, but increased rates could help explain the MI risk. Increased rates of pneumonia could as well, since pneumonia is a common cause of sepsis.
“We need to better manage complications of COPD in this population. I think optimizing long-term COPD management could have many beneficial effects,” Dr. Crothers said.
The National Institutes of Health funded the work. Dr. Crothers had no disclosures.
SOURCE: Crothers K et al. CROI 2019, Abstract 31.
REPORTING FROM CROI 2019
Higher dose of checkpoint inhibitor every 4 weeks feasible in NSCLC
SAN FRANCISCO – For patients with advanced non–small cell lung cancer (NSCLC) who previously had disease control with the checkpoint inhibitor nivolumab (Opdivo), second-line nivolumab at a higher dose every 4 weeks appeared to be comparable in efficacy and safety with standard-dose nivolumab every 2 weeks.
The key word in that last sentence is “appeared,” because the Checkmate 384 trial that was designed to show noninferiority of the every-4-weeks regimen lacked the statistical muscle to get the job done, reported Edward B. Garon, MD, from the University of California, Los Angeles.
“In many respects, extending the dosing frequency of nivolumab fulfills some of the promise of immunotherapy: The idea that we would be able to decrease the medicalization of the lives of our patients. For some people this would lead to them being able to resume a more normal work schedule, and for other people it would allow them to do things for fun, like travel on trips that would take longer than a couple of weeks,” he said at the American Society of Clinical Oncology (ASCO) – Society for Immunotherapy of Cancer (SITC): Clinical Immuno-Oncology Symposium.
However, because of difficulties in recruitment, the investigators had to stop enrollment early and settle for a sample size of 363 patients, instead of the 600 planned that would be necessary to meet a 10% noninferiority margin and one-sided 95% confidence interval. Thus, the trial analysis can only be reported as descriptive rather than definitive, Dr. Garon acknowledged.
Nivolumab is approved at a fixed dose of 240 mg every 2 weeks for the treatment of multiple tumor types in several different nations, and in the United States and Canada it is approved at a dose of 480 mg every 4 weeks for the treatment of NSCLC.
The CheckMate 384 study enrolled patients with advanced or metastatic NSCLC who had received 3 mg/kg or 240 mg of nivolumab every 2 weeks for up to 1 year. The patients had to have had relatively good performance status (Eastern Cooperative Oncology Group 0-2) and two consecutive assessments of either complete response, partial response, or stable disease.
The patients were stratified by tumor histology (squamous or nonsquamous) and response to prior nivolumab therapy at randomization, and were then randomized to receive nivolumab 240 mg every 2 weeks or 480 mg every 4 weeks until disease progression or unacceptable toxicity for up to 2 years.
Dr. Garon presented an interim analysis including data on 329 of the 363 patients; the final analysis will occur after all patients have had a minimum of 12 months of follow-up. Here, he reported on 6-month progression-free survival, a coprimary endpoint with 12-month PFS.
After a median follow-up of 9.5 months in the Q4-week group and 10.2 months in the Q2-week group, the 6-month PFS rates were identical between the two dosing strategies, at 72%. The median PFS was 12.1 months and 12.2 months, respectively.
“Although the study is no longer formally powered to show noninferiority, there’s certainly nothing in these curves that makes me concerned that this 480 mg every-4-week dose would be inferior,” Dr. Garon said.
There was a slightly higher rate of treatment-related adverse events of any grade in the lower, more frequent dose group: 48% in the Q4-week versus 61% in the Q2-week arm. The respective rates of grade 3 or 4 adverse events were 8% and 12%. Rates of serious adverse events and events leading to treatment discontinuation were similar between the group; there were no treatment-related deaths.
The investigators hypothesize that the higher rate of overall events in the lower-dose group may be attributable to more frequent visits and more opportunities to report adverse events, Dr. Garon said.
“Overall, the clinical data are in agreement with the pharmacokinetic modeling and give further evidence for this 480 mg every 4 week nivolumab dosing option,” he concluded.
The study was supported by Bristol-Myers Squibb. Dr. Garon reported receiving research support from Bristol-Myers Squibb and others and consulting fees from Dracen Pharmaceuticals.
SOURCE: Garon EB et al. ASCO-SITC, Abstract 100.
SAN FRANCISCO – For patients with advanced non–small cell lung cancer (NSCLC) who previously had disease control with the checkpoint inhibitor nivolumab (Opdivo), second-line nivolumab at a higher dose every 4 weeks appeared to be comparable in efficacy and safety with standard-dose nivolumab every 2 weeks.
The key word in that last sentence is “appeared,” because the Checkmate 384 trial that was designed to show noninferiority of the every-4-weeks regimen lacked the statistical muscle to get the job done, reported Edward B. Garon, MD, from the University of California, Los Angeles.
“In many respects, extending the dosing frequency of nivolumab fulfills some of the promise of immunotherapy: The idea that we would be able to decrease the medicalization of the lives of our patients. For some people this would lead to them being able to resume a more normal work schedule, and for other people it would allow them to do things for fun, like travel on trips that would take longer than a couple of weeks,” he said at the American Society of Clinical Oncology (ASCO) – Society for Immunotherapy of Cancer (SITC): Clinical Immuno-Oncology Symposium.
However, because of difficulties in recruitment, the investigators had to stop enrollment early and settle for a sample size of 363 patients, instead of the 600 planned that would be necessary to meet a 10% noninferiority margin and one-sided 95% confidence interval. Thus, the trial analysis can only be reported as descriptive rather than definitive, Dr. Garon acknowledged.
Nivolumab is approved at a fixed dose of 240 mg every 2 weeks for the treatment of multiple tumor types in several different nations, and in the United States and Canada it is approved at a dose of 480 mg every 4 weeks for the treatment of NSCLC.
The CheckMate 384 study enrolled patients with advanced or metastatic NSCLC who had received 3 mg/kg or 240 mg of nivolumab every 2 weeks for up to 1 year. The patients had to have had relatively good performance status (Eastern Cooperative Oncology Group 0-2) and two consecutive assessments of either complete response, partial response, or stable disease.
The patients were stratified by tumor histology (squamous or nonsquamous) and response to prior nivolumab therapy at randomization, and were then randomized to receive nivolumab 240 mg every 2 weeks or 480 mg every 4 weeks until disease progression or unacceptable toxicity for up to 2 years.
Dr. Garon presented an interim analysis including data on 329 of the 363 patients; the final analysis will occur after all patients have had a minimum of 12 months of follow-up. Here, he reported on 6-month progression-free survival, a coprimary endpoint with 12-month PFS.
After a median follow-up of 9.5 months in the Q4-week group and 10.2 months in the Q2-week group, the 6-month PFS rates were identical between the two dosing strategies, at 72%. The median PFS was 12.1 months and 12.2 months, respectively.
“Although the study is no longer formally powered to show noninferiority, there’s certainly nothing in these curves that makes me concerned that this 480 mg every-4-week dose would be inferior,” Dr. Garon said.
There was a slightly higher rate of treatment-related adverse events of any grade in the lower, more frequent dose group: 48% in the Q4-week versus 61% in the Q2-week arm. The respective rates of grade 3 or 4 adverse events were 8% and 12%. Rates of serious adverse events and events leading to treatment discontinuation were similar between the group; there were no treatment-related deaths.
The investigators hypothesize that the higher rate of overall events in the lower-dose group may be attributable to more frequent visits and more opportunities to report adverse events, Dr. Garon said.
“Overall, the clinical data are in agreement with the pharmacokinetic modeling and give further evidence for this 480 mg every 4 week nivolumab dosing option,” he concluded.
The study was supported by Bristol-Myers Squibb. Dr. Garon reported receiving research support from Bristol-Myers Squibb and others and consulting fees from Dracen Pharmaceuticals.
SOURCE: Garon EB et al. ASCO-SITC, Abstract 100.
SAN FRANCISCO – For patients with advanced non–small cell lung cancer (NSCLC) who previously had disease control with the checkpoint inhibitor nivolumab (Opdivo), second-line nivolumab at a higher dose every 4 weeks appeared to be comparable in efficacy and safety with standard-dose nivolumab every 2 weeks.
The key word in that last sentence is “appeared,” because the Checkmate 384 trial that was designed to show noninferiority of the every-4-weeks regimen lacked the statistical muscle to get the job done, reported Edward B. Garon, MD, from the University of California, Los Angeles.
“In many respects, extending the dosing frequency of nivolumab fulfills some of the promise of immunotherapy: The idea that we would be able to decrease the medicalization of the lives of our patients. For some people this would lead to them being able to resume a more normal work schedule, and for other people it would allow them to do things for fun, like travel on trips that would take longer than a couple of weeks,” he said at the American Society of Clinical Oncology (ASCO) – Society for Immunotherapy of Cancer (SITC): Clinical Immuno-Oncology Symposium.
However, because of difficulties in recruitment, the investigators had to stop enrollment early and settle for a sample size of 363 patients, instead of the 600 planned that would be necessary to meet a 10% noninferiority margin and one-sided 95% confidence interval. Thus, the trial analysis can only be reported as descriptive rather than definitive, Dr. Garon acknowledged.
Nivolumab is approved at a fixed dose of 240 mg every 2 weeks for the treatment of multiple tumor types in several different nations, and in the United States and Canada it is approved at a dose of 480 mg every 4 weeks for the treatment of NSCLC.
The CheckMate 384 study enrolled patients with advanced or metastatic NSCLC who had received 3 mg/kg or 240 mg of nivolumab every 2 weeks for up to 1 year. The patients had to have had relatively good performance status (Eastern Cooperative Oncology Group 0-2) and two consecutive assessments of either complete response, partial response, or stable disease.
The patients were stratified by tumor histology (squamous or nonsquamous) and response to prior nivolumab therapy at randomization, and were then randomized to receive nivolumab 240 mg every 2 weeks or 480 mg every 4 weeks until disease progression or unacceptable toxicity for up to 2 years.
Dr. Garon presented an interim analysis including data on 329 of the 363 patients; the final analysis will occur after all patients have had a minimum of 12 months of follow-up. Here, he reported on 6-month progression-free survival, a coprimary endpoint with 12-month PFS.
After a median follow-up of 9.5 months in the Q4-week group and 10.2 months in the Q2-week group, the 6-month PFS rates were identical between the two dosing strategies, at 72%. The median PFS was 12.1 months and 12.2 months, respectively.
“Although the study is no longer formally powered to show noninferiority, there’s certainly nothing in these curves that makes me concerned that this 480 mg every-4-week dose would be inferior,” Dr. Garon said.
There was a slightly higher rate of treatment-related adverse events of any grade in the lower, more frequent dose group: 48% in the Q4-week versus 61% in the Q2-week arm. The respective rates of grade 3 or 4 adverse events were 8% and 12%. Rates of serious adverse events and events leading to treatment discontinuation were similar between the group; there were no treatment-related deaths.
The investigators hypothesize that the higher rate of overall events in the lower-dose group may be attributable to more frequent visits and more opportunities to report adverse events, Dr. Garon said.
“Overall, the clinical data are in agreement with the pharmacokinetic modeling and give further evidence for this 480 mg every 4 week nivolumab dosing option,” he concluded.
The study was supported by Bristol-Myers Squibb. Dr. Garon reported receiving research support from Bristol-Myers Squibb and others and consulting fees from Dracen Pharmaceuticals.
SOURCE: Garon EB et al. ASCO-SITC, Abstract 100.
REPORTING FROM ASCO-SITC
High-calorie diet may worsen Wilson disease
A high-calorie diet may cause earlier onset of more severe Wilson disease, according to a rodent study.
If translatable to humans, the results could explain “striking phenotype-genotype discrepancies” between patients with Wilson disease, and may give reason to monitor nutrition more closely, particularly dietary levels of fat and sugar, reported lead author Claudia Einer, a PhD candidate at the German Research Center for Environmental Health in Neuherberg, Germany, and her colleagues. Their findings clarify an association between impaired copper metabolism, which defines Wilson disease, and liver steatosis, a common finding in affected patients.
“Indeed, Wilson disease often may be misdiagnosed as nonalcoholic fatty liver disease (NAFLD),” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology. They noted that previous reports showed similar mitochondrial alterations in the livers of patients with NAFLD and those with Wilson disease. Furthermore, in a case report of two twins with Wilson disease, the twin with bulimia nervosa developed severe liver disease, whereas the other twin, who was undernourished, had mild liver disease. Considering these observations and other supportive evidence, the investigators tested this apparent relationship between a high-fat diet and liver damage in Wilson disease.
“The rationale of this study was that both enriched copper and fatty acids cause bioenergetic defects and therefore synergistically and detrimentally may coincide on hepatic mitochondria, which was found to be dramatically the case,” the investigators wrote.
The study involved homozygous Atp7b–/– rats, which mirror Wilson disease, and heterozygous Atp7b+/– rats, which served as control subjects because they lack copper accumulation. The high-calorie diet contained high fat and sugar levels to mirror “the eating habits in Western society, causing the ‘American-lifestyle-induced-obesity syndrome.’ ”
Within several weeks of starting the high-calorie diet, both control and Wilson disease rats showed higher liver triglyceride levels and visceral fat mass compared with rats on the normal diet, with liver histology also showing macrosteatosis and increased NAFLD Activity Score (NAS). Control rats maintained similar body and liver weights regardless of diet; in contrast, Wilson disease rats on the high-calorie diet showed increased liver weight, compared with Wilson disease rats on the normal diet. In addition, Wilson disease rats fed the high-calorie diet had clinical liver injury, supported by elevated aspartate aminotransferase (AST) levels and gross hepatic damage. Under the microscope, histology revealed widespread necrosis, apoptosis, inflammation, and fibrosis; findings were sufficient to constitute nonalcoholic steatohepatitis in all Wilson disease rats fed the high-calorie diet, compared with just one-third of the control rats receiving high calories. Additional testing showed that Wilson disease rats fed the high-calorie diet had disease onset 20 days sooner than did Wilson disease rats fed the normal diet.
“This is a remarkable disease acceleration,” the investigators noted, highlighting the median survival of 106 days in Wilson disease rats fed a normal diet.
Copper testing showed that Wilson disease rats fed the high-calorie diet had high serum levels of non–ceruloplasmin-bound copper, which is a sign of overt liver damage; based on histologic findings, the copper likely came from destroyed hepatocytes. Regardless of diet type, Wilson disease rats developed high levels of copper within the liver, suggesting comparable copper consumption via water sources. Regardless of genotype, the high-calorie diet led to higher mitochondrial copper levels than those of the normal diet, but Wilson disease rats showed the highest levels of copper sequestration in mitochondria, to an extreme degree.
“Importantly,” the investigators wrote, “such increased mitochondrial copper significantly correlated with a higher NAS and a progressive Histologic Activity Index score.”
Closer inspection showed that the mitochondria of Wilson disease rats were abnormal regardless of diet, but those fed the high-calorie diet had “a most severe mitochondrial phenotype,” including detached membranes and ballooned cristae.
“These structural impairments were paralleled by remarkable mitochondrial functional deficits,” the investigators reported, referring to a significant decrease in adenosine triphosphate production and an increase in mitochondrial H2O2. In response to these mitochondrial abnormalities, cholesterol-related enzymes quadrupled, most prominently for biliary excretion. The investigators summed up these hepatic events as a “toxic triad of adenosine triphosphate depletion, increased reactive oxygen species, and increased bile salts [that led] to an earlier onset of the disease and to enhanced disease progression.”
To complete the set of experiments, rats were given the copper chelator methanobactin. This treatment effectively mitigated structural and functional abnormalities in mitochondria, which drove serum levels of AST, copper, and bile salts toward normalcy. Although treatment halted overt liver damage, histology revealed that resolution was incomplete.
“We conclude that lipid accumulation in copper-burdened hepatocytes may represent a ‘second-hit’ in Wilson disease, inducing liver damage, and suggest that further research should establish whether dietary counseling of Wilson disease patients may be of therapeutic benefit,” the investigators concluded.
The study was funded by Deutsche Forschungsgemeinschaft and the WiFoMed Society. The investigators reported no conflicts of interest.
SOURCE: Einer et al. Cell Mol Gastroenterol Hepatol. 2019 Jan 11. doi: 10.1016/j.jcmgh.2018.12.005.
A high-calorie diet may cause earlier onset of more severe Wilson disease, according to a rodent study.
If translatable to humans, the results could explain “striking phenotype-genotype discrepancies” between patients with Wilson disease, and may give reason to monitor nutrition more closely, particularly dietary levels of fat and sugar, reported lead author Claudia Einer, a PhD candidate at the German Research Center for Environmental Health in Neuherberg, Germany, and her colleagues. Their findings clarify an association between impaired copper metabolism, which defines Wilson disease, and liver steatosis, a common finding in affected patients.
“Indeed, Wilson disease often may be misdiagnosed as nonalcoholic fatty liver disease (NAFLD),” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology. They noted that previous reports showed similar mitochondrial alterations in the livers of patients with NAFLD and those with Wilson disease. Furthermore, in a case report of two twins with Wilson disease, the twin with bulimia nervosa developed severe liver disease, whereas the other twin, who was undernourished, had mild liver disease. Considering these observations and other supportive evidence, the investigators tested this apparent relationship between a high-fat diet and liver damage in Wilson disease.
“The rationale of this study was that both enriched copper and fatty acids cause bioenergetic defects and therefore synergistically and detrimentally may coincide on hepatic mitochondria, which was found to be dramatically the case,” the investigators wrote.
The study involved homozygous Atp7b–/– rats, which mirror Wilson disease, and heterozygous Atp7b+/– rats, which served as control subjects because they lack copper accumulation. The high-calorie diet contained high fat and sugar levels to mirror “the eating habits in Western society, causing the ‘American-lifestyle-induced-obesity syndrome.’ ”
Within several weeks of starting the high-calorie diet, both control and Wilson disease rats showed higher liver triglyceride levels and visceral fat mass compared with rats on the normal diet, with liver histology also showing macrosteatosis and increased NAFLD Activity Score (NAS). Control rats maintained similar body and liver weights regardless of diet; in contrast, Wilson disease rats on the high-calorie diet showed increased liver weight, compared with Wilson disease rats on the normal diet. In addition, Wilson disease rats fed the high-calorie diet had clinical liver injury, supported by elevated aspartate aminotransferase (AST) levels and gross hepatic damage. Under the microscope, histology revealed widespread necrosis, apoptosis, inflammation, and fibrosis; findings were sufficient to constitute nonalcoholic steatohepatitis in all Wilson disease rats fed the high-calorie diet, compared with just one-third of the control rats receiving high calories. Additional testing showed that Wilson disease rats fed the high-calorie diet had disease onset 20 days sooner than did Wilson disease rats fed the normal diet.
“This is a remarkable disease acceleration,” the investigators noted, highlighting the median survival of 106 days in Wilson disease rats fed a normal diet.
Copper testing showed that Wilson disease rats fed the high-calorie diet had high serum levels of non–ceruloplasmin-bound copper, which is a sign of overt liver damage; based on histologic findings, the copper likely came from destroyed hepatocytes. Regardless of diet type, Wilson disease rats developed high levels of copper within the liver, suggesting comparable copper consumption via water sources. Regardless of genotype, the high-calorie diet led to higher mitochondrial copper levels than those of the normal diet, but Wilson disease rats showed the highest levels of copper sequestration in mitochondria, to an extreme degree.
“Importantly,” the investigators wrote, “such increased mitochondrial copper significantly correlated with a higher NAS and a progressive Histologic Activity Index score.”
Closer inspection showed that the mitochondria of Wilson disease rats were abnormal regardless of diet, but those fed the high-calorie diet had “a most severe mitochondrial phenotype,” including detached membranes and ballooned cristae.
“These structural impairments were paralleled by remarkable mitochondrial functional deficits,” the investigators reported, referring to a significant decrease in adenosine triphosphate production and an increase in mitochondrial H2O2. In response to these mitochondrial abnormalities, cholesterol-related enzymes quadrupled, most prominently for biliary excretion. The investigators summed up these hepatic events as a “toxic triad of adenosine triphosphate depletion, increased reactive oxygen species, and increased bile salts [that led] to an earlier onset of the disease and to enhanced disease progression.”
To complete the set of experiments, rats were given the copper chelator methanobactin. This treatment effectively mitigated structural and functional abnormalities in mitochondria, which drove serum levels of AST, copper, and bile salts toward normalcy. Although treatment halted overt liver damage, histology revealed that resolution was incomplete.
“We conclude that lipid accumulation in copper-burdened hepatocytes may represent a ‘second-hit’ in Wilson disease, inducing liver damage, and suggest that further research should establish whether dietary counseling of Wilson disease patients may be of therapeutic benefit,” the investigators concluded.
The study was funded by Deutsche Forschungsgemeinschaft and the WiFoMed Society. The investigators reported no conflicts of interest.
SOURCE: Einer et al. Cell Mol Gastroenterol Hepatol. 2019 Jan 11. doi: 10.1016/j.jcmgh.2018.12.005.
A high-calorie diet may cause earlier onset of more severe Wilson disease, according to a rodent study.
If translatable to humans, the results could explain “striking phenotype-genotype discrepancies” between patients with Wilson disease, and may give reason to monitor nutrition more closely, particularly dietary levels of fat and sugar, reported lead author Claudia Einer, a PhD candidate at the German Research Center for Environmental Health in Neuherberg, Germany, and her colleagues. Their findings clarify an association between impaired copper metabolism, which defines Wilson disease, and liver steatosis, a common finding in affected patients.
“Indeed, Wilson disease often may be misdiagnosed as nonalcoholic fatty liver disease (NAFLD),” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology. They noted that previous reports showed similar mitochondrial alterations in the livers of patients with NAFLD and those with Wilson disease. Furthermore, in a case report of two twins with Wilson disease, the twin with bulimia nervosa developed severe liver disease, whereas the other twin, who was undernourished, had mild liver disease. Considering these observations and other supportive evidence, the investigators tested this apparent relationship between a high-fat diet and liver damage in Wilson disease.
“The rationale of this study was that both enriched copper and fatty acids cause bioenergetic defects and therefore synergistically and detrimentally may coincide on hepatic mitochondria, which was found to be dramatically the case,” the investigators wrote.
The study involved homozygous Atp7b–/– rats, which mirror Wilson disease, and heterozygous Atp7b+/– rats, which served as control subjects because they lack copper accumulation. The high-calorie diet contained high fat and sugar levels to mirror “the eating habits in Western society, causing the ‘American-lifestyle-induced-obesity syndrome.’ ”
Within several weeks of starting the high-calorie diet, both control and Wilson disease rats showed higher liver triglyceride levels and visceral fat mass compared with rats on the normal diet, with liver histology also showing macrosteatosis and increased NAFLD Activity Score (NAS). Control rats maintained similar body and liver weights regardless of diet; in contrast, Wilson disease rats on the high-calorie diet showed increased liver weight, compared with Wilson disease rats on the normal diet. In addition, Wilson disease rats fed the high-calorie diet had clinical liver injury, supported by elevated aspartate aminotransferase (AST) levels and gross hepatic damage. Under the microscope, histology revealed widespread necrosis, apoptosis, inflammation, and fibrosis; findings were sufficient to constitute nonalcoholic steatohepatitis in all Wilson disease rats fed the high-calorie diet, compared with just one-third of the control rats receiving high calories. Additional testing showed that Wilson disease rats fed the high-calorie diet had disease onset 20 days sooner than did Wilson disease rats fed the normal diet.
“This is a remarkable disease acceleration,” the investigators noted, highlighting the median survival of 106 days in Wilson disease rats fed a normal diet.
Copper testing showed that Wilson disease rats fed the high-calorie diet had high serum levels of non–ceruloplasmin-bound copper, which is a sign of overt liver damage; based on histologic findings, the copper likely came from destroyed hepatocytes. Regardless of diet type, Wilson disease rats developed high levels of copper within the liver, suggesting comparable copper consumption via water sources. Regardless of genotype, the high-calorie diet led to higher mitochondrial copper levels than those of the normal diet, but Wilson disease rats showed the highest levels of copper sequestration in mitochondria, to an extreme degree.
“Importantly,” the investigators wrote, “such increased mitochondrial copper significantly correlated with a higher NAS and a progressive Histologic Activity Index score.”
Closer inspection showed that the mitochondria of Wilson disease rats were abnormal regardless of diet, but those fed the high-calorie diet had “a most severe mitochondrial phenotype,” including detached membranes and ballooned cristae.
“These structural impairments were paralleled by remarkable mitochondrial functional deficits,” the investigators reported, referring to a significant decrease in adenosine triphosphate production and an increase in mitochondrial H2O2. In response to these mitochondrial abnormalities, cholesterol-related enzymes quadrupled, most prominently for biliary excretion. The investigators summed up these hepatic events as a “toxic triad of adenosine triphosphate depletion, increased reactive oxygen species, and increased bile salts [that led] to an earlier onset of the disease and to enhanced disease progression.”
To complete the set of experiments, rats were given the copper chelator methanobactin. This treatment effectively mitigated structural and functional abnormalities in mitochondria, which drove serum levels of AST, copper, and bile salts toward normalcy. Although treatment halted overt liver damage, histology revealed that resolution was incomplete.
“We conclude that lipid accumulation in copper-burdened hepatocytes may represent a ‘second-hit’ in Wilson disease, inducing liver damage, and suggest that further research should establish whether dietary counseling of Wilson disease patients may be of therapeutic benefit,” the investigators concluded.
The study was funded by Deutsche Forschungsgemeinschaft and the WiFoMed Society. The investigators reported no conflicts of interest.
SOURCE: Einer et al. Cell Mol Gastroenterol Hepatol. 2019 Jan 11. doi: 10.1016/j.jcmgh.2018.12.005.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY
Glyceryl trinitrate does not improve outcomes of ischemic stroke
HONOLULU – , according to data presented at the International Stroke Conference sponsored by the American Heart Association. Results suggest that GTN causes adverse effects in patients with intracerebral hemorrhage (ICH), but this observation is not definitive, according to the researchers. Study results were published online ahead of print Feb. 6 in the Lancet.
Nitric oxide is a regulatory molecule that has vasoactive effects and promotes blood pressure reduction. Vascular levels of nitric oxide are low in stroke, which suggests that the molecule may be a target for stroke treatment. GTN, a nitric oxide donor, lowered blood pressure and improved functional outcome among patients with acute stroke in the phase 2 Rapid Intervention with GTN in Hypertensive Stroke Trial (RIGHT).
Philip Bath, MD, Stroke Association Professor of Stroke Medicine at the University of Nottingham (England), and colleagues conducted the RIGHT-2 study to evaluate the safety and efficacy of GTN when administered early after onset of suspected stroke. Paramedics randomized patients in equal groups to a GTN patch or a sham patch in the ambulance. Three more patches were administered in the hospital on the following days. Active and sham patches looked similar and had no writing on them, thus ensuring effective blinding upon administration. Investigators followed up patients by telephone at 90 days to assess the modified Rankin Scale score and markers of disability, mood, cognition, and quality of life.
Eligible participants were adults who had dialed emergency services, independently or with assistance, because of a possible stroke. They had a Face, Arm, Speech, Time (FAST) score of 2 or 3, were within 4 hours of onset, and had a systolic blood pressure greater than 120 mm Hg. Patients from nursing homes, those with hypoglycemia, those who were unconscious, and those with a witnessed seizure were excluded.
Dr. Bath and colleagues planned to enroll 850 patients from five ambulance services in 30 hospitals across the United Kingdom. Data were to be examined through an intention-to-treat analysis. During the trial, however, the investigators observed that the rate of stroke mimics was 26%, rather than the 12% that they had anticipated. To ensure the proper power for the study, the investigators increased the sample size to 1,149 patients. They also changed the planned data analysis from intention-to-treat to hierarchical analysis. Specifically, the researchers planned to perform the primary analysis in patients with stroke or TIA. If the results were positive, then they would perform a standard intention-to-treat analysis.
More than 99% of patients received the first patch. Approximately 57% of the population received the first two patches. One reason for this decrease in adherence was that many patients were discharged from the hospital with a TIA or a stroke mimic. Participants’ average age was 72. The median time from onset to randomization was 71 minutes, and the median time to treatment was 73 minutes. Participants’ mean systolic blood pressure was 162 mm Hg. Approximately 60% of the patients had a FAST score of 3. About 50% of participants had ischemic stroke, 13% had ICH, 10% had TIA, and 26% had stroke mimics.
At 1 hour after treatment initiation, systolic blood pressure decreased by 6.2 mm Hg and diastolic blood pressure decreased by 2.7 mm Hg among patients who received GTN, compared with controls. At one day, the differences were 5.2 mm Hg and 2.5 mm Hg, respectively, in treated patients, compared with controls. Blood pressure became similar between groups thereafter, “in part because of the tachyphylaxis that we know happens with GTN,” said Dr. Bath.
The researchers found no evidence of an effect of GTN on functional outcome at 90 days in participants with stroke or transient ischemic attack. The adjusted common odds ratio of poor outcome was 1.25 in the GTN group, compared with the control group (95 % confidence interval, 0.97-1.60; P = .083). “We were close to getting a negative trial,” said Dr. Bath.
Subgroup analyses revealed differences in outcome according to the time to randomization. GTN had a negative effect in patients treated within 1 hour of onset. Results were neutral, but tended to be negative, in patients treated between 1 and 2 hours of onset. Results were neutral, but tended to be positive, among patients treated at more than 2 hours after onset. There was no difference between groups in the rate of mortality.
One of the study’s limitations was its single-blind design. In addition, the trial was conducted in a single country, and the investigators changed the protocol after it was initiated. “We had a higher-than-expected [stroke] mimic rate, although I’m reassured by most experts that ... this is probably about right,” said Dr. Bath.
A potential reason for the neutral results is the negative effect that GTN had among patients with ICH, said Dr. Bath. “In that very early first hour, we are of course breaking a law that we learned in medical school, which is that the first part of hemostasis is spasm. We gave an antispasmodic: a vasodilator,” he added. “That is speculation.”
The trial was funded by the British Heart Foundation. Dr. Bath declared a modest ownership interest in Platelet Solutions and consultant or advisory board positions with Moleac, DiaMedica, Phagenesis, Nestle, and ReNeuron. The other investigators declared no conflicts of interest.
The RIGHT-2 trial shows the limitations of a prehospital enrollment model, wrote Karen C. Johnston, MD, professor of neurology at the University of Virginia in Charlottesville, and Valerie L. Durkalski-Mauldin, PhD, professor of medicine at Medical University of South Carolina in Charleston, in an editorial accompanying the RIGHT-2 trial results. The rate of nonstroke diagnoses was so high that it would have reduced the study’s power to assess the efficacy of glyceryl trinitrate (GTN), had the investigators not increased the sample size and changed the statistical analysis plan.
“Future prehospital trials need to consider the implications of enrolling, yet excluding, stroke mimics in the primary analysis,” said Dr. Johnston and Dr. Durkalski-Mauldin. Using telemedicine in the ambulance to facilitate direct contact between the stroke provider and the patient and emergency medical services provider could reduce the enrollment of patients with stroke mimics in clinical trials, they added. “Improved tools to exclude stroke mimics in the field have been difficult to develop and validate. The absence of imaging in most ambulances will continue to limit field personnel from definitively determining ischemic stroke from intracerebral hemorrhage, which will limit hyperacute trials to interventions presumed safe in both populations.”
In addition, the blood pressure reduction that GTN provided might not be clinically relevant, said Dr. Johnston and Dr. Durkalski-Mauldin. “The RIGHT-2 investigators report no difference in blood pressure at day 3 or day 4 of treatment, which might have been related to the very low adherence to study protocol by day 4.
“Regardless of these limitations, RIGHT-2 has provided high-level evidence that GTN given within 4 hours of onset does not significantly improve outcome in hyperacute patients presenting with possible stroke,” the authors concluded (Lancet. 2019 Feb 6. doi: 10.1016/
S0140-6736(19)30276-4). Dr. Johnston and Dr. Durkalski-Mauldin declared no conflicts of interest.
The RIGHT-2 trial shows the limitations of a prehospital enrollment model, wrote Karen C. Johnston, MD, professor of neurology at the University of Virginia in Charlottesville, and Valerie L. Durkalski-Mauldin, PhD, professor of medicine at Medical University of South Carolina in Charleston, in an editorial accompanying the RIGHT-2 trial results. The rate of nonstroke diagnoses was so high that it would have reduced the study’s power to assess the efficacy of glyceryl trinitrate (GTN), had the investigators not increased the sample size and changed the statistical analysis plan.
“Future prehospital trials need to consider the implications of enrolling, yet excluding, stroke mimics in the primary analysis,” said Dr. Johnston and Dr. Durkalski-Mauldin. Using telemedicine in the ambulance to facilitate direct contact between the stroke provider and the patient and emergency medical services provider could reduce the enrollment of patients with stroke mimics in clinical trials, they added. “Improved tools to exclude stroke mimics in the field have been difficult to develop and validate. The absence of imaging in most ambulances will continue to limit field personnel from definitively determining ischemic stroke from intracerebral hemorrhage, which will limit hyperacute trials to interventions presumed safe in both populations.”
In addition, the blood pressure reduction that GTN provided might not be clinically relevant, said Dr. Johnston and Dr. Durkalski-Mauldin. “The RIGHT-2 investigators report no difference in blood pressure at day 3 or day 4 of treatment, which might have been related to the very low adherence to study protocol by day 4.
“Regardless of these limitations, RIGHT-2 has provided high-level evidence that GTN given within 4 hours of onset does not significantly improve outcome in hyperacute patients presenting with possible stroke,” the authors concluded (Lancet. 2019 Feb 6. doi: 10.1016/
S0140-6736(19)30276-4). Dr. Johnston and Dr. Durkalski-Mauldin declared no conflicts of interest.
The RIGHT-2 trial shows the limitations of a prehospital enrollment model, wrote Karen C. Johnston, MD, professor of neurology at the University of Virginia in Charlottesville, and Valerie L. Durkalski-Mauldin, PhD, professor of medicine at Medical University of South Carolina in Charleston, in an editorial accompanying the RIGHT-2 trial results. The rate of nonstroke diagnoses was so high that it would have reduced the study’s power to assess the efficacy of glyceryl trinitrate (GTN), had the investigators not increased the sample size and changed the statistical analysis plan.
“Future prehospital trials need to consider the implications of enrolling, yet excluding, stroke mimics in the primary analysis,” said Dr. Johnston and Dr. Durkalski-Mauldin. Using telemedicine in the ambulance to facilitate direct contact between the stroke provider and the patient and emergency medical services provider could reduce the enrollment of patients with stroke mimics in clinical trials, they added. “Improved tools to exclude stroke mimics in the field have been difficult to develop and validate. The absence of imaging in most ambulances will continue to limit field personnel from definitively determining ischemic stroke from intracerebral hemorrhage, which will limit hyperacute trials to interventions presumed safe in both populations.”
In addition, the blood pressure reduction that GTN provided might not be clinically relevant, said Dr. Johnston and Dr. Durkalski-Mauldin. “The RIGHT-2 investigators report no difference in blood pressure at day 3 or day 4 of treatment, which might have been related to the very low adherence to study protocol by day 4.
“Regardless of these limitations, RIGHT-2 has provided high-level evidence that GTN given within 4 hours of onset does not significantly improve outcome in hyperacute patients presenting with possible stroke,” the authors concluded (Lancet. 2019 Feb 6. doi: 10.1016/
S0140-6736(19)30276-4). Dr. Johnston and Dr. Durkalski-Mauldin declared no conflicts of interest.
HONOLULU – , according to data presented at the International Stroke Conference sponsored by the American Heart Association. Results suggest that GTN causes adverse effects in patients with intracerebral hemorrhage (ICH), but this observation is not definitive, according to the researchers. Study results were published online ahead of print Feb. 6 in the Lancet.
Nitric oxide is a regulatory molecule that has vasoactive effects and promotes blood pressure reduction. Vascular levels of nitric oxide are low in stroke, which suggests that the molecule may be a target for stroke treatment. GTN, a nitric oxide donor, lowered blood pressure and improved functional outcome among patients with acute stroke in the phase 2 Rapid Intervention with GTN in Hypertensive Stroke Trial (RIGHT).
Philip Bath, MD, Stroke Association Professor of Stroke Medicine at the University of Nottingham (England), and colleagues conducted the RIGHT-2 study to evaluate the safety and efficacy of GTN when administered early after onset of suspected stroke. Paramedics randomized patients in equal groups to a GTN patch or a sham patch in the ambulance. Three more patches were administered in the hospital on the following days. Active and sham patches looked similar and had no writing on them, thus ensuring effective blinding upon administration. Investigators followed up patients by telephone at 90 days to assess the modified Rankin Scale score and markers of disability, mood, cognition, and quality of life.
Eligible participants were adults who had dialed emergency services, independently or with assistance, because of a possible stroke. They had a Face, Arm, Speech, Time (FAST) score of 2 or 3, were within 4 hours of onset, and had a systolic blood pressure greater than 120 mm Hg. Patients from nursing homes, those with hypoglycemia, those who were unconscious, and those with a witnessed seizure were excluded.
Dr. Bath and colleagues planned to enroll 850 patients from five ambulance services in 30 hospitals across the United Kingdom. Data were to be examined through an intention-to-treat analysis. During the trial, however, the investigators observed that the rate of stroke mimics was 26%, rather than the 12% that they had anticipated. To ensure the proper power for the study, the investigators increased the sample size to 1,149 patients. They also changed the planned data analysis from intention-to-treat to hierarchical analysis. Specifically, the researchers planned to perform the primary analysis in patients with stroke or TIA. If the results were positive, then they would perform a standard intention-to-treat analysis.
More than 99% of patients received the first patch. Approximately 57% of the population received the first two patches. One reason for this decrease in adherence was that many patients were discharged from the hospital with a TIA or a stroke mimic. Participants’ average age was 72. The median time from onset to randomization was 71 minutes, and the median time to treatment was 73 minutes. Participants’ mean systolic blood pressure was 162 mm Hg. Approximately 60% of the patients had a FAST score of 3. About 50% of participants had ischemic stroke, 13% had ICH, 10% had TIA, and 26% had stroke mimics.
At 1 hour after treatment initiation, systolic blood pressure decreased by 6.2 mm Hg and diastolic blood pressure decreased by 2.7 mm Hg among patients who received GTN, compared with controls. At one day, the differences were 5.2 mm Hg and 2.5 mm Hg, respectively, in treated patients, compared with controls. Blood pressure became similar between groups thereafter, “in part because of the tachyphylaxis that we know happens with GTN,” said Dr. Bath.
The researchers found no evidence of an effect of GTN on functional outcome at 90 days in participants with stroke or transient ischemic attack. The adjusted common odds ratio of poor outcome was 1.25 in the GTN group, compared with the control group (95 % confidence interval, 0.97-1.60; P = .083). “We were close to getting a negative trial,” said Dr. Bath.
Subgroup analyses revealed differences in outcome according to the time to randomization. GTN had a negative effect in patients treated within 1 hour of onset. Results were neutral, but tended to be negative, in patients treated between 1 and 2 hours of onset. Results were neutral, but tended to be positive, among patients treated at more than 2 hours after onset. There was no difference between groups in the rate of mortality.
One of the study’s limitations was its single-blind design. In addition, the trial was conducted in a single country, and the investigators changed the protocol after it was initiated. “We had a higher-than-expected [stroke] mimic rate, although I’m reassured by most experts that ... this is probably about right,” said Dr. Bath.
A potential reason for the neutral results is the negative effect that GTN had among patients with ICH, said Dr. Bath. “In that very early first hour, we are of course breaking a law that we learned in medical school, which is that the first part of hemostasis is spasm. We gave an antispasmodic: a vasodilator,” he added. “That is speculation.”
The trial was funded by the British Heart Foundation. Dr. Bath declared a modest ownership interest in Platelet Solutions and consultant or advisory board positions with Moleac, DiaMedica, Phagenesis, Nestle, and ReNeuron. The other investigators declared no conflicts of interest.
HONOLULU – , according to data presented at the International Stroke Conference sponsored by the American Heart Association. Results suggest that GTN causes adverse effects in patients with intracerebral hemorrhage (ICH), but this observation is not definitive, according to the researchers. Study results were published online ahead of print Feb. 6 in the Lancet.
Nitric oxide is a regulatory molecule that has vasoactive effects and promotes blood pressure reduction. Vascular levels of nitric oxide are low in stroke, which suggests that the molecule may be a target for stroke treatment. GTN, a nitric oxide donor, lowered blood pressure and improved functional outcome among patients with acute stroke in the phase 2 Rapid Intervention with GTN in Hypertensive Stroke Trial (RIGHT).
Philip Bath, MD, Stroke Association Professor of Stroke Medicine at the University of Nottingham (England), and colleagues conducted the RIGHT-2 study to evaluate the safety and efficacy of GTN when administered early after onset of suspected stroke. Paramedics randomized patients in equal groups to a GTN patch or a sham patch in the ambulance. Three more patches were administered in the hospital on the following days. Active and sham patches looked similar and had no writing on them, thus ensuring effective blinding upon administration. Investigators followed up patients by telephone at 90 days to assess the modified Rankin Scale score and markers of disability, mood, cognition, and quality of life.
Eligible participants were adults who had dialed emergency services, independently or with assistance, because of a possible stroke. They had a Face, Arm, Speech, Time (FAST) score of 2 or 3, were within 4 hours of onset, and had a systolic blood pressure greater than 120 mm Hg. Patients from nursing homes, those with hypoglycemia, those who were unconscious, and those with a witnessed seizure were excluded.
Dr. Bath and colleagues planned to enroll 850 patients from five ambulance services in 30 hospitals across the United Kingdom. Data were to be examined through an intention-to-treat analysis. During the trial, however, the investigators observed that the rate of stroke mimics was 26%, rather than the 12% that they had anticipated. To ensure the proper power for the study, the investigators increased the sample size to 1,149 patients. They also changed the planned data analysis from intention-to-treat to hierarchical analysis. Specifically, the researchers planned to perform the primary analysis in patients with stroke or TIA. If the results were positive, then they would perform a standard intention-to-treat analysis.
More than 99% of patients received the first patch. Approximately 57% of the population received the first two patches. One reason for this decrease in adherence was that many patients were discharged from the hospital with a TIA or a stroke mimic. Participants’ average age was 72. The median time from onset to randomization was 71 minutes, and the median time to treatment was 73 minutes. Participants’ mean systolic blood pressure was 162 mm Hg. Approximately 60% of the patients had a FAST score of 3. About 50% of participants had ischemic stroke, 13% had ICH, 10% had TIA, and 26% had stroke mimics.
At 1 hour after treatment initiation, systolic blood pressure decreased by 6.2 mm Hg and diastolic blood pressure decreased by 2.7 mm Hg among patients who received GTN, compared with controls. At one day, the differences were 5.2 mm Hg and 2.5 mm Hg, respectively, in treated patients, compared with controls. Blood pressure became similar between groups thereafter, “in part because of the tachyphylaxis that we know happens with GTN,” said Dr. Bath.
The researchers found no evidence of an effect of GTN on functional outcome at 90 days in participants with stroke or transient ischemic attack. The adjusted common odds ratio of poor outcome was 1.25 in the GTN group, compared with the control group (95 % confidence interval, 0.97-1.60; P = .083). “We were close to getting a negative trial,” said Dr. Bath.
Subgroup analyses revealed differences in outcome according to the time to randomization. GTN had a negative effect in patients treated within 1 hour of onset. Results were neutral, but tended to be negative, in patients treated between 1 and 2 hours of onset. Results were neutral, but tended to be positive, among patients treated at more than 2 hours after onset. There was no difference between groups in the rate of mortality.
One of the study’s limitations was its single-blind design. In addition, the trial was conducted in a single country, and the investigators changed the protocol after it was initiated. “We had a higher-than-expected [stroke] mimic rate, although I’m reassured by most experts that ... this is probably about right,” said Dr. Bath.
A potential reason for the neutral results is the negative effect that GTN had among patients with ICH, said Dr. Bath. “In that very early first hour, we are of course breaking a law that we learned in medical school, which is that the first part of hemostasis is spasm. We gave an antispasmodic: a vasodilator,” he added. “That is speculation.”
The trial was funded by the British Heart Foundation. Dr. Bath declared a modest ownership interest in Platelet Solutions and consultant or advisory board positions with Moleac, DiaMedica, Phagenesis, Nestle, and ReNeuron. The other investigators declared no conflicts of interest.
REPORTING FROM ISC 2019
Epicutaneous milk immunotherapy can resolve pediatric eosinophilic esophagitis
SAN FRANCISCO – Clinicians safely used epicutaneous immunotherapy to resolve eosinophilic esophagitis in children and teens secondary to milk consumption in a placebo-controlled, pilot study that included 20 patients.
Following the randomized phase of the study, all 19 patients who continued to participate began an 11-month open-label phase of epicutaneous immunotherapy to milk. At the end of this open-label phase, nine patients (47%) followed in this phase showed a substantial cut in their eosinophilic esophagitis (EoE) response to milk, with fewer than 15 eosinophils in a high-powered field, said Jonathan M. Spergel, MD, chief of the allergy section and Stuart E. Starr Chair of Pediatrics at the Children’s Hospital of Philadelphia, while presenting a poster at the annual meeting of the American Academy of Allergy, Asthma, and Immunology. An immunologic response of this sort would likely correlate with substantial clinical benefit.
“I’m happy with a 47% response,” Dr. Spergel said, adding that the responding patients “tolerate milk without symptoms, and there is really no risk” from this form of immunotherapy, which produced no serious adverse effects and caused 1 of 15 patients to stop treatment because of a treatment-related effect during the randomized phase. The most common adverse reaction was gastrointestinal symptoms, but these were just marginally more common among patients on active treatment than in control patients.
In contrast, oral immunotherapy with milk has been ineffective in children with an EoE milk reaction, and results from subcutaneous or sublingual immunotherapy for this form of milk allergy haven’t been reported, he said. The most common, current approaches to managing EoE from milk in children are either milk avoidance or treatment to reduce inflammation.
The epicutaneous approach “uses substantially lower dosing [micrograms vs. milligrams], avoids oral allergen ingestion, and may have a more advantageous adverse event profile and better adherence than other therapies,” according to a recent report that tested epicutaneous immunotherapy for peanut allergy in a phase 3 trial with 356 children (JAMA. 2019 Feb 22. doi: 10.1001/jama.2019.1113). The Viaskin Milk system tested in the current milk study involves placing a disc coated with 500 mcg of lyophilized milk protein on the skin for a gradually increasing number of hours daily until the disc is worn continuously, with daily changes of the disk. On the skin, the protein on the disk interacts with epidermal Langerhans cells to trigger desensitization.
The Milk Patch for Eosinophilic Esophagitis (SMILEE) study enrolled 20 patients at Children’s Hospital aged 4-17 years old and had milk-induced EoE, and randomized 15 to receive active epicutaneous immunotherapy to milk and 5 to receive placebo treatment. The protocol called for 9 month of epicutaneous immunotherapy without any milk exposure, followed by 2 months of continued treatment coupled with at least 240 mL of milk consumption daily. At the end of 2 months the researchers performed an esophageal biopsy on each patient to determine eosinophil density in the tissue. The study’s primary endpoint was the number of eosinophils in a high-powered field.
During the randomized phase, 8 of the 15 patients assigned to active treatment and 3 of 5 patients assigned to the placebo arm had violations of the treatment protocol, the diet protocol, or both. A per protocol analysis that focused on the seven actively treated and two placebo patients who adhered to the protocol showed a mean eosinophil count of 26 cells in patients on active treatment and 95 cells among the controls, a statistically significant difference. However, for the intention-to-treat analysis, which included all 20 enrolled patients, the primary endpoint showed no significant difference in eosinophil counts between the two study arms.
Although Dr. Spergel said that he was not aware of the developing company’s plans for further study of epicutaneous milk immunotherapy, from a scientific standpoint the next step should be a phase 2 or phase 2/3 trial for safety and efficacy. EoE was historically considered a rare disease, but a 2015 review of the condition called it “one of the most common conditions diagnosed during the assessment of feeding problems in children” (New Engl J Med. 2015 Oct 22;373[17]:1640-8).
The study was funded by DBV Technologies, which is developing the epicutaneous immunotherapy system. Dr. Spergel has been a consultant to and has received research funding from DBV Technologies.
SOURCE: Spergel JM et al. J Allergy Clin Immunol. 2019 Feb;143[2]:AB430.
SAN FRANCISCO – Clinicians safely used epicutaneous immunotherapy to resolve eosinophilic esophagitis in children and teens secondary to milk consumption in a placebo-controlled, pilot study that included 20 patients.
Following the randomized phase of the study, all 19 patients who continued to participate began an 11-month open-label phase of epicutaneous immunotherapy to milk. At the end of this open-label phase, nine patients (47%) followed in this phase showed a substantial cut in their eosinophilic esophagitis (EoE) response to milk, with fewer than 15 eosinophils in a high-powered field, said Jonathan M. Spergel, MD, chief of the allergy section and Stuart E. Starr Chair of Pediatrics at the Children’s Hospital of Philadelphia, while presenting a poster at the annual meeting of the American Academy of Allergy, Asthma, and Immunology. An immunologic response of this sort would likely correlate with substantial clinical benefit.
“I’m happy with a 47% response,” Dr. Spergel said, adding that the responding patients “tolerate milk without symptoms, and there is really no risk” from this form of immunotherapy, which produced no serious adverse effects and caused 1 of 15 patients to stop treatment because of a treatment-related effect during the randomized phase. The most common adverse reaction was gastrointestinal symptoms, but these were just marginally more common among patients on active treatment than in control patients.
In contrast, oral immunotherapy with milk has been ineffective in children with an EoE milk reaction, and results from subcutaneous or sublingual immunotherapy for this form of milk allergy haven’t been reported, he said. The most common, current approaches to managing EoE from milk in children are either milk avoidance or treatment to reduce inflammation.
The epicutaneous approach “uses substantially lower dosing [micrograms vs. milligrams], avoids oral allergen ingestion, and may have a more advantageous adverse event profile and better adherence than other therapies,” according to a recent report that tested epicutaneous immunotherapy for peanut allergy in a phase 3 trial with 356 children (JAMA. 2019 Feb 22. doi: 10.1001/jama.2019.1113). The Viaskin Milk system tested in the current milk study involves placing a disc coated with 500 mcg of lyophilized milk protein on the skin for a gradually increasing number of hours daily until the disc is worn continuously, with daily changes of the disk. On the skin, the protein on the disk interacts with epidermal Langerhans cells to trigger desensitization.
The Milk Patch for Eosinophilic Esophagitis (SMILEE) study enrolled 20 patients at Children’s Hospital aged 4-17 years old and had milk-induced EoE, and randomized 15 to receive active epicutaneous immunotherapy to milk and 5 to receive placebo treatment. The protocol called for 9 month of epicutaneous immunotherapy without any milk exposure, followed by 2 months of continued treatment coupled with at least 240 mL of milk consumption daily. At the end of 2 months the researchers performed an esophageal biopsy on each patient to determine eosinophil density in the tissue. The study’s primary endpoint was the number of eosinophils in a high-powered field.
During the randomized phase, 8 of the 15 patients assigned to active treatment and 3 of 5 patients assigned to the placebo arm had violations of the treatment protocol, the diet protocol, or both. A per protocol analysis that focused on the seven actively treated and two placebo patients who adhered to the protocol showed a mean eosinophil count of 26 cells in patients on active treatment and 95 cells among the controls, a statistically significant difference. However, for the intention-to-treat analysis, which included all 20 enrolled patients, the primary endpoint showed no significant difference in eosinophil counts between the two study arms.
Although Dr. Spergel said that he was not aware of the developing company’s plans for further study of epicutaneous milk immunotherapy, from a scientific standpoint the next step should be a phase 2 or phase 2/3 trial for safety and efficacy. EoE was historically considered a rare disease, but a 2015 review of the condition called it “one of the most common conditions diagnosed during the assessment of feeding problems in children” (New Engl J Med. 2015 Oct 22;373[17]:1640-8).
The study was funded by DBV Technologies, which is developing the epicutaneous immunotherapy system. Dr. Spergel has been a consultant to and has received research funding from DBV Technologies.
SOURCE: Spergel JM et al. J Allergy Clin Immunol. 2019 Feb;143[2]:AB430.
SAN FRANCISCO – Clinicians safely used epicutaneous immunotherapy to resolve eosinophilic esophagitis in children and teens secondary to milk consumption in a placebo-controlled, pilot study that included 20 patients.
Following the randomized phase of the study, all 19 patients who continued to participate began an 11-month open-label phase of epicutaneous immunotherapy to milk. At the end of this open-label phase, nine patients (47%) followed in this phase showed a substantial cut in their eosinophilic esophagitis (EoE) response to milk, with fewer than 15 eosinophils in a high-powered field, said Jonathan M. Spergel, MD, chief of the allergy section and Stuart E. Starr Chair of Pediatrics at the Children’s Hospital of Philadelphia, while presenting a poster at the annual meeting of the American Academy of Allergy, Asthma, and Immunology. An immunologic response of this sort would likely correlate with substantial clinical benefit.
“I’m happy with a 47% response,” Dr. Spergel said, adding that the responding patients “tolerate milk without symptoms, and there is really no risk” from this form of immunotherapy, which produced no serious adverse effects and caused 1 of 15 patients to stop treatment because of a treatment-related effect during the randomized phase. The most common adverse reaction was gastrointestinal symptoms, but these were just marginally more common among patients on active treatment than in control patients.
In contrast, oral immunotherapy with milk has been ineffective in children with an EoE milk reaction, and results from subcutaneous or sublingual immunotherapy for this form of milk allergy haven’t been reported, he said. The most common, current approaches to managing EoE from milk in children are either milk avoidance or treatment to reduce inflammation.
The epicutaneous approach “uses substantially lower dosing [micrograms vs. milligrams], avoids oral allergen ingestion, and may have a more advantageous adverse event profile and better adherence than other therapies,” according to a recent report that tested epicutaneous immunotherapy for peanut allergy in a phase 3 trial with 356 children (JAMA. 2019 Feb 22. doi: 10.1001/jama.2019.1113). The Viaskin Milk system tested in the current milk study involves placing a disc coated with 500 mcg of lyophilized milk protein on the skin for a gradually increasing number of hours daily until the disc is worn continuously, with daily changes of the disk. On the skin, the protein on the disk interacts with epidermal Langerhans cells to trigger desensitization.
The Milk Patch for Eosinophilic Esophagitis (SMILEE) study enrolled 20 patients at Children’s Hospital aged 4-17 years old and had milk-induced EoE, and randomized 15 to receive active epicutaneous immunotherapy to milk and 5 to receive placebo treatment. The protocol called for 9 month of epicutaneous immunotherapy without any milk exposure, followed by 2 months of continued treatment coupled with at least 240 mL of milk consumption daily. At the end of 2 months the researchers performed an esophageal biopsy on each patient to determine eosinophil density in the tissue. The study’s primary endpoint was the number of eosinophils in a high-powered field.
During the randomized phase, 8 of the 15 patients assigned to active treatment and 3 of 5 patients assigned to the placebo arm had violations of the treatment protocol, the diet protocol, or both. A per protocol analysis that focused on the seven actively treated and two placebo patients who adhered to the protocol showed a mean eosinophil count of 26 cells in patients on active treatment and 95 cells among the controls, a statistically significant difference. However, for the intention-to-treat analysis, which included all 20 enrolled patients, the primary endpoint showed no significant difference in eosinophil counts between the two study arms.
Although Dr. Spergel said that he was not aware of the developing company’s plans for further study of epicutaneous milk immunotherapy, from a scientific standpoint the next step should be a phase 2 or phase 2/3 trial for safety and efficacy. EoE was historically considered a rare disease, but a 2015 review of the condition called it “one of the most common conditions diagnosed during the assessment of feeding problems in children” (New Engl J Med. 2015 Oct 22;373[17]:1640-8).
The study was funded by DBV Technologies, which is developing the epicutaneous immunotherapy system. Dr. Spergel has been a consultant to and has received research funding from DBV Technologies.
SOURCE: Spergel JM et al. J Allergy Clin Immunol. 2019 Feb;143[2]:AB430.
REPORTING FROM AAAAI 2019
Key clinical point:
Major finding: After 11 months of open-label treatment, 9 of 19 patients resolved their eosinophilic esophagitis reaction to milk.
Study details: A 2-year, single-center study of epicutaneous milk immunotherapy in 20 children with milk-induced eosinophilic esophagitis.
Disclosures: The study was funded by DBV Technologies, which is developing the epicutaneous immunotherapy system. Dr. Spergel has been a consultant to and has received research funding from DBV Technologies.
Source: Spergel JM et al. J Allergy Clin Immunol. 2019 Feb;143[2]:AB430.



