User login
VIDEO: No short-term link found between PPIs, myocardial infarction
in a large retrospective insurance claims study.
SOURCE: AMERICAN GASTROENTEROLOGICAL ASSOCIATION
Over a median follow-up of 2-3 months, estimated weighted risks of first-ever MI were low and similar regardless of whether patients started PPIs or histamine2 receptor antagonists (H2RAs), reported Suzanne N. Landi of the University of North Carolina at Chapel Hill, and her associates. “Contrary to prior literature, our analyses do not indicate increased risk of MI in PPI initiators compared to histamine2-receptor antagonist initiators,” they wrote in the March issue of Gastroenterology.
Epidemiologic studies have produced mixed findings on PPI use and MI risk. Animal models and ex vivo studies of human tissue indicate that PPIs might harm coronary vessels by increasing plasma levels of asymmetrical dimethylarginine, which counteracts the vasoprotective activity of endothelial nitrous oxide synthase, the investigators noted. To further assess PPIs and risk of MI while minimizing potential confounding, they studied new users of either prescription PPIs or an active comparator, prescription H2RAs. The dataset included administrative claims for more than 5 million patients with no MI history who were enrolled in commercial insurance plans or Medicare Supplemental Insurance plans. The study data spanned from 2001 to 2014, and patients were followed from their initial antacid prescription until they either developed a first-ever MI, stopped their medication, or left their insurance plan. Median follow-up times were 60 days in patients with commercial insurance and 96 days in patients with Medicare Supplemental Insurance, which employers provide for individuals who are at least 65 years old.
After controlling for numerous measurable clinical and demographic confounders, the estimated 12-month risk of MI was about 2 cases per 1,000 commercially insured patients and about 8 cases per 1,000 Medicare Supplemental Insurance enrollees. The estimated 12-month risk of MI did not significantly differ between users of PPIs and H2RAs, regardless of whether they were enrolled in commercial insurance plans (weighted risk difference per 1,000 users, –0.08; 95% confidence interval, –0.51 to 0.36) or Medicare Supplemental Insurance (weighted risk difference per 1,000 users, –0.45; 95% CI, –1.53 to 0.58) plans.
Each antacid class also conferred a similar estimated risk of MI at 36 months, with weighted risk differences of 0.44 (95% CI, –0.90 to 1.63) per 1,000 commercial plan enrollees and –0.33 (95% CI, –4.40 to 3.46) per 1,000 Medicare Supplemental Insurance plan enrollees, the researchers reported. Weighted estimated risk ratios also were similar between drug classes, ranging from 0.87 (95% CI, 0.76 to 0.99) at 3 months among Medicare Supplemental Insurance enrollees to 1.08 (95% CI, 0.87 to 1.35) at 36 months among commercial insurance plan members.
“Previous studies have examined the risk of MI in PPI users and compared directly to nonusers, which may have resulted in stronger confounding by indication and other risk factors, such as BMI [body mass index] and baseline cardiovascular disease,” the investigators wrote. “Physicians and patients should not avoid starting a PPI because of concerns related to MI risk.”
The researchers received no grant support for this study. Ms. Landi disclosed a student fellowship from UCB Biosciences.
SOURCE: Source: Landi SN et al. Gastroenterology. 2017 Nov 6. doi: 10.1053/j.gastro.2017.10.042.
In the late 2000s, several large epidemiologic studies suggested that proton pump inhibitors (PPIs) increase the risk for MI in users of clopidogrel. There was a proposed mechanism: PPIs competitively inhibit cytochrome P450 isoenzymes, which blocked clopidogrel activation and, ex vivo, increased platelet aggregation. It sounded scary – but fortunately, some reassuring data quickly emerged. In 2007, the COGENT trial randomized patients with cardiovascular disease to a PPI/clopidogrel versus a placebo/clopidogrel combination pill. After 3 years of follow-up, there was no difference in rates of death or cardiovascular events. In the glaring light of this randomized controlled trial data, earlier studies didn’t look so convincing.
So why won’t the PPI/MI issue die? In part because COGENT was a relatively small study. It included 3,761 patients, but the main result depended on 109 cardiovascular events. Naysayers have argued that perhaps if COGENT had been a bigger study, the result would have been different.
In this context, the epidemiologic study by Suzanne Landi and her associates provides further reassurance that PPIs do not cause MI. Two insurance cohorts comprising over 5 million patients were used to compare PPI users with histamine2-receptor antagonist users after adjusting for baseline differences between the two groups. The large size of the dataset allowed the authors to make precise estimates; we can say with confidence that there was no clinically relevant PPI/MI risk in these data.
Can we forget about PPIs and MI? These days, my patients worry more about dementia or chronic kidney disease. But the PPI/MI story is worth remembering. Large epidemiologic studies are sometimes contradicted by subsequent studies and need to be evaluated in context.
Daniel E. Freedberg, MD, MS, is an assistant professor of medicine at the Columbia University Medical Center, New York. He has consulted for Pfizer.
In the late 2000s, several large epidemiologic studies suggested that proton pump inhibitors (PPIs) increase the risk for MI in users of clopidogrel. There was a proposed mechanism: PPIs competitively inhibit cytochrome P450 isoenzymes, which blocked clopidogrel activation and, ex vivo, increased platelet aggregation. It sounded scary – but fortunately, some reassuring data quickly emerged. In 2007, the COGENT trial randomized patients with cardiovascular disease to a PPI/clopidogrel versus a placebo/clopidogrel combination pill. After 3 years of follow-up, there was no difference in rates of death or cardiovascular events. In the glaring light of this randomized controlled trial data, earlier studies didn’t look so convincing.
So why won’t the PPI/MI issue die? In part because COGENT was a relatively small study. It included 3,761 patients, but the main result depended on 109 cardiovascular events. Naysayers have argued that perhaps if COGENT had been a bigger study, the result would have been different.
In this context, the epidemiologic study by Suzanne Landi and her associates provides further reassurance that PPIs do not cause MI. Two insurance cohorts comprising over 5 million patients were used to compare PPI users with histamine2-receptor antagonist users after adjusting for baseline differences between the two groups. The large size of the dataset allowed the authors to make precise estimates; we can say with confidence that there was no clinically relevant PPI/MI risk in these data.
Can we forget about PPIs and MI? These days, my patients worry more about dementia or chronic kidney disease. But the PPI/MI story is worth remembering. Large epidemiologic studies are sometimes contradicted by subsequent studies and need to be evaluated in context.
Daniel E. Freedberg, MD, MS, is an assistant professor of medicine at the Columbia University Medical Center, New York. He has consulted for Pfizer.
In the late 2000s, several large epidemiologic studies suggested that proton pump inhibitors (PPIs) increase the risk for MI in users of clopidogrel. There was a proposed mechanism: PPIs competitively inhibit cytochrome P450 isoenzymes, which blocked clopidogrel activation and, ex vivo, increased platelet aggregation. It sounded scary – but fortunately, some reassuring data quickly emerged. In 2007, the COGENT trial randomized patients with cardiovascular disease to a PPI/clopidogrel versus a placebo/clopidogrel combination pill. After 3 years of follow-up, there was no difference in rates of death or cardiovascular events. In the glaring light of this randomized controlled trial data, earlier studies didn’t look so convincing.
So why won’t the PPI/MI issue die? In part because COGENT was a relatively small study. It included 3,761 patients, but the main result depended on 109 cardiovascular events. Naysayers have argued that perhaps if COGENT had been a bigger study, the result would have been different.
In this context, the epidemiologic study by Suzanne Landi and her associates provides further reassurance that PPIs do not cause MI. Two insurance cohorts comprising over 5 million patients were used to compare PPI users with histamine2-receptor antagonist users after adjusting for baseline differences between the two groups. The large size of the dataset allowed the authors to make precise estimates; we can say with confidence that there was no clinically relevant PPI/MI risk in these data.
Can we forget about PPIs and MI? These days, my patients worry more about dementia or chronic kidney disease. But the PPI/MI story is worth remembering. Large epidemiologic studies are sometimes contradicted by subsequent studies and need to be evaluated in context.
Daniel E. Freedberg, MD, MS, is an assistant professor of medicine at the Columbia University Medical Center, New York. He has consulted for Pfizer.
in a large retrospective insurance claims study.
SOURCE: AMERICAN GASTROENTEROLOGICAL ASSOCIATION
Over a median follow-up of 2-3 months, estimated weighted risks of first-ever MI were low and similar regardless of whether patients started PPIs or histamine2 receptor antagonists (H2RAs), reported Suzanne N. Landi of the University of North Carolina at Chapel Hill, and her associates. “Contrary to prior literature, our analyses do not indicate increased risk of MI in PPI initiators compared to histamine2-receptor antagonist initiators,” they wrote in the March issue of Gastroenterology.
Epidemiologic studies have produced mixed findings on PPI use and MI risk. Animal models and ex vivo studies of human tissue indicate that PPIs might harm coronary vessels by increasing plasma levels of asymmetrical dimethylarginine, which counteracts the vasoprotective activity of endothelial nitrous oxide synthase, the investigators noted. To further assess PPIs and risk of MI while minimizing potential confounding, they studied new users of either prescription PPIs or an active comparator, prescription H2RAs. The dataset included administrative claims for more than 5 million patients with no MI history who were enrolled in commercial insurance plans or Medicare Supplemental Insurance plans. The study data spanned from 2001 to 2014, and patients were followed from their initial antacid prescription until they either developed a first-ever MI, stopped their medication, or left their insurance plan. Median follow-up times were 60 days in patients with commercial insurance and 96 days in patients with Medicare Supplemental Insurance, which employers provide for individuals who are at least 65 years old.
After controlling for numerous measurable clinical and demographic confounders, the estimated 12-month risk of MI was about 2 cases per 1,000 commercially insured patients and about 8 cases per 1,000 Medicare Supplemental Insurance enrollees. The estimated 12-month risk of MI did not significantly differ between users of PPIs and H2RAs, regardless of whether they were enrolled in commercial insurance plans (weighted risk difference per 1,000 users, –0.08; 95% confidence interval, –0.51 to 0.36) or Medicare Supplemental Insurance (weighted risk difference per 1,000 users, –0.45; 95% CI, –1.53 to 0.58) plans.
Each antacid class also conferred a similar estimated risk of MI at 36 months, with weighted risk differences of 0.44 (95% CI, –0.90 to 1.63) per 1,000 commercial plan enrollees and –0.33 (95% CI, –4.40 to 3.46) per 1,000 Medicare Supplemental Insurance plan enrollees, the researchers reported. Weighted estimated risk ratios also were similar between drug classes, ranging from 0.87 (95% CI, 0.76 to 0.99) at 3 months among Medicare Supplemental Insurance enrollees to 1.08 (95% CI, 0.87 to 1.35) at 36 months among commercial insurance plan members.
“Previous studies have examined the risk of MI in PPI users and compared directly to nonusers, which may have resulted in stronger confounding by indication and other risk factors, such as BMI [body mass index] and baseline cardiovascular disease,” the investigators wrote. “Physicians and patients should not avoid starting a PPI because of concerns related to MI risk.”
The researchers received no grant support for this study. Ms. Landi disclosed a student fellowship from UCB Biosciences.
SOURCE: Source: Landi SN et al. Gastroenterology. 2017 Nov 6. doi: 10.1053/j.gastro.2017.10.042.
in a large retrospective insurance claims study.
SOURCE: AMERICAN GASTROENTEROLOGICAL ASSOCIATION
Over a median follow-up of 2-3 months, estimated weighted risks of first-ever MI were low and similar regardless of whether patients started PPIs or histamine2 receptor antagonists (H2RAs), reported Suzanne N. Landi of the University of North Carolina at Chapel Hill, and her associates. “Contrary to prior literature, our analyses do not indicate increased risk of MI in PPI initiators compared to histamine2-receptor antagonist initiators,” they wrote in the March issue of Gastroenterology.
Epidemiologic studies have produced mixed findings on PPI use and MI risk. Animal models and ex vivo studies of human tissue indicate that PPIs might harm coronary vessels by increasing plasma levels of asymmetrical dimethylarginine, which counteracts the vasoprotective activity of endothelial nitrous oxide synthase, the investigators noted. To further assess PPIs and risk of MI while minimizing potential confounding, they studied new users of either prescription PPIs or an active comparator, prescription H2RAs. The dataset included administrative claims for more than 5 million patients with no MI history who were enrolled in commercial insurance plans or Medicare Supplemental Insurance plans. The study data spanned from 2001 to 2014, and patients were followed from their initial antacid prescription until they either developed a first-ever MI, stopped their medication, or left their insurance plan. Median follow-up times were 60 days in patients with commercial insurance and 96 days in patients with Medicare Supplemental Insurance, which employers provide for individuals who are at least 65 years old.
After controlling for numerous measurable clinical and demographic confounders, the estimated 12-month risk of MI was about 2 cases per 1,000 commercially insured patients and about 8 cases per 1,000 Medicare Supplemental Insurance enrollees. The estimated 12-month risk of MI did not significantly differ between users of PPIs and H2RAs, regardless of whether they were enrolled in commercial insurance plans (weighted risk difference per 1,000 users, –0.08; 95% confidence interval, –0.51 to 0.36) or Medicare Supplemental Insurance (weighted risk difference per 1,000 users, –0.45; 95% CI, –1.53 to 0.58) plans.
Each antacid class also conferred a similar estimated risk of MI at 36 months, with weighted risk differences of 0.44 (95% CI, –0.90 to 1.63) per 1,000 commercial plan enrollees and –0.33 (95% CI, –4.40 to 3.46) per 1,000 Medicare Supplemental Insurance plan enrollees, the researchers reported. Weighted estimated risk ratios also were similar between drug classes, ranging from 0.87 (95% CI, 0.76 to 0.99) at 3 months among Medicare Supplemental Insurance enrollees to 1.08 (95% CI, 0.87 to 1.35) at 36 months among commercial insurance plan members.
“Previous studies have examined the risk of MI in PPI users and compared directly to nonusers, which may have resulted in stronger confounding by indication and other risk factors, such as BMI [body mass index] and baseline cardiovascular disease,” the investigators wrote. “Physicians and patients should not avoid starting a PPI because of concerns related to MI risk.”
The researchers received no grant support for this study. Ms. Landi disclosed a student fellowship from UCB Biosciences.
SOURCE: Source: Landi SN et al. Gastroenterology. 2017 Nov 6. doi: 10.1053/j.gastro.2017.10.042.
FROM GASTROENTEROLOGY
Key clinical point: Starting a PPI did not appear to increase the short-term risk of MI.
Major finding: Over a median follow-up time of 2-3 months, the estimated risk of first-ever MI did not statistically differ between initiators of PPIs and initiators of histamine2-receptor antagonists.
Data source: Analyses of commercial and Medicare Supplemental Insurance claims for more than 5 million patients from 2001-2014.
Disclosures: The researchers received no grant support for this study. Ms. Landi disclosed a student fellowship from UCB Biosciences.
Source: Landi SN et al. Gastroenterology. 2017 Nov 6. doi: 10.1053/j.gastro.2017.10.042.
AGA Guideline: Use goal-directed fluid therapy, early oral feeding in acute pancreatitis
Patients with acute pancreatitis should receive “goal-directed” fluid therapy with normal saline or Ringer’s lactate solution rather than hydroxyethyl starch (HES) fluids, states a new guideline from the AGA Institute.
In a single-center randomized trial, hydroxyethyl starch fluids conferred a 3.9-fold increase in the odds of multiorgan failure (95% confidence interval for odds ratio, 1.2-12.0) compared with normal saline in patients with acute pancreatitis, wrote guideline authors Seth D. Crockett, MD, MPH, of the University of North Carolina, Chapel Hill, and his associates. This trial and another randomized study found no mortality benefit for HES compared with fluid resuscitation. The evidence is “very low quality” but mirrors the critical care literature, according to the experts. So far, Ringer’s lactate solution and normal saline have shown similar effects on the risk of organ failure, necrosis, and mortality, but ongoing trials should better clarify this choice, they noted (Gastroenterology. doi: 10.1053/j.gastro.2018.01.032).
The guideline addresses the initial 2-week period of treating acute pancreatitis. It defines goal-directed fluid therapy as titration based on meaningful targets, such as heart rate, mean arterial pressure, central venous pressure, urine output, blood urea nitrogen concentration, and hematocrit. Studies of goal-directed fluid therapy in acute pancreatitis have been unblinded, have used inconsistent outcome measures, and have found no definite benefits over nontargeted fluid therapy, note the guideline authors. Nevertheless, they conditionally recommend goal-directed fluid therapy, partly because a randomized, blinded trial of patients with severe sepsis or septic shock (which physiologically resembles acute pancreatitis) had in-hospital mortality rates of 31% when they received goal-directed fluid therapy and 47% when they received standard fluid therapy (P = .0009).
The guideline recommends against routine use of two interventions: prophylactic antibiotics and urgent endoscopic retrograde cholangiopancreatography (ERCP) for patients with acute pancreatitis. The authors note that no evidence supports routine prophylactic antibiotics for acute pancreatitis patients without cholangitis, and that urgent ERCP did not significantly affect the risk of mortality, multiorgan failure, single-organ failure, infected pancreatic and peripancreatic necrosis, or necrotizing pancreatitis in eight randomized controlled trials of patients with acute gallstone pancreatitis.
The guideline strongly recommends early oral feeding and enteral rather than parenteral nutrition for all patients with acute pancreatitis. In 11 randomized controlled trials, early and delayed feeding led to similar rates of mortality, but delayed feeding produced a 2.5-fold higher risk of necrosis (95% CI for OR, 1.4-4.4) and tended to increase the risk of infected peripancreatic necrosis, multiorgan failure, and total necrotizing pancreatitis, the authors wrote. In another 12 trials, enteral nutrition significantly reduced the risk of infected peripancreatic necrosis, single-organ failure, and multiorgan failure compared with parenteral nutrition.
Clinicians continue to debate cholecystectomy timing in patients with biliary or gallstone pancreatitis. The guidelines strongly recommend same-admission cholecystectomy, citing a randomized controlled trial in which this approach markedly reduced the combined risk of mortality and gallstone-related complications (OR, 0.2, 95% CI, 0.1-0.6), readmission for recurrent pancreatitis (OR, 0.3, 95% CI, 0.1-0.9), and pancreaticobiliary complications (OR, 0.2, 95% CI, 0.1-0.6). “The AGA issued a strong recommendation due to the quality of available evidence and the high likelihood of benefit from early versus delayed cholecystectomy in this patient population,” the experts stated.
Patients with biliary pancreatitis should be evaluated for cholecystectomy during the same admission, while those with alcohol-induced pancreatitis should receive a brief alcohol intervention, according to the guidelines, which also call for better studies of how alcohol and tobacco cessation measures affect risk of recurrent acute pancreatitis, chronic pancreatitis, and pancreatic cancer, as well as quality of life, health care utilization, and mortality.
The authors also noted knowledge gaps concerning the relative benefits of risk stratification tools, the use of prophylactic antibiotics in patients with severe acute pancreatitis or necrotizing pancreatitis, and the timing of ERCP in patients with severe biliary pancreatitis with persistent biliary obstruction.
The guideline was developed with sole funding by the AGA Institute with no external funding. The authors disclosed no relevant conflicts of interest.
Source: Crockett SD et al. Gastroenterology. doi: 10.1053/j.gastro.2018.01.032.
Patients with acute pancreatitis should receive “goal-directed” fluid therapy with normal saline or Ringer’s lactate solution rather than hydroxyethyl starch (HES) fluids, states a new guideline from the AGA Institute.
In a single-center randomized trial, hydroxyethyl starch fluids conferred a 3.9-fold increase in the odds of multiorgan failure (95% confidence interval for odds ratio, 1.2-12.0) compared with normal saline in patients with acute pancreatitis, wrote guideline authors Seth D. Crockett, MD, MPH, of the University of North Carolina, Chapel Hill, and his associates. This trial and another randomized study found no mortality benefit for HES compared with fluid resuscitation. The evidence is “very low quality” but mirrors the critical care literature, according to the experts. So far, Ringer’s lactate solution and normal saline have shown similar effects on the risk of organ failure, necrosis, and mortality, but ongoing trials should better clarify this choice, they noted (Gastroenterology. doi: 10.1053/j.gastro.2018.01.032).
The guideline addresses the initial 2-week period of treating acute pancreatitis. It defines goal-directed fluid therapy as titration based on meaningful targets, such as heart rate, mean arterial pressure, central venous pressure, urine output, blood urea nitrogen concentration, and hematocrit. Studies of goal-directed fluid therapy in acute pancreatitis have been unblinded, have used inconsistent outcome measures, and have found no definite benefits over nontargeted fluid therapy, note the guideline authors. Nevertheless, they conditionally recommend goal-directed fluid therapy, partly because a randomized, blinded trial of patients with severe sepsis or septic shock (which physiologically resembles acute pancreatitis) had in-hospital mortality rates of 31% when they received goal-directed fluid therapy and 47% when they received standard fluid therapy (P = .0009).
The guideline recommends against routine use of two interventions: prophylactic antibiotics and urgent endoscopic retrograde cholangiopancreatography (ERCP) for patients with acute pancreatitis. The authors note that no evidence supports routine prophylactic antibiotics for acute pancreatitis patients without cholangitis, and that urgent ERCP did not significantly affect the risk of mortality, multiorgan failure, single-organ failure, infected pancreatic and peripancreatic necrosis, or necrotizing pancreatitis in eight randomized controlled trials of patients with acute gallstone pancreatitis.
The guideline strongly recommends early oral feeding and enteral rather than parenteral nutrition for all patients with acute pancreatitis. In 11 randomized controlled trials, early and delayed feeding led to similar rates of mortality, but delayed feeding produced a 2.5-fold higher risk of necrosis (95% CI for OR, 1.4-4.4) and tended to increase the risk of infected peripancreatic necrosis, multiorgan failure, and total necrotizing pancreatitis, the authors wrote. In another 12 trials, enteral nutrition significantly reduced the risk of infected peripancreatic necrosis, single-organ failure, and multiorgan failure compared with parenteral nutrition.
Clinicians continue to debate cholecystectomy timing in patients with biliary or gallstone pancreatitis. The guidelines strongly recommend same-admission cholecystectomy, citing a randomized controlled trial in which this approach markedly reduced the combined risk of mortality and gallstone-related complications (OR, 0.2, 95% CI, 0.1-0.6), readmission for recurrent pancreatitis (OR, 0.3, 95% CI, 0.1-0.9), and pancreaticobiliary complications (OR, 0.2, 95% CI, 0.1-0.6). “The AGA issued a strong recommendation due to the quality of available evidence and the high likelihood of benefit from early versus delayed cholecystectomy in this patient population,” the experts stated.
Patients with biliary pancreatitis should be evaluated for cholecystectomy during the same admission, while those with alcohol-induced pancreatitis should receive a brief alcohol intervention, according to the guidelines, which also call for better studies of how alcohol and tobacco cessation measures affect risk of recurrent acute pancreatitis, chronic pancreatitis, and pancreatic cancer, as well as quality of life, health care utilization, and mortality.
The authors also noted knowledge gaps concerning the relative benefits of risk stratification tools, the use of prophylactic antibiotics in patients with severe acute pancreatitis or necrotizing pancreatitis, and the timing of ERCP in patients with severe biliary pancreatitis with persistent biliary obstruction.
The guideline was developed with sole funding by the AGA Institute with no external funding. The authors disclosed no relevant conflicts of interest.
Source: Crockett SD et al. Gastroenterology. doi: 10.1053/j.gastro.2018.01.032.
Patients with acute pancreatitis should receive “goal-directed” fluid therapy with normal saline or Ringer’s lactate solution rather than hydroxyethyl starch (HES) fluids, states a new guideline from the AGA Institute.
In a single-center randomized trial, hydroxyethyl starch fluids conferred a 3.9-fold increase in the odds of multiorgan failure (95% confidence interval for odds ratio, 1.2-12.0) compared with normal saline in patients with acute pancreatitis, wrote guideline authors Seth D. Crockett, MD, MPH, of the University of North Carolina, Chapel Hill, and his associates. This trial and another randomized study found no mortality benefit for HES compared with fluid resuscitation. The evidence is “very low quality” but mirrors the critical care literature, according to the experts. So far, Ringer’s lactate solution and normal saline have shown similar effects on the risk of organ failure, necrosis, and mortality, but ongoing trials should better clarify this choice, they noted (Gastroenterology. doi: 10.1053/j.gastro.2018.01.032).
The guideline addresses the initial 2-week period of treating acute pancreatitis. It defines goal-directed fluid therapy as titration based on meaningful targets, such as heart rate, mean arterial pressure, central venous pressure, urine output, blood urea nitrogen concentration, and hematocrit. Studies of goal-directed fluid therapy in acute pancreatitis have been unblinded, have used inconsistent outcome measures, and have found no definite benefits over nontargeted fluid therapy, note the guideline authors. Nevertheless, they conditionally recommend goal-directed fluid therapy, partly because a randomized, blinded trial of patients with severe sepsis or septic shock (which physiologically resembles acute pancreatitis) had in-hospital mortality rates of 31% when they received goal-directed fluid therapy and 47% when they received standard fluid therapy (P = .0009).
The guideline recommends against routine use of two interventions: prophylactic antibiotics and urgent endoscopic retrograde cholangiopancreatography (ERCP) for patients with acute pancreatitis. The authors note that no evidence supports routine prophylactic antibiotics for acute pancreatitis patients without cholangitis, and that urgent ERCP did not significantly affect the risk of mortality, multiorgan failure, single-organ failure, infected pancreatic and peripancreatic necrosis, or necrotizing pancreatitis in eight randomized controlled trials of patients with acute gallstone pancreatitis.
The guideline strongly recommends early oral feeding and enteral rather than parenteral nutrition for all patients with acute pancreatitis. In 11 randomized controlled trials, early and delayed feeding led to similar rates of mortality, but delayed feeding produced a 2.5-fold higher risk of necrosis (95% CI for OR, 1.4-4.4) and tended to increase the risk of infected peripancreatic necrosis, multiorgan failure, and total necrotizing pancreatitis, the authors wrote. In another 12 trials, enteral nutrition significantly reduced the risk of infected peripancreatic necrosis, single-organ failure, and multiorgan failure compared with parenteral nutrition.
Clinicians continue to debate cholecystectomy timing in patients with biliary or gallstone pancreatitis. The guidelines strongly recommend same-admission cholecystectomy, citing a randomized controlled trial in which this approach markedly reduced the combined risk of mortality and gallstone-related complications (OR, 0.2, 95% CI, 0.1-0.6), readmission for recurrent pancreatitis (OR, 0.3, 95% CI, 0.1-0.9), and pancreaticobiliary complications (OR, 0.2, 95% CI, 0.1-0.6). “The AGA issued a strong recommendation due to the quality of available evidence and the high likelihood of benefit from early versus delayed cholecystectomy in this patient population,” the experts stated.
Patients with biliary pancreatitis should be evaluated for cholecystectomy during the same admission, while those with alcohol-induced pancreatitis should receive a brief alcohol intervention, according to the guidelines, which also call for better studies of how alcohol and tobacco cessation measures affect risk of recurrent acute pancreatitis, chronic pancreatitis, and pancreatic cancer, as well as quality of life, health care utilization, and mortality.
The authors also noted knowledge gaps concerning the relative benefits of risk stratification tools, the use of prophylactic antibiotics in patients with severe acute pancreatitis or necrotizing pancreatitis, and the timing of ERCP in patients with severe biliary pancreatitis with persistent biliary obstruction.
The guideline was developed with sole funding by the AGA Institute with no external funding. The authors disclosed no relevant conflicts of interest.
Source: Crockett SD et al. Gastroenterology. doi: 10.1053/j.gastro.2018.01.032.
FROM GASTROENTEROLOGY
Obesity affects the ability to diagnose liver fibrosis
Body mass index accounts for a 43.7% discordance in fibrosis findings between magnetic resonance elastography (MRE) and transient elastography (TE), according to a study from the University of California, San Diego.
“This study demonstrates that BMI is a significant factor of discordancy between MRE and TE for the stage of significant fibrosis (2-4 vs. 0-1),” wrote Cyrielle Caussy, MD, and her colleagues (Clin Gastrolenterol Hepatol. 2018 Jan 15. doi: 10.1016/j.cgh.2017.10.037). “Furthermore, this study showed that the grade of obesity is also a significant predictor of discordancy between MRE and TE because the discordance rate between MRE and TE increases with the increase in BMI.”
Dr. Caussy of the University of California, San Diego, and her colleagues had noted that MRE and TE had discordant findings in obese patients. To ascertain under what conditions TE and MRE produce the same readings, Dr. Caussy and her associates conducted a cross-sectional study of two cohorts with nonalcoholic fatty liver disease (NAFLD) who underwent contemporaneous MRE, TE, and liver biopsy. TE utilized both M and XL probes during imaging. The training cohort involved 119 adult patients undergoing NAFLD testing from October 2011 through January 2017. The validation cohort, consisting of 75 adults with NAFLD undergoing liver imaging from March 2010 through May 2013, was formed to validate the findings of the training cohort.
The study revealed that BMI was a significant predictor of the difference between MRE and TE results and made it difficult to assess the stage of liver fibrosis (2-4 vs. 0-1). After adjustment for age and sex, BMI accounted for a 5-unit increase of 1.694 (95% confidence interval, 1.145-2.507; P = .008). This was not a static relationship, and as BMI increased, so did the discordance between MRE and TE (P = .0309). Interestingly, the discordance rate was significantly higher in participants with BMIs greater than 35 kg/m2, compared with participants with BMIs below 35 (63.0% vs. 38.0%; P = .022), the investigators reported.
While the study revealed valuable information, it had both strengths and limitations. A strength of the study was the use of two cohorts, specifically the validation cohort. The use of the liver biopsy as a reference, which is the standard for assessing fibrosis, was also a strength of the study. A limitation was that the study was conducted at specialized, tertiary care centers using advanced imaging techniques that may not be available at other clinics. Additionally, the cohorts included a small number of patients with advanced fibrosis.
“The integration of the BMI in the screening strategy for the noninvasive detection of liver fibrosis in NAFLD should be considered, and this parameter would help to determine when MRE is not needed in future guidelines” wrote Dr. Caussy and her associates. “Further cost-effectiveness studies are necessary to evaluate the clinical utility of MRE, TE, and/or liver biopsy to develop optimal screening strategies for diagnosing NAFLD-associated fibrosis.”
Jun Chen, MD, Meng Yin, MD, and Richard L. Ehman, MD, all have intellectual property rights and financial interests in elastography technology. Dr. Ehman also serves as an noncompensated CEO of Resoundant. Claude B. Sirlin, MD, has served as a consultant to Bayer and GE Healthcare. All other authors did not disclose any conflicts.
The AGA Obesity Practice Guide provides a comprehensive, multi-disciplinary process to personalize innovative obesity care for safe and effective weight management. Learn more at www.gastro.org/obesity.
SOURCE: Caussy C et al. Clin Gastrolenterol Hepatol. 2018 Jan 15. doi: 10.1016/j.cgh.2017.10.037.
Body mass index accounts for a 43.7% discordance in fibrosis findings between magnetic resonance elastography (MRE) and transient elastography (TE), according to a study from the University of California, San Diego.
“This study demonstrates that BMI is a significant factor of discordancy between MRE and TE for the stage of significant fibrosis (2-4 vs. 0-1),” wrote Cyrielle Caussy, MD, and her colleagues (Clin Gastrolenterol Hepatol. 2018 Jan 15. doi: 10.1016/j.cgh.2017.10.037). “Furthermore, this study showed that the grade of obesity is also a significant predictor of discordancy between MRE and TE because the discordance rate between MRE and TE increases with the increase in BMI.”
Dr. Caussy of the University of California, San Diego, and her colleagues had noted that MRE and TE had discordant findings in obese patients. To ascertain under what conditions TE and MRE produce the same readings, Dr. Caussy and her associates conducted a cross-sectional study of two cohorts with nonalcoholic fatty liver disease (NAFLD) who underwent contemporaneous MRE, TE, and liver biopsy. TE utilized both M and XL probes during imaging. The training cohort involved 119 adult patients undergoing NAFLD testing from October 2011 through January 2017. The validation cohort, consisting of 75 adults with NAFLD undergoing liver imaging from March 2010 through May 2013, was formed to validate the findings of the training cohort.
The study revealed that BMI was a significant predictor of the difference between MRE and TE results and made it difficult to assess the stage of liver fibrosis (2-4 vs. 0-1). After adjustment for age and sex, BMI accounted for a 5-unit increase of 1.694 (95% confidence interval, 1.145-2.507; P = .008). This was not a static relationship, and as BMI increased, so did the discordance between MRE and TE (P = .0309). Interestingly, the discordance rate was significantly higher in participants with BMIs greater than 35 kg/m2, compared with participants with BMIs below 35 (63.0% vs. 38.0%; P = .022), the investigators reported.
While the study revealed valuable information, it had both strengths and limitations. A strength of the study was the use of two cohorts, specifically the validation cohort. The use of the liver biopsy as a reference, which is the standard for assessing fibrosis, was also a strength of the study. A limitation was that the study was conducted at specialized, tertiary care centers using advanced imaging techniques that may not be available at other clinics. Additionally, the cohorts included a small number of patients with advanced fibrosis.
“The integration of the BMI in the screening strategy for the noninvasive detection of liver fibrosis in NAFLD should be considered, and this parameter would help to determine when MRE is not needed in future guidelines” wrote Dr. Caussy and her associates. “Further cost-effectiveness studies are necessary to evaluate the clinical utility of MRE, TE, and/or liver biopsy to develop optimal screening strategies for diagnosing NAFLD-associated fibrosis.”
Jun Chen, MD, Meng Yin, MD, and Richard L. Ehman, MD, all have intellectual property rights and financial interests in elastography technology. Dr. Ehman also serves as an noncompensated CEO of Resoundant. Claude B. Sirlin, MD, has served as a consultant to Bayer and GE Healthcare. All other authors did not disclose any conflicts.
The AGA Obesity Practice Guide provides a comprehensive, multi-disciplinary process to personalize innovative obesity care for safe and effective weight management. Learn more at www.gastro.org/obesity.
SOURCE: Caussy C et al. Clin Gastrolenterol Hepatol. 2018 Jan 15. doi: 10.1016/j.cgh.2017.10.037.
Body mass index accounts for a 43.7% discordance in fibrosis findings between magnetic resonance elastography (MRE) and transient elastography (TE), according to a study from the University of California, San Diego.
“This study demonstrates that BMI is a significant factor of discordancy between MRE and TE for the stage of significant fibrosis (2-4 vs. 0-1),” wrote Cyrielle Caussy, MD, and her colleagues (Clin Gastrolenterol Hepatol. 2018 Jan 15. doi: 10.1016/j.cgh.2017.10.037). “Furthermore, this study showed that the grade of obesity is also a significant predictor of discordancy between MRE and TE because the discordance rate between MRE and TE increases with the increase in BMI.”
Dr. Caussy of the University of California, San Diego, and her colleagues had noted that MRE and TE had discordant findings in obese patients. To ascertain under what conditions TE and MRE produce the same readings, Dr. Caussy and her associates conducted a cross-sectional study of two cohorts with nonalcoholic fatty liver disease (NAFLD) who underwent contemporaneous MRE, TE, and liver biopsy. TE utilized both M and XL probes during imaging. The training cohort involved 119 adult patients undergoing NAFLD testing from October 2011 through January 2017. The validation cohort, consisting of 75 adults with NAFLD undergoing liver imaging from March 2010 through May 2013, was formed to validate the findings of the training cohort.
The study revealed that BMI was a significant predictor of the difference between MRE and TE results and made it difficult to assess the stage of liver fibrosis (2-4 vs. 0-1). After adjustment for age and sex, BMI accounted for a 5-unit increase of 1.694 (95% confidence interval, 1.145-2.507; P = .008). This was not a static relationship, and as BMI increased, so did the discordance between MRE and TE (P = .0309). Interestingly, the discordance rate was significantly higher in participants with BMIs greater than 35 kg/m2, compared with participants with BMIs below 35 (63.0% vs. 38.0%; P = .022), the investigators reported.
While the study revealed valuable information, it had both strengths and limitations. A strength of the study was the use of two cohorts, specifically the validation cohort. The use of the liver biopsy as a reference, which is the standard for assessing fibrosis, was also a strength of the study. A limitation was that the study was conducted at specialized, tertiary care centers using advanced imaging techniques that may not be available at other clinics. Additionally, the cohorts included a small number of patients with advanced fibrosis.
“The integration of the BMI in the screening strategy for the noninvasive detection of liver fibrosis in NAFLD should be considered, and this parameter would help to determine when MRE is not needed in future guidelines” wrote Dr. Caussy and her associates. “Further cost-effectiveness studies are necessary to evaluate the clinical utility of MRE, TE, and/or liver biopsy to develop optimal screening strategies for diagnosing NAFLD-associated fibrosis.”
Jun Chen, MD, Meng Yin, MD, and Richard L. Ehman, MD, all have intellectual property rights and financial interests in elastography technology. Dr. Ehman also serves as an noncompensated CEO of Resoundant. Claude B. Sirlin, MD, has served as a consultant to Bayer and GE Healthcare. All other authors did not disclose any conflicts.
The AGA Obesity Practice Guide provides a comprehensive, multi-disciplinary process to personalize innovative obesity care for safe and effective weight management. Learn more at www.gastro.org/obesity.
SOURCE: Caussy C et al. Clin Gastrolenterol Hepatol. 2018 Jan 15. doi: 10.1016/j.cgh.2017.10.037.
VIDEO: Cystic fibrosis patients need earlier, more frequent colorectal cancer screening
Adults with cystic fibrosis (CF) should undergo screening colonoscopy for colorectal cancer every 5 years beginning at age 40 years, unless they have had a solid organ transplant – in which case, screening should begin at age 30 years. For both groups, screening intervals should be shortened to 3 years if any adenomatous polyps are recovered.
The new screening recommendation is 1 of 10 set forth by the Cystic Fibrosis Foundation, in conjunction with the American Gastroenterological Association. The document reflects the significantly increased risk of colorectal cancer among adults with the chronic lung disorder, Denis Hadjiliadis, MD, and his colleagues wrote in the February issue of Gastroenterology. ; the risk approaches a 30-fold increase among CF patients who have undergone a lung transplant.
SOURCE: American Gastroenterological Association
In addition to making recommendations on screening intervals and protocols, the document asks clinicians to reframe their thinking of CF as a respiratory-only disease.
“Physicians should recognize that CF is a colon cancer syndrome,” wrote Dr. Hadjiliadis, director of the Adult Cystic Fibrosis Program at the University of Pennsylvania, Philadelphia, and his coauthors.
The increased colorectal cancer risk has become increasingly evident as CF patients live longer, Dr. Hadjiliadis and the panel wrote.
“The current median predicted survival is 41 years, and persons born in 2015 have an estimated average life expectancy of 45 years. The increasing longevity of adults with CF puts them at risk for other diseases, such as gastrointestinal cancer.”
In addition to the normal age-related risk, however, CF patients seem to have an elevated risk profile unique to the disease. The underlying causes have not been fully elucidated but may have to do with mutations in the cystic fibrosis transmembrane conductance regulator (CFTR), which are responsible for the excess thickened mucosal secretions that characterize CF. CFTR also is a tumor-suppressor gene in the intestinal tract of mice, and is important in gastrointestinal epithelial homeostasis. “Absence of CFTR is associated with dysregulation of the immune response, intestinal stem cells, and growth signaling regulators,” the authors noted.
In response to this observed increased risk of colorectal cancers among CF patients, the Cystic Fibrosis Foundation convened an 18-member task force to review the extant literature and compile colorectal cancer screening recommendations for CF patients who show no signs of such malignancies. The team reviewed 1,159 articles and based its findings on the 50 most relevant. The papers comprised observational studies, case-control studies, and case reports; there are no randomized clinical trials of screening for this population.
The American Gastroenterological Association reviewed and approved all of the recommendations:
- Screening decisions should be a collaborative process between the CF patient and clinician, taking into account comorbidities, safety, and quality of life. This should include a discussion of expected lifespan; patients with limited lifespan won’t benefit from screening for a slow-growing cancer. Patients should also consider that the colonoscopy prep for CF patients is somewhat more complex than for non-CF patients. “Given these complexities, the task force agreed that individuals with CF and their providers should … carefully assess the risks and benefits of CRC screening and its impact on the health and quality of life for the adult with CF.”
- The decision team should include an endoscopist. An endoscopist with CF training is preferred, but the panel noted these specialists are rare.
- Colonoscopy is the preferred method of screening for CF patients, since it can both detect and remove polyps. “This is one of the main reasons why colonoscopy is the screening procedure of choice for other high-risk groups,” the panel noted.
- There is insufficient evidence to recommend alternate screening methods in CF patients, including CT scanning, colonography, stool-based tests, or flexible sigmoidoscopy.
- In CF patients without signs of CRC, screening should commence at age 40 years and be repeated every 5 years as long as the results are negative.
- Any CF patient who has had adenomatous polyps on a screening colonoscopy should have a repeat colonoscopy within 3 years, unless clinical findings support more frequent screening.
- For any adult CF patient older than age 30 years who has undergone a solid organ transplant, screening colonoscopy should commence within 2 years of transplantation. “Although the absolute risk of CRC in individuals with CF is extremely low for patients younger than 30 years, the risk … greatly increases after lung transplantation,” to 25-30 times the age-adjusted baseline, the panel wrote. “Increased posttransplantation survival means that many transplant patients will enter older age groups where there is an increased risk of cancer.” Screening should be performed after recovery and within 2 years, unless there was a negative colonoscopy in the 5 years before transplant.
- Thereafter, patients who have had a solid organ transplant should undergo colonoscopy every 5 years, based on their life expectancy. “In cases where the expected survival time is limited (less than 10 years), screening should not be performed. For adults appropriately selected, lung transplantation usually increases survival probability. Therefore, a lung transplantation candidate with a short life expectancy is likely to become a screening candidate before and after transplantation at the appropriate ages described here, because the potential survival increases to approximately 10 years.”
- Colonoscopy should be repeated every 3 years on CF patients with transplants with a history of adenomatous polyps. This interval may be as short as 1 year for patients with high-risk, large, or multiple polyps.
- CF patients should undergo more intense bowel prep for colonoscopy, with three-four washes of a minimum of one liter of purgative per wash; the last wash should occur 4-6 hours before the procedure. Split-prep regimens (several smaller-volume washes) are better than a single larger-volume wash. The panel suggested a sample CF-specific regimen available from the Minnesota Cystic Fibrosis Center.
The new document reflects expert consensus on the currently available data, the panel said. As more data emerge, the recommendations might change.
“It is possible that different subpopulations will need more or less frequent schedules for rescreening and surveillance. Our recommendations are making an effort to balance the risk of missing advanced colorectal cancer and minimizing the burden and risk of too frequent examinations.”
None of the panel members had any financial disclosures.
SOURCE: Hadjiliadis D et al. Gastroenterology. 2017 Dec 28. doi. org/10.1053/j.gastro.2017.12.012
According to the Cystic Fibrosis Foundation Patient Registry, more than 30,000 people are living with cystic fibrosis (CF ) in the United States. More than half of the CF population is over 18 years of age! It is extremely important to talk to patients about preventative medicine which was not a topic of conversation CF healthcare providers were adding to their management plan in the past.
According to the Cystic Fibrosis Foundation Patient Registry, more than 30,000 people are living with cystic fibrosis (CF ) in the United States. More than half of the CF population is over 18 years of age! It is extremely important to talk to patients about preventative medicine which was not a topic of conversation CF healthcare providers were adding to their management plan in the past.
According to the Cystic Fibrosis Foundation Patient Registry, more than 30,000 people are living with cystic fibrosis (CF ) in the United States. More than half of the CF population is over 18 years of age! It is extremely important to talk to patients about preventative medicine which was not a topic of conversation CF healthcare providers were adding to their management plan in the past.
Adults with cystic fibrosis (CF) should undergo screening colonoscopy for colorectal cancer every 5 years beginning at age 40 years, unless they have had a solid organ transplant – in which case, screening should begin at age 30 years. For both groups, screening intervals should be shortened to 3 years if any adenomatous polyps are recovered.
The new screening recommendation is 1 of 10 set forth by the Cystic Fibrosis Foundation, in conjunction with the American Gastroenterological Association. The document reflects the significantly increased risk of colorectal cancer among adults with the chronic lung disorder, Denis Hadjiliadis, MD, and his colleagues wrote in the February issue of Gastroenterology. ; the risk approaches a 30-fold increase among CF patients who have undergone a lung transplant.
SOURCE: American Gastroenterological Association
In addition to making recommendations on screening intervals and protocols, the document asks clinicians to reframe their thinking of CF as a respiratory-only disease.
“Physicians should recognize that CF is a colon cancer syndrome,” wrote Dr. Hadjiliadis, director of the Adult Cystic Fibrosis Program at the University of Pennsylvania, Philadelphia, and his coauthors.
The increased colorectal cancer risk has become increasingly evident as CF patients live longer, Dr. Hadjiliadis and the panel wrote.
“The current median predicted survival is 41 years, and persons born in 2015 have an estimated average life expectancy of 45 years. The increasing longevity of adults with CF puts them at risk for other diseases, such as gastrointestinal cancer.”
In addition to the normal age-related risk, however, CF patients seem to have an elevated risk profile unique to the disease. The underlying causes have not been fully elucidated but may have to do with mutations in the cystic fibrosis transmembrane conductance regulator (CFTR), which are responsible for the excess thickened mucosal secretions that characterize CF. CFTR also is a tumor-suppressor gene in the intestinal tract of mice, and is important in gastrointestinal epithelial homeostasis. “Absence of CFTR is associated with dysregulation of the immune response, intestinal stem cells, and growth signaling regulators,” the authors noted.
In response to this observed increased risk of colorectal cancers among CF patients, the Cystic Fibrosis Foundation convened an 18-member task force to review the extant literature and compile colorectal cancer screening recommendations for CF patients who show no signs of such malignancies. The team reviewed 1,159 articles and based its findings on the 50 most relevant. The papers comprised observational studies, case-control studies, and case reports; there are no randomized clinical trials of screening for this population.
The American Gastroenterological Association reviewed and approved all of the recommendations:
- Screening decisions should be a collaborative process between the CF patient and clinician, taking into account comorbidities, safety, and quality of life. This should include a discussion of expected lifespan; patients with limited lifespan won’t benefit from screening for a slow-growing cancer. Patients should also consider that the colonoscopy prep for CF patients is somewhat more complex than for non-CF patients. “Given these complexities, the task force agreed that individuals with CF and their providers should … carefully assess the risks and benefits of CRC screening and its impact on the health and quality of life for the adult with CF.”
- The decision team should include an endoscopist. An endoscopist with CF training is preferred, but the panel noted these specialists are rare.
- Colonoscopy is the preferred method of screening for CF patients, since it can both detect and remove polyps. “This is one of the main reasons why colonoscopy is the screening procedure of choice for other high-risk groups,” the panel noted.
- There is insufficient evidence to recommend alternate screening methods in CF patients, including CT scanning, colonography, stool-based tests, or flexible sigmoidoscopy.
- In CF patients without signs of CRC, screening should commence at age 40 years and be repeated every 5 years as long as the results are negative.
- Any CF patient who has had adenomatous polyps on a screening colonoscopy should have a repeat colonoscopy within 3 years, unless clinical findings support more frequent screening.
- For any adult CF patient older than age 30 years who has undergone a solid organ transplant, screening colonoscopy should commence within 2 years of transplantation. “Although the absolute risk of CRC in individuals with CF is extremely low for patients younger than 30 years, the risk … greatly increases after lung transplantation,” to 25-30 times the age-adjusted baseline, the panel wrote. “Increased posttransplantation survival means that many transplant patients will enter older age groups where there is an increased risk of cancer.” Screening should be performed after recovery and within 2 years, unless there was a negative colonoscopy in the 5 years before transplant.
- Thereafter, patients who have had a solid organ transplant should undergo colonoscopy every 5 years, based on their life expectancy. “In cases where the expected survival time is limited (less than 10 years), screening should not be performed. For adults appropriately selected, lung transplantation usually increases survival probability. Therefore, a lung transplantation candidate with a short life expectancy is likely to become a screening candidate before and after transplantation at the appropriate ages described here, because the potential survival increases to approximately 10 years.”
- Colonoscopy should be repeated every 3 years on CF patients with transplants with a history of adenomatous polyps. This interval may be as short as 1 year for patients with high-risk, large, or multiple polyps.
- CF patients should undergo more intense bowel prep for colonoscopy, with three-four washes of a minimum of one liter of purgative per wash; the last wash should occur 4-6 hours before the procedure. Split-prep regimens (several smaller-volume washes) are better than a single larger-volume wash. The panel suggested a sample CF-specific regimen available from the Minnesota Cystic Fibrosis Center.
The new document reflects expert consensus on the currently available data, the panel said. As more data emerge, the recommendations might change.
“It is possible that different subpopulations will need more or less frequent schedules for rescreening and surveillance. Our recommendations are making an effort to balance the risk of missing advanced colorectal cancer and minimizing the burden and risk of too frequent examinations.”
None of the panel members had any financial disclosures.
SOURCE: Hadjiliadis D et al. Gastroenterology. 2017 Dec 28. doi. org/10.1053/j.gastro.2017.12.012
Adults with cystic fibrosis (CF) should undergo screening colonoscopy for colorectal cancer every 5 years beginning at age 40 years, unless they have had a solid organ transplant – in which case, screening should begin at age 30 years. For both groups, screening intervals should be shortened to 3 years if any adenomatous polyps are recovered.
The new screening recommendation is 1 of 10 set forth by the Cystic Fibrosis Foundation, in conjunction with the American Gastroenterological Association. The document reflects the significantly increased risk of colorectal cancer among adults with the chronic lung disorder, Denis Hadjiliadis, MD, and his colleagues wrote in the February issue of Gastroenterology. ; the risk approaches a 30-fold increase among CF patients who have undergone a lung transplant.
SOURCE: American Gastroenterological Association
In addition to making recommendations on screening intervals and protocols, the document asks clinicians to reframe their thinking of CF as a respiratory-only disease.
“Physicians should recognize that CF is a colon cancer syndrome,” wrote Dr. Hadjiliadis, director of the Adult Cystic Fibrosis Program at the University of Pennsylvania, Philadelphia, and his coauthors.
The increased colorectal cancer risk has become increasingly evident as CF patients live longer, Dr. Hadjiliadis and the panel wrote.
“The current median predicted survival is 41 years, and persons born in 2015 have an estimated average life expectancy of 45 years. The increasing longevity of adults with CF puts them at risk for other diseases, such as gastrointestinal cancer.”
In addition to the normal age-related risk, however, CF patients seem to have an elevated risk profile unique to the disease. The underlying causes have not been fully elucidated but may have to do with mutations in the cystic fibrosis transmembrane conductance regulator (CFTR), which are responsible for the excess thickened mucosal secretions that characterize CF. CFTR also is a tumor-suppressor gene in the intestinal tract of mice, and is important in gastrointestinal epithelial homeostasis. “Absence of CFTR is associated with dysregulation of the immune response, intestinal stem cells, and growth signaling regulators,” the authors noted.
In response to this observed increased risk of colorectal cancers among CF patients, the Cystic Fibrosis Foundation convened an 18-member task force to review the extant literature and compile colorectal cancer screening recommendations for CF patients who show no signs of such malignancies. The team reviewed 1,159 articles and based its findings on the 50 most relevant. The papers comprised observational studies, case-control studies, and case reports; there are no randomized clinical trials of screening for this population.
The American Gastroenterological Association reviewed and approved all of the recommendations:
- Screening decisions should be a collaborative process between the CF patient and clinician, taking into account comorbidities, safety, and quality of life. This should include a discussion of expected lifespan; patients with limited lifespan won’t benefit from screening for a slow-growing cancer. Patients should also consider that the colonoscopy prep for CF patients is somewhat more complex than for non-CF patients. “Given these complexities, the task force agreed that individuals with CF and their providers should … carefully assess the risks and benefits of CRC screening and its impact on the health and quality of life for the adult with CF.”
- The decision team should include an endoscopist. An endoscopist with CF training is preferred, but the panel noted these specialists are rare.
- Colonoscopy is the preferred method of screening for CF patients, since it can both detect and remove polyps. “This is one of the main reasons why colonoscopy is the screening procedure of choice for other high-risk groups,” the panel noted.
- There is insufficient evidence to recommend alternate screening methods in CF patients, including CT scanning, colonography, stool-based tests, or flexible sigmoidoscopy.
- In CF patients without signs of CRC, screening should commence at age 40 years and be repeated every 5 years as long as the results are negative.
- Any CF patient who has had adenomatous polyps on a screening colonoscopy should have a repeat colonoscopy within 3 years, unless clinical findings support more frequent screening.
- For any adult CF patient older than age 30 years who has undergone a solid organ transplant, screening colonoscopy should commence within 2 years of transplantation. “Although the absolute risk of CRC in individuals with CF is extremely low for patients younger than 30 years, the risk … greatly increases after lung transplantation,” to 25-30 times the age-adjusted baseline, the panel wrote. “Increased posttransplantation survival means that many transplant patients will enter older age groups where there is an increased risk of cancer.” Screening should be performed after recovery and within 2 years, unless there was a negative colonoscopy in the 5 years before transplant.
- Thereafter, patients who have had a solid organ transplant should undergo colonoscopy every 5 years, based on their life expectancy. “In cases where the expected survival time is limited (less than 10 years), screening should not be performed. For adults appropriately selected, lung transplantation usually increases survival probability. Therefore, a lung transplantation candidate with a short life expectancy is likely to become a screening candidate before and after transplantation at the appropriate ages described here, because the potential survival increases to approximately 10 years.”
- Colonoscopy should be repeated every 3 years on CF patients with transplants with a history of adenomatous polyps. This interval may be as short as 1 year for patients with high-risk, large, or multiple polyps.
- CF patients should undergo more intense bowel prep for colonoscopy, with three-four washes of a minimum of one liter of purgative per wash; the last wash should occur 4-6 hours before the procedure. Split-prep regimens (several smaller-volume washes) are better than a single larger-volume wash. The panel suggested a sample CF-specific regimen available from the Minnesota Cystic Fibrosis Center.
The new document reflects expert consensus on the currently available data, the panel said. As more data emerge, the recommendations might change.
“It is possible that different subpopulations will need more or less frequent schedules for rescreening and surveillance. Our recommendations are making an effort to balance the risk of missing advanced colorectal cancer and minimizing the burden and risk of too frequent examinations.”
None of the panel members had any financial disclosures.
SOURCE: Hadjiliadis D et al. Gastroenterology. 2017 Dec 28. doi. org/10.1053/j.gastro.2017.12.012
FROM GASTROENTEROLOGY
VIDEO: Gluten-free diet tied to heavy metal bioaccumulation
A gluten-free diet was associated with significantly increased blood levels of mercury, lead, and cadmium and with significantly increased urinary levels of arsenic in a large cross-sectional population-based survey study.
Source: American Gastroenterological Association
After researchers controlled for demographic characteristics, “levels of all heavy metals remained significantly higher in persons following a gluten-free diet, compared with those not following a gluten-free diet,” Stephanie L. Raehsler, MPH, of Mayo Clinic in Rochester, Minn., wrote with her associates in an article published in the February issue of Clinical Gastroenterology and Hepatology.
The purported (unproven) benefits of a gluten-free diet (GFD) have propelled them into the mainstream outside the settings of celiac disease, dermatitis herpetiformis, and wheat allergy. However, GFDs have been linked to nutritional deficits of iron, ferritin, zinc, and fiber, to increased consumption of sugar, fats, and salt, and to excessive bioaccumulation of mercury, the investigators noted.
High intake of rice, a staple of many GFDs, also has been associated with elevated urinary excretion of arsenic (PLoS One. 2014 Sep 8;9[9]:e104768. doi: 10.1371/journal.pone.0104768). To further characterize these relationships, the researchers analyzed data for 2009 through 2012 from 11,354 participants in the National Health and Nutrition Examination Survey (NHANES). Blood levels of lead, mercury, and cadmium were available from 115 participants who reported following a GFD, and data on urinary arsenic levels were available from 32 such individuals.
In the overall study group, blood mercury levels averaged 1.37 mcg/L (95% confidence interval, 1.02-1.85 mcg/L) among persons on a GFD and 0.93 mcg/L (95% CI, 0.86-1.0 mcg/L) in persons not on a GFD (P = .008). Individuals on a GFD also had significantly higher total blood levels of lead (1.42 vs. 1.13 mcg/L; P = .007 ) and cadmium (0.42 vs. 0.34; P = .03), and they had significantly higher urinary levels of total arsenic (15.2 vs. 8.4 mcg/L; P = .003). These significant differences persisted after researchers controlled for age, sex, race, and smoking status.
Additionally, among 101 individuals on GFDs who had no laboratory or clinical indication of celiac disease, blood levels of total mercury were significantly elevated, compared with individuals not on a GFD (1.40 vs. 0.93 mcg/L; P = .02), as were blood lead concentrations (1.44 vs. 1.13 mcg/L; P = .01) and urinary arsenic levels (14.7 vs. 8.3 mcg/L; P = .01). Blood cadmium levels also were increased (0.42 vs. 0.34 mcg/L), but this difference did not reach statistical significance (P = .06).
Individuals who reported eating fish or shellfish in the past month had higher blood mercury levels than those who did not, regardless of whether they were on a GFD. However, only two individuals in the study exceeded the toxicity threshold for mercury and neither was on a GFD, the researchers said. For most individuals on a GFD, levels of all heavy metals except urinary arsenic stayed under the recognized limits for toxicity, they noted.
The number of respondents following a GFD was small, but the investigators followed NHANES recommendations on sampling weights and sample design variables. Also, although the NHANES included only one question on GFDs, trained interviewers were used to help minimize bias. “Studies are needed to determine the long-term effects of accumulation of these elements in persons on a GFD,” the researchers concluded.
The Centers for Disease Control and Prevention provided partial funding. The researchers reported having no conflicts of interest.
SOURCE: Raehsler S et al. Clin Gastro Hepatol. 2018;(in press).
A gluten-free diet was associated with significantly increased blood levels of mercury, lead, and cadmium and with significantly increased urinary levels of arsenic in a large cross-sectional population-based survey study.
Source: American Gastroenterological Association
After researchers controlled for demographic characteristics, “levels of all heavy metals remained significantly higher in persons following a gluten-free diet, compared with those not following a gluten-free diet,” Stephanie L. Raehsler, MPH, of Mayo Clinic in Rochester, Minn., wrote with her associates in an article published in the February issue of Clinical Gastroenterology and Hepatology.
The purported (unproven) benefits of a gluten-free diet (GFD) have propelled them into the mainstream outside the settings of celiac disease, dermatitis herpetiformis, and wheat allergy. However, GFDs have been linked to nutritional deficits of iron, ferritin, zinc, and fiber, to increased consumption of sugar, fats, and salt, and to excessive bioaccumulation of mercury, the investigators noted.
High intake of rice, a staple of many GFDs, also has been associated with elevated urinary excretion of arsenic (PLoS One. 2014 Sep 8;9[9]:e104768. doi: 10.1371/journal.pone.0104768). To further characterize these relationships, the researchers analyzed data for 2009 through 2012 from 11,354 participants in the National Health and Nutrition Examination Survey (NHANES). Blood levels of lead, mercury, and cadmium were available from 115 participants who reported following a GFD, and data on urinary arsenic levels were available from 32 such individuals.
In the overall study group, blood mercury levels averaged 1.37 mcg/L (95% confidence interval, 1.02-1.85 mcg/L) among persons on a GFD and 0.93 mcg/L (95% CI, 0.86-1.0 mcg/L) in persons not on a GFD (P = .008). Individuals on a GFD also had significantly higher total blood levels of lead (1.42 vs. 1.13 mcg/L; P = .007 ) and cadmium (0.42 vs. 0.34; P = .03), and they had significantly higher urinary levels of total arsenic (15.2 vs. 8.4 mcg/L; P = .003). These significant differences persisted after researchers controlled for age, sex, race, and smoking status.
Additionally, among 101 individuals on GFDs who had no laboratory or clinical indication of celiac disease, blood levels of total mercury were significantly elevated, compared with individuals not on a GFD (1.40 vs. 0.93 mcg/L; P = .02), as were blood lead concentrations (1.44 vs. 1.13 mcg/L; P = .01) and urinary arsenic levels (14.7 vs. 8.3 mcg/L; P = .01). Blood cadmium levels also were increased (0.42 vs. 0.34 mcg/L), but this difference did not reach statistical significance (P = .06).
Individuals who reported eating fish or shellfish in the past month had higher blood mercury levels than those who did not, regardless of whether they were on a GFD. However, only two individuals in the study exceeded the toxicity threshold for mercury and neither was on a GFD, the researchers said. For most individuals on a GFD, levels of all heavy metals except urinary arsenic stayed under the recognized limits for toxicity, they noted.
The number of respondents following a GFD was small, but the investigators followed NHANES recommendations on sampling weights and sample design variables. Also, although the NHANES included only one question on GFDs, trained interviewers were used to help minimize bias. “Studies are needed to determine the long-term effects of accumulation of these elements in persons on a GFD,” the researchers concluded.
The Centers for Disease Control and Prevention provided partial funding. The researchers reported having no conflicts of interest.
SOURCE: Raehsler S et al. Clin Gastro Hepatol. 2018;(in press).
A gluten-free diet was associated with significantly increased blood levels of mercury, lead, and cadmium and with significantly increased urinary levels of arsenic in a large cross-sectional population-based survey study.
Source: American Gastroenterological Association
After researchers controlled for demographic characteristics, “levels of all heavy metals remained significantly higher in persons following a gluten-free diet, compared with those not following a gluten-free diet,” Stephanie L. Raehsler, MPH, of Mayo Clinic in Rochester, Minn., wrote with her associates in an article published in the February issue of Clinical Gastroenterology and Hepatology.
The purported (unproven) benefits of a gluten-free diet (GFD) have propelled them into the mainstream outside the settings of celiac disease, dermatitis herpetiformis, and wheat allergy. However, GFDs have been linked to nutritional deficits of iron, ferritin, zinc, and fiber, to increased consumption of sugar, fats, and salt, and to excessive bioaccumulation of mercury, the investigators noted.
High intake of rice, a staple of many GFDs, also has been associated with elevated urinary excretion of arsenic (PLoS One. 2014 Sep 8;9[9]:e104768. doi: 10.1371/journal.pone.0104768). To further characterize these relationships, the researchers analyzed data for 2009 through 2012 from 11,354 participants in the National Health and Nutrition Examination Survey (NHANES). Blood levels of lead, mercury, and cadmium were available from 115 participants who reported following a GFD, and data on urinary arsenic levels were available from 32 such individuals.
In the overall study group, blood mercury levels averaged 1.37 mcg/L (95% confidence interval, 1.02-1.85 mcg/L) among persons on a GFD and 0.93 mcg/L (95% CI, 0.86-1.0 mcg/L) in persons not on a GFD (P = .008). Individuals on a GFD also had significantly higher total blood levels of lead (1.42 vs. 1.13 mcg/L; P = .007 ) and cadmium (0.42 vs. 0.34; P = .03), and they had significantly higher urinary levels of total arsenic (15.2 vs. 8.4 mcg/L; P = .003). These significant differences persisted after researchers controlled for age, sex, race, and smoking status.
Additionally, among 101 individuals on GFDs who had no laboratory or clinical indication of celiac disease, blood levels of total mercury were significantly elevated, compared with individuals not on a GFD (1.40 vs. 0.93 mcg/L; P = .02), as were blood lead concentrations (1.44 vs. 1.13 mcg/L; P = .01) and urinary arsenic levels (14.7 vs. 8.3 mcg/L; P = .01). Blood cadmium levels also were increased (0.42 vs. 0.34 mcg/L), but this difference did not reach statistical significance (P = .06).
Individuals who reported eating fish or shellfish in the past month had higher blood mercury levels than those who did not, regardless of whether they were on a GFD. However, only two individuals in the study exceeded the toxicity threshold for mercury and neither was on a GFD, the researchers said. For most individuals on a GFD, levels of all heavy metals except urinary arsenic stayed under the recognized limits for toxicity, they noted.
The number of respondents following a GFD was small, but the investigators followed NHANES recommendations on sampling weights and sample design variables. Also, although the NHANES included only one question on GFDs, trained interviewers were used to help minimize bias. “Studies are needed to determine the long-term effects of accumulation of these elements in persons on a GFD,” the researchers concluded.
The Centers for Disease Control and Prevention provided partial funding. The researchers reported having no conflicts of interest.
SOURCE: Raehsler S et al. Clin Gastro Hepatol. 2018;(in press).
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: A gluten-free diet was associated with significantly increased bioaccumulation of several heavy metals.
Major finding: After accounting for demographic factors, blood or urinary levels of lead, cadmium, arsenic, and mercury were significantly higher in persons following a gluten-free diet, compared with those who did not follow a gluten-free diet.
Data source: A population-based, cross-sectional study of 11,354 respondents to NHANES 2009-2012, including 115 persons on a gluten-free diet.
Disclosures: The Centers for Disease Control and Prevention provided partial funding. The researchers reported having no conflicts of interest.
Source: Raehsler S et al. Clin Gastro Hepatol. 2018 (in press).
VIP an unwelcome contributor to eosinophilic esophagitis
Vasoactive intestinal peptide (VIP) appears to play an important role in the pathology of eosinophilic esophagitis (EoE) by recruiting mast cells and eosinophils that contribute to EoE’s hallmark symptoms of dysphagia and esophageal dysmotility, investigators reported in the February issue of Cellular and Molecular Gastroenterology and Hepatology.
Blocking one of three VIP receptors – chemoattractant receptor-homologous molecule expressed on Th2 (CRTH2) – could reduce eosinophil infiltration and mast cell numbers in the esophagus, wrote Alok K. Verma, PhD, a postodoctoral fellow at Tulane University in New Orleans, and his colleagues.
“We suggest that inhibiting the VIP–CRTH2 axis may ameliorate the dysphagia, stricture, and motility dysfunction of chronic EoE,” they wrote in a research letter to Cellular and Molecular Gastroenterology and Hepatology.
Several cytokines and chemokines, notably interleukin-5 and eotaxin-3, have been fingered as suspects in eosinophil infiltration, but whether chemokines other than eotaxin play a role has not been well documented, the investigators noted.
They hypothesized that VIP may be a chemoattractant that draws eosinophils into perineural areas of the muscular mucosa of the esophagus.
To test this idea, they looked at VIP-expression in samples from patients both with and without EoE and found that VIP expression was low among controls (without EoE); they also found that eosinophils were seen to accumulate near VIP-expressing nerve cells in biopsy samples from patients with EoE.
When they performed in vitro studies of VIP binding and immunologic functions, they found that eosinophils primarily express the CRTH2 receptor rather than the vasoactive intestinal peptide receptor 1 (VPAC-1) or VPAC-2.
They also demonstrated that VIP’s effects on eosinophil motility was similar to that of eotaxin and that, when they pretreated eosinophils with a CRTH2 inhibitor, esoinophil motility was hampered.
The investigators next looked at biopsy specimens from patients with EoE and found that eosinophils that express CRTH2 accumulated in the epithelial mucosa.
To see whether (as they and other researchers had suspected) VIP and its interaction with the CRTH2 receptor might play a role in mast cell recruitment, they performed immunofluorescence analyses and confirmed the presence of the CRTH2 receptor on tryptase-positive mast cells in the esophageal mucosa of patients with EoE.
“These findings suggest that, similar to eosinophils, mast cells accumulate via interaction of the CRTH2 receptor with neutrally derived VIP,” they wrote.
Finally, to see whether a reduction in peak eosinophil levels in patients with EoE with a CRTH2 antagonist – as seen in prior studies – could also ameliorate the negative effects of mast cells on esophageal function, they looked at the effects of CRTH2 inhibition in a mouse model of human EoE.
They found that, in the mice treated with a CRTH2 blocker, each segment of the esophagus had significant reductions in both eosinophil infiltration and mast cell numbers (P less than .05 for each).
The work was supported in part by grants from the National Institutes of Health and the Tulane Edward G. Schlieder Educational Foundation. Senior author Anil Mishra, PhD, disclosed serving as a consultant for Axcan Pharma, Aptalis, Elite Biosciences, Calypso Biotech SA, and Enumeral Biomedical. The remaining authors disclosed no conflicts of interest.
SOURCE: Verma AK et al. Cell Mol Gastroenterol Hepatol. 2018;5[1]:99-100.e7.
The rapid increase in the incidence of pediatric and adult eosinophilic esophagitis (EoE) draws immediate attention to the importance of studying the mechanisms underlying this detrimental condition. The lack of preventive or curative therapies for EoE further underscores the importance of research that addresses gaps in our understanding of how eosinophilic inflammation of the esophagus is regulated on the molecular and cellular level. EoE is classified as an allergic immune disorder of the gastrointestinal tract and is characterized by eosinophil-rich, chronic Th2-type inflammation of the esophagus.
In this recent publication, the laboratory of Anil Mishra, PhD, showed that vasoactive intestinal peptide (VIP) serves as a potent chemoattractant for eosinophils and promotes accumulation of these innate immune cells adjacent to nerve cells in the muscular mucosa. Increased VIP expression was documented in EoE patients when compared to controls, and the authors identified the chemoattractant receptor homologous molecule expressed on Th2 lymphocytes (CRTH2) as a main binding receptor for VIP. Interestingly, CRTH2 was not only found to be expressed on eosinophils but also on tissue mast cells – another innate immune cell type that significantly contributes to the inflammatory tissue infiltrate in EoE patients. Based on the human findings, the authors tested whether VIP plays a major role in recruiting eosinophils and mast cells to the inflamed esophagus and whether CRTH2 blockade can modulate experimental EoE. Indeed, EoE pathology improved in animals that were treated with a CRTH2 antagonist.
In conclusion, these observations suggest that inhibiting the VIP-CRTH2 axis may serve as a therapeutic intervention pathway to ameliorate innate tissue inflammation in EoE patients.
Edda Fiebiger, PhD, is in the department of pediatrics in the division of gastroenterology, hepatology and nutrition at Boston Children’s Hospital, as well as in the department of medicine at Harvard Medical School, also in Boston. She had no disclosures.
The rapid increase in the incidence of pediatric and adult eosinophilic esophagitis (EoE) draws immediate attention to the importance of studying the mechanisms underlying this detrimental condition. The lack of preventive or curative therapies for EoE further underscores the importance of research that addresses gaps in our understanding of how eosinophilic inflammation of the esophagus is regulated on the molecular and cellular level. EoE is classified as an allergic immune disorder of the gastrointestinal tract and is characterized by eosinophil-rich, chronic Th2-type inflammation of the esophagus.
In this recent publication, the laboratory of Anil Mishra, PhD, showed that vasoactive intestinal peptide (VIP) serves as a potent chemoattractant for eosinophils and promotes accumulation of these innate immune cells adjacent to nerve cells in the muscular mucosa. Increased VIP expression was documented in EoE patients when compared to controls, and the authors identified the chemoattractant receptor homologous molecule expressed on Th2 lymphocytes (CRTH2) as a main binding receptor for VIP. Interestingly, CRTH2 was not only found to be expressed on eosinophils but also on tissue mast cells – another innate immune cell type that significantly contributes to the inflammatory tissue infiltrate in EoE patients. Based on the human findings, the authors tested whether VIP plays a major role in recruiting eosinophils and mast cells to the inflamed esophagus and whether CRTH2 blockade can modulate experimental EoE. Indeed, EoE pathology improved in animals that were treated with a CRTH2 antagonist.
In conclusion, these observations suggest that inhibiting the VIP-CRTH2 axis may serve as a therapeutic intervention pathway to ameliorate innate tissue inflammation in EoE patients.
Edda Fiebiger, PhD, is in the department of pediatrics in the division of gastroenterology, hepatology and nutrition at Boston Children’s Hospital, as well as in the department of medicine at Harvard Medical School, also in Boston. She had no disclosures.
The rapid increase in the incidence of pediatric and adult eosinophilic esophagitis (EoE) draws immediate attention to the importance of studying the mechanisms underlying this detrimental condition. The lack of preventive or curative therapies for EoE further underscores the importance of research that addresses gaps in our understanding of how eosinophilic inflammation of the esophagus is regulated on the molecular and cellular level. EoE is classified as an allergic immune disorder of the gastrointestinal tract and is characterized by eosinophil-rich, chronic Th2-type inflammation of the esophagus.
In this recent publication, the laboratory of Anil Mishra, PhD, showed that vasoactive intestinal peptide (VIP) serves as a potent chemoattractant for eosinophils and promotes accumulation of these innate immune cells adjacent to nerve cells in the muscular mucosa. Increased VIP expression was documented in EoE patients when compared to controls, and the authors identified the chemoattractant receptor homologous molecule expressed on Th2 lymphocytes (CRTH2) as a main binding receptor for VIP. Interestingly, CRTH2 was not only found to be expressed on eosinophils but also on tissue mast cells – another innate immune cell type that significantly contributes to the inflammatory tissue infiltrate in EoE patients. Based on the human findings, the authors tested whether VIP plays a major role in recruiting eosinophils and mast cells to the inflamed esophagus and whether CRTH2 blockade can modulate experimental EoE. Indeed, EoE pathology improved in animals that were treated with a CRTH2 antagonist.
In conclusion, these observations suggest that inhibiting the VIP-CRTH2 axis may serve as a therapeutic intervention pathway to ameliorate innate tissue inflammation in EoE patients.
Edda Fiebiger, PhD, is in the department of pediatrics in the division of gastroenterology, hepatology and nutrition at Boston Children’s Hospital, as well as in the department of medicine at Harvard Medical School, also in Boston. She had no disclosures.
Vasoactive intestinal peptide (VIP) appears to play an important role in the pathology of eosinophilic esophagitis (EoE) by recruiting mast cells and eosinophils that contribute to EoE’s hallmark symptoms of dysphagia and esophageal dysmotility, investigators reported in the February issue of Cellular and Molecular Gastroenterology and Hepatology.
Blocking one of three VIP receptors – chemoattractant receptor-homologous molecule expressed on Th2 (CRTH2) – could reduce eosinophil infiltration and mast cell numbers in the esophagus, wrote Alok K. Verma, PhD, a postodoctoral fellow at Tulane University in New Orleans, and his colleagues.
“We suggest that inhibiting the VIP–CRTH2 axis may ameliorate the dysphagia, stricture, and motility dysfunction of chronic EoE,” they wrote in a research letter to Cellular and Molecular Gastroenterology and Hepatology.
Several cytokines and chemokines, notably interleukin-5 and eotaxin-3, have been fingered as suspects in eosinophil infiltration, but whether chemokines other than eotaxin play a role has not been well documented, the investigators noted.
They hypothesized that VIP may be a chemoattractant that draws eosinophils into perineural areas of the muscular mucosa of the esophagus.
To test this idea, they looked at VIP-expression in samples from patients both with and without EoE and found that VIP expression was low among controls (without EoE); they also found that eosinophils were seen to accumulate near VIP-expressing nerve cells in biopsy samples from patients with EoE.
When they performed in vitro studies of VIP binding and immunologic functions, they found that eosinophils primarily express the CRTH2 receptor rather than the vasoactive intestinal peptide receptor 1 (VPAC-1) or VPAC-2.
They also demonstrated that VIP’s effects on eosinophil motility was similar to that of eotaxin and that, when they pretreated eosinophils with a CRTH2 inhibitor, esoinophil motility was hampered.
The investigators next looked at biopsy specimens from patients with EoE and found that eosinophils that express CRTH2 accumulated in the epithelial mucosa.
To see whether (as they and other researchers had suspected) VIP and its interaction with the CRTH2 receptor might play a role in mast cell recruitment, they performed immunofluorescence analyses and confirmed the presence of the CRTH2 receptor on tryptase-positive mast cells in the esophageal mucosa of patients with EoE.
“These findings suggest that, similar to eosinophils, mast cells accumulate via interaction of the CRTH2 receptor with neutrally derived VIP,” they wrote.
Finally, to see whether a reduction in peak eosinophil levels in patients with EoE with a CRTH2 antagonist – as seen in prior studies – could also ameliorate the negative effects of mast cells on esophageal function, they looked at the effects of CRTH2 inhibition in a mouse model of human EoE.
They found that, in the mice treated with a CRTH2 blocker, each segment of the esophagus had significant reductions in both eosinophil infiltration and mast cell numbers (P less than .05 for each).
The work was supported in part by grants from the National Institutes of Health and the Tulane Edward G. Schlieder Educational Foundation. Senior author Anil Mishra, PhD, disclosed serving as a consultant for Axcan Pharma, Aptalis, Elite Biosciences, Calypso Biotech SA, and Enumeral Biomedical. The remaining authors disclosed no conflicts of interest.
SOURCE: Verma AK et al. Cell Mol Gastroenterol Hepatol. 2018;5[1]:99-100.e7.
Vasoactive intestinal peptide (VIP) appears to play an important role in the pathology of eosinophilic esophagitis (EoE) by recruiting mast cells and eosinophils that contribute to EoE’s hallmark symptoms of dysphagia and esophageal dysmotility, investigators reported in the February issue of Cellular and Molecular Gastroenterology and Hepatology.
Blocking one of three VIP receptors – chemoattractant receptor-homologous molecule expressed on Th2 (CRTH2) – could reduce eosinophil infiltration and mast cell numbers in the esophagus, wrote Alok K. Verma, PhD, a postodoctoral fellow at Tulane University in New Orleans, and his colleagues.
“We suggest that inhibiting the VIP–CRTH2 axis may ameliorate the dysphagia, stricture, and motility dysfunction of chronic EoE,” they wrote in a research letter to Cellular and Molecular Gastroenterology and Hepatology.
Several cytokines and chemokines, notably interleukin-5 and eotaxin-3, have been fingered as suspects in eosinophil infiltration, but whether chemokines other than eotaxin play a role has not been well documented, the investigators noted.
They hypothesized that VIP may be a chemoattractant that draws eosinophils into perineural areas of the muscular mucosa of the esophagus.
To test this idea, they looked at VIP-expression in samples from patients both with and without EoE and found that VIP expression was low among controls (without EoE); they also found that eosinophils were seen to accumulate near VIP-expressing nerve cells in biopsy samples from patients with EoE.
When they performed in vitro studies of VIP binding and immunologic functions, they found that eosinophils primarily express the CRTH2 receptor rather than the vasoactive intestinal peptide receptor 1 (VPAC-1) or VPAC-2.
They also demonstrated that VIP’s effects on eosinophil motility was similar to that of eotaxin and that, when they pretreated eosinophils with a CRTH2 inhibitor, esoinophil motility was hampered.
The investigators next looked at biopsy specimens from patients with EoE and found that eosinophils that express CRTH2 accumulated in the epithelial mucosa.
To see whether (as they and other researchers had suspected) VIP and its interaction with the CRTH2 receptor might play a role in mast cell recruitment, they performed immunofluorescence analyses and confirmed the presence of the CRTH2 receptor on tryptase-positive mast cells in the esophageal mucosa of patients with EoE.
“These findings suggest that, similar to eosinophils, mast cells accumulate via interaction of the CRTH2 receptor with neutrally derived VIP,” they wrote.
Finally, to see whether a reduction in peak eosinophil levels in patients with EoE with a CRTH2 antagonist – as seen in prior studies – could also ameliorate the negative effects of mast cells on esophageal function, they looked at the effects of CRTH2 inhibition in a mouse model of human EoE.
They found that, in the mice treated with a CRTH2 blocker, each segment of the esophagus had significant reductions in both eosinophil infiltration and mast cell numbers (P less than .05 for each).
The work was supported in part by grants from the National Institutes of Health and the Tulane Edward G. Schlieder Educational Foundation. Senior author Anil Mishra, PhD, disclosed serving as a consultant for Axcan Pharma, Aptalis, Elite Biosciences, Calypso Biotech SA, and Enumeral Biomedical. The remaining authors disclosed no conflicts of interest.
SOURCE: Verma AK et al. Cell Mol Gastroenterol Hepatol. 2018;5[1]:99-100.e7.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: VIP appears to play an important role in the pathogenesis of eosinophilic esophagitis (EoE).
Major finding: Neurally derived VIP and its interaction with the CRTH2 receptor appear to recruit eosinophils and mast cells into the esophageal mucosa.
Data source: In vitro studies of human EoE biopsy samples and in vivo studies in mouse models of EoE.
Disclosures: The work was supported in part by grants from the National Institutes of Health and the Tulane Edward G. Schlieder Educational Foundation. Senior author Anil Mishra, PhD, disclosed serving as a consultant for Axcan Pharma, Aptalis, Elite Biosciences, Calypso Biotech SA, and Enumeral Biomedical. The remaining authors disclosed no conflicts of interest.
Source: Verma AK et al. Cell Mol Gastroenterol Hepatol. 2018;5[1]:99-100.e7.
Study eyed natural history of branch-duct intraductal papillary mucinous neoplasms
Branch-duct intraductal papillary mucinous neoplasms (BD-IPMNs) grew at a median annual rate of 0.8 mm in a retrospective study of 1,369 patients.
While most of these cysts were “indolent and dormant,” some grew rapidly and developed “other worrisome features,” Youngmin Han, MS, of Seoul (South Korea) National University reported with his associates in the February issue of Gastroenterology. Therefore, clinicians should plan follow-up surveillance based on initial cyst size and growth rate, they concluded.
Based on their findings, the researchers recommended surgery for young, fit, asymptomatic patients who have BD-IPMNs with a diameter of least 30 mm or with thickened cyst walls or those who have a main pancreatic duct measuring 5-9 mm. Surgery also should be considered when patients have lymphadenopathy, high tumor marker levels, or an abrupt change in pancreatic duct caliber with distal pancreatic atrophy or a rapidly growing cyst, they said.
For asymptomatic patients whose cysts are under 10 mm and who do not have worrisome features, they recommended follow-up with CT or MRI at 6 months and then every 2 years after that. Cysts of 10-20 mm should be imaged at 6 months, at 12 months, and then every 1.5-2 years after that, they said. Patients with cyst diameters greater than 20 mm “should undergo MRI or CT or EUS [endoscopic ultrasound] every 6 months for 1 year and then annually thereafter, until the cyst size and features become stable,” they added. Patients whose cysts have a diameter of 30 mm or greater “should be closely monitored with MRI or CT or EUS every 6 months. Surgical resection can be considered in younger patients or those with other combined worrisome features.”
To characterize the natural history of BD-IPMN, the investigators evaluated clinical and imaging data collected between 2001 and 2016 from patients with classical features of BD-IPMN. Each patient included in the study provided 3 or more years of CT, MRI, EUS, and endoscopic retrograde cholangiopancreatography data. The researchers used regression models to estimate changes in sizes of cysts and main pancreatic ducts.
Median follow-up time was 61 months (range, 36-189 months). Cyst diameter averaged 12.8 mm (standard deviation, 6.5 mm) at baseline and 17 mm (SD, 9.2 mm) at final measurement. Larger baseline diameter was associated with faster growth (P = .046): Cysts measuring less than 10 mm at baseline grew at a median annual rate of 0.8 mm (SD, 1.1 mm), while those measuring at least 30 mm grew at a median annual rate of 1.2 mm (SD, 2.1 mm).
Worrisome features were present in 59 patients at baseline and emerged in another 150 patients during follow-up. At baseline, only 2.3% of cysts exceeded 30 mm in diameter, but 8.0% did at final measurement. Cyst wall thickening was found in 0.5% of patients at baseline and 3.7% of patients at final measurement. Main pancreatic ducts measured 5-9 mm in 1.9% of patients at baseline and in 5.6% of patients at final measurement. Additionally, the prevalence of mural nodules rose from 0.4% at baseline to 3.1% at final measurement.
Main pancreatic ducts averaged 1.8 mm (SD, 1.0 mm) at baseline and 2.4 mm (SD, 1.8 mm) at final measurement. Compared with the values seen with smaller cysts, larger baseline cyst diameter correlated significantly with larger main pancreatic ducts, more cases of cyst wall thickening, and more cases with mural nodules (P less than .001 for all comparisons).
The study was funded by a grant from Korean Health Technology R&D Project of Ministry of Health and Welfare, Republic of Korea. The investigators reported having no conflicts of interest.
SOURCE: Han Y et al. Gastroenterology. 2018. doi: 10.1053/j.gastro.2017.10.013.
The appropriate management of branch-duct intraductal papillary mucinous neoplasms (BD-IPMNs), a precursor cystic lesion to pancreatic cancer, has been a controversial issue since their initial description in 1982. Current national and international guidelines are primarily based on surgical series with potential selection bias and on observational studies with short surveillance periods. Consequently, there is limited information on the natural history and, more importantly, the malignant potential of BD-IPMNs.
The study by Youngmin Han and colleagues represents a comprehensive analysis of over 1,000 patients, each with at least 3 years of follow-up for a suspected BD-IPMN. In addition, the authors identified an optimal screening method for patients based on cyst size. Their data largely validates prior reports and will undoubtedly serve as the basis for future pancreatic cyst guidelines.
However, as the authors note, limitations of their study include its retrospective design and validation of their screening protocol. Moreover, several lingering questions remain for patients with BD-IPMNs: What is the best method of measuring a BD-IPMN (for example, CT, MRI, or endoscopic ultrasound)? How long should surveillance continue? And what is the role for cytopathology and ancillary studies, such as carcinoembryonic antigen testing, molecular testing, and testing for other pancreatic cyst biomarkers? At the risk of enouncing a cliché, “further studies are needed” to identify an optimal treatment algorithm and, considering the increasingly frequent detection of pancreatic cysts, a cost-effective approach to the evaluation of patients with BD-IPMNs.
Aatur D. Singhi, MD, PhD, is in the division of anatomic pathology in the department of pathology at the University of Pittsburgh Medical Center. He has no conflicts of interest.
The appropriate management of branch-duct intraductal papillary mucinous neoplasms (BD-IPMNs), a precursor cystic lesion to pancreatic cancer, has been a controversial issue since their initial description in 1982. Current national and international guidelines are primarily based on surgical series with potential selection bias and on observational studies with short surveillance periods. Consequently, there is limited information on the natural history and, more importantly, the malignant potential of BD-IPMNs.
The study by Youngmin Han and colleagues represents a comprehensive analysis of over 1,000 patients, each with at least 3 years of follow-up for a suspected BD-IPMN. In addition, the authors identified an optimal screening method for patients based on cyst size. Their data largely validates prior reports and will undoubtedly serve as the basis for future pancreatic cyst guidelines.
However, as the authors note, limitations of their study include its retrospective design and validation of their screening protocol. Moreover, several lingering questions remain for patients with BD-IPMNs: What is the best method of measuring a BD-IPMN (for example, CT, MRI, or endoscopic ultrasound)? How long should surveillance continue? And what is the role for cytopathology and ancillary studies, such as carcinoembryonic antigen testing, molecular testing, and testing for other pancreatic cyst biomarkers? At the risk of enouncing a cliché, “further studies are needed” to identify an optimal treatment algorithm and, considering the increasingly frequent detection of pancreatic cysts, a cost-effective approach to the evaluation of patients with BD-IPMNs.
Aatur D. Singhi, MD, PhD, is in the division of anatomic pathology in the department of pathology at the University of Pittsburgh Medical Center. He has no conflicts of interest.
The appropriate management of branch-duct intraductal papillary mucinous neoplasms (BD-IPMNs), a precursor cystic lesion to pancreatic cancer, has been a controversial issue since their initial description in 1982. Current national and international guidelines are primarily based on surgical series with potential selection bias and on observational studies with short surveillance periods. Consequently, there is limited information on the natural history and, more importantly, the malignant potential of BD-IPMNs.
The study by Youngmin Han and colleagues represents a comprehensive analysis of over 1,000 patients, each with at least 3 years of follow-up for a suspected BD-IPMN. In addition, the authors identified an optimal screening method for patients based on cyst size. Their data largely validates prior reports and will undoubtedly serve as the basis for future pancreatic cyst guidelines.
However, as the authors note, limitations of their study include its retrospective design and validation of their screening protocol. Moreover, several lingering questions remain for patients with BD-IPMNs: What is the best method of measuring a BD-IPMN (for example, CT, MRI, or endoscopic ultrasound)? How long should surveillance continue? And what is the role for cytopathology and ancillary studies, such as carcinoembryonic antigen testing, molecular testing, and testing for other pancreatic cyst biomarkers? At the risk of enouncing a cliché, “further studies are needed” to identify an optimal treatment algorithm and, considering the increasingly frequent detection of pancreatic cysts, a cost-effective approach to the evaluation of patients with BD-IPMNs.
Aatur D. Singhi, MD, PhD, is in the division of anatomic pathology in the department of pathology at the University of Pittsburgh Medical Center. He has no conflicts of interest.
Branch-duct intraductal papillary mucinous neoplasms (BD-IPMNs) grew at a median annual rate of 0.8 mm in a retrospective study of 1,369 patients.
While most of these cysts were “indolent and dormant,” some grew rapidly and developed “other worrisome features,” Youngmin Han, MS, of Seoul (South Korea) National University reported with his associates in the February issue of Gastroenterology. Therefore, clinicians should plan follow-up surveillance based on initial cyst size and growth rate, they concluded.
Based on their findings, the researchers recommended surgery for young, fit, asymptomatic patients who have BD-IPMNs with a diameter of least 30 mm or with thickened cyst walls or those who have a main pancreatic duct measuring 5-9 mm. Surgery also should be considered when patients have lymphadenopathy, high tumor marker levels, or an abrupt change in pancreatic duct caliber with distal pancreatic atrophy or a rapidly growing cyst, they said.
For asymptomatic patients whose cysts are under 10 mm and who do not have worrisome features, they recommended follow-up with CT or MRI at 6 months and then every 2 years after that. Cysts of 10-20 mm should be imaged at 6 months, at 12 months, and then every 1.5-2 years after that, they said. Patients with cyst diameters greater than 20 mm “should undergo MRI or CT or EUS [endoscopic ultrasound] every 6 months for 1 year and then annually thereafter, until the cyst size and features become stable,” they added. Patients whose cysts have a diameter of 30 mm or greater “should be closely monitored with MRI or CT or EUS every 6 months. Surgical resection can be considered in younger patients or those with other combined worrisome features.”
To characterize the natural history of BD-IPMN, the investigators evaluated clinical and imaging data collected between 2001 and 2016 from patients with classical features of BD-IPMN. Each patient included in the study provided 3 or more years of CT, MRI, EUS, and endoscopic retrograde cholangiopancreatography data. The researchers used regression models to estimate changes in sizes of cysts and main pancreatic ducts.
Median follow-up time was 61 months (range, 36-189 months). Cyst diameter averaged 12.8 mm (standard deviation, 6.5 mm) at baseline and 17 mm (SD, 9.2 mm) at final measurement. Larger baseline diameter was associated with faster growth (P = .046): Cysts measuring less than 10 mm at baseline grew at a median annual rate of 0.8 mm (SD, 1.1 mm), while those measuring at least 30 mm grew at a median annual rate of 1.2 mm (SD, 2.1 mm).
Worrisome features were present in 59 patients at baseline and emerged in another 150 patients during follow-up. At baseline, only 2.3% of cysts exceeded 30 mm in diameter, but 8.0% did at final measurement. Cyst wall thickening was found in 0.5% of patients at baseline and 3.7% of patients at final measurement. Main pancreatic ducts measured 5-9 mm in 1.9% of patients at baseline and in 5.6% of patients at final measurement. Additionally, the prevalence of mural nodules rose from 0.4% at baseline to 3.1% at final measurement.
Main pancreatic ducts averaged 1.8 mm (SD, 1.0 mm) at baseline and 2.4 mm (SD, 1.8 mm) at final measurement. Compared with the values seen with smaller cysts, larger baseline cyst diameter correlated significantly with larger main pancreatic ducts, more cases of cyst wall thickening, and more cases with mural nodules (P less than .001 for all comparisons).
The study was funded by a grant from Korean Health Technology R&D Project of Ministry of Health and Welfare, Republic of Korea. The investigators reported having no conflicts of interest.
SOURCE: Han Y et al. Gastroenterology. 2018. doi: 10.1053/j.gastro.2017.10.013.
Branch-duct intraductal papillary mucinous neoplasms (BD-IPMNs) grew at a median annual rate of 0.8 mm in a retrospective study of 1,369 patients.
While most of these cysts were “indolent and dormant,” some grew rapidly and developed “other worrisome features,” Youngmin Han, MS, of Seoul (South Korea) National University reported with his associates in the February issue of Gastroenterology. Therefore, clinicians should plan follow-up surveillance based on initial cyst size and growth rate, they concluded.
Based on their findings, the researchers recommended surgery for young, fit, asymptomatic patients who have BD-IPMNs with a diameter of least 30 mm or with thickened cyst walls or those who have a main pancreatic duct measuring 5-9 mm. Surgery also should be considered when patients have lymphadenopathy, high tumor marker levels, or an abrupt change in pancreatic duct caliber with distal pancreatic atrophy or a rapidly growing cyst, they said.
For asymptomatic patients whose cysts are under 10 mm and who do not have worrisome features, they recommended follow-up with CT or MRI at 6 months and then every 2 years after that. Cysts of 10-20 mm should be imaged at 6 months, at 12 months, and then every 1.5-2 years after that, they said. Patients with cyst diameters greater than 20 mm “should undergo MRI or CT or EUS [endoscopic ultrasound] every 6 months for 1 year and then annually thereafter, until the cyst size and features become stable,” they added. Patients whose cysts have a diameter of 30 mm or greater “should be closely monitored with MRI or CT or EUS every 6 months. Surgical resection can be considered in younger patients or those with other combined worrisome features.”
To characterize the natural history of BD-IPMN, the investigators evaluated clinical and imaging data collected between 2001 and 2016 from patients with classical features of BD-IPMN. Each patient included in the study provided 3 or more years of CT, MRI, EUS, and endoscopic retrograde cholangiopancreatography data. The researchers used regression models to estimate changes in sizes of cysts and main pancreatic ducts.
Median follow-up time was 61 months (range, 36-189 months). Cyst diameter averaged 12.8 mm (standard deviation, 6.5 mm) at baseline and 17 mm (SD, 9.2 mm) at final measurement. Larger baseline diameter was associated with faster growth (P = .046): Cysts measuring less than 10 mm at baseline grew at a median annual rate of 0.8 mm (SD, 1.1 mm), while those measuring at least 30 mm grew at a median annual rate of 1.2 mm (SD, 2.1 mm).
Worrisome features were present in 59 patients at baseline and emerged in another 150 patients during follow-up. At baseline, only 2.3% of cysts exceeded 30 mm in diameter, but 8.0% did at final measurement. Cyst wall thickening was found in 0.5% of patients at baseline and 3.7% of patients at final measurement. Main pancreatic ducts measured 5-9 mm in 1.9% of patients at baseline and in 5.6% of patients at final measurement. Additionally, the prevalence of mural nodules rose from 0.4% at baseline to 3.1% at final measurement.
Main pancreatic ducts averaged 1.8 mm (SD, 1.0 mm) at baseline and 2.4 mm (SD, 1.8 mm) at final measurement. Compared with the values seen with smaller cysts, larger baseline cyst diameter correlated significantly with larger main pancreatic ducts, more cases of cyst wall thickening, and more cases with mural nodules (P less than .001 for all comparisons).
The study was funded by a grant from Korean Health Technology R&D Project of Ministry of Health and Welfare, Republic of Korea. The investigators reported having no conflicts of interest.
SOURCE: Han Y et al. Gastroenterology. 2018. doi: 10.1053/j.gastro.2017.10.013.
FROM GASTROENTEROLOGY
Key clinical point: Tailor the surveillance of BD-IPMNs based on initial diameter and the presence or absence of high-risk features.
Major finding: Median annual growth rate was 0.8 mm.
Data source: A retrospective study of 1,369 patients with BD-IPMNs.
Disclosures: The study was funded by a grant from the Korean Health Technology R&D Project of the Ministry of Health and Welfare, Republic of Korea. The investigators reported having no conflicts of interest.
Source: Han Y et al. Gastroenterology. 2018. doi: 10.1053/j.gastro.2017.10.013.
One in five Crohn’s disease patients have major complications after infliximab withdrawal
About , according to research published in the February issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.09.061).
About 70% of patients remained free of both infliximab restart failure and major complications, said Catherine Reenaers, MD, PhD, of Centre Hospitalier Universitaire de Liège (Belgium), and her associates. Significant predictors of major complications included upper gastrointestinal disease at the time of infliximab withdrawal, white blood cell count of at least 5.0 x 109 per L, and hemoglobin level under 12.5 g per dL. “Patients with at least two of these factors had a more than 40% risk of major complication in the 7 years following infliximab withdrawal,” the researchers reported.
Little is known about long-term outcomes after patients with Crohn’s disease withdraw from infliximab. Therefore, Dr. Reenaers and her associates retrospectively studied 102 patients with Crohn’s disease who had received infliximab and an antimetabolite (azathioprine, mercaptopurine, or methotrexate) for at least 12 months, had been in steroid-free clinical remission for at least 6 months, and then withdrew from infliximab. Patients were recruited from 19 centers in Belgium and France and were originally part of a prospective cohort study of infliximab withdrawal in Crohn’s disease (Gastroenterology. 2012;142[1]:63-70.e5).
About half of patients relapsed and restarted infliximab within 12 months, which is in line with other studies, the researchers noted. Over a median follow-up of 83 months (interquartile range, 71-93 months), 21% (95% confidence interval, 13.1%-30.3%) of patients had no complications, did not restart infliximab, and started no other biologics. In all, 70.2% of patients (95% CI, 60.2%-80.1%) had no major complications and did not fail to respond after restarting infliximab.
Eighteen patients (19%; 95% CI, 10%-27%) developed major complications: 14 who required surgery and 4 who developed new complex perianal lesions. In a multivariable model, the strongest independent predictor of major complications was leukocytosis (hazard ratio, 10.5; 95% CI, 1.3-83; P less than .002), followed by upper gastrointestinal disease (HR, 5.8; 95% CI, 1.5-22) and low hemoglobin level (HR, 4.1; 95% CI, 1.5-21.8; P less than .01). The 13 patients who lacked these risk factors had no major complications of infliximab withdrawal. Among 72 patients who had at least one risk factor, 16.3% (95% CI, 7%-25%) developed major complications over 7 years. Strikingly, among 17 with at least two risk factors, 43% (95% CI, 17%-69%) developed major complications over 7 years, the researchers noted.
Complications emerged a median of 50 months (interquartile range, 41-73 months) after patients received their last infliximab infusion, highlighting the need for close long-term monitoring even if patients show no signs of early clinical relapse after infliximab withdrawal, the investigators said. “One strength of this cohort was the homogeneity of the population,” they stressed. “Most studies of anti–tumor necrosis factor withdrawal after clinical remission were limited by heterogeneous populations, variable lengths of infliximab treatment before discontinuation, and variable use of immunomodulators and corticosteroids. In [our] cohort, the population was homogenous, infliximab withdrawal was standardized, and the disease characteristics at the time of stopping were collected prospectively.” Although follow-up times varied, less than 5% of patients were followed for less than 3 years, they noted.
The researchers did not acknowledge external funding sources. Dr. Reenaers disclosed ties to AbbVie, Takeda, MSD, Mundipharma, Hospira, and Ferring.
SOURCE: Reenaers C et al. Clin Gastro Hepatol 2018 February (in press).
The option of stopping a biologic agent is an attractive prospect for most Crohn's disease (CD) patients in stable clinical remission. The STORI trial, published in 2012, was among the earliest and select few studies addressing withdrawal of biologic therapy in CD among patients in sustained clinical remission with combination therapy (infliximab and thiopurine/methotrexate) for at least 6 months. Almost 50% of patients experienced disease relapse within a year of stopping infliximab in the trial.
Reenaers et al. recently published long-term follow-up of the original STORI cohort. After a median follow-up time of 7 years; four out five patients previously in clinical remission with combination therapy experienced worsening disease activity following withdrawal of infliximab. While the majority (70%) were able to resume infliximab and recapture disease response without any untoward adverse effects; one in five patients experienced major disease-related complications such as complex perianal disease or need for abdominal surgery. Upper GI tract involvement, high white blood cell count, and low hemoglobin concentration were associated with increased likelihood of a major complication. Notably, median time to a major complication was almost 4 years.
These results are similar to long-term relapse rates reported in other studies of withdrawal of therapy in CD. While biomarkers such as C-reactive protein, fecal calprotectin, along with endoscopic disease activity are reliable predictors of short-term relapse; clinical factors such as family history of CD, disease extent, stricturing or penetrating disease, and cigarette smoking are more relevant predictors of long-term disease activity. It is important to consider both types of predictors when considering withdrawal of therapy in CD.
Lastly, while the majority of patients who relapse following withdrawal of a biologic agent will do so within a year or two, a subset may not experience disease-related complications for several years - underscoring the need for long-term follow-up.
Manreet Kaur, MD, is assistant professor in the division of gastroenterology and hepatology; medical director, Inflammatory Bowel Disease Center, and medical director, faculty group practice, Baylor College of Medicine, Houston.
The option of stopping a biologic agent is an attractive prospect for most Crohn's disease (CD) patients in stable clinical remission. The STORI trial, published in 2012, was among the earliest and select few studies addressing withdrawal of biologic therapy in CD among patients in sustained clinical remission with combination therapy (infliximab and thiopurine/methotrexate) for at least 6 months. Almost 50% of patients experienced disease relapse within a year of stopping infliximab in the trial.
Reenaers et al. recently published long-term follow-up of the original STORI cohort. After a median follow-up time of 7 years; four out five patients previously in clinical remission with combination therapy experienced worsening disease activity following withdrawal of infliximab. While the majority (70%) were able to resume infliximab and recapture disease response without any untoward adverse effects; one in five patients experienced major disease-related complications such as complex perianal disease or need for abdominal surgery. Upper GI tract involvement, high white blood cell count, and low hemoglobin concentration were associated with increased likelihood of a major complication. Notably, median time to a major complication was almost 4 years.
These results are similar to long-term relapse rates reported in other studies of withdrawal of therapy in CD. While biomarkers such as C-reactive protein, fecal calprotectin, along with endoscopic disease activity are reliable predictors of short-term relapse; clinical factors such as family history of CD, disease extent, stricturing or penetrating disease, and cigarette smoking are more relevant predictors of long-term disease activity. It is important to consider both types of predictors when considering withdrawal of therapy in CD.
Lastly, while the majority of patients who relapse following withdrawal of a biologic agent will do so within a year or two, a subset may not experience disease-related complications for several years - underscoring the need for long-term follow-up.
Manreet Kaur, MD, is assistant professor in the division of gastroenterology and hepatology; medical director, Inflammatory Bowel Disease Center, and medical director, faculty group practice, Baylor College of Medicine, Houston.
The option of stopping a biologic agent is an attractive prospect for most Crohn's disease (CD) patients in stable clinical remission. The STORI trial, published in 2012, was among the earliest and select few studies addressing withdrawal of biologic therapy in CD among patients in sustained clinical remission with combination therapy (infliximab and thiopurine/methotrexate) for at least 6 months. Almost 50% of patients experienced disease relapse within a year of stopping infliximab in the trial.
Reenaers et al. recently published long-term follow-up of the original STORI cohort. After a median follow-up time of 7 years; four out five patients previously in clinical remission with combination therapy experienced worsening disease activity following withdrawal of infliximab. While the majority (70%) were able to resume infliximab and recapture disease response without any untoward adverse effects; one in five patients experienced major disease-related complications such as complex perianal disease or need for abdominal surgery. Upper GI tract involvement, high white blood cell count, and low hemoglobin concentration were associated with increased likelihood of a major complication. Notably, median time to a major complication was almost 4 years.
These results are similar to long-term relapse rates reported in other studies of withdrawal of therapy in CD. While biomarkers such as C-reactive protein, fecal calprotectin, along with endoscopic disease activity are reliable predictors of short-term relapse; clinical factors such as family history of CD, disease extent, stricturing or penetrating disease, and cigarette smoking are more relevant predictors of long-term disease activity. It is important to consider both types of predictors when considering withdrawal of therapy in CD.
Lastly, while the majority of patients who relapse following withdrawal of a biologic agent will do so within a year or two, a subset may not experience disease-related complications for several years - underscoring the need for long-term follow-up.
Manreet Kaur, MD, is assistant professor in the division of gastroenterology and hepatology; medical director, Inflammatory Bowel Disease Center, and medical director, faculty group practice, Baylor College of Medicine, Houston.
About , according to research published in the February issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.09.061).
About 70% of patients remained free of both infliximab restart failure and major complications, said Catherine Reenaers, MD, PhD, of Centre Hospitalier Universitaire de Liège (Belgium), and her associates. Significant predictors of major complications included upper gastrointestinal disease at the time of infliximab withdrawal, white blood cell count of at least 5.0 x 109 per L, and hemoglobin level under 12.5 g per dL. “Patients with at least two of these factors had a more than 40% risk of major complication in the 7 years following infliximab withdrawal,” the researchers reported.
Little is known about long-term outcomes after patients with Crohn’s disease withdraw from infliximab. Therefore, Dr. Reenaers and her associates retrospectively studied 102 patients with Crohn’s disease who had received infliximab and an antimetabolite (azathioprine, mercaptopurine, or methotrexate) for at least 12 months, had been in steroid-free clinical remission for at least 6 months, and then withdrew from infliximab. Patients were recruited from 19 centers in Belgium and France and were originally part of a prospective cohort study of infliximab withdrawal in Crohn’s disease (Gastroenterology. 2012;142[1]:63-70.e5).
About half of patients relapsed and restarted infliximab within 12 months, which is in line with other studies, the researchers noted. Over a median follow-up of 83 months (interquartile range, 71-93 months), 21% (95% confidence interval, 13.1%-30.3%) of patients had no complications, did not restart infliximab, and started no other biologics. In all, 70.2% of patients (95% CI, 60.2%-80.1%) had no major complications and did not fail to respond after restarting infliximab.
Eighteen patients (19%; 95% CI, 10%-27%) developed major complications: 14 who required surgery and 4 who developed new complex perianal lesions. In a multivariable model, the strongest independent predictor of major complications was leukocytosis (hazard ratio, 10.5; 95% CI, 1.3-83; P less than .002), followed by upper gastrointestinal disease (HR, 5.8; 95% CI, 1.5-22) and low hemoglobin level (HR, 4.1; 95% CI, 1.5-21.8; P less than .01). The 13 patients who lacked these risk factors had no major complications of infliximab withdrawal. Among 72 patients who had at least one risk factor, 16.3% (95% CI, 7%-25%) developed major complications over 7 years. Strikingly, among 17 with at least two risk factors, 43% (95% CI, 17%-69%) developed major complications over 7 years, the researchers noted.
Complications emerged a median of 50 months (interquartile range, 41-73 months) after patients received their last infliximab infusion, highlighting the need for close long-term monitoring even if patients show no signs of early clinical relapse after infliximab withdrawal, the investigators said. “One strength of this cohort was the homogeneity of the population,” they stressed. “Most studies of anti–tumor necrosis factor withdrawal after clinical remission were limited by heterogeneous populations, variable lengths of infliximab treatment before discontinuation, and variable use of immunomodulators and corticosteroids. In [our] cohort, the population was homogenous, infliximab withdrawal was standardized, and the disease characteristics at the time of stopping were collected prospectively.” Although follow-up times varied, less than 5% of patients were followed for less than 3 years, they noted.
The researchers did not acknowledge external funding sources. Dr. Reenaers disclosed ties to AbbVie, Takeda, MSD, Mundipharma, Hospira, and Ferring.
SOURCE: Reenaers C et al. Clin Gastro Hepatol 2018 February (in press).
About , according to research published in the February issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.09.061).
About 70% of patients remained free of both infliximab restart failure and major complications, said Catherine Reenaers, MD, PhD, of Centre Hospitalier Universitaire de Liège (Belgium), and her associates. Significant predictors of major complications included upper gastrointestinal disease at the time of infliximab withdrawal, white blood cell count of at least 5.0 x 109 per L, and hemoglobin level under 12.5 g per dL. “Patients with at least two of these factors had a more than 40% risk of major complication in the 7 years following infliximab withdrawal,” the researchers reported.
Little is known about long-term outcomes after patients with Crohn’s disease withdraw from infliximab. Therefore, Dr. Reenaers and her associates retrospectively studied 102 patients with Crohn’s disease who had received infliximab and an antimetabolite (azathioprine, mercaptopurine, or methotrexate) for at least 12 months, had been in steroid-free clinical remission for at least 6 months, and then withdrew from infliximab. Patients were recruited from 19 centers in Belgium and France and were originally part of a prospective cohort study of infliximab withdrawal in Crohn’s disease (Gastroenterology. 2012;142[1]:63-70.e5).
About half of patients relapsed and restarted infliximab within 12 months, which is in line with other studies, the researchers noted. Over a median follow-up of 83 months (interquartile range, 71-93 months), 21% (95% confidence interval, 13.1%-30.3%) of patients had no complications, did not restart infliximab, and started no other biologics. In all, 70.2% of patients (95% CI, 60.2%-80.1%) had no major complications and did not fail to respond after restarting infliximab.
Eighteen patients (19%; 95% CI, 10%-27%) developed major complications: 14 who required surgery and 4 who developed new complex perianal lesions. In a multivariable model, the strongest independent predictor of major complications was leukocytosis (hazard ratio, 10.5; 95% CI, 1.3-83; P less than .002), followed by upper gastrointestinal disease (HR, 5.8; 95% CI, 1.5-22) and low hemoglobin level (HR, 4.1; 95% CI, 1.5-21.8; P less than .01). The 13 patients who lacked these risk factors had no major complications of infliximab withdrawal. Among 72 patients who had at least one risk factor, 16.3% (95% CI, 7%-25%) developed major complications over 7 years. Strikingly, among 17 with at least two risk factors, 43% (95% CI, 17%-69%) developed major complications over 7 years, the researchers noted.
Complications emerged a median of 50 months (interquartile range, 41-73 months) after patients received their last infliximab infusion, highlighting the need for close long-term monitoring even if patients show no signs of early clinical relapse after infliximab withdrawal, the investigators said. “One strength of this cohort was the homogeneity of the population,” they stressed. “Most studies of anti–tumor necrosis factor withdrawal after clinical remission were limited by heterogeneous populations, variable lengths of infliximab treatment before discontinuation, and variable use of immunomodulators and corticosteroids. In [our] cohort, the population was homogenous, infliximab withdrawal was standardized, and the disease characteristics at the time of stopping were collected prospectively.” Although follow-up times varied, less than 5% of patients were followed for less than 3 years, they noted.
The researchers did not acknowledge external funding sources. Dr. Reenaers disclosed ties to AbbVie, Takeda, MSD, Mundipharma, Hospira, and Ferring.
SOURCE: Reenaers C et al. Clin Gastro Hepatol 2018 February (in press).
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: Over 7 years, about one in five patients with remitted Crohn’s disease developed a major complication after withdrawing from infliximab, despite remaining on an antimetabolite.
Major finding: Eighteen patients (19%; 95% CI, 10%-27%) developed major complications: Fourteen needed surgery and four developed new complex perianal lesions.
Data source: A cohort study of 102 patients with Crohn’s disease who had received infliximab and an antimetabolite for at least 12 months, had been in steroid-free clinical remission for at least 6 months, and who then withdrew from infliximab.
Disclosures: The researchers did not acknowledge external funding sources. Dr. Reenaers disclosed ties to AbbVie, Takeda, MSD, Mundipharma, Hospira, and Ferring.
Source: Reenaers C et al. Clin Gastroenterol Hepatol. 2018 February (in press).
Eradicating HCV significantly improved liver stiffness in meta-analysis
Eradicating chronic hepatitis C virus (HCV) infection led to significant decreases in liver stiffness in a systematic review and meta-analysis of nearly 3,000 patients.
Mean liver stiffness fell by 4.1 kPa (kilopascals) (95% confidence interval, 3.3-4.9 kPa) 12 or more months after patients achieved sustained virologic response to treatment, but did not significantly change in patients who did not achieve SVR, reported Siddharth Singh, MD, of the University of San Diego, La Jolla, Calif., and his associates in the January issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.04.038). The results were especially striking among patients who received direct-acting antiviral agents (DAAs) or who had high baseline levels of inflammation, the investigators added.
SOURCE: AMERICAN GASTROENTEROLOGICAL ASSOCIATION
Based on these findings, about 47% of patients with advanced fibrosis or cirrhosis at baseline will drop below 9.5 kPa after achieving SVR, they reported. “With this decline in liver stiffness, it is conceivable that risk of liver-related complications would decrease, particularly in patients without cirrhosis,” they added. “Future research is warranted on the impact of magnitude and kinetics of decline in liver stiffness on improvement in liver-related outcomes.”
Eradicating HCV infection was known to decrease liver stiffness, but the magnitude of decline was not well understood. Therefore, the reviewers searched the literature through October 2016 for studies of HCV-infected adults who underwent liver stiffness measurement by vibration-controlled transient elastography before and at least once after completing HCV treatment. All studies also included data on median liver stiffness among patients who did and did not achieve SVR. The search identified 23 observational studies and one post hoc analysis of a randomized controlled trial, for a total of 2,934 patients, of whom 2,214 achieved SVR.
Among patients who achieved SVR, mean liver stiffness dropped by 2.4 kPa at the end of treatment (95% CI, 1.7-3.0 kPa), by 3.1 kPa 1-6 months later (95% CI, 1.6-4.7 kPa), and by 3.2 kPa 6-12 months after completing treatment (90% CI, 2.6-3.9 kPa). A year or more after finishing treatment, patients who achieved SVR had a 28% median decrease in liver stiffness (interquartile range, 22%-35%). However, liver stiffness did not significantly change among patients who did not achieve SVR, the reviewers reported.
Mean liver stiffness declined significantly more among patients who received DAAs (4.5 kPa) than among recipients of interferon-based regimens (2.6 kPa; P = .03). However, studies of DAAs included patients with greater liver stiffness at baseline, which could at least partially explain this discrepancy, the investigators said. Baseline cirrhosis also was associated with a greater decline in liver stiffness (mean, 5.1 kPa, vs. 2.8 kPa in patients without cirrhosis; P = .02), as was high baseline alanine aminotransferase level (P less than .01). Among patients whose baseline liver stiffness measurement exceeded 9.5 kPa, 47% had their liver stiffness drop to less than 9.5 kPa after achieving SVR.
Coinfection with HIV did not significantly alter the magnitude of decline in liver stiffness 6-12 months after treatment in patients who achieved SVR, the reviewers noted. “[Follow-up] assessment after SVR was relatively short; hence, long-term evolution of liver stiffness after antiviral therapy and impact of decline in liver stiffness on patient clinical outcomes could not be ascertained,” they wrote. The studies also did not consistently assess potential confounders such as nonalcoholic fatty liver disease, diabetes, and alcohol consumption.
One reviewer disclosed funding from the National Institutes of Health/National Library of Medicine. None had conflicts of interest.
The current era of new-generation direct-acting antiviral agents have revolutionized the treatment landscape of chronic hepatitis C virus infection, providing short-duration, safe, and consistently effective regimens that achieve SVR or cure in nearly 100% of patients. While achieving SVR is important, even more important is the long-term impact of SVR and whether cure translates into outcomes such as improved mortality or a reduced risk of disease progression. Although improved mortality after SVR has been demonstrated, one of the main drivers of risk of disease progression is the severity of hepatic fibrosis.
Robert J. Wong, MD, MS, is with the department of medicine and is director of research and education, division of gastroenterology and hepatology, Alameda Health System – Highland Hospital, Oakland, Calif. He has received a 2017-2019 Clinical Translational Research Award from AASLD, has received research funding from Gilead and AbbVie, and is on the speakers bureau of Gilead, Salix, and Bayer. He has also done consulting for and been an advisory board member for Gilead.
The current era of new-generation direct-acting antiviral agents have revolutionized the treatment landscape of chronic hepatitis C virus infection, providing short-duration, safe, and consistently effective regimens that achieve SVR or cure in nearly 100% of patients. While achieving SVR is important, even more important is the long-term impact of SVR and whether cure translates into outcomes such as improved mortality or a reduced risk of disease progression. Although improved mortality after SVR has been demonstrated, one of the main drivers of risk of disease progression is the severity of hepatic fibrosis.
Robert J. Wong, MD, MS, is with the department of medicine and is director of research and education, division of gastroenterology and hepatology, Alameda Health System – Highland Hospital, Oakland, Calif. He has received a 2017-2019 Clinical Translational Research Award from AASLD, has received research funding from Gilead and AbbVie, and is on the speakers bureau of Gilead, Salix, and Bayer. He has also done consulting for and been an advisory board member for Gilead.
The current era of new-generation direct-acting antiviral agents have revolutionized the treatment landscape of chronic hepatitis C virus infection, providing short-duration, safe, and consistently effective regimens that achieve SVR or cure in nearly 100% of patients. While achieving SVR is important, even more important is the long-term impact of SVR and whether cure translates into outcomes such as improved mortality or a reduced risk of disease progression. Although improved mortality after SVR has been demonstrated, one of the main drivers of risk of disease progression is the severity of hepatic fibrosis.
Robert J. Wong, MD, MS, is with the department of medicine and is director of research and education, division of gastroenterology and hepatology, Alameda Health System – Highland Hospital, Oakland, Calif. He has received a 2017-2019 Clinical Translational Research Award from AASLD, has received research funding from Gilead and AbbVie, and is on the speakers bureau of Gilead, Salix, and Bayer. He has also done consulting for and been an advisory board member for Gilead.
Eradicating chronic hepatitis C virus (HCV) infection led to significant decreases in liver stiffness in a systematic review and meta-analysis of nearly 3,000 patients.
Mean liver stiffness fell by 4.1 kPa (kilopascals) (95% confidence interval, 3.3-4.9 kPa) 12 or more months after patients achieved sustained virologic response to treatment, but did not significantly change in patients who did not achieve SVR, reported Siddharth Singh, MD, of the University of San Diego, La Jolla, Calif., and his associates in the January issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.04.038). The results were especially striking among patients who received direct-acting antiviral agents (DAAs) or who had high baseline levels of inflammation, the investigators added.
SOURCE: AMERICAN GASTROENTEROLOGICAL ASSOCIATION
Based on these findings, about 47% of patients with advanced fibrosis or cirrhosis at baseline will drop below 9.5 kPa after achieving SVR, they reported. “With this decline in liver stiffness, it is conceivable that risk of liver-related complications would decrease, particularly in patients without cirrhosis,” they added. “Future research is warranted on the impact of magnitude and kinetics of decline in liver stiffness on improvement in liver-related outcomes.”
Eradicating HCV infection was known to decrease liver stiffness, but the magnitude of decline was not well understood. Therefore, the reviewers searched the literature through October 2016 for studies of HCV-infected adults who underwent liver stiffness measurement by vibration-controlled transient elastography before and at least once after completing HCV treatment. All studies also included data on median liver stiffness among patients who did and did not achieve SVR. The search identified 23 observational studies and one post hoc analysis of a randomized controlled trial, for a total of 2,934 patients, of whom 2,214 achieved SVR.
Among patients who achieved SVR, mean liver stiffness dropped by 2.4 kPa at the end of treatment (95% CI, 1.7-3.0 kPa), by 3.1 kPa 1-6 months later (95% CI, 1.6-4.7 kPa), and by 3.2 kPa 6-12 months after completing treatment (90% CI, 2.6-3.9 kPa). A year or more after finishing treatment, patients who achieved SVR had a 28% median decrease in liver stiffness (interquartile range, 22%-35%). However, liver stiffness did not significantly change among patients who did not achieve SVR, the reviewers reported.
Mean liver stiffness declined significantly more among patients who received DAAs (4.5 kPa) than among recipients of interferon-based regimens (2.6 kPa; P = .03). However, studies of DAAs included patients with greater liver stiffness at baseline, which could at least partially explain this discrepancy, the investigators said. Baseline cirrhosis also was associated with a greater decline in liver stiffness (mean, 5.1 kPa, vs. 2.8 kPa in patients without cirrhosis; P = .02), as was high baseline alanine aminotransferase level (P less than .01). Among patients whose baseline liver stiffness measurement exceeded 9.5 kPa, 47% had their liver stiffness drop to less than 9.5 kPa after achieving SVR.
Coinfection with HIV did not significantly alter the magnitude of decline in liver stiffness 6-12 months after treatment in patients who achieved SVR, the reviewers noted. “[Follow-up] assessment after SVR was relatively short; hence, long-term evolution of liver stiffness after antiviral therapy and impact of decline in liver stiffness on patient clinical outcomes could not be ascertained,” they wrote. The studies also did not consistently assess potential confounders such as nonalcoholic fatty liver disease, diabetes, and alcohol consumption.
One reviewer disclosed funding from the National Institutes of Health/National Library of Medicine. None had conflicts of interest.
Eradicating chronic hepatitis C virus (HCV) infection led to significant decreases in liver stiffness in a systematic review and meta-analysis of nearly 3,000 patients.
Mean liver stiffness fell by 4.1 kPa (kilopascals) (95% confidence interval, 3.3-4.9 kPa) 12 or more months after patients achieved sustained virologic response to treatment, but did not significantly change in patients who did not achieve SVR, reported Siddharth Singh, MD, of the University of San Diego, La Jolla, Calif., and his associates in the January issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.04.038). The results were especially striking among patients who received direct-acting antiviral agents (DAAs) or who had high baseline levels of inflammation, the investigators added.
SOURCE: AMERICAN GASTROENTEROLOGICAL ASSOCIATION
Based on these findings, about 47% of patients with advanced fibrosis or cirrhosis at baseline will drop below 9.5 kPa after achieving SVR, they reported. “With this decline in liver stiffness, it is conceivable that risk of liver-related complications would decrease, particularly in patients without cirrhosis,” they added. “Future research is warranted on the impact of magnitude and kinetics of decline in liver stiffness on improvement in liver-related outcomes.”
Eradicating HCV infection was known to decrease liver stiffness, but the magnitude of decline was not well understood. Therefore, the reviewers searched the literature through October 2016 for studies of HCV-infected adults who underwent liver stiffness measurement by vibration-controlled transient elastography before and at least once after completing HCV treatment. All studies also included data on median liver stiffness among patients who did and did not achieve SVR. The search identified 23 observational studies and one post hoc analysis of a randomized controlled trial, for a total of 2,934 patients, of whom 2,214 achieved SVR.
Among patients who achieved SVR, mean liver stiffness dropped by 2.4 kPa at the end of treatment (95% CI, 1.7-3.0 kPa), by 3.1 kPa 1-6 months later (95% CI, 1.6-4.7 kPa), and by 3.2 kPa 6-12 months after completing treatment (90% CI, 2.6-3.9 kPa). A year or more after finishing treatment, patients who achieved SVR had a 28% median decrease in liver stiffness (interquartile range, 22%-35%). However, liver stiffness did not significantly change among patients who did not achieve SVR, the reviewers reported.
Mean liver stiffness declined significantly more among patients who received DAAs (4.5 kPa) than among recipients of interferon-based regimens (2.6 kPa; P = .03). However, studies of DAAs included patients with greater liver stiffness at baseline, which could at least partially explain this discrepancy, the investigators said. Baseline cirrhosis also was associated with a greater decline in liver stiffness (mean, 5.1 kPa, vs. 2.8 kPa in patients without cirrhosis; P = .02), as was high baseline alanine aminotransferase level (P less than .01). Among patients whose baseline liver stiffness measurement exceeded 9.5 kPa, 47% had their liver stiffness drop to less than 9.5 kPa after achieving SVR.
Coinfection with HIV did not significantly alter the magnitude of decline in liver stiffness 6-12 months after treatment in patients who achieved SVR, the reviewers noted. “[Follow-up] assessment after SVR was relatively short; hence, long-term evolution of liver stiffness after antiviral therapy and impact of decline in liver stiffness on patient clinical outcomes could not be ascertained,” they wrote. The studies also did not consistently assess potential confounders such as nonalcoholic fatty liver disease, diabetes, and alcohol consumption.
One reviewer disclosed funding from the National Institutes of Health/National Library of Medicine. None had conflicts of interest.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: Eradicating chronic hepatitis C virus infection led to significant decreases in liver stiffness.
Major finding: Mean liver stiffness decreased by 4.1 kPa 12 or more months after patients achieved sustained virologic response to treatment, but did not significantly improve in patients who lacked SVR.
Data source: A systematic review and meta-analysis of 2,934 patients from 23 observational studies and one post hoc analysis of a randomized controlled trial.
Disclosures: One reviewer disclosed funding from the National Institutes of Health/National Library of Medicine. The reviewers reported having no conflicts of interest.
VIDEO: Project ECHO would cost-effectively expand HCV treatment
Training community health providers to treat chronic hepatitis C virus infection is a cost-effective way to expand treatment access and reduce the national burden of this prevalent condition, according to research published in the December issue of Gastroenterology (doi: 10.1053/j.gastro.2017.10.016).
The model, dubbed Project ECHO, “is the best way, to our knowledge, to cost-effectively find and treat HCV patients at scale,” wrote Thilo Rattay, MPH, of the University of Michigan School of Public Health, Ann Arbor, and his associates. “Our analysis demonstrates that fundamentally changing the care delivery model for HCV enables unparalleled reach, in contrast to simply using ever more cost-effective drugs in an inefficient system. Project ECHO can quickly reduce the burden of disease from HCV and accelerate the impact of the new generation of highly effective medications.”
Project ECHO (echo.unm.edu) links multidisciplinary teams of specialists (hubs) to physicians and nurse practitioners in community practice (spokes). Each hub, which is usually based at an academic medical center, holds video conferences to mentor and teach providers about best practices for managing conditions ranging from autism to Zika virus infection. Initial reports suggest that Project ECHO can improve health care quality and access as well as job satisfaction among primary care providers, the researchers noted.
Project ECHO has 127 hubs globally, including 77 in the United States, and receives support from foundations, state legislatures, and government agencies. Because patients with chronic HCV vastly outnumber gastroenterologists in the United States, Mr. Rattay and his coinvestigators used Markov models to evaluate Project ECHO’s cost-effectiveness in the HCV setting. To do so, they created a decision tree and Markov models with Microsoft Excel, PrecisionTree, and @RISK by using data from the U.S. Census Bureau, MarketScan, and an extensive literature review
SOURCE: AMERICAN GASTROENTEROLOGICAL ASSOCIATION
The models yielded an incremental cost-effectiveness ratio of $10,351 per quality-adjusted life year when compared with the status quo, said the researchers. Commonly cited willingness-to-pay thresholds are $50,000 and $100,000, indicating that Project ECHO is a cost-effective way to expand HCV treatment, they added. However, insurers would pay substantially more during the first 5 years of rollout – about $708 million versus $368 million with the status quo. During the first year, ECHO would cost payers about $350.5 million more than would the status quo, but 4,446 more patients would be treated, drastically reducing prevalence in the insurance pool. Consequently, subsequent costs would drop by nearly $11 million over the first 5 years of ECHO. Nonetheless, the “high budgetary costs suggest that incremental rollout of [Project] ECHO may be best,” the investigators wrote.
They were unable to determine whether increased treatment under ECHO relates to expanded screening, treatment adherence, or access, but sensitivity analyses suggested that “results are largely independent of the cause,” the researchers wrote. “Importantly, most of the financial benefits of treating HCV are not immediate, while a majority of the costs are upfront,” they stressed. Stakeholders therefore need to adopt a long-term view and consider population-based health care models and reimbursement strategies that “capture the full benefit of this type of ecosystem.”
The investigators had no external funding sources and no conflicts of interest.
Training community health providers to treat chronic hepatitis C virus infection is a cost-effective way to expand treatment access and reduce the national burden of this prevalent condition, according to research published in the December issue of Gastroenterology (doi: 10.1053/j.gastro.2017.10.016).
The model, dubbed Project ECHO, “is the best way, to our knowledge, to cost-effectively find and treat HCV patients at scale,” wrote Thilo Rattay, MPH, of the University of Michigan School of Public Health, Ann Arbor, and his associates. “Our analysis demonstrates that fundamentally changing the care delivery model for HCV enables unparalleled reach, in contrast to simply using ever more cost-effective drugs in an inefficient system. Project ECHO can quickly reduce the burden of disease from HCV and accelerate the impact of the new generation of highly effective medications.”
Project ECHO (echo.unm.edu) links multidisciplinary teams of specialists (hubs) to physicians and nurse practitioners in community practice (spokes). Each hub, which is usually based at an academic medical center, holds video conferences to mentor and teach providers about best practices for managing conditions ranging from autism to Zika virus infection. Initial reports suggest that Project ECHO can improve health care quality and access as well as job satisfaction among primary care providers, the researchers noted.
Project ECHO has 127 hubs globally, including 77 in the United States, and receives support from foundations, state legislatures, and government agencies. Because patients with chronic HCV vastly outnumber gastroenterologists in the United States, Mr. Rattay and his coinvestigators used Markov models to evaluate Project ECHO’s cost-effectiveness in the HCV setting. To do so, they created a decision tree and Markov models with Microsoft Excel, PrecisionTree, and @RISK by using data from the U.S. Census Bureau, MarketScan, and an extensive literature review
SOURCE: AMERICAN GASTROENTEROLOGICAL ASSOCIATION
The models yielded an incremental cost-effectiveness ratio of $10,351 per quality-adjusted life year when compared with the status quo, said the researchers. Commonly cited willingness-to-pay thresholds are $50,000 and $100,000, indicating that Project ECHO is a cost-effective way to expand HCV treatment, they added. However, insurers would pay substantially more during the first 5 years of rollout – about $708 million versus $368 million with the status quo. During the first year, ECHO would cost payers about $350.5 million more than would the status quo, but 4,446 more patients would be treated, drastically reducing prevalence in the insurance pool. Consequently, subsequent costs would drop by nearly $11 million over the first 5 years of ECHO. Nonetheless, the “high budgetary costs suggest that incremental rollout of [Project] ECHO may be best,” the investigators wrote.
They were unable to determine whether increased treatment under ECHO relates to expanded screening, treatment adherence, or access, but sensitivity analyses suggested that “results are largely independent of the cause,” the researchers wrote. “Importantly, most of the financial benefits of treating HCV are not immediate, while a majority of the costs are upfront,” they stressed. Stakeholders therefore need to adopt a long-term view and consider population-based health care models and reimbursement strategies that “capture the full benefit of this type of ecosystem.”
The investigators had no external funding sources and no conflicts of interest.
Training community health providers to treat chronic hepatitis C virus infection is a cost-effective way to expand treatment access and reduce the national burden of this prevalent condition, according to research published in the December issue of Gastroenterology (doi: 10.1053/j.gastro.2017.10.016).
The model, dubbed Project ECHO, “is the best way, to our knowledge, to cost-effectively find and treat HCV patients at scale,” wrote Thilo Rattay, MPH, of the University of Michigan School of Public Health, Ann Arbor, and his associates. “Our analysis demonstrates that fundamentally changing the care delivery model for HCV enables unparalleled reach, in contrast to simply using ever more cost-effective drugs in an inefficient system. Project ECHO can quickly reduce the burden of disease from HCV and accelerate the impact of the new generation of highly effective medications.”
Project ECHO (echo.unm.edu) links multidisciplinary teams of specialists (hubs) to physicians and nurse practitioners in community practice (spokes). Each hub, which is usually based at an academic medical center, holds video conferences to mentor and teach providers about best practices for managing conditions ranging from autism to Zika virus infection. Initial reports suggest that Project ECHO can improve health care quality and access as well as job satisfaction among primary care providers, the researchers noted.
Project ECHO has 127 hubs globally, including 77 in the United States, and receives support from foundations, state legislatures, and government agencies. Because patients with chronic HCV vastly outnumber gastroenterologists in the United States, Mr. Rattay and his coinvestigators used Markov models to evaluate Project ECHO’s cost-effectiveness in the HCV setting. To do so, they created a decision tree and Markov models with Microsoft Excel, PrecisionTree, and @RISK by using data from the U.S. Census Bureau, MarketScan, and an extensive literature review
SOURCE: AMERICAN GASTROENTEROLOGICAL ASSOCIATION
The models yielded an incremental cost-effectiveness ratio of $10,351 per quality-adjusted life year when compared with the status quo, said the researchers. Commonly cited willingness-to-pay thresholds are $50,000 and $100,000, indicating that Project ECHO is a cost-effective way to expand HCV treatment, they added. However, insurers would pay substantially more during the first 5 years of rollout – about $708 million versus $368 million with the status quo. During the first year, ECHO would cost payers about $350.5 million more than would the status quo, but 4,446 more patients would be treated, drastically reducing prevalence in the insurance pool. Consequently, subsequent costs would drop by nearly $11 million over the first 5 years of ECHO. Nonetheless, the “high budgetary costs suggest that incremental rollout of [Project] ECHO may be best,” the investigators wrote.
They were unable to determine whether increased treatment under ECHO relates to expanded screening, treatment adherence, or access, but sensitivity analyses suggested that “results are largely independent of the cause,” the researchers wrote. “Importantly, most of the financial benefits of treating HCV are not immediate, while a majority of the costs are upfront,” they stressed. Stakeholders therefore need to adopt a long-term view and consider population-based health care models and reimbursement strategies that “capture the full benefit of this type of ecosystem.”
The investigators had no external funding sources and no conflicts of interest.
FROM GASTROENTEROLOGY
Key clinical point: A teletraining model called Project ECHO is a cost-effective way to expand access to treatment for chronic hepatitis C virus infection.
Major finding: The incremental cost-effectiveness ratio was $10,351 per quality-adjusted life year, compared with the status quo. Commonly cited willingness-to-pay thresholds are $50,000 and $100,000.
Data source: A decision tree and Markov models created with Microsoft Excel, PrecisionTree, and @RISK using data from the U.S. Census Bureau, MarketScan, and an extensive literature review.
Disclosures: The investigators had no external funding sources and no conflicts of interest.