User login
MELD sodium score tied to better transplant outcomes
Factoring hyponatremic status into liver graft allocations led to significant reductions in wait-list mortality, researchers reported in the November issue of Gastroenterology.
Hyponatremic patients with low MELD scores benefited significantly from allocation based on the end-stage liver disease–sodium (MELD-Na) score, while its survival benefit was less evident among patients with higher scores, said Shunji Nagai, MD, PhD, of Henry Ford Hospital, Detroit, and his associates. “Therefore, liver allocation rules such as Share 15 and Share 35 need to be revised to fulfill the Final Rule under the MELD-Na based allocation,” they wrote.
The Share 35 rule offers liver grafts locally and regionally to wait-listed patients with MELD-Na scores of at least 35. Under the Share 15 rule, livers are offered regionally or nationally before considering local candidates with MELD scores under 15. The traditional MELD scoring system excluded hyponatremia, which has since been found to independently predict death from cirrhosis. Therefore, in January 2016, a modified MELD-Na score was implemented for patients with traditional MELD scores of at least 12. The MELD-Na score assigns patients between 1 and 11 additional points, and patients with low MELD scores and severe hyponatremia receive the most points. To assess the impact of this change, Dr. Nagai and his associates compared wait-list and posttransplantation outcomes during the pre and post–MELD-Na eras and the survival benefit of liver transplantation during the MELD-Na period. The study included all adults wait-listed for livers from June 2013, when Share 35 was implemented, through September 2017.
Mortality within 90 days on the wait list fell significantly during the MELD-Na era (hazard ratio, 0.74; P less than .001). Transplantation conferred a “definitive” survival benefit when MELD-Na scores were 21-23 (HR versus wait list, 0.34; P less than .001). During the traditional MELD period, the equivalent cutoff was 15-17 (HR, 0.36; P less than .001). “As such, the current rules for liver allocation may be suboptimal under the MELD-Na–based allocation and the criteria for Share 15 may need to be revisited,” the researchers wrote. They recommended raising the cutoff to 21.
The study also confirmed mild hyponatremia (130-134 mmol/L), moderate hyponatremia (125-129 mmol/L), and severe hyponatremia (less than 125 mmol/L) as independent predictors of wait-list mortality during the traditional MELD era. Hazard ratios were 1.4, 1.8, and 1.7, respectively (all P less than .001). The implementation of MELD-Na significantly weakened these associations, with HRs of 1.1 (P = .3), 1.3 (P = .02), and 1.4 (P = .04), respectively).
The probability of transplantation also rose significantly during the MELD-Na era (HR, 1.2; P less than .001), possibly because of the opioid epidemic, the researchers said. Although greater availability of liver grafts might have improved wait-list outcomes, all score categories would have shown a positive impact if this was the only reason, they added. Instead, MELD-Na most benefited patients with lower scores.
Finally, posttransplantation outcomes worsened during the MELD-Na era, perhaps because of transplant population aging. However, the survival benefit of transplant shifted to higher score ranges during the MELD-Na era even after the researchers controlled for this effect. “According to this analysis,” they wrote, “the survival benefit of liver transplant was definitive in patients with score category of 21-23, which could further validate our proposal to revise Share 15 rule to ‘Share 21.’ ”
The investigators reported having no external funding sources or conflicts of interest.
SOURCE: Nagai S et al. Gastroenterology. 2018 Jul 26. doi: 10.1053/j.gastro.2018.07.025.
Factoring hyponatremic status into liver graft allocations led to significant reductions in wait-list mortality, researchers reported in the November issue of Gastroenterology.
Hyponatremic patients with low MELD scores benefited significantly from allocation based on the end-stage liver disease–sodium (MELD-Na) score, while its survival benefit was less evident among patients with higher scores, said Shunji Nagai, MD, PhD, of Henry Ford Hospital, Detroit, and his associates. “Therefore, liver allocation rules such as Share 15 and Share 35 need to be revised to fulfill the Final Rule under the MELD-Na based allocation,” they wrote.
The Share 35 rule offers liver grafts locally and regionally to wait-listed patients with MELD-Na scores of at least 35. Under the Share 15 rule, livers are offered regionally or nationally before considering local candidates with MELD scores under 15. The traditional MELD scoring system excluded hyponatremia, which has since been found to independently predict death from cirrhosis. Therefore, in January 2016, a modified MELD-Na score was implemented for patients with traditional MELD scores of at least 12. The MELD-Na score assigns patients between 1 and 11 additional points, and patients with low MELD scores and severe hyponatremia receive the most points. To assess the impact of this change, Dr. Nagai and his associates compared wait-list and posttransplantation outcomes during the pre and post–MELD-Na eras and the survival benefit of liver transplantation during the MELD-Na period. The study included all adults wait-listed for livers from June 2013, when Share 35 was implemented, through September 2017.
Mortality within 90 days on the wait list fell significantly during the MELD-Na era (hazard ratio, 0.74; P less than .001). Transplantation conferred a “definitive” survival benefit when MELD-Na scores were 21-23 (HR versus wait list, 0.34; P less than .001). During the traditional MELD period, the equivalent cutoff was 15-17 (HR, 0.36; P less than .001). “As such, the current rules for liver allocation may be suboptimal under the MELD-Na–based allocation and the criteria for Share 15 may need to be revisited,” the researchers wrote. They recommended raising the cutoff to 21.
The study also confirmed mild hyponatremia (130-134 mmol/L), moderate hyponatremia (125-129 mmol/L), and severe hyponatremia (less than 125 mmol/L) as independent predictors of wait-list mortality during the traditional MELD era. Hazard ratios were 1.4, 1.8, and 1.7, respectively (all P less than .001). The implementation of MELD-Na significantly weakened these associations, with HRs of 1.1 (P = .3), 1.3 (P = .02), and 1.4 (P = .04), respectively).
The probability of transplantation also rose significantly during the MELD-Na era (HR, 1.2; P less than .001), possibly because of the opioid epidemic, the researchers said. Although greater availability of liver grafts might have improved wait-list outcomes, all score categories would have shown a positive impact if this was the only reason, they added. Instead, MELD-Na most benefited patients with lower scores.
Finally, posttransplantation outcomes worsened during the MELD-Na era, perhaps because of transplant population aging. However, the survival benefit of transplant shifted to higher score ranges during the MELD-Na era even after the researchers controlled for this effect. “According to this analysis,” they wrote, “the survival benefit of liver transplant was definitive in patients with score category of 21-23, which could further validate our proposal to revise Share 15 rule to ‘Share 21.’ ”
The investigators reported having no external funding sources or conflicts of interest.
SOURCE: Nagai S et al. Gastroenterology. 2018 Jul 26. doi: 10.1053/j.gastro.2018.07.025.
Factoring hyponatremic status into liver graft allocations led to significant reductions in wait-list mortality, researchers reported in the November issue of Gastroenterology.
Hyponatremic patients with low MELD scores benefited significantly from allocation based on the end-stage liver disease–sodium (MELD-Na) score, while its survival benefit was less evident among patients with higher scores, said Shunji Nagai, MD, PhD, of Henry Ford Hospital, Detroit, and his associates. “Therefore, liver allocation rules such as Share 15 and Share 35 need to be revised to fulfill the Final Rule under the MELD-Na based allocation,” they wrote.
The Share 35 rule offers liver grafts locally and regionally to wait-listed patients with MELD-Na scores of at least 35. Under the Share 15 rule, livers are offered regionally or nationally before considering local candidates with MELD scores under 15. The traditional MELD scoring system excluded hyponatremia, which has since been found to independently predict death from cirrhosis. Therefore, in January 2016, a modified MELD-Na score was implemented for patients with traditional MELD scores of at least 12. The MELD-Na score assigns patients between 1 and 11 additional points, and patients with low MELD scores and severe hyponatremia receive the most points. To assess the impact of this change, Dr. Nagai and his associates compared wait-list and posttransplantation outcomes during the pre and post–MELD-Na eras and the survival benefit of liver transplantation during the MELD-Na period. The study included all adults wait-listed for livers from June 2013, when Share 35 was implemented, through September 2017.
Mortality within 90 days on the wait list fell significantly during the MELD-Na era (hazard ratio, 0.74; P less than .001). Transplantation conferred a “definitive” survival benefit when MELD-Na scores were 21-23 (HR versus wait list, 0.34; P less than .001). During the traditional MELD period, the equivalent cutoff was 15-17 (HR, 0.36; P less than .001). “As such, the current rules for liver allocation may be suboptimal under the MELD-Na–based allocation and the criteria for Share 15 may need to be revisited,” the researchers wrote. They recommended raising the cutoff to 21.
The study also confirmed mild hyponatremia (130-134 mmol/L), moderate hyponatremia (125-129 mmol/L), and severe hyponatremia (less than 125 mmol/L) as independent predictors of wait-list mortality during the traditional MELD era. Hazard ratios were 1.4, 1.8, and 1.7, respectively (all P less than .001). The implementation of MELD-Na significantly weakened these associations, with HRs of 1.1 (P = .3), 1.3 (P = .02), and 1.4 (P = .04), respectively).
The probability of transplantation also rose significantly during the MELD-Na era (HR, 1.2; P less than .001), possibly because of the opioid epidemic, the researchers said. Although greater availability of liver grafts might have improved wait-list outcomes, all score categories would have shown a positive impact if this was the only reason, they added. Instead, MELD-Na most benefited patients with lower scores.
Finally, posttransplantation outcomes worsened during the MELD-Na era, perhaps because of transplant population aging. However, the survival benefit of transplant shifted to higher score ranges during the MELD-Na era even after the researchers controlled for this effect. “According to this analysis,” they wrote, “the survival benefit of liver transplant was definitive in patients with score category of 21-23, which could further validate our proposal to revise Share 15 rule to ‘Share 21.’ ”
The investigators reported having no external funding sources or conflicts of interest.
SOURCE: Nagai S et al. Gastroenterology. 2018 Jul 26. doi: 10.1053/j.gastro.2018.07.025.
FROM GASTROENTEROLOGY
Key clinical point: The implementation of the MELD sodium (MELD-Na) score for liver allocation was associated with significantly improved outcomes for wait-listed patients.
Major finding: During the MELD-Na era, mortality within 90 days on the liver wait list dropped significantly (HR, 0.74; P less than .001) while the probability of transplant rose significantly (HR, 1.2; P less than .001).
Study details: Comparison of 18,850 adult transplant candidates during the traditional MELD era versus 14,512 candidates during the MELD-Na era.
Disclosures: The investigators had no external funding sources or conflicts of interest.
Source: Nagai S et al. Gastroenterology. 2018 Jul 26. doi: 10.1053/j.gastro.2018.07.025.
Proximal adenoma location does not predict high-grade dysplasia
Proximal adenoma location did not predict high-grade dysplasia in a large registry study.
In fact, the odds of high-grade dysplasia were about 25% lower for proximal versus distal adenomas (odds ratio, 0.75), reported Thomas Rösch, MD, of University Hospital Hamburg-Eppendorf, Hamburg, Germany, and his associates. A third of adenomas in the study lacked location data, but in sensitivity analyses, the odds of high-grade dysplasia fell to 0.72 when these lesions were assumed to be proximal and rose to 0.96 when they were assumed to be distal.
Interval colorectal cancers probably are more likely to be proximal than distal because of a “combination of endoscopy-related factors and biology,” not because of histologic differences alone, the researchers wrote. The report was published in Clinical Gastroenterology and Hepatology.
Interval cancers are more common in the right colon, as several studies have noted. However, it was unclear whether this phenomenon represented a higher miss rate, a lower rate of successful polypectomy, or an increased risk of malignant histology in the proximal colon, the researchers wrote. Accordingly, they analyzed data on 594,614 index adenomas detected during more than 2.5 million screening colonoscopies performed between 2007 and 2012 and entered into the German National Screening Colonoscopy Registry.
A total of 3.5% of index adenomas showed high-grade dysplasia, which correlated most strongly with larger size, said the researchers. In fact, the odds of high-grade dysplasia were 10-fold higher when index adenomas measured at least 1 cm than when they were smaller. High-grade dysplasia also was significantly more frequent when patients were older than 64 years, were male, and when they had pedunculated versus flat lesions. Given the large size of the dataset, all these associations were statistically significant.
Sessile lesions were slightly more likely to be high-grade compared with flat lesions, the investigators noted. Many proximal interval cancers arise from sessile serrated polyps, which may be subtle and difficult to detect or to resect completely, they continued. At the same time, colonoscopy also might be more likely to miss flat, serrated lesions when they are located proximally, and these lesions can become more aggressive over time. Thus, “[e]ndoscopist factors, such as missed lesions or incompletely removed lesions, may account for the predominance of proximal interval colorectal cancers.”
Like other registry studies, this study lacked uniform histopathologic definitions or central histopathology review. The dataset also covered only the largest or most histologically remarkable adenoma for each patient. However, the study findings did not change substantially after the researchers controlled for patients with missing location data, which presumably included patients with multiple polyps in both proximal and distal locations.
The researchers did not disclose external funding sources. They reported having no conflicts of interest.
SOURCE: Rösch T et al. Clin Gastroenterol Hepatol. 2018 Jun 11. doi: 10.1016/j.cgh.2018.05.043.
Colorectal cancers detected in a short interval after a complete and clearing colonoscopy are referred to as postcolonoscopy colon cancers or interval cancers, and are approximately three times more likely to occur in the proximal colon compared with the distal colon. Reasons for this difference are not known and possible explanations include alternate and accelerated tumor biology and rapid cancer progression, such as through the CpG island methylation phenotype pathway, missed cancers or precursor lesions in the proximal colon, or incomplete polyp resection. In the current study, the authors address whether the biology of polyps removed in the proximal colon is different, i.e., are these adenomas more likely to exhibit high-grade dysplasia compared to adenomas in the distal colon in approximately 2.5 million screening colonoscopies performed between 2007 and 2012, obtained from a screening colonoscopy registry in Germany. The authors did not find a difference in frequency of high-grade dysplasia between proximal and distal polyps. As expected, adenoma size, male sex, and older age were associated with finding of high-grade dysplasia, but contrary to current literature, the authors found that distal location and pedunculated (versus sessile) form were associated with high-grade dysplasia. A major limitation of the study is that sessile serrated polyps were not included, and the authors did not have information on villous histology. The study reinforces the hypothesis that missed and incompletely resected adenomas play a bigger role in missed proximal cancers, and that the goal of high-quality colonoscopy should be to detect and completely resect adenomas with equal vigilance in both the proximal and distal colon.
Aasma Shaukat, MD, MPH, AGAF, is professor of medicine in the division of gastroenterology and hepatology at the University of Minnesota, Minneapolis, and the GI Section Chief at the Minneapolis VA Medical Center. She has no conflicts of interest.
Colorectal cancers detected in a short interval after a complete and clearing colonoscopy are referred to as postcolonoscopy colon cancers or interval cancers, and are approximately three times more likely to occur in the proximal colon compared with the distal colon. Reasons for this difference are not known and possible explanations include alternate and accelerated tumor biology and rapid cancer progression, such as through the CpG island methylation phenotype pathway, missed cancers or precursor lesions in the proximal colon, or incomplete polyp resection. In the current study, the authors address whether the biology of polyps removed in the proximal colon is different, i.e., are these adenomas more likely to exhibit high-grade dysplasia compared to adenomas in the distal colon in approximately 2.5 million screening colonoscopies performed between 2007 and 2012, obtained from a screening colonoscopy registry in Germany. The authors did not find a difference in frequency of high-grade dysplasia between proximal and distal polyps. As expected, adenoma size, male sex, and older age were associated with finding of high-grade dysplasia, but contrary to current literature, the authors found that distal location and pedunculated (versus sessile) form were associated with high-grade dysplasia. A major limitation of the study is that sessile serrated polyps were not included, and the authors did not have information on villous histology. The study reinforces the hypothesis that missed and incompletely resected adenomas play a bigger role in missed proximal cancers, and that the goal of high-quality colonoscopy should be to detect and completely resect adenomas with equal vigilance in both the proximal and distal colon.
Aasma Shaukat, MD, MPH, AGAF, is professor of medicine in the division of gastroenterology and hepatology at the University of Minnesota, Minneapolis, and the GI Section Chief at the Minneapolis VA Medical Center. She has no conflicts of interest.
Colorectal cancers detected in a short interval after a complete and clearing colonoscopy are referred to as postcolonoscopy colon cancers or interval cancers, and are approximately three times more likely to occur in the proximal colon compared with the distal colon. Reasons for this difference are not known and possible explanations include alternate and accelerated tumor biology and rapid cancer progression, such as through the CpG island methylation phenotype pathway, missed cancers or precursor lesions in the proximal colon, or incomplete polyp resection. In the current study, the authors address whether the biology of polyps removed in the proximal colon is different, i.e., are these adenomas more likely to exhibit high-grade dysplasia compared to adenomas in the distal colon in approximately 2.5 million screening colonoscopies performed between 2007 and 2012, obtained from a screening colonoscopy registry in Germany. The authors did not find a difference in frequency of high-grade dysplasia between proximal and distal polyps. As expected, adenoma size, male sex, and older age were associated with finding of high-grade dysplasia, but contrary to current literature, the authors found that distal location and pedunculated (versus sessile) form were associated with high-grade dysplasia. A major limitation of the study is that sessile serrated polyps were not included, and the authors did not have information on villous histology. The study reinforces the hypothesis that missed and incompletely resected adenomas play a bigger role in missed proximal cancers, and that the goal of high-quality colonoscopy should be to detect and completely resect adenomas with equal vigilance in both the proximal and distal colon.
Aasma Shaukat, MD, MPH, AGAF, is professor of medicine in the division of gastroenterology and hepatology at the University of Minnesota, Minneapolis, and the GI Section Chief at the Minneapolis VA Medical Center. She has no conflicts of interest.
Proximal adenoma location did not predict high-grade dysplasia in a large registry study.
In fact, the odds of high-grade dysplasia were about 25% lower for proximal versus distal adenomas (odds ratio, 0.75), reported Thomas Rösch, MD, of University Hospital Hamburg-Eppendorf, Hamburg, Germany, and his associates. A third of adenomas in the study lacked location data, but in sensitivity analyses, the odds of high-grade dysplasia fell to 0.72 when these lesions were assumed to be proximal and rose to 0.96 when they were assumed to be distal.
Interval colorectal cancers probably are more likely to be proximal than distal because of a “combination of endoscopy-related factors and biology,” not because of histologic differences alone, the researchers wrote. The report was published in Clinical Gastroenterology and Hepatology.
Interval cancers are more common in the right colon, as several studies have noted. However, it was unclear whether this phenomenon represented a higher miss rate, a lower rate of successful polypectomy, or an increased risk of malignant histology in the proximal colon, the researchers wrote. Accordingly, they analyzed data on 594,614 index adenomas detected during more than 2.5 million screening colonoscopies performed between 2007 and 2012 and entered into the German National Screening Colonoscopy Registry.
A total of 3.5% of index adenomas showed high-grade dysplasia, which correlated most strongly with larger size, said the researchers. In fact, the odds of high-grade dysplasia were 10-fold higher when index adenomas measured at least 1 cm than when they were smaller. High-grade dysplasia also was significantly more frequent when patients were older than 64 years, were male, and when they had pedunculated versus flat lesions. Given the large size of the dataset, all these associations were statistically significant.
Sessile lesions were slightly more likely to be high-grade compared with flat lesions, the investigators noted. Many proximal interval cancers arise from sessile serrated polyps, which may be subtle and difficult to detect or to resect completely, they continued. At the same time, colonoscopy also might be more likely to miss flat, serrated lesions when they are located proximally, and these lesions can become more aggressive over time. Thus, “[e]ndoscopist factors, such as missed lesions or incompletely removed lesions, may account for the predominance of proximal interval colorectal cancers.”
Like other registry studies, this study lacked uniform histopathologic definitions or central histopathology review. The dataset also covered only the largest or most histologically remarkable adenoma for each patient. However, the study findings did not change substantially after the researchers controlled for patients with missing location data, which presumably included patients with multiple polyps in both proximal and distal locations.
The researchers did not disclose external funding sources. They reported having no conflicts of interest.
SOURCE: Rösch T et al. Clin Gastroenterol Hepatol. 2018 Jun 11. doi: 10.1016/j.cgh.2018.05.043.
Proximal adenoma location did not predict high-grade dysplasia in a large registry study.
In fact, the odds of high-grade dysplasia were about 25% lower for proximal versus distal adenomas (odds ratio, 0.75), reported Thomas Rösch, MD, of University Hospital Hamburg-Eppendorf, Hamburg, Germany, and his associates. A third of adenomas in the study lacked location data, but in sensitivity analyses, the odds of high-grade dysplasia fell to 0.72 when these lesions were assumed to be proximal and rose to 0.96 when they were assumed to be distal.
Interval colorectal cancers probably are more likely to be proximal than distal because of a “combination of endoscopy-related factors and biology,” not because of histologic differences alone, the researchers wrote. The report was published in Clinical Gastroenterology and Hepatology.
Interval cancers are more common in the right colon, as several studies have noted. However, it was unclear whether this phenomenon represented a higher miss rate, a lower rate of successful polypectomy, or an increased risk of malignant histology in the proximal colon, the researchers wrote. Accordingly, they analyzed data on 594,614 index adenomas detected during more than 2.5 million screening colonoscopies performed between 2007 and 2012 and entered into the German National Screening Colonoscopy Registry.
A total of 3.5% of index adenomas showed high-grade dysplasia, which correlated most strongly with larger size, said the researchers. In fact, the odds of high-grade dysplasia were 10-fold higher when index adenomas measured at least 1 cm than when they were smaller. High-grade dysplasia also was significantly more frequent when patients were older than 64 years, were male, and when they had pedunculated versus flat lesions. Given the large size of the dataset, all these associations were statistically significant.
Sessile lesions were slightly more likely to be high-grade compared with flat lesions, the investigators noted. Many proximal interval cancers arise from sessile serrated polyps, which may be subtle and difficult to detect or to resect completely, they continued. At the same time, colonoscopy also might be more likely to miss flat, serrated lesions when they are located proximally, and these lesions can become more aggressive over time. Thus, “[e]ndoscopist factors, such as missed lesions or incompletely removed lesions, may account for the predominance of proximal interval colorectal cancers.”
Like other registry studies, this study lacked uniform histopathologic definitions or central histopathology review. The dataset also covered only the largest or most histologically remarkable adenoma for each patient. However, the study findings did not change substantially after the researchers controlled for patients with missing location data, which presumably included patients with multiple polyps in both proximal and distal locations.
The researchers did not disclose external funding sources. They reported having no conflicts of interest.
SOURCE: Rösch T et al. Clin Gastroenterol Hepatol. 2018 Jun 11. doi: 10.1016/j.cgh.2018.05.043.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: Proximal adenoma location did not predict high-grade dysplasia.
Major finding: The odds of high-grade dysplasia were about 25% lower for proximal versus distal adenomas (odds ratio, 0.75).
Study details: Registry study of 594,614 adenomas identified during more than 2.5 million screening colonoscopies between 2007 and 2012.
Disclosures: The researchers did not disclose external funding sources. They reported having no conflicts of interest.
Source: Rösch T et al. Clin Gastroenterol Hepatol. 2018 Jun 11. doi: 10.1016/j.cgh.2018.05.043.
Antibiotics trigger proteolytic activity that leads to chronic colitis
Antibiotics are associated with increased large intestinal proteolytic activity and gut barrier disruption, thereby raising the risk of chronic colitis in susceptible individuals, a recent study found.
Although the association between antibiotics and chronic colitis has been previously described, this is the first study to demonstrate the causative role of high proteolytic activity, reported lead author Hongsup Yoon, PhD, chair of nutrition and immunology at Technische Universität München in Freising-Weihenstephan, Germany, and colleagues. The team’s experiments support development of antiproteolytic strategies in susceptible humans.
“In the context of IBD, several clinical studies have already revealed that early and frequent antibiotic therapies, especially metronidazole or fluoroquinolone treatments, are associated with increased risk for Crohn’s disease,” the authors wrote in Cellular and Molecular Gastroenterology and Hepatology. “However, the causal role of antibiotic therapies in the disease development and the mechanisms underlying this [potentially] serious long-term adverse effect of antibiotics on the intestinal immune homeostasis remain unknown.”
Previous studies have shown that antibiotic therapy often causes high luminal proteolytic activity in the large intestine, likely because of the elimination of antiproteolytic bacteria that normally control pancreatic protease levels. Other studies have shown that exposing murine colonic mucosa to fecal supernatants with high proteolytic activity increases gut barrier permeability, which triggers chronic inflammation via translocation of luminal antigens.
“In view of these data,” the authors wrote, “we hypothesized that the antibiotic-increased proteolytic activity in the large intestine is a relevant risk factor for the development of colitis in susceptible organisms.”
The first component of the study used transwell experiments to evaluate the impact of high proteolytic activity on gut barrier integrity. High proteolytic activity was induced by several antibiotics, including fluoroquinolones with or without an imidazole (ciprofloxacin and levofloxacin plus or minus metronidazole), a beta-lactam (amoxicillin + clavulanate), cephalosporins with or without a macrolide (azithromycin and ceftriaxone plus or minus azithromycin), and a rifamycin (rifaximin).
“All tested antibiotic classes mediated a major proteolytic activity increase in some patients but not in others,” the authors wrote, “demonstrating individual-specific vulnerability of the intestinal microbiota toward antibiotic therapies, which is likely caused by the high interindividual variability of human microbial ecosystems.”
One-quarter of patients had a 400% or greater increase in large intestinal proteolytic activity following antibiotic therapy, and several had an increase greater than 900%. Analysis indicated that proteolytic activity was caused by pancreatic proteases such as chymotrypsin and trypsin.
Subsequent cell line testing showed that stool supernatants with high proteolytic activity damaged the epithelial barrier, but samples with low proteolytic activity did not. Of note, the negative impact of high proteolytic activity on epithelial cells could be mitigated by incubating stool supernatants with a serine protease inhibitor.
In analogous experiments, mice were given a combination of vancomycin and metronidazole (V/M). In contrast with the various proteolytic activity levels observed in humans, all mice had high proteolytic activity levels following treatment, suggesting that V/M eliminated almost all antiproteolytic bacteria.
The loss of antiproteolytic bacteria was clarified by cecal microbiota transplantation tests. Transplants from untreated mice were capable of normalizing proteolytic activity levels in germ-free mice (which have high proteolytic activity levels), but transplants from V/M-treated mice were ineffective, suggesting a near-total loss of antiproteolytic bacteria. The identity of these antiproteolytic bacteria remains a mystery.
“Although our data are in line with published literature suggesting specific strains of the order Bacteroidales to play a role in the physiological inactivation of pancreatic proteases,” the authors wrote, “the identity of relevant antiproteolytic species/strains remains to be elucidated.”
The next part of the study involved wild-type and interleukin (IL)-10–/– mice, the latter of which serves as a model of human colitis. Both types of mice were given V/M with or without an oral serine protease inhibitor, a potential therapy intended to limit proteolytic activity and associated intestinal barrier damage.
Although both wild-type and IL-10–/– mice had increased intestinal permeability after V/M treatment, only IL-10–/– mice showed lasting inflammation. Of note, coadministration of an oral serine protease inhibitor with V/M protected against colitis in IL-10–/– mice.
The protective benefit of an oral serine protease inhibitor in IL-10–/– mice prompts the development of antiproteolytic strategies in humans. These would target “large intestinal proteolytic activity [e.g., oral administration of encapsulated serine protease inhibitors, commensal antiproteolytic bacteria, or genetically modified bacteria expressing protease inhibitors] to protect the large intestinal mucosa from adverse effects of antibiotic-induced or diarrhea-induced high proteolytic activity,” the authors wrote.
The study was funded by the Deutscher Akademischer Austauschdienst. No conflicts of interest were reported.
SOURCE: Yoon H-S et al. Cell Mol Gastroenterol Hepatol. 2018 May 29. doi: 10.1016/j.jcmgh.2018.05.008.
Antibiotics are associated with increased large intestinal proteolytic activity and gut barrier disruption, thereby raising the risk of chronic colitis in susceptible individuals, a recent study found.
Although the association between antibiotics and chronic colitis has been previously described, this is the first study to demonstrate the causative role of high proteolytic activity, reported lead author Hongsup Yoon, PhD, chair of nutrition and immunology at Technische Universität München in Freising-Weihenstephan, Germany, and colleagues. The team’s experiments support development of antiproteolytic strategies in susceptible humans.
“In the context of IBD, several clinical studies have already revealed that early and frequent antibiotic therapies, especially metronidazole or fluoroquinolone treatments, are associated with increased risk for Crohn’s disease,” the authors wrote in Cellular and Molecular Gastroenterology and Hepatology. “However, the causal role of antibiotic therapies in the disease development and the mechanisms underlying this [potentially] serious long-term adverse effect of antibiotics on the intestinal immune homeostasis remain unknown.”
Previous studies have shown that antibiotic therapy often causes high luminal proteolytic activity in the large intestine, likely because of the elimination of antiproteolytic bacteria that normally control pancreatic protease levels. Other studies have shown that exposing murine colonic mucosa to fecal supernatants with high proteolytic activity increases gut barrier permeability, which triggers chronic inflammation via translocation of luminal antigens.
“In view of these data,” the authors wrote, “we hypothesized that the antibiotic-increased proteolytic activity in the large intestine is a relevant risk factor for the development of colitis in susceptible organisms.”
The first component of the study used transwell experiments to evaluate the impact of high proteolytic activity on gut barrier integrity. High proteolytic activity was induced by several antibiotics, including fluoroquinolones with or without an imidazole (ciprofloxacin and levofloxacin plus or minus metronidazole), a beta-lactam (amoxicillin + clavulanate), cephalosporins with or without a macrolide (azithromycin and ceftriaxone plus or minus azithromycin), and a rifamycin (rifaximin).
“All tested antibiotic classes mediated a major proteolytic activity increase in some patients but not in others,” the authors wrote, “demonstrating individual-specific vulnerability of the intestinal microbiota toward antibiotic therapies, which is likely caused by the high interindividual variability of human microbial ecosystems.”
One-quarter of patients had a 400% or greater increase in large intestinal proteolytic activity following antibiotic therapy, and several had an increase greater than 900%. Analysis indicated that proteolytic activity was caused by pancreatic proteases such as chymotrypsin and trypsin.
Subsequent cell line testing showed that stool supernatants with high proteolytic activity damaged the epithelial barrier, but samples with low proteolytic activity did not. Of note, the negative impact of high proteolytic activity on epithelial cells could be mitigated by incubating stool supernatants with a serine protease inhibitor.
In analogous experiments, mice were given a combination of vancomycin and metronidazole (V/M). In contrast with the various proteolytic activity levels observed in humans, all mice had high proteolytic activity levels following treatment, suggesting that V/M eliminated almost all antiproteolytic bacteria.
The loss of antiproteolytic bacteria was clarified by cecal microbiota transplantation tests. Transplants from untreated mice were capable of normalizing proteolytic activity levels in germ-free mice (which have high proteolytic activity levels), but transplants from V/M-treated mice were ineffective, suggesting a near-total loss of antiproteolytic bacteria. The identity of these antiproteolytic bacteria remains a mystery.
“Although our data are in line with published literature suggesting specific strains of the order Bacteroidales to play a role in the physiological inactivation of pancreatic proteases,” the authors wrote, “the identity of relevant antiproteolytic species/strains remains to be elucidated.”
The next part of the study involved wild-type and interleukin (IL)-10–/– mice, the latter of which serves as a model of human colitis. Both types of mice were given V/M with or without an oral serine protease inhibitor, a potential therapy intended to limit proteolytic activity and associated intestinal barrier damage.
Although both wild-type and IL-10–/– mice had increased intestinal permeability after V/M treatment, only IL-10–/– mice showed lasting inflammation. Of note, coadministration of an oral serine protease inhibitor with V/M protected against colitis in IL-10–/– mice.
The protective benefit of an oral serine protease inhibitor in IL-10–/– mice prompts the development of antiproteolytic strategies in humans. These would target “large intestinal proteolytic activity [e.g., oral administration of encapsulated serine protease inhibitors, commensal antiproteolytic bacteria, or genetically modified bacteria expressing protease inhibitors] to protect the large intestinal mucosa from adverse effects of antibiotic-induced or diarrhea-induced high proteolytic activity,” the authors wrote.
The study was funded by the Deutscher Akademischer Austauschdienst. No conflicts of interest were reported.
SOURCE: Yoon H-S et al. Cell Mol Gastroenterol Hepatol. 2018 May 29. doi: 10.1016/j.jcmgh.2018.05.008.
Antibiotics are associated with increased large intestinal proteolytic activity and gut barrier disruption, thereby raising the risk of chronic colitis in susceptible individuals, a recent study found.
Although the association between antibiotics and chronic colitis has been previously described, this is the first study to demonstrate the causative role of high proteolytic activity, reported lead author Hongsup Yoon, PhD, chair of nutrition and immunology at Technische Universität München in Freising-Weihenstephan, Germany, and colleagues. The team’s experiments support development of antiproteolytic strategies in susceptible humans.
“In the context of IBD, several clinical studies have already revealed that early and frequent antibiotic therapies, especially metronidazole or fluoroquinolone treatments, are associated with increased risk for Crohn’s disease,” the authors wrote in Cellular and Molecular Gastroenterology and Hepatology. “However, the causal role of antibiotic therapies in the disease development and the mechanisms underlying this [potentially] serious long-term adverse effect of antibiotics on the intestinal immune homeostasis remain unknown.”
Previous studies have shown that antibiotic therapy often causes high luminal proteolytic activity in the large intestine, likely because of the elimination of antiproteolytic bacteria that normally control pancreatic protease levels. Other studies have shown that exposing murine colonic mucosa to fecal supernatants with high proteolytic activity increases gut barrier permeability, which triggers chronic inflammation via translocation of luminal antigens.
“In view of these data,” the authors wrote, “we hypothesized that the antibiotic-increased proteolytic activity in the large intestine is a relevant risk factor for the development of colitis in susceptible organisms.”
The first component of the study used transwell experiments to evaluate the impact of high proteolytic activity on gut barrier integrity. High proteolytic activity was induced by several antibiotics, including fluoroquinolones with or without an imidazole (ciprofloxacin and levofloxacin plus or minus metronidazole), a beta-lactam (amoxicillin + clavulanate), cephalosporins with or without a macrolide (azithromycin and ceftriaxone plus or minus azithromycin), and a rifamycin (rifaximin).
“All tested antibiotic classes mediated a major proteolytic activity increase in some patients but not in others,” the authors wrote, “demonstrating individual-specific vulnerability of the intestinal microbiota toward antibiotic therapies, which is likely caused by the high interindividual variability of human microbial ecosystems.”
One-quarter of patients had a 400% or greater increase in large intestinal proteolytic activity following antibiotic therapy, and several had an increase greater than 900%. Analysis indicated that proteolytic activity was caused by pancreatic proteases such as chymotrypsin and trypsin.
Subsequent cell line testing showed that stool supernatants with high proteolytic activity damaged the epithelial barrier, but samples with low proteolytic activity did not. Of note, the negative impact of high proteolytic activity on epithelial cells could be mitigated by incubating stool supernatants with a serine protease inhibitor.
In analogous experiments, mice were given a combination of vancomycin and metronidazole (V/M). In contrast with the various proteolytic activity levels observed in humans, all mice had high proteolytic activity levels following treatment, suggesting that V/M eliminated almost all antiproteolytic bacteria.
The loss of antiproteolytic bacteria was clarified by cecal microbiota transplantation tests. Transplants from untreated mice were capable of normalizing proteolytic activity levels in germ-free mice (which have high proteolytic activity levels), but transplants from V/M-treated mice were ineffective, suggesting a near-total loss of antiproteolytic bacteria. The identity of these antiproteolytic bacteria remains a mystery.
“Although our data are in line with published literature suggesting specific strains of the order Bacteroidales to play a role in the physiological inactivation of pancreatic proteases,” the authors wrote, “the identity of relevant antiproteolytic species/strains remains to be elucidated.”
The next part of the study involved wild-type and interleukin (IL)-10–/– mice, the latter of which serves as a model of human colitis. Both types of mice were given V/M with or without an oral serine protease inhibitor, a potential therapy intended to limit proteolytic activity and associated intestinal barrier damage.
Although both wild-type and IL-10–/– mice had increased intestinal permeability after V/M treatment, only IL-10–/– mice showed lasting inflammation. Of note, coadministration of an oral serine protease inhibitor with V/M protected against colitis in IL-10–/– mice.
The protective benefit of an oral serine protease inhibitor in IL-10–/– mice prompts the development of antiproteolytic strategies in humans. These would target “large intestinal proteolytic activity [e.g., oral administration of encapsulated serine protease inhibitors, commensal antiproteolytic bacteria, or genetically modified bacteria expressing protease inhibitors] to protect the large intestinal mucosa from adverse effects of antibiotic-induced or diarrhea-induced high proteolytic activity,” the authors wrote.
The study was funded by the Deutscher Akademischer Austauschdienst. No conflicts of interest were reported.
SOURCE: Yoon H-S et al. Cell Mol Gastroenterol Hepatol. 2018 May 29. doi: 10.1016/j.jcmgh.2018.05.008.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: In patients susceptible to inflammatory bowel disease, antibiotics cause increased proteolytic activity in the large intestine that disrupts the gut barrier, thereby increasing risk of chronic colitis.
Major finding: One-quarter of patients had a 400% or greater increase in large intestinal proteolytic activity following antibiotic therapy.
Study details: A prospective study involving mice and humans treated with antibiotics.
Disclosures: The study was funded by the Deutscher Akademischer Austauschdienst. No conflicts of interest were reported.
Source: Yoon H et al. Cell Mol Gastroenterol Hepatol. 2018 May 29. doi: 10.1016/j.jcmgh.2018.05.008.
Guideline: Early screening warranted if family history of nonhereditary colorectal cancer
New consensus guidelines strongly recommend screening colonoscopy for individuals who have at least one first-degree relative with nonhereditary colorectal cancer or advanced adenoma.
Published in the November issue of Gastroenterology, the guideline cites moderate-quality evidence for this recommendation and reserves fecal immunochemical testing for individuals who refuse colonoscopy, are at increased risk for complications, or face barriers accessing the procedure.
Most colorectal cancer screening guidelines have focused on average-risk individuals or those at highest risk because of heritable germline mutations. However, hereditary syndromes comprise only about 5% of colorectal cancers, noted Desmond Leddin, MB, MSc, FRCPC, FRCPI, of the University of Limerick, Ireland, and David A. Lieberman, MD, AGAF, FACG, of Oregon Health and Science University, Portland, with their associates from the Canadian Association of Gastroenterology Banff Consensus.
To develop the guideline, they searched the literature for studies of family history and colorectal cancer risk apart from hereditary Lynch syndrome, familial adenomatous polyposis, attenuated familial adenomatous polyposis, MUTYH-associated polyposis, Peutz-Jeghers syndrome, juvenile polyposis syndrome, Cowden syndrome, serrated (hyperplastic) polyposis syndrome, hereditary pancreatic cancer, and hereditary gastric cancer.
The ensuant guideline cites two new systematic reviews and meta-analyses of 16 prospective studies, as well as one twin study, four retrospective cohort studies, one new systematic review of retrospective studies, and three prior systematic reviews and meta-analyses. The authors note that this is the first guideline to use the GRADE (Grading of Recommendation Assessment, Development and Evaluation) approach to make screening recommendations for individuals who have a family history of nonhereditary colorectal cancer or advanced adenoma.
For those with one first-degree relative with colorectal cancer, the guideline recommends screening colonoscopy or fecal immunochemical testing beginning at age 40-50 years, or 10 years before the age of diagnosis of the first-degree relative, whichever is earlier. The authors recommend spacing subsequent screening colonoscopies by 5-10 years and spacing fecal immunochemical testing by 1-2 years. They offer the same recommendation for individuals with one or more first-degree relatives with confirmed advanced adenoma.
For individuals whose family history includes at least two first-degree relatives with colorectal cancer, the guideline recommends an initial screening colonoscopy at age 40, or 10 years earlier than the age of earliest-diagnosed first-degree relative, whichever is earlier. Screenings should occur every 5 years.
For persons with at least one second-degree relative with colorectal cancer, the guideline authors strongly recommend screening starting at age 50 with tests and intervals based on guidelines for average-risk individuals. Their recommendation is the same for individuals with at least one first-degree relative with nonadvanced adenoma or a polyp of unknown histology.
Given the low-quality evidence supporting most of these recommendations, the guideline calls for well designed observational studies to better quantify the risk of colorectal cancer among individuals with a family history of nonheritable disease. Studies should especially focus on the optimal age of first screening and appropriate screening intervals, the guideline authors wrote. Also, they call for randomized controlled trials to assess whether colonoscopy, fecal immunochemical testing, or fecal occult blood screening might significantly reduce long-term risk for colorectal cancer and improve survival in this population.
Merck provided unrestricted funding for the work. Dr. Leddin reported having no conflicts of interest. Dr. Lieberman and several coauthors disclosed financial relationships with companies other than Merck. One coauthor disclosed advisory and consulting relationships with Merck.
SOURCE: Leddin D et al. Gastroenterology. 2018 Aug 16. doi: 10.1053/j.gastro.2018.08.017.
New consensus guidelines strongly recommend screening colonoscopy for individuals who have at least one first-degree relative with nonhereditary colorectal cancer or advanced adenoma.
Published in the November issue of Gastroenterology, the guideline cites moderate-quality evidence for this recommendation and reserves fecal immunochemical testing for individuals who refuse colonoscopy, are at increased risk for complications, or face barriers accessing the procedure.
Most colorectal cancer screening guidelines have focused on average-risk individuals or those at highest risk because of heritable germline mutations. However, hereditary syndromes comprise only about 5% of colorectal cancers, noted Desmond Leddin, MB, MSc, FRCPC, FRCPI, of the University of Limerick, Ireland, and David A. Lieberman, MD, AGAF, FACG, of Oregon Health and Science University, Portland, with their associates from the Canadian Association of Gastroenterology Banff Consensus.
To develop the guideline, they searched the literature for studies of family history and colorectal cancer risk apart from hereditary Lynch syndrome, familial adenomatous polyposis, attenuated familial adenomatous polyposis, MUTYH-associated polyposis, Peutz-Jeghers syndrome, juvenile polyposis syndrome, Cowden syndrome, serrated (hyperplastic) polyposis syndrome, hereditary pancreatic cancer, and hereditary gastric cancer.
The ensuant guideline cites two new systematic reviews and meta-analyses of 16 prospective studies, as well as one twin study, four retrospective cohort studies, one new systematic review of retrospective studies, and three prior systematic reviews and meta-analyses. The authors note that this is the first guideline to use the GRADE (Grading of Recommendation Assessment, Development and Evaluation) approach to make screening recommendations for individuals who have a family history of nonhereditary colorectal cancer or advanced adenoma.
For those with one first-degree relative with colorectal cancer, the guideline recommends screening colonoscopy or fecal immunochemical testing beginning at age 40-50 years, or 10 years before the age of diagnosis of the first-degree relative, whichever is earlier. The authors recommend spacing subsequent screening colonoscopies by 5-10 years and spacing fecal immunochemical testing by 1-2 years. They offer the same recommendation for individuals with one or more first-degree relatives with confirmed advanced adenoma.
For individuals whose family history includes at least two first-degree relatives with colorectal cancer, the guideline recommends an initial screening colonoscopy at age 40, or 10 years earlier than the age of earliest-diagnosed first-degree relative, whichever is earlier. Screenings should occur every 5 years.
For persons with at least one second-degree relative with colorectal cancer, the guideline authors strongly recommend screening starting at age 50 with tests and intervals based on guidelines for average-risk individuals. Their recommendation is the same for individuals with at least one first-degree relative with nonadvanced adenoma or a polyp of unknown histology.
Given the low-quality evidence supporting most of these recommendations, the guideline calls for well designed observational studies to better quantify the risk of colorectal cancer among individuals with a family history of nonheritable disease. Studies should especially focus on the optimal age of first screening and appropriate screening intervals, the guideline authors wrote. Also, they call for randomized controlled trials to assess whether colonoscopy, fecal immunochemical testing, or fecal occult blood screening might significantly reduce long-term risk for colorectal cancer and improve survival in this population.
Merck provided unrestricted funding for the work. Dr. Leddin reported having no conflicts of interest. Dr. Lieberman and several coauthors disclosed financial relationships with companies other than Merck. One coauthor disclosed advisory and consulting relationships with Merck.
SOURCE: Leddin D et al. Gastroenterology. 2018 Aug 16. doi: 10.1053/j.gastro.2018.08.017.
New consensus guidelines strongly recommend screening colonoscopy for individuals who have at least one first-degree relative with nonhereditary colorectal cancer or advanced adenoma.
Published in the November issue of Gastroenterology, the guideline cites moderate-quality evidence for this recommendation and reserves fecal immunochemical testing for individuals who refuse colonoscopy, are at increased risk for complications, or face barriers accessing the procedure.
Most colorectal cancer screening guidelines have focused on average-risk individuals or those at highest risk because of heritable germline mutations. However, hereditary syndromes comprise only about 5% of colorectal cancers, noted Desmond Leddin, MB, MSc, FRCPC, FRCPI, of the University of Limerick, Ireland, and David A. Lieberman, MD, AGAF, FACG, of Oregon Health and Science University, Portland, with their associates from the Canadian Association of Gastroenterology Banff Consensus.
To develop the guideline, they searched the literature for studies of family history and colorectal cancer risk apart from hereditary Lynch syndrome, familial adenomatous polyposis, attenuated familial adenomatous polyposis, MUTYH-associated polyposis, Peutz-Jeghers syndrome, juvenile polyposis syndrome, Cowden syndrome, serrated (hyperplastic) polyposis syndrome, hereditary pancreatic cancer, and hereditary gastric cancer.
The ensuant guideline cites two new systematic reviews and meta-analyses of 16 prospective studies, as well as one twin study, four retrospective cohort studies, one new systematic review of retrospective studies, and three prior systematic reviews and meta-analyses. The authors note that this is the first guideline to use the GRADE (Grading of Recommendation Assessment, Development and Evaluation) approach to make screening recommendations for individuals who have a family history of nonhereditary colorectal cancer or advanced adenoma.
For those with one first-degree relative with colorectal cancer, the guideline recommends screening colonoscopy or fecal immunochemical testing beginning at age 40-50 years, or 10 years before the age of diagnosis of the first-degree relative, whichever is earlier. The authors recommend spacing subsequent screening colonoscopies by 5-10 years and spacing fecal immunochemical testing by 1-2 years. They offer the same recommendation for individuals with one or more first-degree relatives with confirmed advanced adenoma.
For individuals whose family history includes at least two first-degree relatives with colorectal cancer, the guideline recommends an initial screening colonoscopy at age 40, or 10 years earlier than the age of earliest-diagnosed first-degree relative, whichever is earlier. Screenings should occur every 5 years.
For persons with at least one second-degree relative with colorectal cancer, the guideline authors strongly recommend screening starting at age 50 with tests and intervals based on guidelines for average-risk individuals. Their recommendation is the same for individuals with at least one first-degree relative with nonadvanced adenoma or a polyp of unknown histology.
Given the low-quality evidence supporting most of these recommendations, the guideline calls for well designed observational studies to better quantify the risk of colorectal cancer among individuals with a family history of nonheritable disease. Studies should especially focus on the optimal age of first screening and appropriate screening intervals, the guideline authors wrote. Also, they call for randomized controlled trials to assess whether colonoscopy, fecal immunochemical testing, or fecal occult blood screening might significantly reduce long-term risk for colorectal cancer and improve survival in this population.
Merck provided unrestricted funding for the work. Dr. Leddin reported having no conflicts of interest. Dr. Lieberman and several coauthors disclosed financial relationships with companies other than Merck. One coauthor disclosed advisory and consulting relationships with Merck.
SOURCE: Leddin D et al. Gastroenterology. 2018 Aug 16. doi: 10.1053/j.gastro.2018.08.017.
FROM GASTROENTEROLOGY
AGA Clinical Practice Update: Diagnosis of rumination syndrome
Clinical Gastroenterology and Hepatology.
Additionally, promote diaphragmatic breathing to help manage the condition, advised authors of an expert review of clinical practice updates for rumination syndrome published in“Patients, not unsurprisingly, typically use the word ‘vomiting’ to describe rumination events, and many patients are misdiagnosed as having refractory vomiting, gastroesophageal reflux disease, or gastroparesis,” Magnus Halland, MD, of the Mayo Clinic in Rochester, Minn., and colleagues wrote in the review. “A long delay in receiving a diagnosis is common and can lead to unnecessary testing, reduced quality of life, and even invasive procedures such as surgery or feeding tubes.”
Rumination syndrome differs from vomiting, the authors noted, because the retrograde flow of ingested gastric content does not have an acidic taste and may in fact taste like food or drink recently ingested. Rumination can occur without any preceding events, after a reflux episode or by the swallowing of air that causes gastric straining but typically happens within 1 hour to 2 hours after a meal. Patients can experience weight loss, dental erosions and caries, heartburn, nausea, bloating, diarrhea, abdominal pain, abdominal discomfort, and belching, among other symptoms, in the presence of rumination syndrome, the authors said.
Dr. Halland and his colleagues provided seven best practice recommendations for rumination syndrome in their updates, which include:
- Patients who show symptoms of consistent postprandial regurgitation, often misdiagnosed with refractory gastroesophageal reflux or vomiting, should be considered for rumination syndrome.
- Patients who have dysphagia, nausea, nocturnal regurgitation, or gastric symptoms outside of meals are less likely to have rumination syndrome, but those symptoms do not exclude the condition.
- Rome IV criteria are advised to diagnose rumination syndrome after medical work-up, which includes “persistent or recurrent regurgitation of recently ingested food into the mouth with subsequent spitting or remastication and swallowing” not preceded by retching where patients fulfill these symptom criteria for 3 months with a minimum of 6 months of symptoms before diagnosis.
- Patients should receive first-line therapy for rumination syndrome consisting of diaphragmatic breathing with or without biofeedback.
- Patients should be referred to a speech therapist, gastroenterologist, psychologist, or other knowledgeable health practitioners to learn effective diaphragmatic breathing.
- Current limitations in the diagnosis of rumination syndrome include need for expertise and lack of standardized protocols, but “testing for rumination syndrome with postprandial high-resolution esophageal impedance manometry can be used to support the diagnosis.”
- Bacloflen (10 mg) taken three times daily is a “reasonable next step” for patients who do not respond to treatment.
The authors acknowledged that many questions, such as the pathophysiology and initiating factors of rumination syndrome, are unknown and noted future studies are needed to address epidemiology, develop validated tools for measuring symptoms, and study diaphragmatic breathing’s effect on reducing symptoms of rumination syndrome as well as the condition’s impact on quality of life.
“Indeed, the basic question of how subconsciously one can learn to regurgitate still needs to be answered,” Dr. Halland and his colleagues wrote.
The authors report no relevant conflicts of interest.
SOURCE: Halland M et al. Clin Gastroenterol Hepatol. 2018 Jun 11. doi: 10.1016/j.cgh.2018.05.049.
Clinical Gastroenterology and Hepatology.
Additionally, promote diaphragmatic breathing to help manage the condition, advised authors of an expert review of clinical practice updates for rumination syndrome published in“Patients, not unsurprisingly, typically use the word ‘vomiting’ to describe rumination events, and many patients are misdiagnosed as having refractory vomiting, gastroesophageal reflux disease, or gastroparesis,” Magnus Halland, MD, of the Mayo Clinic in Rochester, Minn., and colleagues wrote in the review. “A long delay in receiving a diagnosis is common and can lead to unnecessary testing, reduced quality of life, and even invasive procedures such as surgery or feeding tubes.”
Rumination syndrome differs from vomiting, the authors noted, because the retrograde flow of ingested gastric content does not have an acidic taste and may in fact taste like food or drink recently ingested. Rumination can occur without any preceding events, after a reflux episode or by the swallowing of air that causes gastric straining but typically happens within 1 hour to 2 hours after a meal. Patients can experience weight loss, dental erosions and caries, heartburn, nausea, bloating, diarrhea, abdominal pain, abdominal discomfort, and belching, among other symptoms, in the presence of rumination syndrome, the authors said.
Dr. Halland and his colleagues provided seven best practice recommendations for rumination syndrome in their updates, which include:
- Patients who show symptoms of consistent postprandial regurgitation, often misdiagnosed with refractory gastroesophageal reflux or vomiting, should be considered for rumination syndrome.
- Patients who have dysphagia, nausea, nocturnal regurgitation, or gastric symptoms outside of meals are less likely to have rumination syndrome, but those symptoms do not exclude the condition.
- Rome IV criteria are advised to diagnose rumination syndrome after medical work-up, which includes “persistent or recurrent regurgitation of recently ingested food into the mouth with subsequent spitting or remastication and swallowing” not preceded by retching where patients fulfill these symptom criteria for 3 months with a minimum of 6 months of symptoms before diagnosis.
- Patients should receive first-line therapy for rumination syndrome consisting of diaphragmatic breathing with or without biofeedback.
- Patients should be referred to a speech therapist, gastroenterologist, psychologist, or other knowledgeable health practitioners to learn effective diaphragmatic breathing.
- Current limitations in the diagnosis of rumination syndrome include need for expertise and lack of standardized protocols, but “testing for rumination syndrome with postprandial high-resolution esophageal impedance manometry can be used to support the diagnosis.”
- Bacloflen (10 mg) taken three times daily is a “reasonable next step” for patients who do not respond to treatment.
The authors acknowledged that many questions, such as the pathophysiology and initiating factors of rumination syndrome, are unknown and noted future studies are needed to address epidemiology, develop validated tools for measuring symptoms, and study diaphragmatic breathing’s effect on reducing symptoms of rumination syndrome as well as the condition’s impact on quality of life.
“Indeed, the basic question of how subconsciously one can learn to regurgitate still needs to be answered,” Dr. Halland and his colleagues wrote.
The authors report no relevant conflicts of interest.
SOURCE: Halland M et al. Clin Gastroenterol Hepatol. 2018 Jun 11. doi: 10.1016/j.cgh.2018.05.049.
Clinical Gastroenterology and Hepatology.
Additionally, promote diaphragmatic breathing to help manage the condition, advised authors of an expert review of clinical practice updates for rumination syndrome published in“Patients, not unsurprisingly, typically use the word ‘vomiting’ to describe rumination events, and many patients are misdiagnosed as having refractory vomiting, gastroesophageal reflux disease, or gastroparesis,” Magnus Halland, MD, of the Mayo Clinic in Rochester, Minn., and colleagues wrote in the review. “A long delay in receiving a diagnosis is common and can lead to unnecessary testing, reduced quality of life, and even invasive procedures such as surgery or feeding tubes.”
Rumination syndrome differs from vomiting, the authors noted, because the retrograde flow of ingested gastric content does not have an acidic taste and may in fact taste like food or drink recently ingested. Rumination can occur without any preceding events, after a reflux episode or by the swallowing of air that causes gastric straining but typically happens within 1 hour to 2 hours after a meal. Patients can experience weight loss, dental erosions and caries, heartburn, nausea, bloating, diarrhea, abdominal pain, abdominal discomfort, and belching, among other symptoms, in the presence of rumination syndrome, the authors said.
Dr. Halland and his colleagues provided seven best practice recommendations for rumination syndrome in their updates, which include:
- Patients who show symptoms of consistent postprandial regurgitation, often misdiagnosed with refractory gastroesophageal reflux or vomiting, should be considered for rumination syndrome.
- Patients who have dysphagia, nausea, nocturnal regurgitation, or gastric symptoms outside of meals are less likely to have rumination syndrome, but those symptoms do not exclude the condition.
- Rome IV criteria are advised to diagnose rumination syndrome after medical work-up, which includes “persistent or recurrent regurgitation of recently ingested food into the mouth with subsequent spitting or remastication and swallowing” not preceded by retching where patients fulfill these symptom criteria for 3 months with a minimum of 6 months of symptoms before diagnosis.
- Patients should receive first-line therapy for rumination syndrome consisting of diaphragmatic breathing with or without biofeedback.
- Patients should be referred to a speech therapist, gastroenterologist, psychologist, or other knowledgeable health practitioners to learn effective diaphragmatic breathing.
- Current limitations in the diagnosis of rumination syndrome include need for expertise and lack of standardized protocols, but “testing for rumination syndrome with postprandial high-resolution esophageal impedance manometry can be used to support the diagnosis.”
- Bacloflen (10 mg) taken three times daily is a “reasonable next step” for patients who do not respond to treatment.
The authors acknowledged that many questions, such as the pathophysiology and initiating factors of rumination syndrome, are unknown and noted future studies are needed to address epidemiology, develop validated tools for measuring symptoms, and study diaphragmatic breathing’s effect on reducing symptoms of rumination syndrome as well as the condition’s impact on quality of life.
“Indeed, the basic question of how subconsciously one can learn to regurgitate still needs to be answered,” Dr. Halland and his colleagues wrote.
The authors report no relevant conflicts of interest.
SOURCE: Halland M et al. Clin Gastroenterol Hepatol. 2018 Jun 11. doi: 10.1016/j.cgh.2018.05.049.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Commentary: Composite risk, not age, is key for timing first colorectal cancer screening
The American Cancer Society’s recent recommendation to lower the age of first screening for colorectal cancer to 45 years does not reflect clear knowledge of risks versus benefits, experts wrote in a recent commentary.
“In the big picture, [the question of whether to start screening at 45 versus 50 years] seems relatively unimportant compared with using individual patient risk for advanced neoplasia in practical, feasible models” that are geared toward adherence, efficiency, and cost-efficacy, wrote Thomas F. Imperiale, MD, of Indiana University, Indianapolis, and his associates. The commentary is in the October issue of Clinical Gastroenterology and Hepatology.
Tailoring age of first screening on an individual level, based on other risk factors and patient preferences, might improve uptake and benefit-risk ratios, balance, they argued.
Rates of colorectal cancer in persons under age 50 years rose by about 22% between 2000 and 2013. However, estimates for the most recent birth cohorts have wide confidence intervals, “indicating imprecision and uncertainty that this trend will continue,” the experts wrote. Furthermore, the absolute risk of colorectal cancer among individuals younger than 50 years has risen only slightly, from 5.9 cases per 100,000 population to 7.2 cases per 100,000 population. “[This] small increase in incidence may represent a true increase or could be due to increased use of colonoscopy in general, and specifically, for diagnosis or high-risk screening of first-degree relatives of persons with colorectal cancer,” the experts wrote.
Implementing the new recommendation could detect earlier-stage (curable) colorectal cancer “in a youthful and productive age group that may be sandwiched between raising children and caring for aging parents,” they continued. Earlier detection could reduce mortality and reduce the costs of treating a disease that often exceeds $100,000 per person annually.
However, the recommendation was based on a modeling study that assumed 100% adherence. In reality, uptake among 45- to 49-year-olds might be 15%-20%, and “who actually shows up for screening could make or break this recommendation,” the experts said. If younger individuals who underwent screening tended to have few risk factors for colorectal cancer, then the new recommendation would lead to many false positives and unnecessary colonoscopies, with the associated fallout of emotional harm and wasted health care resources, they added.
Population-level studies have identified age as the strongest predictor of colorectal cancer, but age “does not perform as well” at patient level, the experts said. They emphasized the role of other risk factors, such as male sex, having a first-degree relative with colorectal cancer, high body mass index, metabolic syndrome, cigarette smoking, diet, adherence to screening, and use of aspirin, nonsteroidal anti-inflammatory drugs, and hormone therapy. “The goal for providers and health systems is to determine whether and how to change screening practice and policy, and how to incorporate this new recommendation into practice, a necessarily complex process that requires knowing patient risk, patient preferences, and the long-term balance of benefits and burdens,” they concluded (Clin Gastroenterol Hepatol. 2018 Aug 13. doi: 10.1016/j.cgh.2018.08.023).
Dr. Imperiale and coauthor Charles J. Kahi, MD, MS, had no disclosures. Coauthor Douglas K. Rex, MD, disclosed ties to Aries Pharmaceutical, Cosmo Pharmaceuticals, Boston Scientific, Sebela, Medtronic, EndoAid Ltd, Olympus, Paion, Braintree, and Medivators. He also chairs the U.S. Multi-Society Task Force on Colorectal Cancer.
* This story was updated on 10/16/2018.
The American Cancer Society’s recent recommendation to lower the age of first screening for colorectal cancer to 45 years does not reflect clear knowledge of risks versus benefits, experts wrote in a recent commentary.
“In the big picture, [the question of whether to start screening at 45 versus 50 years] seems relatively unimportant compared with using individual patient risk for advanced neoplasia in practical, feasible models” that are geared toward adherence, efficiency, and cost-efficacy, wrote Thomas F. Imperiale, MD, of Indiana University, Indianapolis, and his associates. The commentary is in the October issue of Clinical Gastroenterology and Hepatology.
Tailoring age of first screening on an individual level, based on other risk factors and patient preferences, might improve uptake and benefit-risk ratios, balance, they argued.
Rates of colorectal cancer in persons under age 50 years rose by about 22% between 2000 and 2013. However, estimates for the most recent birth cohorts have wide confidence intervals, “indicating imprecision and uncertainty that this trend will continue,” the experts wrote. Furthermore, the absolute risk of colorectal cancer among individuals younger than 50 years has risen only slightly, from 5.9 cases per 100,000 population to 7.2 cases per 100,000 population. “[This] small increase in incidence may represent a true increase or could be due to increased use of colonoscopy in general, and specifically, for diagnosis or high-risk screening of first-degree relatives of persons with colorectal cancer,” the experts wrote.
Implementing the new recommendation could detect earlier-stage (curable) colorectal cancer “in a youthful and productive age group that may be sandwiched between raising children and caring for aging parents,” they continued. Earlier detection could reduce mortality and reduce the costs of treating a disease that often exceeds $100,000 per person annually.
However, the recommendation was based on a modeling study that assumed 100% adherence. In reality, uptake among 45- to 49-year-olds might be 15%-20%, and “who actually shows up for screening could make or break this recommendation,” the experts said. If younger individuals who underwent screening tended to have few risk factors for colorectal cancer, then the new recommendation would lead to many false positives and unnecessary colonoscopies, with the associated fallout of emotional harm and wasted health care resources, they added.
Population-level studies have identified age as the strongest predictor of colorectal cancer, but age “does not perform as well” at patient level, the experts said. They emphasized the role of other risk factors, such as male sex, having a first-degree relative with colorectal cancer, high body mass index, metabolic syndrome, cigarette smoking, diet, adherence to screening, and use of aspirin, nonsteroidal anti-inflammatory drugs, and hormone therapy. “The goal for providers and health systems is to determine whether and how to change screening practice and policy, and how to incorporate this new recommendation into practice, a necessarily complex process that requires knowing patient risk, patient preferences, and the long-term balance of benefits and burdens,” they concluded (Clin Gastroenterol Hepatol. 2018 Aug 13. doi: 10.1016/j.cgh.2018.08.023).
Dr. Imperiale and coauthor Charles J. Kahi, MD, MS, had no disclosures. Coauthor Douglas K. Rex, MD, disclosed ties to Aries Pharmaceutical, Cosmo Pharmaceuticals, Boston Scientific, Sebela, Medtronic, EndoAid Ltd, Olympus, Paion, Braintree, and Medivators. He also chairs the U.S. Multi-Society Task Force on Colorectal Cancer.
* This story was updated on 10/16/2018.
The American Cancer Society’s recent recommendation to lower the age of first screening for colorectal cancer to 45 years does not reflect clear knowledge of risks versus benefits, experts wrote in a recent commentary.
“In the big picture, [the question of whether to start screening at 45 versus 50 years] seems relatively unimportant compared with using individual patient risk for advanced neoplasia in practical, feasible models” that are geared toward adherence, efficiency, and cost-efficacy, wrote Thomas F. Imperiale, MD, of Indiana University, Indianapolis, and his associates. The commentary is in the October issue of Clinical Gastroenterology and Hepatology.
Tailoring age of first screening on an individual level, based on other risk factors and patient preferences, might improve uptake and benefit-risk ratios, balance, they argued.
Rates of colorectal cancer in persons under age 50 years rose by about 22% between 2000 and 2013. However, estimates for the most recent birth cohorts have wide confidence intervals, “indicating imprecision and uncertainty that this trend will continue,” the experts wrote. Furthermore, the absolute risk of colorectal cancer among individuals younger than 50 years has risen only slightly, from 5.9 cases per 100,000 population to 7.2 cases per 100,000 population. “[This] small increase in incidence may represent a true increase or could be due to increased use of colonoscopy in general, and specifically, for diagnosis or high-risk screening of first-degree relatives of persons with colorectal cancer,” the experts wrote.
Implementing the new recommendation could detect earlier-stage (curable) colorectal cancer “in a youthful and productive age group that may be sandwiched between raising children and caring for aging parents,” they continued. Earlier detection could reduce mortality and reduce the costs of treating a disease that often exceeds $100,000 per person annually.
However, the recommendation was based on a modeling study that assumed 100% adherence. In reality, uptake among 45- to 49-year-olds might be 15%-20%, and “who actually shows up for screening could make or break this recommendation,” the experts said. If younger individuals who underwent screening tended to have few risk factors for colorectal cancer, then the new recommendation would lead to many false positives and unnecessary colonoscopies, with the associated fallout of emotional harm and wasted health care resources, they added.
Population-level studies have identified age as the strongest predictor of colorectal cancer, but age “does not perform as well” at patient level, the experts said. They emphasized the role of other risk factors, such as male sex, having a first-degree relative with colorectal cancer, high body mass index, metabolic syndrome, cigarette smoking, diet, adherence to screening, and use of aspirin, nonsteroidal anti-inflammatory drugs, and hormone therapy. “The goal for providers and health systems is to determine whether and how to change screening practice and policy, and how to incorporate this new recommendation into practice, a necessarily complex process that requires knowing patient risk, patient preferences, and the long-term balance of benefits and burdens,” they concluded (Clin Gastroenterol Hepatol. 2018 Aug 13. doi: 10.1016/j.cgh.2018.08.023).
Dr. Imperiale and coauthor Charles J. Kahi, MD, MS, had no disclosures. Coauthor Douglas K. Rex, MD, disclosed ties to Aries Pharmaceutical, Cosmo Pharmaceuticals, Boston Scientific, Sebela, Medtronic, EndoAid Ltd, Olympus, Paion, Braintree, and Medivators. He also chairs the U.S. Multi-Society Task Force on Colorectal Cancer.
* This story was updated on 10/16/2018.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Proinflammatory diet linked to colorectal cancer testing positive for Fusobacterium nucleatum
Diets promoting colonic inflammation were associated with a greater risk of colorectal carcinomas containing Fusobacterium nucleatum bacteria, according to a report in the October issue of Clinical Gastroenterology and Hepatology.
Courtesy American Gastroenterological Association
Proinflammatory diets were not linked to heightened risk for colon cancers without these bacteria, reported Li Liu, MD, PhD, of Dana-Farber Cancer Institute and Harvard Medical School, Boston. “These findings indicate that diet-induced intestinal inflammation alters the gut microbiome to contribute to colorectal carcinogenesis,” they wrote. “Nutritional interventions might be used in precision medicine and cancer prevention.”
Intestinal inflammation, a risk factor for colorectal cancer, is associated with high levels of circulating interleukin 6, C-reactive protein, and tumor necrosis factor–receptor superfamily member 1B. Colonic inflammation impairs the mucosal barrier and alters immune cell responses, which affects the composition of colonic microbiota. Among these, F. nucleatum is known to potentiate colorectal tumors and is associated with proximal tumor location, other tumor features, and cancer progression and chemoresistance.
For the study, the investigators examined self-reported data from more than 124,000 individuals followed for 28 years as part of the Nurses’ Health Study and the Health Professionals Follow-Up Study. They calculated average dietary patterns based on the empiric dietary inflammatory pattern (EDIP) score, which sums weighted intake scores for 18 foods (such as red and processed meat, coffee, tea, and leafy green or dark yellow vegetables) that are known to affect plasma levels of interleukin 6, C-reactive protein, tumor necrosis factor–receptor superfamily member 1B, and tumor necrosis factor alpha–receptor 2. A higher EDIP score denotes a more inflammatory diet.
During the 28-year follow-up period, 951 individuals developed colorectal carcinomas that were tested with a polymerase chain reaction assay for F. nucleatum DNA. A total of 115 tumors tested positive for F. nucleatum. After the researchers controlled for potential confounders, individuals whose EDIP scores were in the highest tertile were significantly more likely to develop F. nucleatum–positive colorectal cancer than were those who scored in the lowest tertile (adjusted hazard ratio, 1.63; 95% confidence interval, 1.03 to 2.58; P = .03). This differential association “appeared to be stronger in proximal colon cancer than in distal colon and rectal cancer,” the researchers said.
More than 90% of individuals in this study were non-Hispanic white, the researchers noted. Tumor tissue was not available from all cases of colorectal cancer and a fairly small number of cases tested positive for tumor F. nucleatum. Nonetheless, the findings suggest that an inflammatory diet could help amplify gut microbiota involved in tumorigenesis, they said. Pending confirmatory studies, they recommended an anti-inflammatory diet with high intake of green leafy vegetables, dark yellow vegetables, coffee, and tea, and with low intake of red meat, processed meat, refined grain, and sugary beverages. They also recommended studying whether F. nucleatum tumor or stool tests could help personalize dietary interventions.
Funders included the National Institutes of Health, Dana Farber Harvard Cancer, Project P. Fund for Colorectal Cancer Research, Friends of the Dana-Farber Cancer Institute, Bennett Family Fund, the Entertainment Industry Foundation, and American Association for Cancer Research, National Natural Science Foundation of China, Chinese Scholarship Council, Huazhong University of Science and Technology, and others. Dr. Liu had no disclosures. One coinvestigator disclosed ties to Genentech/Roche, Lilly, Sanofi, Bayer, and several other biomedical companies.
SOURCE: Liu L et al. Clin Gastroenterol Hepatol. 2018 Apr 24. doi: 10.1016/j.cgh.2018.04.030.
The underlying reasons colorectal cancer (CRC) develops are unknown, but they likely include a complex interaction between genetics and environmental exposures. Recent studies have highlighted important links between diet, the intestinal microbiota, and CRC development and progression.
Liu et al. used the Nurses’ Health Study and Health Professionals Follow-Up Study cohorts to extend our understanding of the relationship between diet, the intestinal microbiota, and CRC. They utilized validated food frequency questionnaires obtained every 4 years and formalin-fixed paraffin embedded CRC tissue samples collected from 951 individuals. They calculated an empiric dietary inflammatory pattern (EDIP) score, which correlates components of the diet with plasma inflammatory markers. After adjusting for confounders, they found high EDIP scores were significantly associated with Fusobacterium nucleatum–positive CRC, but not with F. nucleatum–negative CRC. In addition, they demonstrated this association was stronger for proximal compared with distal CRC. Their findings suggest an inflammatory diet may interact with the intestinal microbiota to promote the development of CRC and they provide a preliminary recommendation to minimize intake of potentially harmful foods (i.e., red meat, processed meat, refined grains, etc.). Despite the intriguing results, the authors do recognize limitations including the small number of cases with F. nucleatum present (n = 115) and the homogeneous cohort (90% non-Hispanic whites), which may limit generalizability.
As clinicians, we should continue strongly advocating for CRC screening and, based on these findings, may consider dietary recommendations to reduce intake of potentially harmful foods. Further research will be needed to confirm these findings in additional cohorts and to clarify the molecular interactions between dietary components, intestinal microbiota, and development of CRC.
Rajesh R. Shah, MD, is assistant professor of gastroenterology, department of internal medicine, Baylor College of Medicine, Houston. He has no conflicts of interest.
The underlying reasons colorectal cancer (CRC) develops are unknown, but they likely include a complex interaction between genetics and environmental exposures. Recent studies have highlighted important links between diet, the intestinal microbiota, and CRC development and progression.
Liu et al. used the Nurses’ Health Study and Health Professionals Follow-Up Study cohorts to extend our understanding of the relationship between diet, the intestinal microbiota, and CRC. They utilized validated food frequency questionnaires obtained every 4 years and formalin-fixed paraffin embedded CRC tissue samples collected from 951 individuals. They calculated an empiric dietary inflammatory pattern (EDIP) score, which correlates components of the diet with plasma inflammatory markers. After adjusting for confounders, they found high EDIP scores were significantly associated with Fusobacterium nucleatum–positive CRC, but not with F. nucleatum–negative CRC. In addition, they demonstrated this association was stronger for proximal compared with distal CRC. Their findings suggest an inflammatory diet may interact with the intestinal microbiota to promote the development of CRC and they provide a preliminary recommendation to minimize intake of potentially harmful foods (i.e., red meat, processed meat, refined grains, etc.). Despite the intriguing results, the authors do recognize limitations including the small number of cases with F. nucleatum present (n = 115) and the homogeneous cohort (90% non-Hispanic whites), which may limit generalizability.
As clinicians, we should continue strongly advocating for CRC screening and, based on these findings, may consider dietary recommendations to reduce intake of potentially harmful foods. Further research will be needed to confirm these findings in additional cohorts and to clarify the molecular interactions between dietary components, intestinal microbiota, and development of CRC.
Rajesh R. Shah, MD, is assistant professor of gastroenterology, department of internal medicine, Baylor College of Medicine, Houston. He has no conflicts of interest.
The underlying reasons colorectal cancer (CRC) develops are unknown, but they likely include a complex interaction between genetics and environmental exposures. Recent studies have highlighted important links between diet, the intestinal microbiota, and CRC development and progression.
Liu et al. used the Nurses’ Health Study and Health Professionals Follow-Up Study cohorts to extend our understanding of the relationship between diet, the intestinal microbiota, and CRC. They utilized validated food frequency questionnaires obtained every 4 years and formalin-fixed paraffin embedded CRC tissue samples collected from 951 individuals. They calculated an empiric dietary inflammatory pattern (EDIP) score, which correlates components of the diet with plasma inflammatory markers. After adjusting for confounders, they found high EDIP scores were significantly associated with Fusobacterium nucleatum–positive CRC, but not with F. nucleatum–negative CRC. In addition, they demonstrated this association was stronger for proximal compared with distal CRC. Their findings suggest an inflammatory diet may interact with the intestinal microbiota to promote the development of CRC and they provide a preliminary recommendation to minimize intake of potentially harmful foods (i.e., red meat, processed meat, refined grains, etc.). Despite the intriguing results, the authors do recognize limitations including the small number of cases with F. nucleatum present (n = 115) and the homogeneous cohort (90% non-Hispanic whites), which may limit generalizability.
As clinicians, we should continue strongly advocating for CRC screening and, based on these findings, may consider dietary recommendations to reduce intake of potentially harmful foods. Further research will be needed to confirm these findings in additional cohorts and to clarify the molecular interactions between dietary components, intestinal microbiota, and development of CRC.
Rajesh R. Shah, MD, is assistant professor of gastroenterology, department of internal medicine, Baylor College of Medicine, Houston. He has no conflicts of interest.
Diets promoting colonic inflammation were associated with a greater risk of colorectal carcinomas containing Fusobacterium nucleatum bacteria, according to a report in the October issue of Clinical Gastroenterology and Hepatology.
Courtesy American Gastroenterological Association
Proinflammatory diets were not linked to heightened risk for colon cancers without these bacteria, reported Li Liu, MD, PhD, of Dana-Farber Cancer Institute and Harvard Medical School, Boston. “These findings indicate that diet-induced intestinal inflammation alters the gut microbiome to contribute to colorectal carcinogenesis,” they wrote. “Nutritional interventions might be used in precision medicine and cancer prevention.”
Intestinal inflammation, a risk factor for colorectal cancer, is associated with high levels of circulating interleukin 6, C-reactive protein, and tumor necrosis factor–receptor superfamily member 1B. Colonic inflammation impairs the mucosal barrier and alters immune cell responses, which affects the composition of colonic microbiota. Among these, F. nucleatum is known to potentiate colorectal tumors and is associated with proximal tumor location, other tumor features, and cancer progression and chemoresistance.
For the study, the investigators examined self-reported data from more than 124,000 individuals followed for 28 years as part of the Nurses’ Health Study and the Health Professionals Follow-Up Study. They calculated average dietary patterns based on the empiric dietary inflammatory pattern (EDIP) score, which sums weighted intake scores for 18 foods (such as red and processed meat, coffee, tea, and leafy green or dark yellow vegetables) that are known to affect plasma levels of interleukin 6, C-reactive protein, tumor necrosis factor–receptor superfamily member 1B, and tumor necrosis factor alpha–receptor 2. A higher EDIP score denotes a more inflammatory diet.
During the 28-year follow-up period, 951 individuals developed colorectal carcinomas that were tested with a polymerase chain reaction assay for F. nucleatum DNA. A total of 115 tumors tested positive for F. nucleatum. After the researchers controlled for potential confounders, individuals whose EDIP scores were in the highest tertile were significantly more likely to develop F. nucleatum–positive colorectal cancer than were those who scored in the lowest tertile (adjusted hazard ratio, 1.63; 95% confidence interval, 1.03 to 2.58; P = .03). This differential association “appeared to be stronger in proximal colon cancer than in distal colon and rectal cancer,” the researchers said.
More than 90% of individuals in this study were non-Hispanic white, the researchers noted. Tumor tissue was not available from all cases of colorectal cancer and a fairly small number of cases tested positive for tumor F. nucleatum. Nonetheless, the findings suggest that an inflammatory diet could help amplify gut microbiota involved in tumorigenesis, they said. Pending confirmatory studies, they recommended an anti-inflammatory diet with high intake of green leafy vegetables, dark yellow vegetables, coffee, and tea, and with low intake of red meat, processed meat, refined grain, and sugary beverages. They also recommended studying whether F. nucleatum tumor or stool tests could help personalize dietary interventions.
Funders included the National Institutes of Health, Dana Farber Harvard Cancer, Project P. Fund for Colorectal Cancer Research, Friends of the Dana-Farber Cancer Institute, Bennett Family Fund, the Entertainment Industry Foundation, and American Association for Cancer Research, National Natural Science Foundation of China, Chinese Scholarship Council, Huazhong University of Science and Technology, and others. Dr. Liu had no disclosures. One coinvestigator disclosed ties to Genentech/Roche, Lilly, Sanofi, Bayer, and several other biomedical companies.
SOURCE: Liu L et al. Clin Gastroenterol Hepatol. 2018 Apr 24. doi: 10.1016/j.cgh.2018.04.030.
Diets promoting colonic inflammation were associated with a greater risk of colorectal carcinomas containing Fusobacterium nucleatum bacteria, according to a report in the October issue of Clinical Gastroenterology and Hepatology.
Courtesy American Gastroenterological Association
Proinflammatory diets were not linked to heightened risk for colon cancers without these bacteria, reported Li Liu, MD, PhD, of Dana-Farber Cancer Institute and Harvard Medical School, Boston. “These findings indicate that diet-induced intestinal inflammation alters the gut microbiome to contribute to colorectal carcinogenesis,” they wrote. “Nutritional interventions might be used in precision medicine and cancer prevention.”
Intestinal inflammation, a risk factor for colorectal cancer, is associated with high levels of circulating interleukin 6, C-reactive protein, and tumor necrosis factor–receptor superfamily member 1B. Colonic inflammation impairs the mucosal barrier and alters immune cell responses, which affects the composition of colonic microbiota. Among these, F. nucleatum is known to potentiate colorectal tumors and is associated with proximal tumor location, other tumor features, and cancer progression and chemoresistance.
For the study, the investigators examined self-reported data from more than 124,000 individuals followed for 28 years as part of the Nurses’ Health Study and the Health Professionals Follow-Up Study. They calculated average dietary patterns based on the empiric dietary inflammatory pattern (EDIP) score, which sums weighted intake scores for 18 foods (such as red and processed meat, coffee, tea, and leafy green or dark yellow vegetables) that are known to affect plasma levels of interleukin 6, C-reactive protein, tumor necrosis factor–receptor superfamily member 1B, and tumor necrosis factor alpha–receptor 2. A higher EDIP score denotes a more inflammatory diet.
During the 28-year follow-up period, 951 individuals developed colorectal carcinomas that were tested with a polymerase chain reaction assay for F. nucleatum DNA. A total of 115 tumors tested positive for F. nucleatum. After the researchers controlled for potential confounders, individuals whose EDIP scores were in the highest tertile were significantly more likely to develop F. nucleatum–positive colorectal cancer than were those who scored in the lowest tertile (adjusted hazard ratio, 1.63; 95% confidence interval, 1.03 to 2.58; P = .03). This differential association “appeared to be stronger in proximal colon cancer than in distal colon and rectal cancer,” the researchers said.
More than 90% of individuals in this study were non-Hispanic white, the researchers noted. Tumor tissue was not available from all cases of colorectal cancer and a fairly small number of cases tested positive for tumor F. nucleatum. Nonetheless, the findings suggest that an inflammatory diet could help amplify gut microbiota involved in tumorigenesis, they said. Pending confirmatory studies, they recommended an anti-inflammatory diet with high intake of green leafy vegetables, dark yellow vegetables, coffee, and tea, and with low intake of red meat, processed meat, refined grain, and sugary beverages. They also recommended studying whether F. nucleatum tumor or stool tests could help personalize dietary interventions.
Funders included the National Institutes of Health, Dana Farber Harvard Cancer, Project P. Fund for Colorectal Cancer Research, Friends of the Dana-Farber Cancer Institute, Bennett Family Fund, the Entertainment Industry Foundation, and American Association for Cancer Research, National Natural Science Foundation of China, Chinese Scholarship Council, Huazhong University of Science and Technology, and others. Dr. Liu had no disclosures. One coinvestigator disclosed ties to Genentech/Roche, Lilly, Sanofi, Bayer, and several other biomedical companies.
SOURCE: Liu L et al. Clin Gastroenterol Hepatol. 2018 Apr 24. doi: 10.1016/j.cgh.2018.04.030.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: A proinflammatory diet was associated with a significantly increased risk for colorectal cancer testing positive for Fusobacterium nucleatum.
Major finding: Dietary scores in the highest inflammatory tertile correlated with significantly increased risk (HR, 1.63; P = .03).
Study details: Longitudinal study of self-reported dietary patterns and cancers among 124,433 individuals with 28 years of follow-up.
Disclosures: Funders included the National Institutes of Health, Dana Farber Harvard Cancer, Project P. Fund for Colorectal Cancer Research, Friends of the Dana-Farber Cancer Institute, Bennett Family Fund, the Entertainment Industry Foundation, and American Association for Cancer Research, National Natural Science Foundation of China, Chinese Scholarship Council, Huazhong University of Science and Technology, and others. Dr. Liu had no disclosures. One coinvestigator disclosed ties to Genentech/Roche, Lilly, Sanofi, Bayer, and several other biomedical companies.
Source: Liu L et al. Clin Gastroenterol Hepatol. 2018 Apr 24. doi: 10.1016/j.cgh.2018.04.030.
Cirrhosis study finds no link between screening, liver cancer mortality
In a case-control study of patients with cirrhosis, screening for hepatocellular carcinoma up to 4 years prior to diagnosis was not associated with lower mortality.
Similar proportions of cases and controls underwent screening with abdominal ultrasonography, serum alpha-fetoprotein (AFP) testing, or both, reported Andrew M. Moon, MD, MPH, of the University of North Carolina at Chapel Hill, and his associates. “There was also no difference in receipt of these screening tests within 1, 2, or 3 years prior to the index date,” they wrote. The report was published in Gastroenterology. The findings “[suggest] that either these screening tests or the currently available treatments [for liver cancer], or both, are suboptimal and need to be improved.”
Because cirrhosis significantly increases the risk of hepatocellular carcinoma, the American Association for the Study of Liver Diseases, the European Association for the Study of the Liver, and the Asian Pacific Association for the Study of the Liver recommend screening cirrhotic patients every 6 months with abdominal ultrasonography with or without concomitant serum AFP. But nonliver societies have not endorsed this approach, citing a lack of high-quality data. One problem is that studies have compared patients whose liver cancer was diagnosed by screening with those diagnosed after they became symptomatic, which creates a lead-time bias that inherently favors screening, Dr. Moon and his associates noted.
To help fill the evidence gap, they identified 238 patients from the Veterans Affairs health care system who had died of hepatocellular carcinoma between 2013 and 2015 and who had been diagnosed with cirrhosis at least 4 years beforehand. They compared these cases with an equal number of patients with cirrhosis who had been in VA care for a similar amount of time and had not died of hepatocellular carcinoma. Cases and controls were matched by etiology of cirrhosis, year that cirrhosis was diagnosed, race, age, sex, Model for End-Stage Liver Disease score, and VA medical center. The researchers identified screening tests by reviewing blinded medical charts.
There were no significant differences in the proportions of cases and controls who underwent screening ultrasonography (52.9% versus 54.2%, respectively), screening serum AFP (74.8% versus 73.5%), either test (81.1% versus 79.4%), or both tests (46.6% versus 48.3%) within 4 years of the index date or the matched control. The result was similar after potential confounders were controlled for and when examining shorter time frames of 1, 2, and 3 years.
It was unlikely that these results reflect delayed diagnosis of liver cancer or a lack of treatment within the VA system, the experts wrote. A total of 51.3% of cases were diagnosed with Milan criteria, which exceeds the proportion in the national Surveillance, Epidemiology, and End Results registry, they noted. None of the fatal cases underwent liver transplantation, but 66.8% received other treatments for liver cancer.
Funders included the National Institutes of Health and the Veterans Affairs Clinical Science Research & Development. The investigators reported having no conflicts of interest.
SOURCE: Moon AM et al. Gastroenterology. 2018 Jul 5. doi: 10.1053/j.gastro.2018.06.079.
In a case-control study of patients with cirrhosis, screening for hepatocellular carcinoma up to 4 years prior to diagnosis was not associated with lower mortality.
Similar proportions of cases and controls underwent screening with abdominal ultrasonography, serum alpha-fetoprotein (AFP) testing, or both, reported Andrew M. Moon, MD, MPH, of the University of North Carolina at Chapel Hill, and his associates. “There was also no difference in receipt of these screening tests within 1, 2, or 3 years prior to the index date,” they wrote. The report was published in Gastroenterology. The findings “[suggest] that either these screening tests or the currently available treatments [for liver cancer], or both, are suboptimal and need to be improved.”
Because cirrhosis significantly increases the risk of hepatocellular carcinoma, the American Association for the Study of Liver Diseases, the European Association for the Study of the Liver, and the Asian Pacific Association for the Study of the Liver recommend screening cirrhotic patients every 6 months with abdominal ultrasonography with or without concomitant serum AFP. But nonliver societies have not endorsed this approach, citing a lack of high-quality data. One problem is that studies have compared patients whose liver cancer was diagnosed by screening with those diagnosed after they became symptomatic, which creates a lead-time bias that inherently favors screening, Dr. Moon and his associates noted.
To help fill the evidence gap, they identified 238 patients from the Veterans Affairs health care system who had died of hepatocellular carcinoma between 2013 and 2015 and who had been diagnosed with cirrhosis at least 4 years beforehand. They compared these cases with an equal number of patients with cirrhosis who had been in VA care for a similar amount of time and had not died of hepatocellular carcinoma. Cases and controls were matched by etiology of cirrhosis, year that cirrhosis was diagnosed, race, age, sex, Model for End-Stage Liver Disease score, and VA medical center. The researchers identified screening tests by reviewing blinded medical charts.
There were no significant differences in the proportions of cases and controls who underwent screening ultrasonography (52.9% versus 54.2%, respectively), screening serum AFP (74.8% versus 73.5%), either test (81.1% versus 79.4%), or both tests (46.6% versus 48.3%) within 4 years of the index date or the matched control. The result was similar after potential confounders were controlled for and when examining shorter time frames of 1, 2, and 3 years.
It was unlikely that these results reflect delayed diagnosis of liver cancer or a lack of treatment within the VA system, the experts wrote. A total of 51.3% of cases were diagnosed with Milan criteria, which exceeds the proportion in the national Surveillance, Epidemiology, and End Results registry, they noted. None of the fatal cases underwent liver transplantation, but 66.8% received other treatments for liver cancer.
Funders included the National Institutes of Health and the Veterans Affairs Clinical Science Research & Development. The investigators reported having no conflicts of interest.
SOURCE: Moon AM et al. Gastroenterology. 2018 Jul 5. doi: 10.1053/j.gastro.2018.06.079.
In a case-control study of patients with cirrhosis, screening for hepatocellular carcinoma up to 4 years prior to diagnosis was not associated with lower mortality.
Similar proportions of cases and controls underwent screening with abdominal ultrasonography, serum alpha-fetoprotein (AFP) testing, or both, reported Andrew M. Moon, MD, MPH, of the University of North Carolina at Chapel Hill, and his associates. “There was also no difference in receipt of these screening tests within 1, 2, or 3 years prior to the index date,” they wrote. The report was published in Gastroenterology. The findings “[suggest] that either these screening tests or the currently available treatments [for liver cancer], or both, are suboptimal and need to be improved.”
Because cirrhosis significantly increases the risk of hepatocellular carcinoma, the American Association for the Study of Liver Diseases, the European Association for the Study of the Liver, and the Asian Pacific Association for the Study of the Liver recommend screening cirrhotic patients every 6 months with abdominal ultrasonography with or without concomitant serum AFP. But nonliver societies have not endorsed this approach, citing a lack of high-quality data. One problem is that studies have compared patients whose liver cancer was diagnosed by screening with those diagnosed after they became symptomatic, which creates a lead-time bias that inherently favors screening, Dr. Moon and his associates noted.
To help fill the evidence gap, they identified 238 patients from the Veterans Affairs health care system who had died of hepatocellular carcinoma between 2013 and 2015 and who had been diagnosed with cirrhosis at least 4 years beforehand. They compared these cases with an equal number of patients with cirrhosis who had been in VA care for a similar amount of time and had not died of hepatocellular carcinoma. Cases and controls were matched by etiology of cirrhosis, year that cirrhosis was diagnosed, race, age, sex, Model for End-Stage Liver Disease score, and VA medical center. The researchers identified screening tests by reviewing blinded medical charts.
There were no significant differences in the proportions of cases and controls who underwent screening ultrasonography (52.9% versus 54.2%, respectively), screening serum AFP (74.8% versus 73.5%), either test (81.1% versus 79.4%), or both tests (46.6% versus 48.3%) within 4 years of the index date or the matched control. The result was similar after potential confounders were controlled for and when examining shorter time frames of 1, 2, and 3 years.
It was unlikely that these results reflect delayed diagnosis of liver cancer or a lack of treatment within the VA system, the experts wrote. A total of 51.3% of cases were diagnosed with Milan criteria, which exceeds the proportion in the national Surveillance, Epidemiology, and End Results registry, they noted. None of the fatal cases underwent liver transplantation, but 66.8% received other treatments for liver cancer.
Funders included the National Institutes of Health and the Veterans Affairs Clinical Science Research & Development. The investigators reported having no conflicts of interest.
SOURCE: Moon AM et al. Gastroenterology. 2018 Jul 5. doi: 10.1053/j.gastro.2018.06.079.
FROM GASTROENTEROLOGY
Key clinical point: Among patients with cirrhosis, screening for hepatocellular carcinoma was not associated with reductions in liver cancer mortality.
Major finding: Similar proportions of cases and controls were screened by abdominal ultrasonography, serum alpha-fetoprotein, or both up to 4 years before the index date and even after researchers controlled for relevant confounders.
Study details: A matched case-control study of 476 patients from the Veterans Affairs health care system.
Disclosures: Funders included the National Institutes of Health and the Veterans Affairs Clinical Science Research & Development. The investigators reported no conflicts of interest.
Source: Moon AM et al. Gastroenterology. 2018 Jul 5. doi: 10.1053/j.gastro.2018.06.079.
Experts update diagnostic guidelines for eosinophilic esophagitis
The diagnosis of eosinophilic esophagitis no longer needs to include a trial of proton pump inhibitor (PPI) therapy, according to an updated international consensus statement published in the October issue of Gastroenterology.
“An initial rationale for the PPI trial was to distinguish eosinophilic esophagitis from gastroesophageal reflux disease, but it is now known that these conditions have a complex relationship and are not necessarily mutually exclusive,” wrote Evan S. Dellon, MD, of the University of North Carolina at Chapel Hill, and his associates. According to current evidence, “PPIs are better classified as a treatment for esophageal eosinophilia that may be due to eosinophilic esophagitis than as a diagnostic criterion,” they said.
Diagnostic guidelines for eosinophilic esophagitis were published first in 2007 and were updated in 2011. The guideline authors recommended either pH monitoring or an 8-week trial of high-dose PPI therapy to rule out inflammation from gastroesophageal reflux disease (GERD).
But subsequent publications described patients with symptomatic esophageal eosinophilia who responded to PPIs and lacked classic GERD symptoms. Guidelines called this condition “PPI-responsive esophageal eosinophilia” and considered it a separate entity from GERD.
However, an “evolving body of research” shows that eosinophilic esophagitis can overlap with GERD, Dr. Dellon and his associates wrote. Furthermore, each of these conditions can trigger the other. Eosinophilic esophagitis can decrease esophageal compliance, leading to secondary reflux, while gastroesophageal reflux can erode the esophageal epithelium, triggering antigen exposure and eosinophilia.
Therefore, Dr. Dellon and his associates recommended defining eosinophilic esophagitis as signs and symptoms of esophageal dysfunction and an esophageal biopsy showing at least 15 eosinophils per high-power field, or approximately 60 eosinophils per millimeter, with infiltration limited to the esophagus. They stressed the importance of esophageal biopsy even if endoscopy shows normal mucosa. “As per prior guidelines, multiple biopsies from two or more esophageal levels, targeting areas of apparent inflammation, are recommended to increase the diagnostic yield,” they added. “Gastric and duodenal biopsies should be obtained as clinically indicated by symptoms, endoscopic findings in the stomach or duodenum, or high index of suspicion for a mucosal process.”
Physicians should increase their suspicion of eosinophilic esophagitis if patients have other types of atopy or endoscopic findings of “rings, furrows, exudates, edema, stricture, narrowing, and crepe-paper mucosa,” they added. In addition to GERD, they recommended looking carefully for other conditions that can trigger esophageal eosinophilia, such as pemphigus, drug hypersensitivity reactions, achalasia, and Crohn’s disease with esophageal involvement.
To create the guideline, Dr. Dellon and his associates searched PubMed for studies of all designs and sizes published from 1966 through December 2016. Teams of experts on specific topics then reviewed and discussed relevant literature. In May 2017, 43 reviewers met for 8 hours to present and discuss conclusions. There was 100% agreement to remove the PPI trial from the diagnostic criteria, the experts noted.
The authors disclosed financial support from the International Gastrointestinal Eosinophilic Diseases Researchers (TIGERS), The David and Denise Bunning Family, and the Rare Disease Clinical Research Network. Dr. Dellon disclosed consulting relationships and receiving research funding from Adare, Celgene/Receptos, Regeneron, and Shire among others. The majority of his coauthors also disclosed relationships with numerous medical companies.
SOURCE: Dellon ES et al. Gastroenterology. 2018 Jul 12. doi: 10.1053/j.gastro.2018.07.009.
Studies in the 1980s linked the presence of esophageal mucosal eosinophils with increased acid exposure on pH monitoring. For the next 2 decades, clinicians viewed eosinophils on esophageal biopsies as diagnostic for GERD such that the initial description of EoE by Attwood in 1993 distinguished EoE from GERD by the presence of esophageal eosinophilia in the absence of either reflux esophagitis or abnormal acid exposure on pH testing. Consequently, the initial diagnostic criteria for EoE in 2007 included a lack of response to PPI and/or normal pH testing to establish the diagnosis of EoE. Reflecting growing uncertainty regarding the ability of PPI therapy to differentiate acid-induced from allergic inflammatory mechanisms, an updated consensus in 2011 introduced the terminology “PPI responsive esophageal eosinophilia (PPIREE)” to describe an increasingly recognized subset of patients with suspected EoE that resolved with PPI. Now, supported by scientific evidence accumulated over the past decade, AGREE has taken a step back by removing the PPI trial from the diagnosis of EoE, thereby abandoning the PPIREE terminology. This step simplifies the diagnosis of EoE and acknowledges that a histologic response to PPI does not “rule in” GERD or “rule out” EoE. It is important to emphasize that the updated criteria still advocate careful consideration of secondary causes of esophageal eosinophilia prior to the diagnosis of EoE.
Ramifications of the updated diagnostic criteria include the opportunities for clinicians to consider use of topical corticosteroids and diet therapies, rather than mandate an up-front PPI trial, in patients with EoE. On a practical level, based on their effectiveness, safety, and ease of administration, PPIs remain positioned as a favorable initial intervention for EoE. Conceptually, however, the paradigm shift highlights the ability of research to improve our understanding of disease pathogenesis and thereby impact clinical management.
Ikuo Hirano, MD, AGAF, is in the division of gastroenterology, Northwestern University, Chicago. He has received grant support from the NIH Consortium of Eosinophilic Gastrointestinal Disease Researchers (CEGIR, U54 AI117804), which is part of the Rare Disease Clinical Research Network. He has received research funding and consulting fees from Celgene, Regeneron, Shire, and others.
Studies in the 1980s linked the presence of esophageal mucosal eosinophils with increased acid exposure on pH monitoring. For the next 2 decades, clinicians viewed eosinophils on esophageal biopsies as diagnostic for GERD such that the initial description of EoE by Attwood in 1993 distinguished EoE from GERD by the presence of esophageal eosinophilia in the absence of either reflux esophagitis or abnormal acid exposure on pH testing. Consequently, the initial diagnostic criteria for EoE in 2007 included a lack of response to PPI and/or normal pH testing to establish the diagnosis of EoE. Reflecting growing uncertainty regarding the ability of PPI therapy to differentiate acid-induced from allergic inflammatory mechanisms, an updated consensus in 2011 introduced the terminology “PPI responsive esophageal eosinophilia (PPIREE)” to describe an increasingly recognized subset of patients with suspected EoE that resolved with PPI. Now, supported by scientific evidence accumulated over the past decade, AGREE has taken a step back by removing the PPI trial from the diagnosis of EoE, thereby abandoning the PPIREE terminology. This step simplifies the diagnosis of EoE and acknowledges that a histologic response to PPI does not “rule in” GERD or “rule out” EoE. It is important to emphasize that the updated criteria still advocate careful consideration of secondary causes of esophageal eosinophilia prior to the diagnosis of EoE.
Ramifications of the updated diagnostic criteria include the opportunities for clinicians to consider use of topical corticosteroids and diet therapies, rather than mandate an up-front PPI trial, in patients with EoE. On a practical level, based on their effectiveness, safety, and ease of administration, PPIs remain positioned as a favorable initial intervention for EoE. Conceptually, however, the paradigm shift highlights the ability of research to improve our understanding of disease pathogenesis and thereby impact clinical management.
Ikuo Hirano, MD, AGAF, is in the division of gastroenterology, Northwestern University, Chicago. He has received grant support from the NIH Consortium of Eosinophilic Gastrointestinal Disease Researchers (CEGIR, U54 AI117804), which is part of the Rare Disease Clinical Research Network. He has received research funding and consulting fees from Celgene, Regeneron, Shire, and others.
Studies in the 1980s linked the presence of esophageal mucosal eosinophils with increased acid exposure on pH monitoring. For the next 2 decades, clinicians viewed eosinophils on esophageal biopsies as diagnostic for GERD such that the initial description of EoE by Attwood in 1993 distinguished EoE from GERD by the presence of esophageal eosinophilia in the absence of either reflux esophagitis or abnormal acid exposure on pH testing. Consequently, the initial diagnostic criteria for EoE in 2007 included a lack of response to PPI and/or normal pH testing to establish the diagnosis of EoE. Reflecting growing uncertainty regarding the ability of PPI therapy to differentiate acid-induced from allergic inflammatory mechanisms, an updated consensus in 2011 introduced the terminology “PPI responsive esophageal eosinophilia (PPIREE)” to describe an increasingly recognized subset of patients with suspected EoE that resolved with PPI. Now, supported by scientific evidence accumulated over the past decade, AGREE has taken a step back by removing the PPI trial from the diagnosis of EoE, thereby abandoning the PPIREE terminology. This step simplifies the diagnosis of EoE and acknowledges that a histologic response to PPI does not “rule in” GERD or “rule out” EoE. It is important to emphasize that the updated criteria still advocate careful consideration of secondary causes of esophageal eosinophilia prior to the diagnosis of EoE.
Ramifications of the updated diagnostic criteria include the opportunities for clinicians to consider use of topical corticosteroids and diet therapies, rather than mandate an up-front PPI trial, in patients with EoE. On a practical level, based on their effectiveness, safety, and ease of administration, PPIs remain positioned as a favorable initial intervention for EoE. Conceptually, however, the paradigm shift highlights the ability of research to improve our understanding of disease pathogenesis and thereby impact clinical management.
Ikuo Hirano, MD, AGAF, is in the division of gastroenterology, Northwestern University, Chicago. He has received grant support from the NIH Consortium of Eosinophilic Gastrointestinal Disease Researchers (CEGIR, U54 AI117804), which is part of the Rare Disease Clinical Research Network. He has received research funding and consulting fees from Celgene, Regeneron, Shire, and others.
The diagnosis of eosinophilic esophagitis no longer needs to include a trial of proton pump inhibitor (PPI) therapy, according to an updated international consensus statement published in the October issue of Gastroenterology.
“An initial rationale for the PPI trial was to distinguish eosinophilic esophagitis from gastroesophageal reflux disease, but it is now known that these conditions have a complex relationship and are not necessarily mutually exclusive,” wrote Evan S. Dellon, MD, of the University of North Carolina at Chapel Hill, and his associates. According to current evidence, “PPIs are better classified as a treatment for esophageal eosinophilia that may be due to eosinophilic esophagitis than as a diagnostic criterion,” they said.
Diagnostic guidelines for eosinophilic esophagitis were published first in 2007 and were updated in 2011. The guideline authors recommended either pH monitoring or an 8-week trial of high-dose PPI therapy to rule out inflammation from gastroesophageal reflux disease (GERD).
But subsequent publications described patients with symptomatic esophageal eosinophilia who responded to PPIs and lacked classic GERD symptoms. Guidelines called this condition “PPI-responsive esophageal eosinophilia” and considered it a separate entity from GERD.
However, an “evolving body of research” shows that eosinophilic esophagitis can overlap with GERD, Dr. Dellon and his associates wrote. Furthermore, each of these conditions can trigger the other. Eosinophilic esophagitis can decrease esophageal compliance, leading to secondary reflux, while gastroesophageal reflux can erode the esophageal epithelium, triggering antigen exposure and eosinophilia.
Therefore, Dr. Dellon and his associates recommended defining eosinophilic esophagitis as signs and symptoms of esophageal dysfunction and an esophageal biopsy showing at least 15 eosinophils per high-power field, or approximately 60 eosinophils per millimeter, with infiltration limited to the esophagus. They stressed the importance of esophageal biopsy even if endoscopy shows normal mucosa. “As per prior guidelines, multiple biopsies from two or more esophageal levels, targeting areas of apparent inflammation, are recommended to increase the diagnostic yield,” they added. “Gastric and duodenal biopsies should be obtained as clinically indicated by symptoms, endoscopic findings in the stomach or duodenum, or high index of suspicion for a mucosal process.”
Physicians should increase their suspicion of eosinophilic esophagitis if patients have other types of atopy or endoscopic findings of “rings, furrows, exudates, edema, stricture, narrowing, and crepe-paper mucosa,” they added. In addition to GERD, they recommended looking carefully for other conditions that can trigger esophageal eosinophilia, such as pemphigus, drug hypersensitivity reactions, achalasia, and Crohn’s disease with esophageal involvement.
To create the guideline, Dr. Dellon and his associates searched PubMed for studies of all designs and sizes published from 1966 through December 2016. Teams of experts on specific topics then reviewed and discussed relevant literature. In May 2017, 43 reviewers met for 8 hours to present and discuss conclusions. There was 100% agreement to remove the PPI trial from the diagnostic criteria, the experts noted.
The authors disclosed financial support from the International Gastrointestinal Eosinophilic Diseases Researchers (TIGERS), The David and Denise Bunning Family, and the Rare Disease Clinical Research Network. Dr. Dellon disclosed consulting relationships and receiving research funding from Adare, Celgene/Receptos, Regeneron, and Shire among others. The majority of his coauthors also disclosed relationships with numerous medical companies.
SOURCE: Dellon ES et al. Gastroenterology. 2018 Jul 12. doi: 10.1053/j.gastro.2018.07.009.
The diagnosis of eosinophilic esophagitis no longer needs to include a trial of proton pump inhibitor (PPI) therapy, according to an updated international consensus statement published in the October issue of Gastroenterology.
“An initial rationale for the PPI trial was to distinguish eosinophilic esophagitis from gastroesophageal reflux disease, but it is now known that these conditions have a complex relationship and are not necessarily mutually exclusive,” wrote Evan S. Dellon, MD, of the University of North Carolina at Chapel Hill, and his associates. According to current evidence, “PPIs are better classified as a treatment for esophageal eosinophilia that may be due to eosinophilic esophagitis than as a diagnostic criterion,” they said.
Diagnostic guidelines for eosinophilic esophagitis were published first in 2007 and were updated in 2011. The guideline authors recommended either pH monitoring or an 8-week trial of high-dose PPI therapy to rule out inflammation from gastroesophageal reflux disease (GERD).
But subsequent publications described patients with symptomatic esophageal eosinophilia who responded to PPIs and lacked classic GERD symptoms. Guidelines called this condition “PPI-responsive esophageal eosinophilia” and considered it a separate entity from GERD.
However, an “evolving body of research” shows that eosinophilic esophagitis can overlap with GERD, Dr. Dellon and his associates wrote. Furthermore, each of these conditions can trigger the other. Eosinophilic esophagitis can decrease esophageal compliance, leading to secondary reflux, while gastroesophageal reflux can erode the esophageal epithelium, triggering antigen exposure and eosinophilia.
Therefore, Dr. Dellon and his associates recommended defining eosinophilic esophagitis as signs and symptoms of esophageal dysfunction and an esophageal biopsy showing at least 15 eosinophils per high-power field, or approximately 60 eosinophils per millimeter, with infiltration limited to the esophagus. They stressed the importance of esophageal biopsy even if endoscopy shows normal mucosa. “As per prior guidelines, multiple biopsies from two or more esophageal levels, targeting areas of apparent inflammation, are recommended to increase the diagnostic yield,” they added. “Gastric and duodenal biopsies should be obtained as clinically indicated by symptoms, endoscopic findings in the stomach or duodenum, or high index of suspicion for a mucosal process.”
Physicians should increase their suspicion of eosinophilic esophagitis if patients have other types of atopy or endoscopic findings of “rings, furrows, exudates, edema, stricture, narrowing, and crepe-paper mucosa,” they added. In addition to GERD, they recommended looking carefully for other conditions that can trigger esophageal eosinophilia, such as pemphigus, drug hypersensitivity reactions, achalasia, and Crohn’s disease with esophageal involvement.
To create the guideline, Dr. Dellon and his associates searched PubMed for studies of all designs and sizes published from 1966 through December 2016. Teams of experts on specific topics then reviewed and discussed relevant literature. In May 2017, 43 reviewers met for 8 hours to present and discuss conclusions. There was 100% agreement to remove the PPI trial from the diagnostic criteria, the experts noted.
The authors disclosed financial support from the International Gastrointestinal Eosinophilic Diseases Researchers (TIGERS), The David and Denise Bunning Family, and the Rare Disease Clinical Research Network. Dr. Dellon disclosed consulting relationships and receiving research funding from Adare, Celgene/Receptos, Regeneron, and Shire among others. The majority of his coauthors also disclosed relationships with numerous medical companies.
SOURCE: Dellon ES et al. Gastroenterology. 2018 Jul 12. doi: 10.1053/j.gastro.2018.07.009.
FROM GASTROENTEROLOGY
Key clinical point: The diagnosis of eosinophilic esophagitis no longer needs to include a trial of proton pump inhibitor therapy.
Major finding: Eosinophilic esophagitis and gastroesophageal reflux disease are not mutually exclusive.
Study details: Review by an international consensus panel of studies published between 1966 and 2016.
Disclosures: The authors disclosed financial support from the International Gastrointestinal Eosinophilic Diseases Researchers (TIGERS), The David and Denise Bunning Family, the Rare Disease Clinical Research Network. Dr. Dellon disclosed consulting relationships with Adare, Allakos, Alivio, Banner, Celgene/Receptos, Enumeral, GSK, Regeneron, and Shire. He also reported receiving research funding from Adare, Celgene/Receptos, Miraca, Meritage, Nutricia, Regeneron, and Shire and educational grants from Banner and Holoclara. The majority of his coauthors disclosed relationships with numerous medical companies.
Source: Dellon ES et al. Gastroenterology. 2018 Jul 12. doi: 10.1053/j.gastro.2018.07.009.
AGA Guideline: Treatment of opioid-induced constipation
For patients with suspected opioid-induced constipation, start by taking a careful history of defecation and dietary patterns, stool consistency, incomplete evacuation, and “alarm symptoms,” such as bloody stools or weight loss, state new guidelines from the American Gastroenterological Association in Gastroenterology.
Clinicians also should rule out other causes of constipation, such as pelvic outlet dysfunction, mechanical obstruction, metabolic abnormalities, and comorbidities or concurrent medications, wrote Seth D. Crockett, MD, MPH, of the University of North Carolina at Chapel Hill, together with his associates. The guideline was published online Sept. 1.
Opioid therapy can lead to a range of gastrointestinal symptoms, such as constipation, gastroesophageal reflux, nausea and vomiting, bloating, and abdominal pain. Among these, constipation is by far the most common and debilitating, the guideline notes. In past studies, 40%-80% of patients who received opioids developed opioid-induced constipation (OIC), a more severe presentation that involves a combination of reduced stool frequency in addition to other symptoms, such as harder stools, new or worsening straining during defecation, and a sense of incomplete rectal evacuation.
Treating OIC should start with lifestyle interventions, such as drinking more fluids, toileting as soon as possible when feeling the urge to defecate, and adding regular moderate exercise whenever tolerable, the guideline advises. For patients on oral or parenteral therapy, consider switching to an equianalgesic dose of a less-constipating opioid, such as transdermal fentanyl or oxycodone-naloxone combination therapy.
Many patients with OIC require interventions beyond lifestyle changes or opioid switching. For these patients, the guideline advises starting with conventional laxative therapies based on their safety, low cost, and “established efficacy” in the OIC setting. Options include stool softeners (docusate sodium), osmotic laxatives (polyethylene glycol, magnesium hydroxide, magnesium citrate, and lactulose), lubricants (mineral oil), and stimulant laxatives (bisacodyl, sodium picosulfate, and senna). “Of note, there is little evidence that routine use of stimulant laxatives is harmful to the colon, despite widespread concern to the contrary,” the guideline states. Although randomized, controlled trials have not evaluated particular laxative combinations or titrations for OIC, the best evidence supports stimulant and osmotic laxative therapy, the authors note.
Before deeming any case of OIC laxative refractory, ensure that a patient receives an adequate trial of at least two classes of laxatives administered on a regular schedule, not just “as needed,” the guideline specifies. For example, a patient might receive a 2-week trial of a daily osmotic laxative plus a stimulant laxative two to three times weekly. The guideline authors suggest restricting the use of enemas to rescue therapy. They also note that consuming more fiber tends not to help patients with OIC because fiber does not affect colonic motility.
For truly laxative-refractory OIC, the guidelines recommend escalating treatment to peripherally acting mu-opioid receptor antagonists (PAMORAs). These drugs restore the function of the enteric nervous system by blocking mu-opioid receptors in the gut. Among the PAMORAs, the guideline strongly recommends the use of naldemedine or naloxegol over no treatment, based on robust data from randomized, double-blind, placebo-controlled trials. In the phase 3 COMPOSE 1, 2, and 3 trials, about 52% of patients who received naldemedine achieved at least three spontaneous bowel movements per week, compared with 35% of patients who received placebo. Additionally, in a 52-week safety and efficacy study (COMPOSE 3), naldemedine was associated with one more spontaneous bowel movement per week versus placebo and with a low absolute increase in adverse events.
The guideline bases its strong recommendation for naloxegol on moderate-quality data from three studies, including two phase 3, double-blind, randomized, placebo-controlled trials. Although at least five randomized, controlled trials have evaluated methylnaltrexone, the evidence was low quality and therefore the guideline only conditionally recommends prescribing this PAMORA over no treatment.
The guideline also makes no recommendation on the use of the intestinal secretagogue lubiprostone or the 5HT agonist prucalopride. Studies of lubiprostone were limited by possible reporting bias and showed no clear treatment benefit, the authors state. They describe a similar evidence gap for prucalopride, noting that at least one trial ended early without publication of the findings. They recommend further studying lubiprostone as well as prucalopride and other highly selective 5-HT4 agonists for treating OIC. Head-to-head trials would help guide treatment choice for patients with laxative-refractory OIC, they add. “Cost-effectiveness studies are also lacking in this field, which could inform prescribing strategy, particularly for newer, more expensive agents.”
For patients with suspected opioid-induced constipation, start by taking a careful history of defecation and dietary patterns, stool consistency, incomplete evacuation, and “alarm symptoms,” such as bloody stools or weight loss, state new guidelines from the American Gastroenterological Association in Gastroenterology.
Clinicians also should rule out other causes of constipation, such as pelvic outlet dysfunction, mechanical obstruction, metabolic abnormalities, and comorbidities or concurrent medications, wrote Seth D. Crockett, MD, MPH, of the University of North Carolina at Chapel Hill, together with his associates. The guideline was published online Sept. 1.
Opioid therapy can lead to a range of gastrointestinal symptoms, such as constipation, gastroesophageal reflux, nausea and vomiting, bloating, and abdominal pain. Among these, constipation is by far the most common and debilitating, the guideline notes. In past studies, 40%-80% of patients who received opioids developed opioid-induced constipation (OIC), a more severe presentation that involves a combination of reduced stool frequency in addition to other symptoms, such as harder stools, new or worsening straining during defecation, and a sense of incomplete rectal evacuation.
Treating OIC should start with lifestyle interventions, such as drinking more fluids, toileting as soon as possible when feeling the urge to defecate, and adding regular moderate exercise whenever tolerable, the guideline advises. For patients on oral or parenteral therapy, consider switching to an equianalgesic dose of a less-constipating opioid, such as transdermal fentanyl or oxycodone-naloxone combination therapy.
Many patients with OIC require interventions beyond lifestyle changes or opioid switching. For these patients, the guideline advises starting with conventional laxative therapies based on their safety, low cost, and “established efficacy” in the OIC setting. Options include stool softeners (docusate sodium), osmotic laxatives (polyethylene glycol, magnesium hydroxide, magnesium citrate, and lactulose), lubricants (mineral oil), and stimulant laxatives (bisacodyl, sodium picosulfate, and senna). “Of note, there is little evidence that routine use of stimulant laxatives is harmful to the colon, despite widespread concern to the contrary,” the guideline states. Although randomized, controlled trials have not evaluated particular laxative combinations or titrations for OIC, the best evidence supports stimulant and osmotic laxative therapy, the authors note.
Before deeming any case of OIC laxative refractory, ensure that a patient receives an adequate trial of at least two classes of laxatives administered on a regular schedule, not just “as needed,” the guideline specifies. For example, a patient might receive a 2-week trial of a daily osmotic laxative plus a stimulant laxative two to three times weekly. The guideline authors suggest restricting the use of enemas to rescue therapy. They also note that consuming more fiber tends not to help patients with OIC because fiber does not affect colonic motility.
For truly laxative-refractory OIC, the guidelines recommend escalating treatment to peripherally acting mu-opioid receptor antagonists (PAMORAs). These drugs restore the function of the enteric nervous system by blocking mu-opioid receptors in the gut. Among the PAMORAs, the guideline strongly recommends the use of naldemedine or naloxegol over no treatment, based on robust data from randomized, double-blind, placebo-controlled trials. In the phase 3 COMPOSE 1, 2, and 3 trials, about 52% of patients who received naldemedine achieved at least three spontaneous bowel movements per week, compared with 35% of patients who received placebo. Additionally, in a 52-week safety and efficacy study (COMPOSE 3), naldemedine was associated with one more spontaneous bowel movement per week versus placebo and with a low absolute increase in adverse events.
The guideline bases its strong recommendation for naloxegol on moderate-quality data from three studies, including two phase 3, double-blind, randomized, placebo-controlled trials. Although at least five randomized, controlled trials have evaluated methylnaltrexone, the evidence was low quality and therefore the guideline only conditionally recommends prescribing this PAMORA over no treatment.
The guideline also makes no recommendation on the use of the intestinal secretagogue lubiprostone or the 5HT agonist prucalopride. Studies of lubiprostone were limited by possible reporting bias and showed no clear treatment benefit, the authors state. They describe a similar evidence gap for prucalopride, noting that at least one trial ended early without publication of the findings. They recommend further studying lubiprostone as well as prucalopride and other highly selective 5-HT4 agonists for treating OIC. Head-to-head trials would help guide treatment choice for patients with laxative-refractory OIC, they add. “Cost-effectiveness studies are also lacking in this field, which could inform prescribing strategy, particularly for newer, more expensive agents.”
For patients with suspected opioid-induced constipation, start by taking a careful history of defecation and dietary patterns, stool consistency, incomplete evacuation, and “alarm symptoms,” such as bloody stools or weight loss, state new guidelines from the American Gastroenterological Association in Gastroenterology.
Clinicians also should rule out other causes of constipation, such as pelvic outlet dysfunction, mechanical obstruction, metabolic abnormalities, and comorbidities or concurrent medications, wrote Seth D. Crockett, MD, MPH, of the University of North Carolina at Chapel Hill, together with his associates. The guideline was published online Sept. 1.
Opioid therapy can lead to a range of gastrointestinal symptoms, such as constipation, gastroesophageal reflux, nausea and vomiting, bloating, and abdominal pain. Among these, constipation is by far the most common and debilitating, the guideline notes. In past studies, 40%-80% of patients who received opioids developed opioid-induced constipation (OIC), a more severe presentation that involves a combination of reduced stool frequency in addition to other symptoms, such as harder stools, new or worsening straining during defecation, and a sense of incomplete rectal evacuation.
Treating OIC should start with lifestyle interventions, such as drinking more fluids, toileting as soon as possible when feeling the urge to defecate, and adding regular moderate exercise whenever tolerable, the guideline advises. For patients on oral or parenteral therapy, consider switching to an equianalgesic dose of a less-constipating opioid, such as transdermal fentanyl or oxycodone-naloxone combination therapy.
Many patients with OIC require interventions beyond lifestyle changes or opioid switching. For these patients, the guideline advises starting with conventional laxative therapies based on their safety, low cost, and “established efficacy” in the OIC setting. Options include stool softeners (docusate sodium), osmotic laxatives (polyethylene glycol, magnesium hydroxide, magnesium citrate, and lactulose), lubricants (mineral oil), and stimulant laxatives (bisacodyl, sodium picosulfate, and senna). “Of note, there is little evidence that routine use of stimulant laxatives is harmful to the colon, despite widespread concern to the contrary,” the guideline states. Although randomized, controlled trials have not evaluated particular laxative combinations or titrations for OIC, the best evidence supports stimulant and osmotic laxative therapy, the authors note.
Before deeming any case of OIC laxative refractory, ensure that a patient receives an adequate trial of at least two classes of laxatives administered on a regular schedule, not just “as needed,” the guideline specifies. For example, a patient might receive a 2-week trial of a daily osmotic laxative plus a stimulant laxative two to three times weekly. The guideline authors suggest restricting the use of enemas to rescue therapy. They also note that consuming more fiber tends not to help patients with OIC because fiber does not affect colonic motility.
For truly laxative-refractory OIC, the guidelines recommend escalating treatment to peripherally acting mu-opioid receptor antagonists (PAMORAs). These drugs restore the function of the enteric nervous system by blocking mu-opioid receptors in the gut. Among the PAMORAs, the guideline strongly recommends the use of naldemedine or naloxegol over no treatment, based on robust data from randomized, double-blind, placebo-controlled trials. In the phase 3 COMPOSE 1, 2, and 3 trials, about 52% of patients who received naldemedine achieved at least three spontaneous bowel movements per week, compared with 35% of patients who received placebo. Additionally, in a 52-week safety and efficacy study (COMPOSE 3), naldemedine was associated with one more spontaneous bowel movement per week versus placebo and with a low absolute increase in adverse events.
The guideline bases its strong recommendation for naloxegol on moderate-quality data from three studies, including two phase 3, double-blind, randomized, placebo-controlled trials. Although at least five randomized, controlled trials have evaluated methylnaltrexone, the evidence was low quality and therefore the guideline only conditionally recommends prescribing this PAMORA over no treatment.
The guideline also makes no recommendation on the use of the intestinal secretagogue lubiprostone or the 5HT agonist prucalopride. Studies of lubiprostone were limited by possible reporting bias and showed no clear treatment benefit, the authors state. They describe a similar evidence gap for prucalopride, noting that at least one trial ended early without publication of the findings. They recommend further studying lubiprostone as well as prucalopride and other highly selective 5-HT4 agonists for treating OIC. Head-to-head trials would help guide treatment choice for patients with laxative-refractory OIC, they add. “Cost-effectiveness studies are also lacking in this field, which could inform prescribing strategy, particularly for newer, more expensive agents.”
FROM GASTROENTEROLOGY