User login
Mediterranean diet cut fatty liver risk
Middle-aged and older adults who closely followed a Mediterranean-style diet for 6 years were at significantly lower risk of developing fatty liver disease than others in a large prospective study.
Each 1-standard-deviation rise in Mediterranean-style Diet Score (MDS) correlated with significantly decreased hepatic fat accumulation and a 26% lower odds of new onset fatty liver disease (P = .002). “To our knowledge, ours is the first prospective study to examine the relations of long-term habitual diet to fatty liver,” Jiantao Ma, MBBS, PhD, and his associates wrote in Gastroenterology. “Our findings indicate that improved diet quality may be particularly important for those with high genetic risk for NAFLD.”
Over a median 6 years of follow-up, each 1-standard deviation rise in MDS correlated with a 26% decrease in odds of new-onset fatty liver (95% CI, 10% to 39%; P = .002) and with a significant increase in liver phantom ratio (0.57; 95% confidence interval, 0.27 to 0.86; P less than .001), which signifies lower accumulation of liver fat. Similarly, every 1-standard deviation rise in the AHEI dietary score correlated with a 0.56 rise in liver phantom ratio (95% CI, 0.29 to 0.84; P less than .001) and with a 21% lower odds of incident fatty liver disease (95% CI, 5% to 35%; P = .02).
Individuals whose diets improved the most (those in the highest quartile of dietary score change) over time had about 80% less liver fat accumulate between baseline and follow-up compared with those whose diets worsened the most (those in the lowest quartile of dietary score change). Furthermore, relationship between diet and liver fat remained statistically significant (P = .02) even after accounting for changes in body mass index.
The investigators also studied whether the presence of single nucleotide polymorphisms (SNPs) linked with NAFLD modified dietary effects. High genetic risk for NAFLD did not appear to lead to increased liver fat as long as diet improved or remained stable over time, they found. But when diet worsened over time, high genetic NAFLD risk did correlate with significantly greater accumulation of liver fat (P less than .001).
“Future intervention studies are needed to test the efficacy and efficiency of diet-based approaches for NAFLD prevention as well as to examine mechanisms underlying the association between diet and NAFLD,” the researchers wrote.
The National Heart, Lung and Blood Institute’s Framingham Heart Study provided funding. Affymetrix provided genotyping. The researchers reported having no financial conflicts of interest.
SOURCE: Ma J, et al. Gastroenterology. 2018 Mar 28. doi: 10.1053/j.gastro.2018.03.038
Middle-aged and older adults who closely followed a Mediterranean-style diet for 6 years were at significantly lower risk of developing fatty liver disease than others in a large prospective study.
Each 1-standard-deviation rise in Mediterranean-style Diet Score (MDS) correlated with significantly decreased hepatic fat accumulation and a 26% lower odds of new onset fatty liver disease (P = .002). “To our knowledge, ours is the first prospective study to examine the relations of long-term habitual diet to fatty liver,” Jiantao Ma, MBBS, PhD, and his associates wrote in Gastroenterology. “Our findings indicate that improved diet quality may be particularly important for those with high genetic risk for NAFLD.”
Over a median 6 years of follow-up, each 1-standard deviation rise in MDS correlated with a 26% decrease in odds of new-onset fatty liver (95% CI, 10% to 39%; P = .002) and with a significant increase in liver phantom ratio (0.57; 95% confidence interval, 0.27 to 0.86; P less than .001), which signifies lower accumulation of liver fat. Similarly, every 1-standard deviation rise in the AHEI dietary score correlated with a 0.56 rise in liver phantom ratio (95% CI, 0.29 to 0.84; P less than .001) and with a 21% lower odds of incident fatty liver disease (95% CI, 5% to 35%; P = .02).
Individuals whose diets improved the most (those in the highest quartile of dietary score change) over time had about 80% less liver fat accumulate between baseline and follow-up compared with those whose diets worsened the most (those in the lowest quartile of dietary score change). Furthermore, relationship between diet and liver fat remained statistically significant (P = .02) even after accounting for changes in body mass index.
The investigators also studied whether the presence of single nucleotide polymorphisms (SNPs) linked with NAFLD modified dietary effects. High genetic risk for NAFLD did not appear to lead to increased liver fat as long as diet improved or remained stable over time, they found. But when diet worsened over time, high genetic NAFLD risk did correlate with significantly greater accumulation of liver fat (P less than .001).
“Future intervention studies are needed to test the efficacy and efficiency of diet-based approaches for NAFLD prevention as well as to examine mechanisms underlying the association between diet and NAFLD,” the researchers wrote.
The National Heart, Lung and Blood Institute’s Framingham Heart Study provided funding. Affymetrix provided genotyping. The researchers reported having no financial conflicts of interest.
SOURCE: Ma J, et al. Gastroenterology. 2018 Mar 28. doi: 10.1053/j.gastro.2018.03.038
Middle-aged and older adults who closely followed a Mediterranean-style diet for 6 years were at significantly lower risk of developing fatty liver disease than others in a large prospective study.
Each 1-standard-deviation rise in Mediterranean-style Diet Score (MDS) correlated with significantly decreased hepatic fat accumulation and a 26% lower odds of new onset fatty liver disease (P = .002). “To our knowledge, ours is the first prospective study to examine the relations of long-term habitual diet to fatty liver,” Jiantao Ma, MBBS, PhD, and his associates wrote in Gastroenterology. “Our findings indicate that improved diet quality may be particularly important for those with high genetic risk for NAFLD.”
Over a median 6 years of follow-up, each 1-standard deviation rise in MDS correlated with a 26% decrease in odds of new-onset fatty liver (95% CI, 10% to 39%; P = .002) and with a significant increase in liver phantom ratio (0.57; 95% confidence interval, 0.27 to 0.86; P less than .001), which signifies lower accumulation of liver fat. Similarly, every 1-standard deviation rise in the AHEI dietary score correlated with a 0.56 rise in liver phantom ratio (95% CI, 0.29 to 0.84; P less than .001) and with a 21% lower odds of incident fatty liver disease (95% CI, 5% to 35%; P = .02).
Individuals whose diets improved the most (those in the highest quartile of dietary score change) over time had about 80% less liver fat accumulate between baseline and follow-up compared with those whose diets worsened the most (those in the lowest quartile of dietary score change). Furthermore, relationship between diet and liver fat remained statistically significant (P = .02) even after accounting for changes in body mass index.
The investigators also studied whether the presence of single nucleotide polymorphisms (SNPs) linked with NAFLD modified dietary effects. High genetic risk for NAFLD did not appear to lead to increased liver fat as long as diet improved or remained stable over time, they found. But when diet worsened over time, high genetic NAFLD risk did correlate with significantly greater accumulation of liver fat (P less than .001).
“Future intervention studies are needed to test the efficacy and efficiency of diet-based approaches for NAFLD prevention as well as to examine mechanisms underlying the association between diet and NAFLD,” the researchers wrote.
The National Heart, Lung and Blood Institute’s Framingham Heart Study provided funding. Affymetrix provided genotyping. The researchers reported having no financial conflicts of interest.
SOURCE: Ma J, et al. Gastroenterology. 2018 Mar 28. doi: 10.1053/j.gastro.2018.03.038
FROM GASTROENTEROLOGY
Key clinical point: A Mediterranean-style diet was associated with significantly less liver fat accumulation and significantly lower risk of fatty liver disease.
Major finding: Each 1-standard-deviation increase in the Mediterranean Diet Score (MDS) correlated with a 26% lower odds of de novo NAFLD (P = .002).
Study details: Prospective study of 1,521 adults from the Framingham Heart Study.
Disclosures: The National Heart, Lung and Blood Institute’s Framingham Heart Study provided funding. Affymetrix provided genotyping. The researchers reported having no conflicts of interest.
Source: Ma J, et al. Gastroenterology. 2018 Mar 28.
Mesenteric adipose–derived stromal cell lactoferrin may mediate protective effects in Crohn’s disease
Inflammatory bowel disease (IBD) and Crohn’s disease (CD), in particular, are characterized by an unusual ectopic extension of mesenteric adipose tissue. This intra-abdominal fat, also known as “creeping fat,” which wraps around the intestine during the onset of CD, is associated with inflammation and ulceration of the small or large intestine. The role of this fat in the development of CD, and whether it is protective or harmful, however, is not clear.
The current study demonstrates that adipose-derived stromal cells (ADSCs), the precursor cell population of adipose tissue, promote colonocyte proliferation and exhibit a differential gene expression profile in a disease-dependent manner. according to Jill M. Hoffman, MD, and her colleagues at the University of California, Los Angeles. Increased expression and release of lactoferrin by ADSCs – an iron-binding glycoprotein and antimicrobial peptide usually found in large quantities in breast milk – was shown to be a likely mediator that could regulate inflammatory responses during CD. These results were published in Cellular and Molecular Gastroenterology and Hepatology (doi: 10.1016/j.jcmgh.2018.02.001).
Intestinal inflammation is primarily mediated by cytokine production, and targeted anticytokine therapy is the current standard for IBD treatment. The cytokine profile from CD patient–derived mesenteric ADSCs and fat tissue was significantly different from that of these patients’ disease-free counterparts. The authors hypothesized that mesenteric ADSCs release adipokines in response to disease-associated signals; this release of adipokines results from differential gene expression of mesenteric ADSCs in CD versus control patients. To test this hypothesis, conditioned media from CD patient–derived ADSCs was used to study gene expression in colonic intestinal epithelial cells in vitro and in mice with experimental colitis in vivo.
Using the Human LncRNA Expression Microarray V4.0, expression of 20,730 protein-coding mRNA targets was analysed, and 992 mRNA transcripts were found to be differentially (less than or equal to twofold change) expressed in CD patient–derived ADSCs, compared with control patient–derived ADSCs. Subsequent pathway analysis suggested activation of cellular growth and proliferation pathways with caspase 8 and p42/44 as top predicted networks that are differentially regulated in CD patient–derived ADSCs with respect to those of control patients.
The investigators treated intestinal epithelial cells – specifically, NCM460 – with conditioned 233 media from the same CD or control patient–derived ADSCs; subsequent microarray profiling using the GeneChip Human Gene ST Array showed increased expression of interleukin-17A, CCL23, and VEGFA. Ingenuity Pathway Analysis of mRNA expression indicated convergence in injury and inflammation pathways with the SERPINE1 gene, which suggests it’s the central regulator of the differential gene expression network.
In vivo, mice with active dextran sulfate sodium (DSS) colitis that were treated with daily injections of conditioned media from CD patients showed attenuation of colitis as compared with mice treated with vehicle or conditioned media from control patients. Furthermore, the mRNA expression of proinflammatory cytokines was reduced with increased proliferative response (as measured by Ki67 expression) in intestinal epithelial cells in the dextran sulfate sodium–treated mice receiving media from CD patients, compared with that in mice receiving media from control patients or vehicle-treated mice.
Cell proliferation was studied in real time (during a period of 120 hours) using the xCELLigence platform. The authors suggested that mesenteric adipose tissue–derived mediators may regulate proliferative responses in intestinal epithelial cells during intestinal inflammation, as observed by enhanced cell-doubling time in conditioned media from CD patient–derived ADSCs.
Levels of lactoferrin mRNA (validated by real time polymerase chain reaction; 92.70 ± 18.41 versus 28.98 ± 5.681; P less than .05) and protein (validated by ELISA; 142.2 ± 5.653 versus 120.1 ± 3.664; P less than .01) were increased in human mesenteric ADSCs and conditioned media from CD patients, respectively, compared with that from controls.
“Compared with mice receiving vehicle injections, mice receiving daily injections of lactoferrin had improved clinical scores (5.625 ± 0.565 versus 11.125 ± 0.743; n = 8) and colon length at day 7 (6.575 ± 0.1688 versus 5.613 ± 0.1445; n = 8). In addition, we found epithelial cell proliferation was increased in the colons of lactoferrin-treated mice with colitis, compared with vehicle-treated controls (3.548e7 ± 1.547e6 versus 1.184e7 ± 2.915e6; P less than .01),” said the authors.
Collectively, the presented data was suggestive of a protective role of mesenteric adipose tissue–derived mediators, such as lactoferrin, in the pathophysiology of CD.
The study was supported by the Broad Medical Research Program (IBD-0390), an NIDDK Q51856 Ruth L. Kirschstein National Research Service Award Postdoctoral Fellowship 1857 (F32 DK102322), the Neuroendocrine Assay and Models of Gastrointestinal Function and Disease Cores (P50 DK 64539), an AGA-1858 Broad Student Research Fellowship, the Blinder Center for Crohn’s 1859 Disease Research, the Eli and Edythe Broad Chair, and NIH/NIDDK grant DK047343.
The authors disclosed no conflicts of interest.
SOURCE: Hoffman J et al. Cell Molec Gastro Hepatol. doi: 10.1016/j.jcmgh.2018.02.001.
Inflammatory bowel disease (IBD), including Crohn’s disease, is a chronic inflammatory condition of the gastrointestinal tract that is often associated with changes in adipose tissue. However, the pathophysiological significance of fat wrapping in Crohn’s disease remains largely elusive. A correlation of IBD with obesity has been established by a number of studies, which report 15%-40% of adults with IBD are obese. Obesity is found to have a negative effect on disease activity and progression to surgery in patients with Crohn’s disease. In contrast, adipose-derived stromal or stem cells exhibit regenerative and anti-inflammatory function.
A recent study published in Cellular and Molecular Gastroenterology and Hepatology by Jill M. Hoffman and her colleagues highlighted the immune-modulatory function of adipose-derived stromal cells (ADSCs) in Crohn’s disease patients. They observed that patient-derived ADSCs promote colonocyte proliferation and exhibit distinct gene expression patterns, compared with healthy controls. The authors successfully identified ADSC-derived lactoferrin, an iron binding glycoprotein and an antimicrobial peptide, as a potential immunoregulatory molecule.
Amlan Biswas, PhD, is an instructor in pediatrics at Harvard Medical School, Boston, and is affiliated with Boston Children’s Hospital in the division of gastroenterology and nutrition. He has no conflicts of interest
Inflammatory bowel disease (IBD), including Crohn’s disease, is a chronic inflammatory condition of the gastrointestinal tract that is often associated with changes in adipose tissue. However, the pathophysiological significance of fat wrapping in Crohn’s disease remains largely elusive. A correlation of IBD with obesity has been established by a number of studies, which report 15%-40% of adults with IBD are obese. Obesity is found to have a negative effect on disease activity and progression to surgery in patients with Crohn’s disease. In contrast, adipose-derived stromal or stem cells exhibit regenerative and anti-inflammatory function.
A recent study published in Cellular and Molecular Gastroenterology and Hepatology by Jill M. Hoffman and her colleagues highlighted the immune-modulatory function of adipose-derived stromal cells (ADSCs) in Crohn’s disease patients. They observed that patient-derived ADSCs promote colonocyte proliferation and exhibit distinct gene expression patterns, compared with healthy controls. The authors successfully identified ADSC-derived lactoferrin, an iron binding glycoprotein and an antimicrobial peptide, as a potential immunoregulatory molecule.
Amlan Biswas, PhD, is an instructor in pediatrics at Harvard Medical School, Boston, and is affiliated with Boston Children’s Hospital in the division of gastroenterology and nutrition. He has no conflicts of interest
Inflammatory bowel disease (IBD), including Crohn’s disease, is a chronic inflammatory condition of the gastrointestinal tract that is often associated with changes in adipose tissue. However, the pathophysiological significance of fat wrapping in Crohn’s disease remains largely elusive. A correlation of IBD with obesity has been established by a number of studies, which report 15%-40% of adults with IBD are obese. Obesity is found to have a negative effect on disease activity and progression to surgery in patients with Crohn’s disease. In contrast, adipose-derived stromal or stem cells exhibit regenerative and anti-inflammatory function.
A recent study published in Cellular and Molecular Gastroenterology and Hepatology by Jill M. Hoffman and her colleagues highlighted the immune-modulatory function of adipose-derived stromal cells (ADSCs) in Crohn’s disease patients. They observed that patient-derived ADSCs promote colonocyte proliferation and exhibit distinct gene expression patterns, compared with healthy controls. The authors successfully identified ADSC-derived lactoferrin, an iron binding glycoprotein and an antimicrobial peptide, as a potential immunoregulatory molecule.
Amlan Biswas, PhD, is an instructor in pediatrics at Harvard Medical School, Boston, and is affiliated with Boston Children’s Hospital in the division of gastroenterology and nutrition. He has no conflicts of interest
Inflammatory bowel disease (IBD) and Crohn’s disease (CD), in particular, are characterized by an unusual ectopic extension of mesenteric adipose tissue. This intra-abdominal fat, also known as “creeping fat,” which wraps around the intestine during the onset of CD, is associated with inflammation and ulceration of the small or large intestine. The role of this fat in the development of CD, and whether it is protective or harmful, however, is not clear.
The current study demonstrates that adipose-derived stromal cells (ADSCs), the precursor cell population of adipose tissue, promote colonocyte proliferation and exhibit a differential gene expression profile in a disease-dependent manner. according to Jill M. Hoffman, MD, and her colleagues at the University of California, Los Angeles. Increased expression and release of lactoferrin by ADSCs – an iron-binding glycoprotein and antimicrobial peptide usually found in large quantities in breast milk – was shown to be a likely mediator that could regulate inflammatory responses during CD. These results were published in Cellular and Molecular Gastroenterology and Hepatology (doi: 10.1016/j.jcmgh.2018.02.001).
Intestinal inflammation is primarily mediated by cytokine production, and targeted anticytokine therapy is the current standard for IBD treatment. The cytokine profile from CD patient–derived mesenteric ADSCs and fat tissue was significantly different from that of these patients’ disease-free counterparts. The authors hypothesized that mesenteric ADSCs release adipokines in response to disease-associated signals; this release of adipokines results from differential gene expression of mesenteric ADSCs in CD versus control patients. To test this hypothesis, conditioned media from CD patient–derived ADSCs was used to study gene expression in colonic intestinal epithelial cells in vitro and in mice with experimental colitis in vivo.
Using the Human LncRNA Expression Microarray V4.0, expression of 20,730 protein-coding mRNA targets was analysed, and 992 mRNA transcripts were found to be differentially (less than or equal to twofold change) expressed in CD patient–derived ADSCs, compared with control patient–derived ADSCs. Subsequent pathway analysis suggested activation of cellular growth and proliferation pathways with caspase 8 and p42/44 as top predicted networks that are differentially regulated in CD patient–derived ADSCs with respect to those of control patients.
The investigators treated intestinal epithelial cells – specifically, NCM460 – with conditioned 233 media from the same CD or control patient–derived ADSCs; subsequent microarray profiling using the GeneChip Human Gene ST Array showed increased expression of interleukin-17A, CCL23, and VEGFA. Ingenuity Pathway Analysis of mRNA expression indicated convergence in injury and inflammation pathways with the SERPINE1 gene, which suggests it’s the central regulator of the differential gene expression network.
In vivo, mice with active dextran sulfate sodium (DSS) colitis that were treated with daily injections of conditioned media from CD patients showed attenuation of colitis as compared with mice treated with vehicle or conditioned media from control patients. Furthermore, the mRNA expression of proinflammatory cytokines was reduced with increased proliferative response (as measured by Ki67 expression) in intestinal epithelial cells in the dextran sulfate sodium–treated mice receiving media from CD patients, compared with that in mice receiving media from control patients or vehicle-treated mice.
Cell proliferation was studied in real time (during a period of 120 hours) using the xCELLigence platform. The authors suggested that mesenteric adipose tissue–derived mediators may regulate proliferative responses in intestinal epithelial cells during intestinal inflammation, as observed by enhanced cell-doubling time in conditioned media from CD patient–derived ADSCs.
Levels of lactoferrin mRNA (validated by real time polymerase chain reaction; 92.70 ± 18.41 versus 28.98 ± 5.681; P less than .05) and protein (validated by ELISA; 142.2 ± 5.653 versus 120.1 ± 3.664; P less than .01) were increased in human mesenteric ADSCs and conditioned media from CD patients, respectively, compared with that from controls.
“Compared with mice receiving vehicle injections, mice receiving daily injections of lactoferrin had improved clinical scores (5.625 ± 0.565 versus 11.125 ± 0.743; n = 8) and colon length at day 7 (6.575 ± 0.1688 versus 5.613 ± 0.1445; n = 8). In addition, we found epithelial cell proliferation was increased in the colons of lactoferrin-treated mice with colitis, compared with vehicle-treated controls (3.548e7 ± 1.547e6 versus 1.184e7 ± 2.915e6; P less than .01),” said the authors.
Collectively, the presented data was suggestive of a protective role of mesenteric adipose tissue–derived mediators, such as lactoferrin, in the pathophysiology of CD.
The study was supported by the Broad Medical Research Program (IBD-0390), an NIDDK Q51856 Ruth L. Kirschstein National Research Service Award Postdoctoral Fellowship 1857 (F32 DK102322), the Neuroendocrine Assay and Models of Gastrointestinal Function and Disease Cores (P50 DK 64539), an AGA-1858 Broad Student Research Fellowship, the Blinder Center for Crohn’s 1859 Disease Research, the Eli and Edythe Broad Chair, and NIH/NIDDK grant DK047343.
The authors disclosed no conflicts of interest.
SOURCE: Hoffman J et al. Cell Molec Gastro Hepatol. doi: 10.1016/j.jcmgh.2018.02.001.
Inflammatory bowel disease (IBD) and Crohn’s disease (CD), in particular, are characterized by an unusual ectopic extension of mesenteric adipose tissue. This intra-abdominal fat, also known as “creeping fat,” which wraps around the intestine during the onset of CD, is associated with inflammation and ulceration of the small or large intestine. The role of this fat in the development of CD, and whether it is protective or harmful, however, is not clear.
The current study demonstrates that adipose-derived stromal cells (ADSCs), the precursor cell population of adipose tissue, promote colonocyte proliferation and exhibit a differential gene expression profile in a disease-dependent manner. according to Jill M. Hoffman, MD, and her colleagues at the University of California, Los Angeles. Increased expression and release of lactoferrin by ADSCs – an iron-binding glycoprotein and antimicrobial peptide usually found in large quantities in breast milk – was shown to be a likely mediator that could regulate inflammatory responses during CD. These results were published in Cellular and Molecular Gastroenterology and Hepatology (doi: 10.1016/j.jcmgh.2018.02.001).
Intestinal inflammation is primarily mediated by cytokine production, and targeted anticytokine therapy is the current standard for IBD treatment. The cytokine profile from CD patient–derived mesenteric ADSCs and fat tissue was significantly different from that of these patients’ disease-free counterparts. The authors hypothesized that mesenteric ADSCs release adipokines in response to disease-associated signals; this release of adipokines results from differential gene expression of mesenteric ADSCs in CD versus control patients. To test this hypothesis, conditioned media from CD patient–derived ADSCs was used to study gene expression in colonic intestinal epithelial cells in vitro and in mice with experimental colitis in vivo.
Using the Human LncRNA Expression Microarray V4.0, expression of 20,730 protein-coding mRNA targets was analysed, and 992 mRNA transcripts were found to be differentially (less than or equal to twofold change) expressed in CD patient–derived ADSCs, compared with control patient–derived ADSCs. Subsequent pathway analysis suggested activation of cellular growth and proliferation pathways with caspase 8 and p42/44 as top predicted networks that are differentially regulated in CD patient–derived ADSCs with respect to those of control patients.
The investigators treated intestinal epithelial cells – specifically, NCM460 – with conditioned 233 media from the same CD or control patient–derived ADSCs; subsequent microarray profiling using the GeneChip Human Gene ST Array showed increased expression of interleukin-17A, CCL23, and VEGFA. Ingenuity Pathway Analysis of mRNA expression indicated convergence in injury and inflammation pathways with the SERPINE1 gene, which suggests it’s the central regulator of the differential gene expression network.
In vivo, mice with active dextran sulfate sodium (DSS) colitis that were treated with daily injections of conditioned media from CD patients showed attenuation of colitis as compared with mice treated with vehicle or conditioned media from control patients. Furthermore, the mRNA expression of proinflammatory cytokines was reduced with increased proliferative response (as measured by Ki67 expression) in intestinal epithelial cells in the dextran sulfate sodium–treated mice receiving media from CD patients, compared with that in mice receiving media from control patients or vehicle-treated mice.
Cell proliferation was studied in real time (during a period of 120 hours) using the xCELLigence platform. The authors suggested that mesenteric adipose tissue–derived mediators may regulate proliferative responses in intestinal epithelial cells during intestinal inflammation, as observed by enhanced cell-doubling time in conditioned media from CD patient–derived ADSCs.
Levels of lactoferrin mRNA (validated by real time polymerase chain reaction; 92.70 ± 18.41 versus 28.98 ± 5.681; P less than .05) and protein (validated by ELISA; 142.2 ± 5.653 versus 120.1 ± 3.664; P less than .01) were increased in human mesenteric ADSCs and conditioned media from CD patients, respectively, compared with that from controls.
“Compared with mice receiving vehicle injections, mice receiving daily injections of lactoferrin had improved clinical scores (5.625 ± 0.565 versus 11.125 ± 0.743; n = 8) and colon length at day 7 (6.575 ± 0.1688 versus 5.613 ± 0.1445; n = 8). In addition, we found epithelial cell proliferation was increased in the colons of lactoferrin-treated mice with colitis, compared with vehicle-treated controls (3.548e7 ± 1.547e6 versus 1.184e7 ± 2.915e6; P less than .01),” said the authors.
Collectively, the presented data was suggestive of a protective role of mesenteric adipose tissue–derived mediators, such as lactoferrin, in the pathophysiology of CD.
The study was supported by the Broad Medical Research Program (IBD-0390), an NIDDK Q51856 Ruth L. Kirschstein National Research Service Award Postdoctoral Fellowship 1857 (F32 DK102322), the Neuroendocrine Assay and Models of Gastrointestinal Function and Disease Cores (P50 DK 64539), an AGA-1858 Broad Student Research Fellowship, the Blinder Center for Crohn’s 1859 Disease Research, the Eli and Edythe Broad Chair, and NIH/NIDDK grant DK047343.
The authors disclosed no conflicts of interest.
SOURCE: Hoffman J et al. Cell Molec Gastro Hepatol. doi: 10.1016/j.jcmgh.2018.02.001.
FROM CMGH
Meta-analysis supports endoscopic surveillance of Barrett’s esophagus
Endoscopic surveillance also was associated with modest improvements in all-cause and cancer-specific mortality, said Don C. Codipilly, MD, of the Mayo Clinic in Rochester, Minn., with his associates. The Barrett’s esophagus community “eagerly” await results from the multicenter, randomized BOSS trial (Barrett’s Oesophagus Surveillance versus endoscopy at need Study), the reviewers wrote in the June issue of Gastroenterology.
Guidelines recommend endoscopic surveillance of patients with Barrett’s esophagus, but it is unclear whether this practice improves survival. Hence, the reviewers searched databases such as Ovid MEDLINE, Embase, PubMed, and Scopus for studies published since 1996 that evaluated outcomes of endoscopic surveillance in Barrett’s esophagus. Eligible studies included a case-control study of a large hospital database in northern California, 6 retrospective cohort studies of endoscopic Barrett’s esophagus surveillance, and 11 prospective and retrospective studies comparing patients with esophageal adenocarcinoma (EAC) with and without a history of Barrett’s esophagus.
The case-control study found no link between endoscopic surveillance and improved survival in Barrett’s esophagus, said the reviewers. However, the retrospective cohort studies linked regular Barrett’s esophagus endoscopic surveillance with a 40% lower risk of death from EAC, compared with incomplete or no endoscopic surveillance, which was statistically significant (risk ratio, 0.60; 95% confidence interval, 0.50-0.71). Individual results of these studies also were consistent with the results of their meta-analysis. A separate meta-analysis of the remaining studies also linked endoscopic surveillance with a significantly lower risk of EAC-related mortality (RR, 0.73; 95% CI, 0.57-0.94), but these studies had substantial heterogeneity, the investigators said.
Meta-analyses also supported endoscopic surveillance for earlier detection of EAC. Unadjusted data from four studies indicated that Barrett’s esophagus surveillance helped detect EAC while it was still early stage (stage 0 or 1) rather than later stage (RR, 2.1; 95% CI, 1.1-4.1). Similarly, patients who had already been diagnosed with Barrett’s esophagus were significantly more likely to present with early-stage EAC (RR, 5.5; 95% CI, 3.7-8.2). In contrast, enrollment in a Barrett’s esophagus surveillance program did not appear to affect the likelihood of esophagectomy.
Additional meta-analyses suggested that endoscopic surveillance of Barrett’s esophagus might confer a “potentially small” overall survival benefit, the reviewers said. A meta-analysis of adjusted data from three studies linked surveillance with a 25% reduction in risk of all-cause mortality, compared with no surveillance (hazard ratio, 0.75; 95% CI, 0.58-0.94). Having a prior Barrett’s esophagus diagnosis also was associated with a 52% decrease in all-cause mortality, compared with having symptomatic cancer (RR, 0.48; 95% CI, 0.37-0.63) in a meta-analysis of unadjusted data from 12 studies. Five studies with adjusted data linked a prior Barrett’s esophagus diagnosis with a 41% lower risk of all-cause mortality (HR, 0.59; 95% CI, 0.45-0.76). However, individual findings varied substantially, and adjusting for lead time “almost eliminated this overall survival benefit,” the reviewers said.
The National Institutes of Health and a Public Health Service Award supported the study. Dr. Codipilly reported having no conflicts of interest. Five coinvestigators disclosed ties to Exact Sciences, C2 Therapeutics, and other companies.
SOURCE: Codipilly DC et al. Gastroenterology. 2018 Feb 17. doi: 10.1053/j.gastro.2018.02.022.
Endoscopic surveillance also was associated with modest improvements in all-cause and cancer-specific mortality, said Don C. Codipilly, MD, of the Mayo Clinic in Rochester, Minn., with his associates. The Barrett’s esophagus community “eagerly” await results from the multicenter, randomized BOSS trial (Barrett’s Oesophagus Surveillance versus endoscopy at need Study), the reviewers wrote in the June issue of Gastroenterology.
Guidelines recommend endoscopic surveillance of patients with Barrett’s esophagus, but it is unclear whether this practice improves survival. Hence, the reviewers searched databases such as Ovid MEDLINE, Embase, PubMed, and Scopus for studies published since 1996 that evaluated outcomes of endoscopic surveillance in Barrett’s esophagus. Eligible studies included a case-control study of a large hospital database in northern California, 6 retrospective cohort studies of endoscopic Barrett’s esophagus surveillance, and 11 prospective and retrospective studies comparing patients with esophageal adenocarcinoma (EAC) with and without a history of Barrett’s esophagus.
The case-control study found no link between endoscopic surveillance and improved survival in Barrett’s esophagus, said the reviewers. However, the retrospective cohort studies linked regular Barrett’s esophagus endoscopic surveillance with a 40% lower risk of death from EAC, compared with incomplete or no endoscopic surveillance, which was statistically significant (risk ratio, 0.60; 95% confidence interval, 0.50-0.71). Individual results of these studies also were consistent with the results of their meta-analysis. A separate meta-analysis of the remaining studies also linked endoscopic surveillance with a significantly lower risk of EAC-related mortality (RR, 0.73; 95% CI, 0.57-0.94), but these studies had substantial heterogeneity, the investigators said.
Meta-analyses also supported endoscopic surveillance for earlier detection of EAC. Unadjusted data from four studies indicated that Barrett’s esophagus surveillance helped detect EAC while it was still early stage (stage 0 or 1) rather than later stage (RR, 2.1; 95% CI, 1.1-4.1). Similarly, patients who had already been diagnosed with Barrett’s esophagus were significantly more likely to present with early-stage EAC (RR, 5.5; 95% CI, 3.7-8.2). In contrast, enrollment in a Barrett’s esophagus surveillance program did not appear to affect the likelihood of esophagectomy.
Additional meta-analyses suggested that endoscopic surveillance of Barrett’s esophagus might confer a “potentially small” overall survival benefit, the reviewers said. A meta-analysis of adjusted data from three studies linked surveillance with a 25% reduction in risk of all-cause mortality, compared with no surveillance (hazard ratio, 0.75; 95% CI, 0.58-0.94). Having a prior Barrett’s esophagus diagnosis also was associated with a 52% decrease in all-cause mortality, compared with having symptomatic cancer (RR, 0.48; 95% CI, 0.37-0.63) in a meta-analysis of unadjusted data from 12 studies. Five studies with adjusted data linked a prior Barrett’s esophagus diagnosis with a 41% lower risk of all-cause mortality (HR, 0.59; 95% CI, 0.45-0.76). However, individual findings varied substantially, and adjusting for lead time “almost eliminated this overall survival benefit,” the reviewers said.
The National Institutes of Health and a Public Health Service Award supported the study. Dr. Codipilly reported having no conflicts of interest. Five coinvestigators disclosed ties to Exact Sciences, C2 Therapeutics, and other companies.
SOURCE: Codipilly DC et al. Gastroenterology. 2018 Feb 17. doi: 10.1053/j.gastro.2018.02.022.
Endoscopic surveillance also was associated with modest improvements in all-cause and cancer-specific mortality, said Don C. Codipilly, MD, of the Mayo Clinic in Rochester, Minn., with his associates. The Barrett’s esophagus community “eagerly” await results from the multicenter, randomized BOSS trial (Barrett’s Oesophagus Surveillance versus endoscopy at need Study), the reviewers wrote in the June issue of Gastroenterology.
Guidelines recommend endoscopic surveillance of patients with Barrett’s esophagus, but it is unclear whether this practice improves survival. Hence, the reviewers searched databases such as Ovid MEDLINE, Embase, PubMed, and Scopus for studies published since 1996 that evaluated outcomes of endoscopic surveillance in Barrett’s esophagus. Eligible studies included a case-control study of a large hospital database in northern California, 6 retrospective cohort studies of endoscopic Barrett’s esophagus surveillance, and 11 prospective and retrospective studies comparing patients with esophageal adenocarcinoma (EAC) with and without a history of Barrett’s esophagus.
The case-control study found no link between endoscopic surveillance and improved survival in Barrett’s esophagus, said the reviewers. However, the retrospective cohort studies linked regular Barrett’s esophagus endoscopic surveillance with a 40% lower risk of death from EAC, compared with incomplete or no endoscopic surveillance, which was statistically significant (risk ratio, 0.60; 95% confidence interval, 0.50-0.71). Individual results of these studies also were consistent with the results of their meta-analysis. A separate meta-analysis of the remaining studies also linked endoscopic surveillance with a significantly lower risk of EAC-related mortality (RR, 0.73; 95% CI, 0.57-0.94), but these studies had substantial heterogeneity, the investigators said.
Meta-analyses also supported endoscopic surveillance for earlier detection of EAC. Unadjusted data from four studies indicated that Barrett’s esophagus surveillance helped detect EAC while it was still early stage (stage 0 or 1) rather than later stage (RR, 2.1; 95% CI, 1.1-4.1). Similarly, patients who had already been diagnosed with Barrett’s esophagus were significantly more likely to present with early-stage EAC (RR, 5.5; 95% CI, 3.7-8.2). In contrast, enrollment in a Barrett’s esophagus surveillance program did not appear to affect the likelihood of esophagectomy.
Additional meta-analyses suggested that endoscopic surveillance of Barrett’s esophagus might confer a “potentially small” overall survival benefit, the reviewers said. A meta-analysis of adjusted data from three studies linked surveillance with a 25% reduction in risk of all-cause mortality, compared with no surveillance (hazard ratio, 0.75; 95% CI, 0.58-0.94). Having a prior Barrett’s esophagus diagnosis also was associated with a 52% decrease in all-cause mortality, compared with having symptomatic cancer (RR, 0.48; 95% CI, 0.37-0.63) in a meta-analysis of unadjusted data from 12 studies. Five studies with adjusted data linked a prior Barrett’s esophagus diagnosis with a 41% lower risk of all-cause mortality (HR, 0.59; 95% CI, 0.45-0.76). However, individual findings varied substantially, and adjusting for lead time “almost eliminated this overall survival benefit,” the reviewers said.
The National Institutes of Health and a Public Health Service Award supported the study. Dr. Codipilly reported having no conflicts of interest. Five coinvestigators disclosed ties to Exact Sciences, C2 Therapeutics, and other companies.
SOURCE: Codipilly DC et al. Gastroenterology. 2018 Feb 17. doi: 10.1053/j.gastro.2018.02.022.
FROM GASTROENTEROLOGY
Key clinical point: Endoscopic surveillance of Barrett’s esophagus was associated with significantly earlier cancer detection and conferred a small survival benefit.
Major finding: Risk ratios for esophageal adenocarcinoma–specific mortality ranged from 0.60 to 0.73 and reached statistical significance.
Study details: A systematic review and meta-analysis of 18 studies.
Disclosures: The National Institutes of Health and a Public Health Service Award supported the study. Dr. Codipilly reported having no conflicts of interest. Five coinvestigators disclosed ties to Exact Sciences, C2 Therapeutics, and other companies.
Source: Codipilly DC et al. Gastroenterology. 2018 Feb 17. doi: 10.1053/j.gastro.2018.02.022.
Study eyes liver transplantation after Region 5 UNOS downstaging
Liver transplantation led to “excellent outcomes” when performed after downstaging hepatocellular carcinoma using the UNOS (United Network for Organ Sharing) Region 5 protocol, investigators reported.
Downstaging succeeded for 58% of patients, and an estimated 87% of transplantation recipients were alive and recurrence free at 5 years, said Neil Mehta, MD, of the University of California, San Francisco, and his associates. The findings support expanding priority access to liver transplantation to include patients whose hepatocellular carcinoma (HCC) has been successfully downstaged, they said. “In the meantime, UNOS has recently approved the Region 5 downstaging protocol for receiving automatic HCC-MELD exception listing,” they wrote. The report was published in the June issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.11.037).
This is the first multicenter study of HCC downstaging according to a uniform protocol, the researchers noted. In multivariable analyses, downstaging was significantly more likely to fail in the setting of moderate to severe (Child Pugh B or C) hepatic impairment (hazard ratio, 3.3; 95% confidence interval, 3.0 to 3.6; P less than .001) or baseline alpha-fetoprotein level above 1,000 ng/mL (HR, 1.6; 95% CI, 1.4 to 1.9; P less than .001).
The incidence of HCC in the United States is expected to keep rising for at least another decade because of epidemic levels of fatty liver disease and chronic hepatitis C, the investigators noted. Downstaging HCC with local-regional therapy is a common bridge to transplantation, and successful treatment tends to reflect favorable tumor biology, which bodes well for transplantation. However, no multicenter study had evaluated these associations. Therefore, the investigators retrospectively studied 187 patients with HCC from three centers in California who underwent downstaging according to the UNOS Region 5 protocol between 2002 and 2012.
A total of 156 patients (83%) were successfully downstaged to within Milan criteria after a median of 2.7 months (interquartile range, 1.4 to 4.9 months), said the researchers. Among patients who were successfully downstaged but did not undergo transplantation, 37 patients had tumor progression or died from liver-related causes after a median of 6 months, while 10 patients remained on the transplant list. Among the 109 patients who underwent transplantation after a median of 13 months (interquartile range 6 to 19 months), median follow-up time was 4.3 years and estimated 5-year survival was 80%, and estimated recurrence-free survival was 87%.
Fully 68% of successfully downstaged patients required only one local-regional treatment, the researchers said. The Region 5 protocol considers patients eligible for downstaging if they have a single HCC lesion measuring up to 8 cm or multiple lesions whose combined diameters do not exceed 8 cm, and no evidence of extrahepatic disease or vascular invasion on multiphase computed tomography or magnetic resonance imaging.
The protocol considers downstaging successful if it results in one lesion measuring up to 5 cm or no more than three lesions of up to 3 cm each. Thus, patients who start out with four or five lesions must have complete necrosis of at least one to two tumors. Successfully downstaged patients must remain free of acute hepatic decompensation for at least 3 consecutive months before undergoing transplantation, according to the protocol.
“Slight refinements in the inclusion criteria for downstaging seem warranted [given] that all Child’s B/C patients with pretreatment alpha-fetoprotein greater than 1000 ng/mL suffered poor outcomes when downstaging was attempted,” the investigators noted. They reported that the 1-year risk of failed downstaging was 70% among patients with both Child’s B/C cirrhosis and alpha-fetoprotein level at or above 1000 ng/mL, 32% among patients with one risk factor, and 14% among patients with no risk factors (P less than .001).
The National Institutes of Health provided partial funding. The investigators reported having no conflicts of interest.
SOURCE: Mehta N, et al. Clin Gastroenterol Hepatol. 2017 Nov 23. doi: 10.1016/j.cgh.2017.11.037.
Liver transplantation of selected patients with hepatocellular carcinoma (HCC) is an accepted indication and associated with excellent outcomes. Until recently, criteria for liver transplantation were based on the Milan criteria that only took size and number of tumors under consideration. In this multicenter study, patients who were outside of Milan criteria were successfully downstaged to within Milan criteria with locoregional therapy and subsequently transplanted with excellent outcomes. Salient features included the following. 1) Six months waiting after the first treatment and 3 months after downstaging was required to ensure that the tumor stage remained within Milan criteria. 2) Any specific type of locoregional therapy was allowed. 3) Downstaging was possible in a majority of patients after a single treatment. 4) Patients with alpha-fetoprotein greater than 1000 ng/mL (approximately 10%) as well as presence of substantial decompensated liver disease (approximately 40%) did not have favorable outcomes. 4) On multivariable analysis, tumor biology was a stronger predictor of poor outcomes than was stage of liver disease.
Sumeet K. Asrani, MD, MSc, is associate professor in medicine and hepatologist at Baylor University Medical Center, and medical director of the Center for Advanced Liver Disease, Dallas. He has no conflicts of interest.
Liver transplantation of selected patients with hepatocellular carcinoma (HCC) is an accepted indication and associated with excellent outcomes. Until recently, criteria for liver transplantation were based on the Milan criteria that only took size and number of tumors under consideration. In this multicenter study, patients who were outside of Milan criteria were successfully downstaged to within Milan criteria with locoregional therapy and subsequently transplanted with excellent outcomes. Salient features included the following. 1) Six months waiting after the first treatment and 3 months after downstaging was required to ensure that the tumor stage remained within Milan criteria. 2) Any specific type of locoregional therapy was allowed. 3) Downstaging was possible in a majority of patients after a single treatment. 4) Patients with alpha-fetoprotein greater than 1000 ng/mL (approximately 10%) as well as presence of substantial decompensated liver disease (approximately 40%) did not have favorable outcomes. 4) On multivariable analysis, tumor biology was a stronger predictor of poor outcomes than was stage of liver disease.
Sumeet K. Asrani, MD, MSc, is associate professor in medicine and hepatologist at Baylor University Medical Center, and medical director of the Center for Advanced Liver Disease, Dallas. He has no conflicts of interest.
Liver transplantation of selected patients with hepatocellular carcinoma (HCC) is an accepted indication and associated with excellent outcomes. Until recently, criteria for liver transplantation were based on the Milan criteria that only took size and number of tumors under consideration. In this multicenter study, patients who were outside of Milan criteria were successfully downstaged to within Milan criteria with locoregional therapy and subsequently transplanted with excellent outcomes. Salient features included the following. 1) Six months waiting after the first treatment and 3 months after downstaging was required to ensure that the tumor stage remained within Milan criteria. 2) Any specific type of locoregional therapy was allowed. 3) Downstaging was possible in a majority of patients after a single treatment. 4) Patients with alpha-fetoprotein greater than 1000 ng/mL (approximately 10%) as well as presence of substantial decompensated liver disease (approximately 40%) did not have favorable outcomes. 4) On multivariable analysis, tumor biology was a stronger predictor of poor outcomes than was stage of liver disease.
Sumeet K. Asrani, MD, MSc, is associate professor in medicine and hepatologist at Baylor University Medical Center, and medical director of the Center for Advanced Liver Disease, Dallas. He has no conflicts of interest.
Liver transplantation led to “excellent outcomes” when performed after downstaging hepatocellular carcinoma using the UNOS (United Network for Organ Sharing) Region 5 protocol, investigators reported.
Downstaging succeeded for 58% of patients, and an estimated 87% of transplantation recipients were alive and recurrence free at 5 years, said Neil Mehta, MD, of the University of California, San Francisco, and his associates. The findings support expanding priority access to liver transplantation to include patients whose hepatocellular carcinoma (HCC) has been successfully downstaged, they said. “In the meantime, UNOS has recently approved the Region 5 downstaging protocol for receiving automatic HCC-MELD exception listing,” they wrote. The report was published in the June issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.11.037).
This is the first multicenter study of HCC downstaging according to a uniform protocol, the researchers noted. In multivariable analyses, downstaging was significantly more likely to fail in the setting of moderate to severe (Child Pugh B or C) hepatic impairment (hazard ratio, 3.3; 95% confidence interval, 3.0 to 3.6; P less than .001) or baseline alpha-fetoprotein level above 1,000 ng/mL (HR, 1.6; 95% CI, 1.4 to 1.9; P less than .001).
The incidence of HCC in the United States is expected to keep rising for at least another decade because of epidemic levels of fatty liver disease and chronic hepatitis C, the investigators noted. Downstaging HCC with local-regional therapy is a common bridge to transplantation, and successful treatment tends to reflect favorable tumor biology, which bodes well for transplantation. However, no multicenter study had evaluated these associations. Therefore, the investigators retrospectively studied 187 patients with HCC from three centers in California who underwent downstaging according to the UNOS Region 5 protocol between 2002 and 2012.
A total of 156 patients (83%) were successfully downstaged to within Milan criteria after a median of 2.7 months (interquartile range, 1.4 to 4.9 months), said the researchers. Among patients who were successfully downstaged but did not undergo transplantation, 37 patients had tumor progression or died from liver-related causes after a median of 6 months, while 10 patients remained on the transplant list. Among the 109 patients who underwent transplantation after a median of 13 months (interquartile range 6 to 19 months), median follow-up time was 4.3 years and estimated 5-year survival was 80%, and estimated recurrence-free survival was 87%.
Fully 68% of successfully downstaged patients required only one local-regional treatment, the researchers said. The Region 5 protocol considers patients eligible for downstaging if they have a single HCC lesion measuring up to 8 cm or multiple lesions whose combined diameters do not exceed 8 cm, and no evidence of extrahepatic disease or vascular invasion on multiphase computed tomography or magnetic resonance imaging.
The protocol considers downstaging successful if it results in one lesion measuring up to 5 cm or no more than three lesions of up to 3 cm each. Thus, patients who start out with four or five lesions must have complete necrosis of at least one to two tumors. Successfully downstaged patients must remain free of acute hepatic decompensation for at least 3 consecutive months before undergoing transplantation, according to the protocol.
“Slight refinements in the inclusion criteria for downstaging seem warranted [given] that all Child’s B/C patients with pretreatment alpha-fetoprotein greater than 1000 ng/mL suffered poor outcomes when downstaging was attempted,” the investigators noted. They reported that the 1-year risk of failed downstaging was 70% among patients with both Child’s B/C cirrhosis and alpha-fetoprotein level at or above 1000 ng/mL, 32% among patients with one risk factor, and 14% among patients with no risk factors (P less than .001).
The National Institutes of Health provided partial funding. The investigators reported having no conflicts of interest.
SOURCE: Mehta N, et al. Clin Gastroenterol Hepatol. 2017 Nov 23. doi: 10.1016/j.cgh.2017.11.037.
Liver transplantation led to “excellent outcomes” when performed after downstaging hepatocellular carcinoma using the UNOS (United Network for Organ Sharing) Region 5 protocol, investigators reported.
Downstaging succeeded for 58% of patients, and an estimated 87% of transplantation recipients were alive and recurrence free at 5 years, said Neil Mehta, MD, of the University of California, San Francisco, and his associates. The findings support expanding priority access to liver transplantation to include patients whose hepatocellular carcinoma (HCC) has been successfully downstaged, they said. “In the meantime, UNOS has recently approved the Region 5 downstaging protocol for receiving automatic HCC-MELD exception listing,” they wrote. The report was published in the June issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.11.037).
This is the first multicenter study of HCC downstaging according to a uniform protocol, the researchers noted. In multivariable analyses, downstaging was significantly more likely to fail in the setting of moderate to severe (Child Pugh B or C) hepatic impairment (hazard ratio, 3.3; 95% confidence interval, 3.0 to 3.6; P less than .001) or baseline alpha-fetoprotein level above 1,000 ng/mL (HR, 1.6; 95% CI, 1.4 to 1.9; P less than .001).
The incidence of HCC in the United States is expected to keep rising for at least another decade because of epidemic levels of fatty liver disease and chronic hepatitis C, the investigators noted. Downstaging HCC with local-regional therapy is a common bridge to transplantation, and successful treatment tends to reflect favorable tumor biology, which bodes well for transplantation. However, no multicenter study had evaluated these associations. Therefore, the investigators retrospectively studied 187 patients with HCC from three centers in California who underwent downstaging according to the UNOS Region 5 protocol between 2002 and 2012.
A total of 156 patients (83%) were successfully downstaged to within Milan criteria after a median of 2.7 months (interquartile range, 1.4 to 4.9 months), said the researchers. Among patients who were successfully downstaged but did not undergo transplantation, 37 patients had tumor progression or died from liver-related causes after a median of 6 months, while 10 patients remained on the transplant list. Among the 109 patients who underwent transplantation after a median of 13 months (interquartile range 6 to 19 months), median follow-up time was 4.3 years and estimated 5-year survival was 80%, and estimated recurrence-free survival was 87%.
Fully 68% of successfully downstaged patients required only one local-regional treatment, the researchers said. The Region 5 protocol considers patients eligible for downstaging if they have a single HCC lesion measuring up to 8 cm or multiple lesions whose combined diameters do not exceed 8 cm, and no evidence of extrahepatic disease or vascular invasion on multiphase computed tomography or magnetic resonance imaging.
The protocol considers downstaging successful if it results in one lesion measuring up to 5 cm or no more than three lesions of up to 3 cm each. Thus, patients who start out with four or five lesions must have complete necrosis of at least one to two tumors. Successfully downstaged patients must remain free of acute hepatic decompensation for at least 3 consecutive months before undergoing transplantation, according to the protocol.
“Slight refinements in the inclusion criteria for downstaging seem warranted [given] that all Child’s B/C patients with pretreatment alpha-fetoprotein greater than 1000 ng/mL suffered poor outcomes when downstaging was attempted,” the investigators noted. They reported that the 1-year risk of failed downstaging was 70% among patients with both Child’s B/C cirrhosis and alpha-fetoprotein level at or above 1000 ng/mL, 32% among patients with one risk factor, and 14% among patients with no risk factors (P less than .001).
The National Institutes of Health provided partial funding. The investigators reported having no conflicts of interest.
SOURCE: Mehta N, et al. Clin Gastroenterol Hepatol. 2017 Nov 23. doi: 10.1016/j.cgh.2017.11.037.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: Liver transplantation led to excellent outcomes when performed after downstaging hepatocellular carcinoma according to the UNOS (United Network for Organ Sharing) Region 5 protocol.
Major finding: Downstaging succeeded in 58% of patients. Estimated 5-year posttransplantation recurrence-free survival was 87%.
Study details: Retrospective multicenter study of 187 patients with hepatocellular carcinoma.
Disclosures: The National Institutes of Health provided partial funding. The investigators reported having no conflicts of interest.
Source: Mehta N et al. Clin Gastroenterol Hepatol. 2017 Nov 23. doi: 10.1016/j.cgh.2017.11.037.
Colonic diverticulosis not linked to mucosal inflammation, GI symptoms
Colonic diverticulosis was not associated with mucosal inflammation or gastrointestinal symptoms in a single-center, prospective study of adults undergoing their first screening colonoscopy.
After adjustment for age, sex, and body mass index, there were no significant links between diverticulosis and tumor necrosis factor, CD4+ cells, CD8+ cells, CD57+ cells, irritable bowel syndrome, or chronic abdominal pain, reported Anne F. Peery, MD, with her associates at the University of North Carolina at Chapel Hill. “Our findings strongly question the rationale for treating symptomatic uncomplicated diverticular disease with mesalamine,” they wrote in the June issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.05.051).
Colonic diverticula affect more than half of individuals in the United States over the age of 60 years, according to the results of past studies. “Although colonic diverticulosis can be complicated by the overt inflammation of acute diverticulitis, there is some thought that colonic diverticulosis is associated with low-grade mucosal inflammation,” the researchers said. “Moreover, this low-grade diverticular inflammation is believed to contribute to chronic gastrointestinal symptoms.” However, no rigorous prospective study had tested these assertions.
Accordingly, the researchers evaluated prospective data from 619 outpatients aged 30 years and older who underwent screening colonoscopies for the first time during 2013-2015. These patients had consented to participate in a study of risk factors for colonic diverticulosis. Most were white (76%) or black (21%), and most were aged 50-59 years.
A total of 255 individuals had diverticula while 364 controls did not. Patients with diverticula tended to be older and were more often male (47% vs. 41% of controls) and overweight or obese (72% vs. 62%). After adjustment for age, sex, and body mass index, there was no evidence linking diverticulosis with tumor necrosis factor alpha expression (odds ratio, 0.9; 95% confidence interval, 0.6-1.2), CD4+ cells (OR, 1.2; 95% CI, 0.9-1.6), CD8+ cells (OR, 1.0; 95% CI, 0.7-1.3), or CD57+ cells (OR, 0.8; 95% CI, 0.6-1.1).
Among 42 patients who met Rome III criteria for irritable bowel syndrome, 11 had diverticulosis. Diverticulosis in IBS was not associated with changes in expression of the mucosal inflammatory markers interleukin-6, interleukin-10, tumor necrosis factor, CD4, CD8, or mast cell tryptase, said the researchers. A total of 63 patients had chronic abdominal pain, of whom 22 also had diverticulosis. There were no significant differences in mucosal inflammatory markers between symptomatic patients with diverticula and those without. Adjusted analysis found no association between number of diverticula and chronic abdominal pain (OR, 0.7; 95% CI, 0.4-1.2) or IBS (OR, 0.5; 95% CI, 0.3-1.1).
The number of patients with IBS in this study was small, and the researchers did not ascertain IBS symptom severity, they noted. “Although we studied several immune markers and cytokines, there are other potential markers that may be associated with chronic inflammation,” they added. “Multianalyte profiling could be used to assess an array of cytokines, and markers for macrophages (CD68), global T cells (CD3), and B cells (CD19). Whether there is utility in further studies given our negative results is debatable.”
The National Institutes of Health provided funding. The investigators reported having no conflicts of interest.
SOURCE: Peery AF et al. Clin Gastroenterol Hepatol. 2017 Jun 8. doi: 10.1016/j.cgh.2017.05.051.
Colonic diverticulosis was not associated with mucosal inflammation or gastrointestinal symptoms in a single-center, prospective study of adults undergoing their first screening colonoscopy.
After adjustment for age, sex, and body mass index, there were no significant links between diverticulosis and tumor necrosis factor, CD4+ cells, CD8+ cells, CD57+ cells, irritable bowel syndrome, or chronic abdominal pain, reported Anne F. Peery, MD, with her associates at the University of North Carolina at Chapel Hill. “Our findings strongly question the rationale for treating symptomatic uncomplicated diverticular disease with mesalamine,” they wrote in the June issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.05.051).
Colonic diverticula affect more than half of individuals in the United States over the age of 60 years, according to the results of past studies. “Although colonic diverticulosis can be complicated by the overt inflammation of acute diverticulitis, there is some thought that colonic diverticulosis is associated with low-grade mucosal inflammation,” the researchers said. “Moreover, this low-grade diverticular inflammation is believed to contribute to chronic gastrointestinal symptoms.” However, no rigorous prospective study had tested these assertions.
Accordingly, the researchers evaluated prospective data from 619 outpatients aged 30 years and older who underwent screening colonoscopies for the first time during 2013-2015. These patients had consented to participate in a study of risk factors for colonic diverticulosis. Most were white (76%) or black (21%), and most were aged 50-59 years.
A total of 255 individuals had diverticula while 364 controls did not. Patients with diverticula tended to be older and were more often male (47% vs. 41% of controls) and overweight or obese (72% vs. 62%). After adjustment for age, sex, and body mass index, there was no evidence linking diverticulosis with tumor necrosis factor alpha expression (odds ratio, 0.9; 95% confidence interval, 0.6-1.2), CD4+ cells (OR, 1.2; 95% CI, 0.9-1.6), CD8+ cells (OR, 1.0; 95% CI, 0.7-1.3), or CD57+ cells (OR, 0.8; 95% CI, 0.6-1.1).
Among 42 patients who met Rome III criteria for irritable bowel syndrome, 11 had diverticulosis. Diverticulosis in IBS was not associated with changes in expression of the mucosal inflammatory markers interleukin-6, interleukin-10, tumor necrosis factor, CD4, CD8, or mast cell tryptase, said the researchers. A total of 63 patients had chronic abdominal pain, of whom 22 also had diverticulosis. There were no significant differences in mucosal inflammatory markers between symptomatic patients with diverticula and those without. Adjusted analysis found no association between number of diverticula and chronic abdominal pain (OR, 0.7; 95% CI, 0.4-1.2) or IBS (OR, 0.5; 95% CI, 0.3-1.1).
The number of patients with IBS in this study was small, and the researchers did not ascertain IBS symptom severity, they noted. “Although we studied several immune markers and cytokines, there are other potential markers that may be associated with chronic inflammation,” they added. “Multianalyte profiling could be used to assess an array of cytokines, and markers for macrophages (CD68), global T cells (CD3), and B cells (CD19). Whether there is utility in further studies given our negative results is debatable.”
The National Institutes of Health provided funding. The investigators reported having no conflicts of interest.
SOURCE: Peery AF et al. Clin Gastroenterol Hepatol. 2017 Jun 8. doi: 10.1016/j.cgh.2017.05.051.
Colonic diverticulosis was not associated with mucosal inflammation or gastrointestinal symptoms in a single-center, prospective study of adults undergoing their first screening colonoscopy.
After adjustment for age, sex, and body mass index, there were no significant links between diverticulosis and tumor necrosis factor, CD4+ cells, CD8+ cells, CD57+ cells, irritable bowel syndrome, or chronic abdominal pain, reported Anne F. Peery, MD, with her associates at the University of North Carolina at Chapel Hill. “Our findings strongly question the rationale for treating symptomatic uncomplicated diverticular disease with mesalamine,” they wrote in the June issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.05.051).
Colonic diverticula affect more than half of individuals in the United States over the age of 60 years, according to the results of past studies. “Although colonic diverticulosis can be complicated by the overt inflammation of acute diverticulitis, there is some thought that colonic diverticulosis is associated with low-grade mucosal inflammation,” the researchers said. “Moreover, this low-grade diverticular inflammation is believed to contribute to chronic gastrointestinal symptoms.” However, no rigorous prospective study had tested these assertions.
Accordingly, the researchers evaluated prospective data from 619 outpatients aged 30 years and older who underwent screening colonoscopies for the first time during 2013-2015. These patients had consented to participate in a study of risk factors for colonic diverticulosis. Most were white (76%) or black (21%), and most were aged 50-59 years.
A total of 255 individuals had diverticula while 364 controls did not. Patients with diverticula tended to be older and were more often male (47% vs. 41% of controls) and overweight or obese (72% vs. 62%). After adjustment for age, sex, and body mass index, there was no evidence linking diverticulosis with tumor necrosis factor alpha expression (odds ratio, 0.9; 95% confidence interval, 0.6-1.2), CD4+ cells (OR, 1.2; 95% CI, 0.9-1.6), CD8+ cells (OR, 1.0; 95% CI, 0.7-1.3), or CD57+ cells (OR, 0.8; 95% CI, 0.6-1.1).
Among 42 patients who met Rome III criteria for irritable bowel syndrome, 11 had diverticulosis. Diverticulosis in IBS was not associated with changes in expression of the mucosal inflammatory markers interleukin-6, interleukin-10, tumor necrosis factor, CD4, CD8, or mast cell tryptase, said the researchers. A total of 63 patients had chronic abdominal pain, of whom 22 also had diverticulosis. There were no significant differences in mucosal inflammatory markers between symptomatic patients with diverticula and those without. Adjusted analysis found no association between number of diverticula and chronic abdominal pain (OR, 0.7; 95% CI, 0.4-1.2) or IBS (OR, 0.5; 95% CI, 0.3-1.1).
The number of patients with IBS in this study was small, and the researchers did not ascertain IBS symptom severity, they noted. “Although we studied several immune markers and cytokines, there are other potential markers that may be associated with chronic inflammation,” they added. “Multianalyte profiling could be used to assess an array of cytokines, and markers for macrophages (CD68), global T cells (CD3), and B cells (CD19). Whether there is utility in further studies given our negative results is debatable.”
The National Institutes of Health provided funding. The investigators reported having no conflicts of interest.
SOURCE: Peery AF et al. Clin Gastroenterol Hepatol. 2017 Jun 8. doi: 10.1016/j.cgh.2017.05.051.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: Colonic diverticula were not associated with mucosal inflammation or chronic gastrointestinal symptoms.
Major finding: After adjustment for possible confounders, there were no significant associations between diverticulosis and tumor necrosis factor, CD4+ cells, CD8+ cells, CD57+ cells, irritable bowel syndrome, or chronic abdominal pain.
Study details: Single-center prospective study of 619 patients undergoing screening colonoscopies.
Disclosures: The National Institutes of Health provided funding. The investigators reported having no conflicts of interest.
Source: Peery AF et al. Clin Gastroenterol Hepatol. 2017 Jun 8. doi: 10.1016/j.cgh.2017.05.051
AGA Clinical Practice Update: Screening for Barrett’s esophagus requires consideration for those most at risk
The evidence discussed in this article supports the current recommendation of GI societies that screening endoscopy for Barrett’s esophagus be performed only in well-defined, high-risk populations. Alternative tests for screening are not now recommended; however, some of the alternative tests show great promise, and it is expected that they will soon find a useful place in clinical practice. At the same time, there should be a complementary focus on using demographic and clinical factors as well as noninvasive tools to further define populations for screening. All tests and tools should be balanced with the cost and potential risks of the screening proposed.
Stuart Spechler, MD, of the University of Texas and his colleagues looked at a variety of techniques, both conventional and novel, as well as the cost effectiveness of these strategies in a commentary published in the May issue of Gastroenterology.
Some studies have shown that endoscopic surveillance programs have identified early-stage cancer and provided better outcomes, compared with patients presenting after they already have cancer symptoms. One meta-analysis included 51 studies with 11,028 subjects and demonstrated that patients who had surveillance-detected esophageal adenocarcinoma (EAC) had a 61% reduction in their mortality risk. Other studies have shown similar results, but are susceptible to certain biases. Still other studies have refuted that the surveillance programs help at all. In fact, those with Barrett’s esophagus who died of EAC underwent similar surveillance, compared with controls, in those studies, showing that surveillance did very little to improve their outcomes.
Perhaps one of the most intriguing and cost-effective strategies is to identify patients with Barrett’s esophagus and develop a tool based on demographic and historical information. Tools like this have been developed, but have shown lukewarm results, with areas under the receiver operating characteristic curve (AUROC) ranging from 0.61 to 0.75. One study used information concerning obesity, smoking history, and increasing age, combined with weekly symptoms of gastroesophageal reflux and found that this improved results by nearly 25%. Modified versions of this model have also shown improved detection. When Thrift et al. added additional factors like education level, body mass index, smoking status, and more serious alarm symptoms like unexplained weight loss, the model was able to improve AUROC scores to 0.85 (95% confidence interval, 0.78-0.91). Of course, the clinical utility of these models is still unclear. Nonetheless, these models have influenced certain GI societies that only believe in endoscopic screening of patients with additional risk factors.
Although predictive models may assist in identifying at-risk patients, endoscopes are still needed to diagnose. Transnasal endoscopes (TNEs), the thinner cousins of the regular endoscope, tend to be better tolerated by patients and result in less gagging. One study showed that TNEs (45.7%) improved participation, compared with standard endoscopy (40.7%), and almost 80% of TNE patients were willing to undergo the procedure again. Despite the positives, TNEs provided significantly lower biopsy acquisitions than standard endoscopes (83% vs. 100%, P = .001) because of the sheathing on the endoscope. Other studies have demonstrated the strengths of TNEs, including a study in which 38% of patients had a finding that changed management of their disease. TNEs should be considered a reliable screening tool for Barrett’s esophagus.
Other advances in imaging technology like the advent of the high-resolution complementary metal oxide semiconductor (CMOS), which is small enough to fit into a pill capsule, have led researchers to look into its effectiveness as a screening tool for Barrett’s esophagus. One meta-analysis of 618 patients found that the pooled sensitivity and specificity for diagnosis were 77% and 86%, respectively. Despite its ability to produce high-quality images, the device remains difficult to control and lacks the ability to obtain biopsy samples.
Another example of a swallowed medical device, the Cytosponge-TFF3 is an ingestible capsule that degrades in stomach acid. After 5 minutes, the capsule dissolves and releases a mesh sponge that will be withdrawn through the mouth, scraping the esophagus and gathering a sample. The Cytosponge has proven effective in the Barrett’s Esophagus Screening Trials (BEST) 1. The BEST 2 looked at 463 control and 647 patients with Barrett’s esophagus across 11 United Kingdom hospitals. The trial showed that the Cytosponge exhibited sensitivity of 79.9%, which increased to 87.2% in patients with more than 3 cm of circumferential Barrett’s metaplasia.
Breaking from the invasive nature of imaging scopes and the Cytosponge, some researchers are looking to use “liquid biopsy” or blood tests to detect abnormalities in the blood like DNA or microRNA (miRNA) to identify precursors or presence of a disease. Much remains to be done to develop a clinically meaningful test, but the use of miRNAs to detect disease is an intriguing option. miRNAs control gene expression, and their dysregulation has been associated with the development of many diseases. One study found that patients with Barrett’s esophagus had increased levels of miRNA-194, 215, and 143 but these findings were not validated in a larger study. Other studies have demonstrated similar findings, but more research must be done to validate these findings in larger cohorts.
Other novel detection therapies have been investigated, including serum adipokine and electronic nose breathing tests. The serum adipokine test looks at the metabolically active adipokines secreted in obese patients and those with metabolic syndrome to see if they could predict the presence of Barrett’s esophagus. Unfortunately, the data appear to be conflicting, but these tests can be used in conjunction with other tools to detect Barrett’s esophagus. Electronic nose breathing tests also work by detecting metabolically active compounds from human and gut bacterial metabolism. One study found that analyzing these volatile compounds could delineate between Barrett’s and non-Barrett’s patients with 82% sensitivity, 80% specificity, and 81% accuracy. Both of these technologies need large prospective studies in primary care to validate their clinical utility.
A discussion of the effectiveness of these screening tools would be incomplete without a discussion of their costs. Currently, endoscopic screening costs are high. Therefore, it is important to reserve these tools for the patients who will benefit the most – in other words, patients with clear risk factors for Barrett’s esophagus. Even the capsule endoscope is quite expensive because of the cost of materials associated with the tool.
Cost-effectivenes calculations surrounding the Cytosponge are particularly complicated. One analysis found the computed incremental cost-effectiveness ratio (ICER) of endoscopy, compared with Cytosponge, to have a range of $107,583-$330,361. The potential benefit that Cytosponge offers comes at an ICER for Cytosponge screening, compared with no screening, that ranges from $26,358 to $33,307. The numbers skyrocket when you consider what society would be willing to pay (up to $50,000 per quality-adjusted life-year gained).
With all of this information in mind, it would be useful to look at Barrett’s esophagus and the tools used to diagnose it from a broader perspective.
While the adoption of a new screening strategy could succeed where others have failed, Dr. Spechler points out the potential harm.
“There also is potential for harm in identifying asymptomatic patients with Barrett’s esophagus. In addition to the high costs and small risks of standard endoscopy, the diagnosis of Barrett’s esophagus can cause psychological stress, have a negative impact on quality of life, result in higher premiums for health and life insurance, and might identify innocuous lesions that lead to potentially hazardous invasive treatments. Efforts should therefore be continued to combine biomarkers for Barrett’s with risk stratification. Overall, while these vexing uncertainties must temper enthusiasm for the unqualified endorsement of any screening test for Barrett’s esophagus, the alternative of making no attempt to stem the rapidly rising incidence of a lethal malignancy also is unpalatable.”
The development of this commentary was supported solely by the American Gastroenterological Association Institute. No conflicts of interest were disclosed for this report.
SOURCE: Spechler S et al. Gastroenterology. 2018 May doi: 10.1053/j.gastro.2018.03.031).
AGA Resource
AGA patient education on Barrett’s esophagus will help your patients better understand the disease and how to manage it. Learn more at gastro.org/patient-care.
The evidence discussed in this article supports the current recommendation of GI societies that screening endoscopy for Barrett’s esophagus be performed only in well-defined, high-risk populations. Alternative tests for screening are not now recommended; however, some of the alternative tests show great promise, and it is expected that they will soon find a useful place in clinical practice. At the same time, there should be a complementary focus on using demographic and clinical factors as well as noninvasive tools to further define populations for screening. All tests and tools should be balanced with the cost and potential risks of the screening proposed.
Stuart Spechler, MD, of the University of Texas and his colleagues looked at a variety of techniques, both conventional and novel, as well as the cost effectiveness of these strategies in a commentary published in the May issue of Gastroenterology.
Some studies have shown that endoscopic surveillance programs have identified early-stage cancer and provided better outcomes, compared with patients presenting after they already have cancer symptoms. One meta-analysis included 51 studies with 11,028 subjects and demonstrated that patients who had surveillance-detected esophageal adenocarcinoma (EAC) had a 61% reduction in their mortality risk. Other studies have shown similar results, but are susceptible to certain biases. Still other studies have refuted that the surveillance programs help at all. In fact, those with Barrett’s esophagus who died of EAC underwent similar surveillance, compared with controls, in those studies, showing that surveillance did very little to improve their outcomes.
Perhaps one of the most intriguing and cost-effective strategies is to identify patients with Barrett’s esophagus and develop a tool based on demographic and historical information. Tools like this have been developed, but have shown lukewarm results, with areas under the receiver operating characteristic curve (AUROC) ranging from 0.61 to 0.75. One study used information concerning obesity, smoking history, and increasing age, combined with weekly symptoms of gastroesophageal reflux and found that this improved results by nearly 25%. Modified versions of this model have also shown improved detection. When Thrift et al. added additional factors like education level, body mass index, smoking status, and more serious alarm symptoms like unexplained weight loss, the model was able to improve AUROC scores to 0.85 (95% confidence interval, 0.78-0.91). Of course, the clinical utility of these models is still unclear. Nonetheless, these models have influenced certain GI societies that only believe in endoscopic screening of patients with additional risk factors.
Although predictive models may assist in identifying at-risk patients, endoscopes are still needed to diagnose. Transnasal endoscopes (TNEs), the thinner cousins of the regular endoscope, tend to be better tolerated by patients and result in less gagging. One study showed that TNEs (45.7%) improved participation, compared with standard endoscopy (40.7%), and almost 80% of TNE patients were willing to undergo the procedure again. Despite the positives, TNEs provided significantly lower biopsy acquisitions than standard endoscopes (83% vs. 100%, P = .001) because of the sheathing on the endoscope. Other studies have demonstrated the strengths of TNEs, including a study in which 38% of patients had a finding that changed management of their disease. TNEs should be considered a reliable screening tool for Barrett’s esophagus.
Other advances in imaging technology like the advent of the high-resolution complementary metal oxide semiconductor (CMOS), which is small enough to fit into a pill capsule, have led researchers to look into its effectiveness as a screening tool for Barrett’s esophagus. One meta-analysis of 618 patients found that the pooled sensitivity and specificity for diagnosis were 77% and 86%, respectively. Despite its ability to produce high-quality images, the device remains difficult to control and lacks the ability to obtain biopsy samples.
Another example of a swallowed medical device, the Cytosponge-TFF3 is an ingestible capsule that degrades in stomach acid. After 5 minutes, the capsule dissolves and releases a mesh sponge that will be withdrawn through the mouth, scraping the esophagus and gathering a sample. The Cytosponge has proven effective in the Barrett’s Esophagus Screening Trials (BEST) 1. The BEST 2 looked at 463 control and 647 patients with Barrett’s esophagus across 11 United Kingdom hospitals. The trial showed that the Cytosponge exhibited sensitivity of 79.9%, which increased to 87.2% in patients with more than 3 cm of circumferential Barrett’s metaplasia.
Breaking from the invasive nature of imaging scopes and the Cytosponge, some researchers are looking to use “liquid biopsy” or blood tests to detect abnormalities in the blood like DNA or microRNA (miRNA) to identify precursors or presence of a disease. Much remains to be done to develop a clinically meaningful test, but the use of miRNAs to detect disease is an intriguing option. miRNAs control gene expression, and their dysregulation has been associated with the development of many diseases. One study found that patients with Barrett’s esophagus had increased levels of miRNA-194, 215, and 143 but these findings were not validated in a larger study. Other studies have demonstrated similar findings, but more research must be done to validate these findings in larger cohorts.
Other novel detection therapies have been investigated, including serum adipokine and electronic nose breathing tests. The serum adipokine test looks at the metabolically active adipokines secreted in obese patients and those with metabolic syndrome to see if they could predict the presence of Barrett’s esophagus. Unfortunately, the data appear to be conflicting, but these tests can be used in conjunction with other tools to detect Barrett’s esophagus. Electronic nose breathing tests also work by detecting metabolically active compounds from human and gut bacterial metabolism. One study found that analyzing these volatile compounds could delineate between Barrett’s and non-Barrett’s patients with 82% sensitivity, 80% specificity, and 81% accuracy. Both of these technologies need large prospective studies in primary care to validate their clinical utility.
A discussion of the effectiveness of these screening tools would be incomplete without a discussion of their costs. Currently, endoscopic screening costs are high. Therefore, it is important to reserve these tools for the patients who will benefit the most – in other words, patients with clear risk factors for Barrett’s esophagus. Even the capsule endoscope is quite expensive because of the cost of materials associated with the tool.
Cost-effectivenes calculations surrounding the Cytosponge are particularly complicated. One analysis found the computed incremental cost-effectiveness ratio (ICER) of endoscopy, compared with Cytosponge, to have a range of $107,583-$330,361. The potential benefit that Cytosponge offers comes at an ICER for Cytosponge screening, compared with no screening, that ranges from $26,358 to $33,307. The numbers skyrocket when you consider what society would be willing to pay (up to $50,000 per quality-adjusted life-year gained).
With all of this information in mind, it would be useful to look at Barrett’s esophagus and the tools used to diagnose it from a broader perspective.
While the adoption of a new screening strategy could succeed where others have failed, Dr. Spechler points out the potential harm.
“There also is potential for harm in identifying asymptomatic patients with Barrett’s esophagus. In addition to the high costs and small risks of standard endoscopy, the diagnosis of Barrett’s esophagus can cause psychological stress, have a negative impact on quality of life, result in higher premiums for health and life insurance, and might identify innocuous lesions that lead to potentially hazardous invasive treatments. Efforts should therefore be continued to combine biomarkers for Barrett’s with risk stratification. Overall, while these vexing uncertainties must temper enthusiasm for the unqualified endorsement of any screening test for Barrett’s esophagus, the alternative of making no attempt to stem the rapidly rising incidence of a lethal malignancy also is unpalatable.”
The development of this commentary was supported solely by the American Gastroenterological Association Institute. No conflicts of interest were disclosed for this report.
SOURCE: Spechler S et al. Gastroenterology. 2018 May doi: 10.1053/j.gastro.2018.03.031).
AGA Resource
AGA patient education on Barrett’s esophagus will help your patients better understand the disease and how to manage it. Learn more at gastro.org/patient-care.
The evidence discussed in this article supports the current recommendation of GI societies that screening endoscopy for Barrett’s esophagus be performed only in well-defined, high-risk populations. Alternative tests for screening are not now recommended; however, some of the alternative tests show great promise, and it is expected that they will soon find a useful place in clinical practice. At the same time, there should be a complementary focus on using demographic and clinical factors as well as noninvasive tools to further define populations for screening. All tests and tools should be balanced with the cost and potential risks of the screening proposed.
Stuart Spechler, MD, of the University of Texas and his colleagues looked at a variety of techniques, both conventional and novel, as well as the cost effectiveness of these strategies in a commentary published in the May issue of Gastroenterology.
Some studies have shown that endoscopic surveillance programs have identified early-stage cancer and provided better outcomes, compared with patients presenting after they already have cancer symptoms. One meta-analysis included 51 studies with 11,028 subjects and demonstrated that patients who had surveillance-detected esophageal adenocarcinoma (EAC) had a 61% reduction in their mortality risk. Other studies have shown similar results, but are susceptible to certain biases. Still other studies have refuted that the surveillance programs help at all. In fact, those with Barrett’s esophagus who died of EAC underwent similar surveillance, compared with controls, in those studies, showing that surveillance did very little to improve their outcomes.
Perhaps one of the most intriguing and cost-effective strategies is to identify patients with Barrett’s esophagus and develop a tool based on demographic and historical information. Tools like this have been developed, but have shown lukewarm results, with areas under the receiver operating characteristic curve (AUROC) ranging from 0.61 to 0.75. One study used information concerning obesity, smoking history, and increasing age, combined with weekly symptoms of gastroesophageal reflux and found that this improved results by nearly 25%. Modified versions of this model have also shown improved detection. When Thrift et al. added additional factors like education level, body mass index, smoking status, and more serious alarm symptoms like unexplained weight loss, the model was able to improve AUROC scores to 0.85 (95% confidence interval, 0.78-0.91). Of course, the clinical utility of these models is still unclear. Nonetheless, these models have influenced certain GI societies that only believe in endoscopic screening of patients with additional risk factors.
Although predictive models may assist in identifying at-risk patients, endoscopes are still needed to diagnose. Transnasal endoscopes (TNEs), the thinner cousins of the regular endoscope, tend to be better tolerated by patients and result in less gagging. One study showed that TNEs (45.7%) improved participation, compared with standard endoscopy (40.7%), and almost 80% of TNE patients were willing to undergo the procedure again. Despite the positives, TNEs provided significantly lower biopsy acquisitions than standard endoscopes (83% vs. 100%, P = .001) because of the sheathing on the endoscope. Other studies have demonstrated the strengths of TNEs, including a study in which 38% of patients had a finding that changed management of their disease. TNEs should be considered a reliable screening tool for Barrett’s esophagus.
Other advances in imaging technology like the advent of the high-resolution complementary metal oxide semiconductor (CMOS), which is small enough to fit into a pill capsule, have led researchers to look into its effectiveness as a screening tool for Barrett’s esophagus. One meta-analysis of 618 patients found that the pooled sensitivity and specificity for diagnosis were 77% and 86%, respectively. Despite its ability to produce high-quality images, the device remains difficult to control and lacks the ability to obtain biopsy samples.
Another example of a swallowed medical device, the Cytosponge-TFF3 is an ingestible capsule that degrades in stomach acid. After 5 minutes, the capsule dissolves and releases a mesh sponge that will be withdrawn through the mouth, scraping the esophagus and gathering a sample. The Cytosponge has proven effective in the Barrett’s Esophagus Screening Trials (BEST) 1. The BEST 2 looked at 463 control and 647 patients with Barrett’s esophagus across 11 United Kingdom hospitals. The trial showed that the Cytosponge exhibited sensitivity of 79.9%, which increased to 87.2% in patients with more than 3 cm of circumferential Barrett’s metaplasia.
Breaking from the invasive nature of imaging scopes and the Cytosponge, some researchers are looking to use “liquid biopsy” or blood tests to detect abnormalities in the blood like DNA or microRNA (miRNA) to identify precursors or presence of a disease. Much remains to be done to develop a clinically meaningful test, but the use of miRNAs to detect disease is an intriguing option. miRNAs control gene expression, and their dysregulation has been associated with the development of many diseases. One study found that patients with Barrett’s esophagus had increased levels of miRNA-194, 215, and 143 but these findings were not validated in a larger study. Other studies have demonstrated similar findings, but more research must be done to validate these findings in larger cohorts.
Other novel detection therapies have been investigated, including serum adipokine and electronic nose breathing tests. The serum adipokine test looks at the metabolically active adipokines secreted in obese patients and those with metabolic syndrome to see if they could predict the presence of Barrett’s esophagus. Unfortunately, the data appear to be conflicting, but these tests can be used in conjunction with other tools to detect Barrett’s esophagus. Electronic nose breathing tests also work by detecting metabolically active compounds from human and gut bacterial metabolism. One study found that analyzing these volatile compounds could delineate between Barrett’s and non-Barrett’s patients with 82% sensitivity, 80% specificity, and 81% accuracy. Both of these technologies need large prospective studies in primary care to validate their clinical utility.
A discussion of the effectiveness of these screening tools would be incomplete without a discussion of their costs. Currently, endoscopic screening costs are high. Therefore, it is important to reserve these tools for the patients who will benefit the most – in other words, patients with clear risk factors for Barrett’s esophagus. Even the capsule endoscope is quite expensive because of the cost of materials associated with the tool.
Cost-effectivenes calculations surrounding the Cytosponge are particularly complicated. One analysis found the computed incremental cost-effectiveness ratio (ICER) of endoscopy, compared with Cytosponge, to have a range of $107,583-$330,361. The potential benefit that Cytosponge offers comes at an ICER for Cytosponge screening, compared with no screening, that ranges from $26,358 to $33,307. The numbers skyrocket when you consider what society would be willing to pay (up to $50,000 per quality-adjusted life-year gained).
With all of this information in mind, it would be useful to look at Barrett’s esophagus and the tools used to diagnose it from a broader perspective.
While the adoption of a new screening strategy could succeed where others have failed, Dr. Spechler points out the potential harm.
“There also is potential for harm in identifying asymptomatic patients with Barrett’s esophagus. In addition to the high costs and small risks of standard endoscopy, the diagnosis of Barrett’s esophagus can cause psychological stress, have a negative impact on quality of life, result in higher premiums for health and life insurance, and might identify innocuous lesions that lead to potentially hazardous invasive treatments. Efforts should therefore be continued to combine biomarkers for Barrett’s with risk stratification. Overall, while these vexing uncertainties must temper enthusiasm for the unqualified endorsement of any screening test for Barrett’s esophagus, the alternative of making no attempt to stem the rapidly rising incidence of a lethal malignancy also is unpalatable.”
The development of this commentary was supported solely by the American Gastroenterological Association Institute. No conflicts of interest were disclosed for this report.
SOURCE: Spechler S et al. Gastroenterology. 2018 May doi: 10.1053/j.gastro.2018.03.031).
AGA Resource
AGA patient education on Barrett’s esophagus will help your patients better understand the disease and how to manage it. Learn more at gastro.org/patient-care.
FROM GASTROENTEROLOGY
PPI use not linked to cognitive decline
Use of proton pump inhibitors (PPIs) is not associated with cognitive decline in two prospective, population-based studies of identical twins published in the May issue of Clinical Gastroenterology and Hepatology.
“No stated differences in [mean cognitive] scores between PPI users and nonusers were significant,” wrote Mette Wod, PhD, of the University of Southern Denmark, Odense, with her associates.
Past research has yielded mixed findings about whether using PPIs affects the risk of dementia. Preclinical data suggest that exposure to these drugs affects amyloid levels in mice, but “the evidence is equivocal, [and] the results of epidemiologic studies [of humans] have also been inconclusive, with more recent studies pointing toward a null association,” the investigators wrote. Furthermore, there are only “scant” data on whether long-term PPI use affects cognitive function, they noted.
To help clarify the issue, they analyzed prospective data from two studies of twins in Denmark: the Study of Middle-Aged Danish Twins, in which individuals underwent a five-part cognitive battery at baseline and then 10 years later, and the Longitudinal Study of Aging Danish Twins, in which participants underwent the same test at baseline and 2 years later. The cognitive test assessed verbal fluency, forward and backward digit span, and immediate and delayed recall of a 12-item list. Using data from a national prescription registry, the investigators also estimated individuals’ PPI exposure starting 2 years before study enrollment.
In the study of middle-aged twins, participants who used high-dose PPIs before study enrollment had cognitive scores that were slightly lower at baseline, compared with PPI nonusers. Mean baseline scores were 43.1 (standard deviation, 13.1) and 46.8 (SD, 10.2), respectively. However, after researchers adjusted for numerous clinical and demographic variables, the between-group difference in baseline scores narrowed to just 0.69 (95% confidence interval, –4.98 to 3.61), which was not statistically significant.
The longitudinal study of older twins yielded similar results. Individuals who used high doses of PPIs had slightly higher adjusted mean baseline cognitive score than did nonusers, but the difference did not reach statistical significance (0.95; 95% CI, –1.88 to 3.79).
Furthermore, prospective assessments of cognitive decline found no evidence of an effect. In the longitudinal aging study, high-dose PPI users had slightly less cognitive decline (based on a smaller change in test scores over time) than did nonusers, but the adjusted difference in decline between groups was not significant (1.22 points; 95% CI, –3.73 to 1.29). In the middle-aged twin study, individuals with the highest levels of PPI exposure (at least 1,600 daily doses) had slightly less cognitive decline than did nonusers, with an adjusted difference of 0.94 points (95% CI, –1.63 to 3.50) between groups, but this did not reach statistical significance.
“This study is the first to examine the association between long-term PPI use and cognitive decline in a population-based setting,” the researchers concluded. “Cognitive scores of more than 7,800 middle-aged and older Danish twins at baseline did not indicate an association with previous PPI use. Follow-up data on more than 4,000 of these twins did not indicate that use of this class of drugs was correlated to cognitive decline.”
Odense University Hospital provided partial funding. Dr. Wod had no disclosures. Three coinvestigators disclosed ties to AstraZeneca and Bayer AG.
SOURCE: Wod M et al. Clin Gastro Hepatol. 2018 Feb 3. doi: 10.1016/j.cgh.2018.01.034.
Over the last 20 years, there have been multiple retrospective studies which have shown associations between the use of proton pump inhibitors (PPIs) and a wide constellation of serious medical complications. However, detecting an association between a drug and a complication does not necessarily indicate that the drug was indeed responsible.
This well-done study by Wod et al, which shows no significant association between PPI use and decreased cognition and cognitive decline will, I hope, serve to allay any misplaced concerns that may exist among clinicians and patients about PPI use in this population. This paper has notable strengths, most importantly having access to results of a direct, unbiased assessment of changes in cognitive function over time and accurate assessment of PPI exposure. Short of performing a controlled, prospective trial, we are unlikely to see better evidence indicating a lack of a causal relationship between PPI use and changes in cognitive function. This provides assurance that patients with indications for PPI use can continue to use them.
Laura E. Targownik, MD, MSHS, FRCPC, is section head, section of gastroenterology, University of Manitoba, Winnipeg, Canada; Gastroenterology and Endoscopy Site Lead, Health Sciences Centre, Winnipeg; associate director, University of Manitoba Inflammatory Bowel Disease Research Centre; associate professor, department of internal medicine, section of gastroenterology, University of Manitoba. She has no conflicts of interest.
Over the last 20 years, there have been multiple retrospective studies which have shown associations between the use of proton pump inhibitors (PPIs) and a wide constellation of serious medical complications. However, detecting an association between a drug and a complication does not necessarily indicate that the drug was indeed responsible.
This well-done study by Wod et al, which shows no significant association between PPI use and decreased cognition and cognitive decline will, I hope, serve to allay any misplaced concerns that may exist among clinicians and patients about PPI use in this population. This paper has notable strengths, most importantly having access to results of a direct, unbiased assessment of changes in cognitive function over time and accurate assessment of PPI exposure. Short of performing a controlled, prospective trial, we are unlikely to see better evidence indicating a lack of a causal relationship between PPI use and changes in cognitive function. This provides assurance that patients with indications for PPI use can continue to use them.
Laura E. Targownik, MD, MSHS, FRCPC, is section head, section of gastroenterology, University of Manitoba, Winnipeg, Canada; Gastroenterology and Endoscopy Site Lead, Health Sciences Centre, Winnipeg; associate director, University of Manitoba Inflammatory Bowel Disease Research Centre; associate professor, department of internal medicine, section of gastroenterology, University of Manitoba. She has no conflicts of interest.
Over the last 20 years, there have been multiple retrospective studies which have shown associations between the use of proton pump inhibitors (PPIs) and a wide constellation of serious medical complications. However, detecting an association between a drug and a complication does not necessarily indicate that the drug was indeed responsible.
This well-done study by Wod et al, which shows no significant association between PPI use and decreased cognition and cognitive decline will, I hope, serve to allay any misplaced concerns that may exist among clinicians and patients about PPI use in this population. This paper has notable strengths, most importantly having access to results of a direct, unbiased assessment of changes in cognitive function over time and accurate assessment of PPI exposure. Short of performing a controlled, prospective trial, we are unlikely to see better evidence indicating a lack of a causal relationship between PPI use and changes in cognitive function. This provides assurance that patients with indications for PPI use can continue to use them.
Laura E. Targownik, MD, MSHS, FRCPC, is section head, section of gastroenterology, University of Manitoba, Winnipeg, Canada; Gastroenterology and Endoscopy Site Lead, Health Sciences Centre, Winnipeg; associate director, University of Manitoba Inflammatory Bowel Disease Research Centre; associate professor, department of internal medicine, section of gastroenterology, University of Manitoba. She has no conflicts of interest.
Use of proton pump inhibitors (PPIs) is not associated with cognitive decline in two prospective, population-based studies of identical twins published in the May issue of Clinical Gastroenterology and Hepatology.
“No stated differences in [mean cognitive] scores between PPI users and nonusers were significant,” wrote Mette Wod, PhD, of the University of Southern Denmark, Odense, with her associates.
Past research has yielded mixed findings about whether using PPIs affects the risk of dementia. Preclinical data suggest that exposure to these drugs affects amyloid levels in mice, but “the evidence is equivocal, [and] the results of epidemiologic studies [of humans] have also been inconclusive, with more recent studies pointing toward a null association,” the investigators wrote. Furthermore, there are only “scant” data on whether long-term PPI use affects cognitive function, they noted.
To help clarify the issue, they analyzed prospective data from two studies of twins in Denmark: the Study of Middle-Aged Danish Twins, in which individuals underwent a five-part cognitive battery at baseline and then 10 years later, and the Longitudinal Study of Aging Danish Twins, in which participants underwent the same test at baseline and 2 years later. The cognitive test assessed verbal fluency, forward and backward digit span, and immediate and delayed recall of a 12-item list. Using data from a national prescription registry, the investigators also estimated individuals’ PPI exposure starting 2 years before study enrollment.
In the study of middle-aged twins, participants who used high-dose PPIs before study enrollment had cognitive scores that were slightly lower at baseline, compared with PPI nonusers. Mean baseline scores were 43.1 (standard deviation, 13.1) and 46.8 (SD, 10.2), respectively. However, after researchers adjusted for numerous clinical and demographic variables, the between-group difference in baseline scores narrowed to just 0.69 (95% confidence interval, –4.98 to 3.61), which was not statistically significant.
The longitudinal study of older twins yielded similar results. Individuals who used high doses of PPIs had slightly higher adjusted mean baseline cognitive score than did nonusers, but the difference did not reach statistical significance (0.95; 95% CI, –1.88 to 3.79).
Furthermore, prospective assessments of cognitive decline found no evidence of an effect. In the longitudinal aging study, high-dose PPI users had slightly less cognitive decline (based on a smaller change in test scores over time) than did nonusers, but the adjusted difference in decline between groups was not significant (1.22 points; 95% CI, –3.73 to 1.29). In the middle-aged twin study, individuals with the highest levels of PPI exposure (at least 1,600 daily doses) had slightly less cognitive decline than did nonusers, with an adjusted difference of 0.94 points (95% CI, –1.63 to 3.50) between groups, but this did not reach statistical significance.
“This study is the first to examine the association between long-term PPI use and cognitive decline in a population-based setting,” the researchers concluded. “Cognitive scores of more than 7,800 middle-aged and older Danish twins at baseline did not indicate an association with previous PPI use. Follow-up data on more than 4,000 of these twins did not indicate that use of this class of drugs was correlated to cognitive decline.”
Odense University Hospital provided partial funding. Dr. Wod had no disclosures. Three coinvestigators disclosed ties to AstraZeneca and Bayer AG.
SOURCE: Wod M et al. Clin Gastro Hepatol. 2018 Feb 3. doi: 10.1016/j.cgh.2018.01.034.
Use of proton pump inhibitors (PPIs) is not associated with cognitive decline in two prospective, population-based studies of identical twins published in the May issue of Clinical Gastroenterology and Hepatology.
“No stated differences in [mean cognitive] scores between PPI users and nonusers were significant,” wrote Mette Wod, PhD, of the University of Southern Denmark, Odense, with her associates.
Past research has yielded mixed findings about whether using PPIs affects the risk of dementia. Preclinical data suggest that exposure to these drugs affects amyloid levels in mice, but “the evidence is equivocal, [and] the results of epidemiologic studies [of humans] have also been inconclusive, with more recent studies pointing toward a null association,” the investigators wrote. Furthermore, there are only “scant” data on whether long-term PPI use affects cognitive function, they noted.
To help clarify the issue, they analyzed prospective data from two studies of twins in Denmark: the Study of Middle-Aged Danish Twins, in which individuals underwent a five-part cognitive battery at baseline and then 10 years later, and the Longitudinal Study of Aging Danish Twins, in which participants underwent the same test at baseline and 2 years later. The cognitive test assessed verbal fluency, forward and backward digit span, and immediate and delayed recall of a 12-item list. Using data from a national prescription registry, the investigators also estimated individuals’ PPI exposure starting 2 years before study enrollment.
In the study of middle-aged twins, participants who used high-dose PPIs before study enrollment had cognitive scores that were slightly lower at baseline, compared with PPI nonusers. Mean baseline scores were 43.1 (standard deviation, 13.1) and 46.8 (SD, 10.2), respectively. However, after researchers adjusted for numerous clinical and demographic variables, the between-group difference in baseline scores narrowed to just 0.69 (95% confidence interval, –4.98 to 3.61), which was not statistically significant.
The longitudinal study of older twins yielded similar results. Individuals who used high doses of PPIs had slightly higher adjusted mean baseline cognitive score than did nonusers, but the difference did not reach statistical significance (0.95; 95% CI, –1.88 to 3.79).
Furthermore, prospective assessments of cognitive decline found no evidence of an effect. In the longitudinal aging study, high-dose PPI users had slightly less cognitive decline (based on a smaller change in test scores over time) than did nonusers, but the adjusted difference in decline between groups was not significant (1.22 points; 95% CI, –3.73 to 1.29). In the middle-aged twin study, individuals with the highest levels of PPI exposure (at least 1,600 daily doses) had slightly less cognitive decline than did nonusers, with an adjusted difference of 0.94 points (95% CI, –1.63 to 3.50) between groups, but this did not reach statistical significance.
“This study is the first to examine the association between long-term PPI use and cognitive decline in a population-based setting,” the researchers concluded. “Cognitive scores of more than 7,800 middle-aged and older Danish twins at baseline did not indicate an association with previous PPI use. Follow-up data on more than 4,000 of these twins did not indicate that use of this class of drugs was correlated to cognitive decline.”
Odense University Hospital provided partial funding. Dr. Wod had no disclosures. Three coinvestigators disclosed ties to AstraZeneca and Bayer AG.
SOURCE: Wod M et al. Clin Gastro Hepatol. 2018 Feb 3. doi: 10.1016/j.cgh.2018.01.034.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: Use of proton pump inhibitors was not associated with cognitive decline.
Major finding: Mean baseline cognitive scores did not significantly differ between PPI users and nonusers, nor did changes in cognitive scores over time.
Study details: Two population-based studies of twins in Denmark.
Disclosures: Odense University Hospital provided partial funding. Dr. Wod had no disclosures. Three coinvestigators disclosed ties to AstraZeneca and Bayer AG.
Source: Wod M et al. Clin Gastro Hepatol. 2018 Feb 3. doi: 10.1016/j.cgh.2018.01.034.
Alpha fetoprotein boosted detection of early-stage liver cancer
For patients with cirrhosis, adding serum alpha fetoprotein testing to ultrasound significantly boosted its ability to detect early-stage hepatocellular carcinoma, according to the results of a systematic review and meta-analysis reported in the May issue of Gastroenterology.
Used alone, ultrasound detected only 45% of early-stage hepatocellular carcinomas (95% confidence interval, 30%-62%), reported Kristina Tzartzeva, MD, of the University of Texas, Dallas, with her associates. Adding alpha fetoprotein (AFP) increased this sensitivity to 63% (95% CI, 48%-75%; P = .002). Few studies evaluated alternative surveillance tools, such as CT or MRI.
Diagnosing liver cancer early is key to survival and thus is a central issue in cirrhosis management. However, the best surveillance strategy remains uncertain, hinging as it does on sensitivity, specificity, and cost. The American Association for the Study of Liver Diseases and the European Association for the Study of the Liver recommend that cirrhotic patients undergo twice-yearly ultrasound to screen for hepatocellular carcinoma (HCC), but they disagree about the value of adding serum biomarker AFP testing. Meanwhile, more and more clinics are using CT and MRI because of concerns about the unreliability of ultrasound. “Given few direct comparative studies, we are forced to primarily rely on indirect comparisons across studies,” the reviewers wrote.
To do so, they searched MEDLINE and Scopus and identified 32 studies of HCC surveillance that comprised 13,367 patients, nearly all with baseline cirrhosis. The studies were published from 1990 to August 2016.
Ultrasound detected HCC of any stage with a sensitivity of 84% (95% CI, 76%-92%), but its sensitivity for detecting early-stage disease was less than 50%. In studies that performed direct comparisons, ultrasound alone was significantly less sensitive than ultrasound plus AFP for detecting all stages of HCC (relative risk, 0.80; 95% CI, 0.72-0.88) and early-stage disease (0.78; 0.66-0.92). However, ultrasound alone was more specific than ultrasound plus AFP (RR, 1.08; 95% CI, 1.05-1.09).
Four studies of about 900 patients evaluated cross-sectional imaging with CT or MRI. In one single-center, randomized trial, CT had a sensitivity of 63% for detecting early-stage disease, but the 95% CI for this estimate was very wide (30%-87%) and CT did not significantly outperform ultrasound (Aliment Pharmacol Ther. 2013;38:303-12). In another study, MRI and ultrasound had significantly different sensitivities of 84% and 26% for detecting (usually) early-stage disease (JAMA Oncol. 2017;3[4]:456-63).
“Ultrasound currently forms the backbone of professional society recommendations for HCC surveillance; however, our meta-analysis highlights its suboptimal sensitivity for detection of hepatocellular carcinoma at an early stage. Using ultrasound in combination with AFP appears to significantly improve sensitivity for detecting early HCC with a small, albeit statistically significant, trade-off in specificity. There are currently insufficient data to support routine use of CT- or MRI-based surveillance in all patients with cirrhosis,” the reviewers concluded.
The National Cancer Institute and Cancer Prevention Research Institute of Texas provided funding. None of the reviewers had conflicts of interest.
SOURCE: Tzartzeva K et al. Gastroenterology. 2018 Feb 6. doi: 10.1053/j.gastro.2018.01.064.
For patients with cirrhosis, adding serum alpha fetoprotein testing to ultrasound significantly boosted its ability to detect early-stage hepatocellular carcinoma, according to the results of a systematic review and meta-analysis reported in the May issue of Gastroenterology.
Used alone, ultrasound detected only 45% of early-stage hepatocellular carcinomas (95% confidence interval, 30%-62%), reported Kristina Tzartzeva, MD, of the University of Texas, Dallas, with her associates. Adding alpha fetoprotein (AFP) increased this sensitivity to 63% (95% CI, 48%-75%; P = .002). Few studies evaluated alternative surveillance tools, such as CT or MRI.
Diagnosing liver cancer early is key to survival and thus is a central issue in cirrhosis management. However, the best surveillance strategy remains uncertain, hinging as it does on sensitivity, specificity, and cost. The American Association for the Study of Liver Diseases and the European Association for the Study of the Liver recommend that cirrhotic patients undergo twice-yearly ultrasound to screen for hepatocellular carcinoma (HCC), but they disagree about the value of adding serum biomarker AFP testing. Meanwhile, more and more clinics are using CT and MRI because of concerns about the unreliability of ultrasound. “Given few direct comparative studies, we are forced to primarily rely on indirect comparisons across studies,” the reviewers wrote.
To do so, they searched MEDLINE and Scopus and identified 32 studies of HCC surveillance that comprised 13,367 patients, nearly all with baseline cirrhosis. The studies were published from 1990 to August 2016.
Ultrasound detected HCC of any stage with a sensitivity of 84% (95% CI, 76%-92%), but its sensitivity for detecting early-stage disease was less than 50%. In studies that performed direct comparisons, ultrasound alone was significantly less sensitive than ultrasound plus AFP for detecting all stages of HCC (relative risk, 0.80; 95% CI, 0.72-0.88) and early-stage disease (0.78; 0.66-0.92). However, ultrasound alone was more specific than ultrasound plus AFP (RR, 1.08; 95% CI, 1.05-1.09).
Four studies of about 900 patients evaluated cross-sectional imaging with CT or MRI. In one single-center, randomized trial, CT had a sensitivity of 63% for detecting early-stage disease, but the 95% CI for this estimate was very wide (30%-87%) and CT did not significantly outperform ultrasound (Aliment Pharmacol Ther. 2013;38:303-12). In another study, MRI and ultrasound had significantly different sensitivities of 84% and 26% for detecting (usually) early-stage disease (JAMA Oncol. 2017;3[4]:456-63).
“Ultrasound currently forms the backbone of professional society recommendations for HCC surveillance; however, our meta-analysis highlights its suboptimal sensitivity for detection of hepatocellular carcinoma at an early stage. Using ultrasound in combination with AFP appears to significantly improve sensitivity for detecting early HCC with a small, albeit statistically significant, trade-off in specificity. There are currently insufficient data to support routine use of CT- or MRI-based surveillance in all patients with cirrhosis,” the reviewers concluded.
The National Cancer Institute and Cancer Prevention Research Institute of Texas provided funding. None of the reviewers had conflicts of interest.
SOURCE: Tzartzeva K et al. Gastroenterology. 2018 Feb 6. doi: 10.1053/j.gastro.2018.01.064.
For patients with cirrhosis, adding serum alpha fetoprotein testing to ultrasound significantly boosted its ability to detect early-stage hepatocellular carcinoma, according to the results of a systematic review and meta-analysis reported in the May issue of Gastroenterology.
Used alone, ultrasound detected only 45% of early-stage hepatocellular carcinomas (95% confidence interval, 30%-62%), reported Kristina Tzartzeva, MD, of the University of Texas, Dallas, with her associates. Adding alpha fetoprotein (AFP) increased this sensitivity to 63% (95% CI, 48%-75%; P = .002). Few studies evaluated alternative surveillance tools, such as CT or MRI.
Diagnosing liver cancer early is key to survival and thus is a central issue in cirrhosis management. However, the best surveillance strategy remains uncertain, hinging as it does on sensitivity, specificity, and cost. The American Association for the Study of Liver Diseases and the European Association for the Study of the Liver recommend that cirrhotic patients undergo twice-yearly ultrasound to screen for hepatocellular carcinoma (HCC), but they disagree about the value of adding serum biomarker AFP testing. Meanwhile, more and more clinics are using CT and MRI because of concerns about the unreliability of ultrasound. “Given few direct comparative studies, we are forced to primarily rely on indirect comparisons across studies,” the reviewers wrote.
To do so, they searched MEDLINE and Scopus and identified 32 studies of HCC surveillance that comprised 13,367 patients, nearly all with baseline cirrhosis. The studies were published from 1990 to August 2016.
Ultrasound detected HCC of any stage with a sensitivity of 84% (95% CI, 76%-92%), but its sensitivity for detecting early-stage disease was less than 50%. In studies that performed direct comparisons, ultrasound alone was significantly less sensitive than ultrasound plus AFP for detecting all stages of HCC (relative risk, 0.80; 95% CI, 0.72-0.88) and early-stage disease (0.78; 0.66-0.92). However, ultrasound alone was more specific than ultrasound plus AFP (RR, 1.08; 95% CI, 1.05-1.09).
Four studies of about 900 patients evaluated cross-sectional imaging with CT or MRI. In one single-center, randomized trial, CT had a sensitivity of 63% for detecting early-stage disease, but the 95% CI for this estimate was very wide (30%-87%) and CT did not significantly outperform ultrasound (Aliment Pharmacol Ther. 2013;38:303-12). In another study, MRI and ultrasound had significantly different sensitivities of 84% and 26% for detecting (usually) early-stage disease (JAMA Oncol. 2017;3[4]:456-63).
“Ultrasound currently forms the backbone of professional society recommendations for HCC surveillance; however, our meta-analysis highlights its suboptimal sensitivity for detection of hepatocellular carcinoma at an early stage. Using ultrasound in combination with AFP appears to significantly improve sensitivity for detecting early HCC with a small, albeit statistically significant, trade-off in specificity. There are currently insufficient data to support routine use of CT- or MRI-based surveillance in all patients with cirrhosis,” the reviewers concluded.
The National Cancer Institute and Cancer Prevention Research Institute of Texas provided funding. None of the reviewers had conflicts of interest.
SOURCE: Tzartzeva K et al. Gastroenterology. 2018 Feb 6. doi: 10.1053/j.gastro.2018.01.064.
FROM GASTROENTEROLOGY
Key clinical point: Ultrasound unreliably detects hepatocellular carcinoma, but adding alpha fetoprotein increases its sensitivity.
Major finding: Used alone, ultrasound detected only 47% of early-stage cases. Adding alpha fetoprotein increased this sensitivity to 63% (P = .002).
Study details: Systematic review and meta-analysis of 32 studies comprising 13,367 patients and spanning from 1990 to August 2016.
Disclosures: The National Cancer Institute and Cancer Prevention Research Institute of Texas provided funding. None of the researchers had conflicts of interest.
Source: Tzartzeva K et al. Gastroenterology. 2018 Feb 6. doi: 10.1053/j.gastro.2018.01.064.
One in seven Americans had fecal incontinence
One in seven respondents to a national survey reported a history of fecal incontinence, including one-third within the preceding week, investigators reported.
“Fecal incontinence [FI] is age-related and more prevalent among individuals with inflammatory bowel disease, celiac disease, irritable bowel syndrome, or diabetes than people without these disorders. Proactive screening for FI among these groups is warranted,” Stacy B. Menees, MD, and her associates wrote in the May issue of Gastroenterology (doi: 10.1053/j.gastro.2018.01.062).
Accurately determining the prevalence of FI is difficult because patients are reluctant to disclose symptoms and physicians often do not ask. In one study of HMO enrollees, about a third of patients had a history of FI but fewer than 3% had a medical diagnosis. In other studies, the prevalence of FI has ranged from 2% to 21%. Population aging fuels the need to narrow these estimates because FI becomes more common with age, the investigators noted.
Accordingly, in October 2015, they used a mobile app called MyGIHealth to survey nearly 72,000 individuals about fecal incontinence and other GI symptoms. The survey took about 15 minutes to complete, in return for which respondents could receive cash, shop online, or donate to charity. The investigators assessed FI severity by analyzing responses to the National Institutes of Health FI Patient Reported Outcomes Measurement Information System questionnaire.
Of the 10,033 respondents reporting a history of fecal incontinence (14.4%), 33.3% had experienced at least one episode in the past week. About a third of individuals with FI said it interfered with their daily activities. “Increasing age and concomitant diarrhea and constipation were associated with increased odds [of] FI,” the researchers wrote. Compared with individuals aged 18-24 years, the odds of having ever experienced FI rose by 29% among those aged 25-45 years, by 72% among those aged 45-64 years, and by 118% among persons aged 65 years and older.
Self-reported FI also was significantly more common among individuals with Crohn’s disease (41%), ulcerative colitis (37%), celiac disease (34%), irritable bowel syndrome (13%), or diabetes (13%) than it was among persons without these conditions. Corresponding odds ratios ranged from about 1.5 (diabetes) to 2.8 (celiac disease).
For individuals reporting FI within the past week, greater severity (based on their responses to the NIH FI Patient Reported Outcomes Measurement Information System questionnaire) significantly correlated with being non-Hispanic black (P = .03) or Latino (P = .02) and with having Crohn’s disease (P less than .001), celiac disease (P less than .001), diabetes (P = .04), human immunodeficiency syndrome (P = .001), or chronic idiopathic constipation (P less than .001). “Our study is the first to find differences among racial/ethnic groups regarding FI severity,” the researchers noted. They did not speculate on reasons for the finding, but stressed the importance of screening for FI and screening patients with FI for serious GI diseases.
Ironwood Pharmaceuticals funded the National GI Survey, but the investigators received no funding for this study. Three coinvestigators reported ties to Ironwood Pharmaceuticals and My Total Health.
SOURCE: Menees SB et al. Gastroenterology. 2018 Feb 3. doi: 10.1053/j.gastro.2018.01.062.
Fecal incontinence (FI) is a common problem associated with significant social anxiety and decreased quality of life for patients who experience it. Unfortunately, patients are not always forthcoming regarding their symptoms, and physicians often fail to inquire directly about incontinence symptoms.
Previous studies have shown the prevalence of FI to vary widely across different populations. Using novel technology through a mobile app, researchers at the University of Michigan, Ann Arbor, and Cedars-Sinai Medical Center, Los Angeles, have been able to perform the largest population-based study of community-dwelling Americans. They confirmed that FI is indeed a common problem experienced across the spectrum of age, sex, race, and socioeconomic status and interferes with the daily activities of more than one-third of those who experience it.
This study supports previous findings of an age-related increase in FI, with the highest prevalence in patients over age 65 years. Interestingly, males were more likely than female to have experienced FI within the past week, but not more likely to have ever experienced FI. While FI is often thought of as a primarily female problem (related to past obstetrical injury), it is important to remember that it likely affects both sexes equally.
Other significant risk factors include diabetes and gastrointestinal disorders. This study also confirms prior population-based findings that patients with chronic constipation are more likely to suffer FI. Finally, this study also identified risk factors associated with FI symptom severity including diabetes, HIV/AIDS, Crohn’s disease, celiac disease, and chronic constipation. This is also the first study to show differences between racial/ethnic groups, suggesting higher FI symptom scores in Latinos and African-Americans.
The strengths of this study include its size and the anonymity provided by an internet-based survey regarding a potentially embarrassing topic; however, it also may have led to the potential exclusion of older individuals or those without regular internet access.
In summary, I believe this is an important study which confirms that FI is a common among Americans while helping to identify potential risk factors for the presence and severity of FI. I am hopeful that with increased awareness, health care providers will become more prudent in screening their patients for FI, particularly in these higher-risk populations.
Stephanie A. McAbee, MD, is an assistant professor of medicine in the division of gastroenterology, hepatology, and nutrition at Vanderbilt University Medical Center, Nashville, Tenn. She has no conflicts of interest.
Fecal incontinence (FI) is a common problem associated with significant social anxiety and decreased quality of life for patients who experience it. Unfortunately, patients are not always forthcoming regarding their symptoms, and physicians often fail to inquire directly about incontinence symptoms.
Previous studies have shown the prevalence of FI to vary widely across different populations. Using novel technology through a mobile app, researchers at the University of Michigan, Ann Arbor, and Cedars-Sinai Medical Center, Los Angeles, have been able to perform the largest population-based study of community-dwelling Americans. They confirmed that FI is indeed a common problem experienced across the spectrum of age, sex, race, and socioeconomic status and interferes with the daily activities of more than one-third of those who experience it.
This study supports previous findings of an age-related increase in FI, with the highest prevalence in patients over age 65 years. Interestingly, males were more likely than female to have experienced FI within the past week, but not more likely to have ever experienced FI. While FI is often thought of as a primarily female problem (related to past obstetrical injury), it is important to remember that it likely affects both sexes equally.
Other significant risk factors include diabetes and gastrointestinal disorders. This study also confirms prior population-based findings that patients with chronic constipation are more likely to suffer FI. Finally, this study also identified risk factors associated with FI symptom severity including diabetes, HIV/AIDS, Crohn’s disease, celiac disease, and chronic constipation. This is also the first study to show differences between racial/ethnic groups, suggesting higher FI symptom scores in Latinos and African-Americans.
The strengths of this study include its size and the anonymity provided by an internet-based survey regarding a potentially embarrassing topic; however, it also may have led to the potential exclusion of older individuals or those without regular internet access.
In summary, I believe this is an important study which confirms that FI is a common among Americans while helping to identify potential risk factors for the presence and severity of FI. I am hopeful that with increased awareness, health care providers will become more prudent in screening their patients for FI, particularly in these higher-risk populations.
Stephanie A. McAbee, MD, is an assistant professor of medicine in the division of gastroenterology, hepatology, and nutrition at Vanderbilt University Medical Center, Nashville, Tenn. She has no conflicts of interest.
Fecal incontinence (FI) is a common problem associated with significant social anxiety and decreased quality of life for patients who experience it. Unfortunately, patients are not always forthcoming regarding their symptoms, and physicians often fail to inquire directly about incontinence symptoms.
Previous studies have shown the prevalence of FI to vary widely across different populations. Using novel technology through a mobile app, researchers at the University of Michigan, Ann Arbor, and Cedars-Sinai Medical Center, Los Angeles, have been able to perform the largest population-based study of community-dwelling Americans. They confirmed that FI is indeed a common problem experienced across the spectrum of age, sex, race, and socioeconomic status and interferes with the daily activities of more than one-third of those who experience it.
This study supports previous findings of an age-related increase in FI, with the highest prevalence in patients over age 65 years. Interestingly, males were more likely than female to have experienced FI within the past week, but not more likely to have ever experienced FI. While FI is often thought of as a primarily female problem (related to past obstetrical injury), it is important to remember that it likely affects both sexes equally.
Other significant risk factors include diabetes and gastrointestinal disorders. This study also confirms prior population-based findings that patients with chronic constipation are more likely to suffer FI. Finally, this study also identified risk factors associated with FI symptom severity including diabetes, HIV/AIDS, Crohn’s disease, celiac disease, and chronic constipation. This is also the first study to show differences between racial/ethnic groups, suggesting higher FI symptom scores in Latinos and African-Americans.
The strengths of this study include its size and the anonymity provided by an internet-based survey regarding a potentially embarrassing topic; however, it also may have led to the potential exclusion of older individuals or those without regular internet access.
In summary, I believe this is an important study which confirms that FI is a common among Americans while helping to identify potential risk factors for the presence and severity of FI. I am hopeful that with increased awareness, health care providers will become more prudent in screening their patients for FI, particularly in these higher-risk populations.
Stephanie A. McAbee, MD, is an assistant professor of medicine in the division of gastroenterology, hepatology, and nutrition at Vanderbilt University Medical Center, Nashville, Tenn. She has no conflicts of interest.
One in seven respondents to a national survey reported a history of fecal incontinence, including one-third within the preceding week, investigators reported.
“Fecal incontinence [FI] is age-related and more prevalent among individuals with inflammatory bowel disease, celiac disease, irritable bowel syndrome, or diabetes than people without these disorders. Proactive screening for FI among these groups is warranted,” Stacy B. Menees, MD, and her associates wrote in the May issue of Gastroenterology (doi: 10.1053/j.gastro.2018.01.062).
Accurately determining the prevalence of FI is difficult because patients are reluctant to disclose symptoms and physicians often do not ask. In one study of HMO enrollees, about a third of patients had a history of FI but fewer than 3% had a medical diagnosis. In other studies, the prevalence of FI has ranged from 2% to 21%. Population aging fuels the need to narrow these estimates because FI becomes more common with age, the investigators noted.
Accordingly, in October 2015, they used a mobile app called MyGIHealth to survey nearly 72,000 individuals about fecal incontinence and other GI symptoms. The survey took about 15 minutes to complete, in return for which respondents could receive cash, shop online, or donate to charity. The investigators assessed FI severity by analyzing responses to the National Institutes of Health FI Patient Reported Outcomes Measurement Information System questionnaire.
Of the 10,033 respondents reporting a history of fecal incontinence (14.4%), 33.3% had experienced at least one episode in the past week. About a third of individuals with FI said it interfered with their daily activities. “Increasing age and concomitant diarrhea and constipation were associated with increased odds [of] FI,” the researchers wrote. Compared with individuals aged 18-24 years, the odds of having ever experienced FI rose by 29% among those aged 25-45 years, by 72% among those aged 45-64 years, and by 118% among persons aged 65 years and older.
Self-reported FI also was significantly more common among individuals with Crohn’s disease (41%), ulcerative colitis (37%), celiac disease (34%), irritable bowel syndrome (13%), or diabetes (13%) than it was among persons without these conditions. Corresponding odds ratios ranged from about 1.5 (diabetes) to 2.8 (celiac disease).
For individuals reporting FI within the past week, greater severity (based on their responses to the NIH FI Patient Reported Outcomes Measurement Information System questionnaire) significantly correlated with being non-Hispanic black (P = .03) or Latino (P = .02) and with having Crohn’s disease (P less than .001), celiac disease (P less than .001), diabetes (P = .04), human immunodeficiency syndrome (P = .001), or chronic idiopathic constipation (P less than .001). “Our study is the first to find differences among racial/ethnic groups regarding FI severity,” the researchers noted. They did not speculate on reasons for the finding, but stressed the importance of screening for FI and screening patients with FI for serious GI diseases.
Ironwood Pharmaceuticals funded the National GI Survey, but the investigators received no funding for this study. Three coinvestigators reported ties to Ironwood Pharmaceuticals and My Total Health.
SOURCE: Menees SB et al. Gastroenterology. 2018 Feb 3. doi: 10.1053/j.gastro.2018.01.062.
One in seven respondents to a national survey reported a history of fecal incontinence, including one-third within the preceding week, investigators reported.
“Fecal incontinence [FI] is age-related and more prevalent among individuals with inflammatory bowel disease, celiac disease, irritable bowel syndrome, or diabetes than people without these disorders. Proactive screening for FI among these groups is warranted,” Stacy B. Menees, MD, and her associates wrote in the May issue of Gastroenterology (doi: 10.1053/j.gastro.2018.01.062).
Accurately determining the prevalence of FI is difficult because patients are reluctant to disclose symptoms and physicians often do not ask. In one study of HMO enrollees, about a third of patients had a history of FI but fewer than 3% had a medical diagnosis. In other studies, the prevalence of FI has ranged from 2% to 21%. Population aging fuels the need to narrow these estimates because FI becomes more common with age, the investigators noted.
Accordingly, in October 2015, they used a mobile app called MyGIHealth to survey nearly 72,000 individuals about fecal incontinence and other GI symptoms. The survey took about 15 minutes to complete, in return for which respondents could receive cash, shop online, or donate to charity. The investigators assessed FI severity by analyzing responses to the National Institutes of Health FI Patient Reported Outcomes Measurement Information System questionnaire.
Of the 10,033 respondents reporting a history of fecal incontinence (14.4%), 33.3% had experienced at least one episode in the past week. About a third of individuals with FI said it interfered with their daily activities. “Increasing age and concomitant diarrhea and constipation were associated with increased odds [of] FI,” the researchers wrote. Compared with individuals aged 18-24 years, the odds of having ever experienced FI rose by 29% among those aged 25-45 years, by 72% among those aged 45-64 years, and by 118% among persons aged 65 years and older.
Self-reported FI also was significantly more common among individuals with Crohn’s disease (41%), ulcerative colitis (37%), celiac disease (34%), irritable bowel syndrome (13%), or diabetes (13%) than it was among persons without these conditions. Corresponding odds ratios ranged from about 1.5 (diabetes) to 2.8 (celiac disease).
For individuals reporting FI within the past week, greater severity (based on their responses to the NIH FI Patient Reported Outcomes Measurement Information System questionnaire) significantly correlated with being non-Hispanic black (P = .03) or Latino (P = .02) and with having Crohn’s disease (P less than .001), celiac disease (P less than .001), diabetes (P = .04), human immunodeficiency syndrome (P = .001), or chronic idiopathic constipation (P less than .001). “Our study is the first to find differences among racial/ethnic groups regarding FI severity,” the researchers noted. They did not speculate on reasons for the finding, but stressed the importance of screening for FI and screening patients with FI for serious GI diseases.
Ironwood Pharmaceuticals funded the National GI Survey, but the investigators received no funding for this study. Three coinvestigators reported ties to Ironwood Pharmaceuticals and My Total Health.
SOURCE: Menees SB et al. Gastroenterology. 2018 Feb 3. doi: 10.1053/j.gastro.2018.01.062.
FROM GASTROENTEROLOGY
Key clinical point: One in seven (14%) individuals had experienced fecal incontinence (FI), one-third within the past week.
Major finding: Self-reported FI was significantly more common among individuals with Crohn’s disease (41%), ulcerative colitis (37%), celiac disease (34%), irritable bowel syndrome (13%), or diabetes (13%) than among individuals without these diagnoses.
Study details: Analysis of 71,812 responses to the National GI Survey, conducted in October 2015.
Disclosures: Although Ironwood Pharmaceuticals funded the National GI Survey, the investigators received no funding for this study. Three coinvestigators reported ties to Ironwood Pharmaceuticals and My Total Health.
Source: Menees SB et al. Gastroenterology. 2018 Feb 3. doi: 10.1053/j.gastro.2018.01.062.
Heavy drinking did not worsen clinical outcomes from drug-induced liver injury
Heavy drinking was not associated with higher proportions of liver-related deaths or liver transplantation among patients with drug-induced liver injury (DILI), according to the results of a prospective multicenter cohort study reported in the May issue of Clinical Gastroenterology and Hepatology.
Anabolic steroids were the most common cause of DILI among heavy drinkers, defined as men who averaged more than three drinks a day or women who averaged more than two drinks daily, said Lara Dakhoul, MD, of Indiana University, Indianapolis, and her associates. There also was no evidence that heavy alcohol consumption increased the risk of liver injury attributable to isoniazid exposure, the researchers wrote in.
Although consuming alcohol significantly increases the risk of acetaminophen-induced liver injury, there is much less clarity about the relationship between drinking and hepatotoxicity from drugs such as duloxetine or antituberculosis medications, the researchers noted. In fact, one recent study found that drinking led to less severe liver injury among individuals with DILI. To better elucidate these links, the investigators studied 1,198 individuals with confirmed or probable DILI who enrolled in the DILI Network study (DILIN) between 2004 and 2016. At enrollment, all participants were asked if they consumed alcohol, and those who reported drinking within the past 12 months were offered a shortened version of the Skinner Alcohol Dependence Scale to collect details on alcohol consumption, including type, amount, and frequency.
In all, 601 persons reported consuming at least one alcoholic drink in the preceding year, of whom 348 completed the Skinner questionnaire. A total of 80 individuals reported heavy alcohol consumption. Heavy drinkers were typically in their early 40s, while nondrinkers tended to be nearly 50 years old (P less than .01). Heavy drinkers were also more often men (63%) while nondrinkers were usually women (65%; P less than .01). Heavy drinkers were significantly more likely to have DILI secondary to anabolic steroid exposure (13%) than were nondrinkers (2%; P less than .001). However, latency, pattern of liver injury, peak enzyme levels, and patterns of recovery from steroid hepatotoxicity were similar regardless of alcohol history.
A total of eight patients with DILI died of liver-related causes or underwent liver transplantation, and proportions of patients with these outcomes were similar regardless of alcohol history. These eight patients had no evidence of hepatitis C virus infection, but three appeared to have underlying alcoholic liver disease with superimposed acute-on-chronic liver failure. Heavy drinkers did not have significantly higher DILI severity scores than nondrinkers, but they did have significantly higher peak serum levels of alanine aminotransferase (1,323 U/L vs. 754, respectively; P = .02) and significantly higher levels of bilirubin (16.1 vs. 12.7 mg/dL; P = .03).
The two fatal cases of DILI among heavy drinkers involved a 44-year-old man with underlying alcoholic cirrhosis and steatohepatitis who developed acute-on-chronic liver failure 11 days after starting niacin, and a 76-year-old man with chronic obstructive pulmonary disease and bronchitis flare who developed severe liver injury and skin rash 6 days after starting azithromycin.
The study was not able to assess whether heavy alcohol consumption contributed to liver injury from specific agents, the researchers said. Additionally, a substantial number of drinkers did not complete the Skinner questionnaire, and those who did might have underestimated or underreported their own alcohol consumption. “Counterbalancing these issues are the [study’s] unique strengths, such as prospective design, larger sample size, well-characterized DILI phenotype, and careful, structured adjudication of causality and severity,” the researchers wrote.
Funders included the National Institute of Diabetes and Digestive and Kidney Diseases and the National Cancer Institute. Dr. Dakhoul had no conflicts of interest. On coinvestigator disclosed ties to numerous pharmaceutical companies.
SOURCE: Dakhoul L et al. Clin Gastro Hepatol. 2018 Jan 3. doi: 10.1016/j.cgh.2017.12.036.
Heavy drinking was not associated with higher proportions of liver-related deaths or liver transplantation among patients with drug-induced liver injury (DILI), according to the results of a prospective multicenter cohort study reported in the May issue of Clinical Gastroenterology and Hepatology.
Anabolic steroids were the most common cause of DILI among heavy drinkers, defined as men who averaged more than three drinks a day or women who averaged more than two drinks daily, said Lara Dakhoul, MD, of Indiana University, Indianapolis, and her associates. There also was no evidence that heavy alcohol consumption increased the risk of liver injury attributable to isoniazid exposure, the researchers wrote in.
Although consuming alcohol significantly increases the risk of acetaminophen-induced liver injury, there is much less clarity about the relationship between drinking and hepatotoxicity from drugs such as duloxetine or antituberculosis medications, the researchers noted. In fact, one recent study found that drinking led to less severe liver injury among individuals with DILI. To better elucidate these links, the investigators studied 1,198 individuals with confirmed or probable DILI who enrolled in the DILI Network study (DILIN) between 2004 and 2016. At enrollment, all participants were asked if they consumed alcohol, and those who reported drinking within the past 12 months were offered a shortened version of the Skinner Alcohol Dependence Scale to collect details on alcohol consumption, including type, amount, and frequency.
In all, 601 persons reported consuming at least one alcoholic drink in the preceding year, of whom 348 completed the Skinner questionnaire. A total of 80 individuals reported heavy alcohol consumption. Heavy drinkers were typically in their early 40s, while nondrinkers tended to be nearly 50 years old (P less than .01). Heavy drinkers were also more often men (63%) while nondrinkers were usually women (65%; P less than .01). Heavy drinkers were significantly more likely to have DILI secondary to anabolic steroid exposure (13%) than were nondrinkers (2%; P less than .001). However, latency, pattern of liver injury, peak enzyme levels, and patterns of recovery from steroid hepatotoxicity were similar regardless of alcohol history.
A total of eight patients with DILI died of liver-related causes or underwent liver transplantation, and proportions of patients with these outcomes were similar regardless of alcohol history. These eight patients had no evidence of hepatitis C virus infection, but three appeared to have underlying alcoholic liver disease with superimposed acute-on-chronic liver failure. Heavy drinkers did not have significantly higher DILI severity scores than nondrinkers, but they did have significantly higher peak serum levels of alanine aminotransferase (1,323 U/L vs. 754, respectively; P = .02) and significantly higher levels of bilirubin (16.1 vs. 12.7 mg/dL; P = .03).
The two fatal cases of DILI among heavy drinkers involved a 44-year-old man with underlying alcoholic cirrhosis and steatohepatitis who developed acute-on-chronic liver failure 11 days after starting niacin, and a 76-year-old man with chronic obstructive pulmonary disease and bronchitis flare who developed severe liver injury and skin rash 6 days after starting azithromycin.
The study was not able to assess whether heavy alcohol consumption contributed to liver injury from specific agents, the researchers said. Additionally, a substantial number of drinkers did not complete the Skinner questionnaire, and those who did might have underestimated or underreported their own alcohol consumption. “Counterbalancing these issues are the [study’s] unique strengths, such as prospective design, larger sample size, well-characterized DILI phenotype, and careful, structured adjudication of causality and severity,” the researchers wrote.
Funders included the National Institute of Diabetes and Digestive and Kidney Diseases and the National Cancer Institute. Dr. Dakhoul had no conflicts of interest. On coinvestigator disclosed ties to numerous pharmaceutical companies.
SOURCE: Dakhoul L et al. Clin Gastro Hepatol. 2018 Jan 3. doi: 10.1016/j.cgh.2017.12.036.
Heavy drinking was not associated with higher proportions of liver-related deaths or liver transplantation among patients with drug-induced liver injury (DILI), according to the results of a prospective multicenter cohort study reported in the May issue of Clinical Gastroenterology and Hepatology.
Anabolic steroids were the most common cause of DILI among heavy drinkers, defined as men who averaged more than three drinks a day or women who averaged more than two drinks daily, said Lara Dakhoul, MD, of Indiana University, Indianapolis, and her associates. There also was no evidence that heavy alcohol consumption increased the risk of liver injury attributable to isoniazid exposure, the researchers wrote in.
Although consuming alcohol significantly increases the risk of acetaminophen-induced liver injury, there is much less clarity about the relationship between drinking and hepatotoxicity from drugs such as duloxetine or antituberculosis medications, the researchers noted. In fact, one recent study found that drinking led to less severe liver injury among individuals with DILI. To better elucidate these links, the investigators studied 1,198 individuals with confirmed or probable DILI who enrolled in the DILI Network study (DILIN) between 2004 and 2016. At enrollment, all participants were asked if they consumed alcohol, and those who reported drinking within the past 12 months were offered a shortened version of the Skinner Alcohol Dependence Scale to collect details on alcohol consumption, including type, amount, and frequency.
In all, 601 persons reported consuming at least one alcoholic drink in the preceding year, of whom 348 completed the Skinner questionnaire. A total of 80 individuals reported heavy alcohol consumption. Heavy drinkers were typically in their early 40s, while nondrinkers tended to be nearly 50 years old (P less than .01). Heavy drinkers were also more often men (63%) while nondrinkers were usually women (65%; P less than .01). Heavy drinkers were significantly more likely to have DILI secondary to anabolic steroid exposure (13%) than were nondrinkers (2%; P less than .001). However, latency, pattern of liver injury, peak enzyme levels, and patterns of recovery from steroid hepatotoxicity were similar regardless of alcohol history.
A total of eight patients with DILI died of liver-related causes or underwent liver transplantation, and proportions of patients with these outcomes were similar regardless of alcohol history. These eight patients had no evidence of hepatitis C virus infection, but three appeared to have underlying alcoholic liver disease with superimposed acute-on-chronic liver failure. Heavy drinkers did not have significantly higher DILI severity scores than nondrinkers, but they did have significantly higher peak serum levels of alanine aminotransferase (1,323 U/L vs. 754, respectively; P = .02) and significantly higher levels of bilirubin (16.1 vs. 12.7 mg/dL; P = .03).
The two fatal cases of DILI among heavy drinkers involved a 44-year-old man with underlying alcoholic cirrhosis and steatohepatitis who developed acute-on-chronic liver failure 11 days after starting niacin, and a 76-year-old man with chronic obstructive pulmonary disease and bronchitis flare who developed severe liver injury and skin rash 6 days after starting azithromycin.
The study was not able to assess whether heavy alcohol consumption contributed to liver injury from specific agents, the researchers said. Additionally, a substantial number of drinkers did not complete the Skinner questionnaire, and those who did might have underestimated or underreported their own alcohol consumption. “Counterbalancing these issues are the [study’s] unique strengths, such as prospective design, larger sample size, well-characterized DILI phenotype, and careful, structured adjudication of causality and severity,” the researchers wrote.
Funders included the National Institute of Diabetes and Digestive and Kidney Diseases and the National Cancer Institute. Dr. Dakhoul had no conflicts of interest. On coinvestigator disclosed ties to numerous pharmaceutical companies.
SOURCE: Dakhoul L et al. Clin Gastro Hepatol. 2018 Jan 3. doi: 10.1016/j.cgh.2017.12.036.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: Heavy alcohol consumption was not associated with worse outcomes of drug-induced liver toxicity.
Major finding: Proportions of patients with liver-related deaths and liver transplantation were statistically similar regardless of alcohol consumption history (P = .18).
Study details: Prospective study of 1,198 individuals with probable drug-induced liver injury.
Disclosures: Funders included the National Institute of Diabetes and Digestive and Kidney Diseases and the National Cancer Institute. Dr. Dakhoul had no conflicts. One coinvestigator disclosed ties to numerous pharmaceutical companies.
Source: Dakhoul L et al. Clin Gastro Hepatol. 2018 Jan 3. doi: 10.1016/j.cgh.2017.12.036.