User login
Engineered liver models to study human hepatotropic pathogens
Recently, exciting clinical progress has been made in the study of hepatotropic pathogens in the context of liver-dependent infectious diseases. review by Nil Gural and colleagues, published in Cellular and Molecular Gastroenterology and Hepatology, described these unique models. Furthermore, the progress made in combining individual approaches and pairing the most appropriate model system and readout modality was discussed.
This is crucial for the development and validation of therapeutic interventions, such as drug and vaccine candidates that may act on the liver cells. The engineered models range from two-dimensional (2-D) cultures of primary human hepatocytes (HH) and stem cell–derived progeny to three-dimensional (3-D) organoid cultures and humanized rodent models. AThe major human hepatotropic pathogens include hepatitis C virus (HCV), hepatitis B virus (HBV), and the protozoan parasites Plasmodium falciparum and P. vivax. While HBV and HCV can cause chronic liver diseases such as cirrhosis and hepatocellular carcinoma, Plasmodium parasites cause malaria. The use of cancer cell lines and animal models to study host-pathogen interactions is limited by uncontrolled proliferation, abnormal liver-specific functions, and stringent host dependency of the hepatotropic pathogens. HHs are thus the only ideal system to study these pathogens, however, maintaining these cells ex vivo is challenging.
For instance, 2D monolayers of human hepatoma-derived cell lines (such as HepG2-A16 and HepaRG) are easier to maintain, to amplify for scaling up, and to use for drug screening, thus representing a renewable alternative to primary hepatocytes. These model systems have been useful to study short-term infections of human Plasmodium parasites (P. vivax and P. falciparum); other hepatotropic pathogens such as Ebola, Lassa, human cytomegalovirus, and dengue viruses; and to generate virion stocks (HCV, HBV). For long-term scientific analyses and cultures, as well as clinical isolates of pathogens that do not infect hepatoma cells, immortalized cell lines have been engineered to differentiate and maintain HH functions for a longer duration. Additionally, cocultivation of primary hepatocytes with nonparenchymal cells or hepatocytes with mouse fibroblasts preserves hepatocyte phenotype. The latter is a self-assembling coculture system that could potentially maintain an infection for over 30 days and be used for testing anti-HBV drugs. A micropatterned coculture system, in which hepatocytes are positioned in “islands” via photolithographic patterning of collagen, surrounded by mouse embryonic fibroblasts, can maintain hepatocyte phenotypes for 4-6 weeks, and remain permissive to P. falciparum, P. vivax, HBV, and HCV infections. Furthermore, micropatterned coculture systems support full developmental liver stages of both P. falciparum and P. vivax, with the release of merozoites from hepatocytes and their subsequent infection of overlaid human red blood cells.
Alternatively, embryonic stem cells and induced pluripotent stem cells of human origin can be differentiated into hepatocytelike cells that enable investigation of host genetics within the context of host-pathogen interactions, and can also be used for target identification for drug development. However, stem cell cultures require significant culture expertise and may not represent a fully differentiated adult hepatocyte phenotype.
Although 2D cultures offer ease of use and monitoring of infection, they often lack the complexity of the liver microenvironment and impact of different cell types on liver infections. A 3D radial-flow bioreactor (cylindrical matrix) was able to maintain and amplify human hepatoma cells (for example, Huh7 cells), by providing sufficient oxygen and nutrient supply, supporting productive HCV infection for months. Other 3D cultures of hepatoma cells using polyethylene glycol–based hydrogels, thermoreversible gelatin polymers, alginate, galactosylated cellulosic sponges, matrigel, and collagen have been developed and shown to be permissive to HCV or HBV infections. Although 3D coculture systems exhibit better hepatic function and differential gene expression profiles in comparison to 2D counterparts, they require a large quantity of cells and are a challenge to scale up. Recently, several liver-on-a-chip models have been created that mimic shear stress, blood flow, and the extracellular environment within a tissue, holding great potential for modeling liver-specific pathogens.
Humanized mouse models with ectopic human liver structures have been developed in which primary HHs are transplanted following liver injury. Chimeric mouse models including Alb-uPA/SCID (HHs transplanted into urokinase-type plasminogen activator-transgenic severe combined immunodeficient mice), FNRG/FRG (HHs transplanted into Fah[-/-], Rag2[-/-], and Il2rg[-/-] mice with or without a nonobese diabetic background), and TK-NOG (HHs transplanted into herpes simplex virus type-1 thymidine kinase mice) were validated for HCV, HBV, P. falciparum, and P. vivax infections. It is, however, laborious to create and maintain chimeric mouse models and monitor infection processes in them.
It is important to note that the selection of model system and the readout modality to monitor infection will vary based on the experimental question at hand. Tissue engineering has thus far made significant contributions to the knowledge of hepatotropic pathogens; a continued effort to develop better liver models is envisioned.
Gural et al. present a timely and outstanding review of the advances made in the engineering of human-relevant liver culture platforms for investigating the molecular mechanisms of infectious diseases (e.g., hepatitis B/C viruses and Plasmodium parasites that cause malaria) and developing better drugs or vaccines against such diseases. The authors cover a continuum of platforms with increasing physiological complexity, such as 2-D hepatocyte monocultures on collagen-coated plastic, 2-D cocultures of hepatocytes and nonparenchymal cells, (both randomly distributed and patterned into microdomains to optimize cell-cell contact), 3-D cultures/cocultures housed in biomaterial-based scaffolds, perfusion-based bioreactors to induce cell growth and phenotypic stability, and finally rodents with humanized livers. Cell sourcing considerations for building human-relevant platforms are discussed, including cancerous cell lines, primary human hepatocytes, and stem cell–derived hepatocytes (e.g., induced pluripotent stem cells).
From the discussions of various studies, it is clear that this field has benefitted tremendously from advances in tissue engineering, including microfabrication tools adapted from the semiconductor industry, to construct human liver platforms that last for several weeks in vitro, can be infected with hepatitis B/C virus and Plasmodium parasites with high efficiencies, and are very useful for high-throughput and high-content drug screening applications. The latest protocols in isolating and cryopreserving primary human hepatocytes and differentiating stem cells into hepatocyte-like cells with adult functions help reduce the reliance on abnormal or cancerous cell lines for building platforms with higher relevance to the clinic. Ultimately, continued advances in microfabricated human liver platforms can aid our understanding of liver infections and spur further drug/vaccine development.
Salman R. Khetani, PhD, is associate professor, department of bioengineering, University of Illinois at Chicago. He has no conflicts of interest.
Gural et al. present a timely and outstanding review of the advances made in the engineering of human-relevant liver culture platforms for investigating the molecular mechanisms of infectious diseases (e.g., hepatitis B/C viruses and Plasmodium parasites that cause malaria) and developing better drugs or vaccines against such diseases. The authors cover a continuum of platforms with increasing physiological complexity, such as 2-D hepatocyte monocultures on collagen-coated plastic, 2-D cocultures of hepatocytes and nonparenchymal cells, (both randomly distributed and patterned into microdomains to optimize cell-cell contact), 3-D cultures/cocultures housed in biomaterial-based scaffolds, perfusion-based bioreactors to induce cell growth and phenotypic stability, and finally rodents with humanized livers. Cell sourcing considerations for building human-relevant platforms are discussed, including cancerous cell lines, primary human hepatocytes, and stem cell–derived hepatocytes (e.g., induced pluripotent stem cells).
From the discussions of various studies, it is clear that this field has benefitted tremendously from advances in tissue engineering, including microfabrication tools adapted from the semiconductor industry, to construct human liver platforms that last for several weeks in vitro, can be infected with hepatitis B/C virus and Plasmodium parasites with high efficiencies, and are very useful for high-throughput and high-content drug screening applications. The latest protocols in isolating and cryopreserving primary human hepatocytes and differentiating stem cells into hepatocyte-like cells with adult functions help reduce the reliance on abnormal or cancerous cell lines for building platforms with higher relevance to the clinic. Ultimately, continued advances in microfabricated human liver platforms can aid our understanding of liver infections and spur further drug/vaccine development.
Salman R. Khetani, PhD, is associate professor, department of bioengineering, University of Illinois at Chicago. He has no conflicts of interest.
Gural et al. present a timely and outstanding review of the advances made in the engineering of human-relevant liver culture platforms for investigating the molecular mechanisms of infectious diseases (e.g., hepatitis B/C viruses and Plasmodium parasites that cause malaria) and developing better drugs or vaccines against such diseases. The authors cover a continuum of platforms with increasing physiological complexity, such as 2-D hepatocyte monocultures on collagen-coated plastic, 2-D cocultures of hepatocytes and nonparenchymal cells, (both randomly distributed and patterned into microdomains to optimize cell-cell contact), 3-D cultures/cocultures housed in biomaterial-based scaffolds, perfusion-based bioreactors to induce cell growth and phenotypic stability, and finally rodents with humanized livers. Cell sourcing considerations for building human-relevant platforms are discussed, including cancerous cell lines, primary human hepatocytes, and stem cell–derived hepatocytes (e.g., induced pluripotent stem cells).
From the discussions of various studies, it is clear that this field has benefitted tremendously from advances in tissue engineering, including microfabrication tools adapted from the semiconductor industry, to construct human liver platforms that last for several weeks in vitro, can be infected with hepatitis B/C virus and Plasmodium parasites with high efficiencies, and are very useful for high-throughput and high-content drug screening applications. The latest protocols in isolating and cryopreserving primary human hepatocytes and differentiating stem cells into hepatocyte-like cells with adult functions help reduce the reliance on abnormal or cancerous cell lines for building platforms with higher relevance to the clinic. Ultimately, continued advances in microfabricated human liver platforms can aid our understanding of liver infections and spur further drug/vaccine development.
Salman R. Khetani, PhD, is associate professor, department of bioengineering, University of Illinois at Chicago. He has no conflicts of interest.
Recently, exciting clinical progress has been made in the study of hepatotropic pathogens in the context of liver-dependent infectious diseases. review by Nil Gural and colleagues, published in Cellular and Molecular Gastroenterology and Hepatology, described these unique models. Furthermore, the progress made in combining individual approaches and pairing the most appropriate model system and readout modality was discussed.
This is crucial for the development and validation of therapeutic interventions, such as drug and vaccine candidates that may act on the liver cells. The engineered models range from two-dimensional (2-D) cultures of primary human hepatocytes (HH) and stem cell–derived progeny to three-dimensional (3-D) organoid cultures and humanized rodent models. AThe major human hepatotropic pathogens include hepatitis C virus (HCV), hepatitis B virus (HBV), and the protozoan parasites Plasmodium falciparum and P. vivax. While HBV and HCV can cause chronic liver diseases such as cirrhosis and hepatocellular carcinoma, Plasmodium parasites cause malaria. The use of cancer cell lines and animal models to study host-pathogen interactions is limited by uncontrolled proliferation, abnormal liver-specific functions, and stringent host dependency of the hepatotropic pathogens. HHs are thus the only ideal system to study these pathogens, however, maintaining these cells ex vivo is challenging.
For instance, 2D monolayers of human hepatoma-derived cell lines (such as HepG2-A16 and HepaRG) are easier to maintain, to amplify for scaling up, and to use for drug screening, thus representing a renewable alternative to primary hepatocytes. These model systems have been useful to study short-term infections of human Plasmodium parasites (P. vivax and P. falciparum); other hepatotropic pathogens such as Ebola, Lassa, human cytomegalovirus, and dengue viruses; and to generate virion stocks (HCV, HBV). For long-term scientific analyses and cultures, as well as clinical isolates of pathogens that do not infect hepatoma cells, immortalized cell lines have been engineered to differentiate and maintain HH functions for a longer duration. Additionally, cocultivation of primary hepatocytes with nonparenchymal cells or hepatocytes with mouse fibroblasts preserves hepatocyte phenotype. The latter is a self-assembling coculture system that could potentially maintain an infection for over 30 days and be used for testing anti-HBV drugs. A micropatterned coculture system, in which hepatocytes are positioned in “islands” via photolithographic patterning of collagen, surrounded by mouse embryonic fibroblasts, can maintain hepatocyte phenotypes for 4-6 weeks, and remain permissive to P. falciparum, P. vivax, HBV, and HCV infections. Furthermore, micropatterned coculture systems support full developmental liver stages of both P. falciparum and P. vivax, with the release of merozoites from hepatocytes and their subsequent infection of overlaid human red blood cells.
Alternatively, embryonic stem cells and induced pluripotent stem cells of human origin can be differentiated into hepatocytelike cells that enable investigation of host genetics within the context of host-pathogen interactions, and can also be used for target identification for drug development. However, stem cell cultures require significant culture expertise and may not represent a fully differentiated adult hepatocyte phenotype.
Although 2D cultures offer ease of use and monitoring of infection, they often lack the complexity of the liver microenvironment and impact of different cell types on liver infections. A 3D radial-flow bioreactor (cylindrical matrix) was able to maintain and amplify human hepatoma cells (for example, Huh7 cells), by providing sufficient oxygen and nutrient supply, supporting productive HCV infection for months. Other 3D cultures of hepatoma cells using polyethylene glycol–based hydrogels, thermoreversible gelatin polymers, alginate, galactosylated cellulosic sponges, matrigel, and collagen have been developed and shown to be permissive to HCV or HBV infections. Although 3D coculture systems exhibit better hepatic function and differential gene expression profiles in comparison to 2D counterparts, they require a large quantity of cells and are a challenge to scale up. Recently, several liver-on-a-chip models have been created that mimic shear stress, blood flow, and the extracellular environment within a tissue, holding great potential for modeling liver-specific pathogens.
Humanized mouse models with ectopic human liver structures have been developed in which primary HHs are transplanted following liver injury. Chimeric mouse models including Alb-uPA/SCID (HHs transplanted into urokinase-type plasminogen activator-transgenic severe combined immunodeficient mice), FNRG/FRG (HHs transplanted into Fah[-/-], Rag2[-/-], and Il2rg[-/-] mice with or without a nonobese diabetic background), and TK-NOG (HHs transplanted into herpes simplex virus type-1 thymidine kinase mice) were validated for HCV, HBV, P. falciparum, and P. vivax infections. It is, however, laborious to create and maintain chimeric mouse models and monitor infection processes in them.
It is important to note that the selection of model system and the readout modality to monitor infection will vary based on the experimental question at hand. Tissue engineering has thus far made significant contributions to the knowledge of hepatotropic pathogens; a continued effort to develop better liver models is envisioned.
Recently, exciting clinical progress has been made in the study of hepatotropic pathogens in the context of liver-dependent infectious diseases. review by Nil Gural and colleagues, published in Cellular and Molecular Gastroenterology and Hepatology, described these unique models. Furthermore, the progress made in combining individual approaches and pairing the most appropriate model system and readout modality was discussed.
This is crucial for the development and validation of therapeutic interventions, such as drug and vaccine candidates that may act on the liver cells. The engineered models range from two-dimensional (2-D) cultures of primary human hepatocytes (HH) and stem cell–derived progeny to three-dimensional (3-D) organoid cultures and humanized rodent models. AThe major human hepatotropic pathogens include hepatitis C virus (HCV), hepatitis B virus (HBV), and the protozoan parasites Plasmodium falciparum and P. vivax. While HBV and HCV can cause chronic liver diseases such as cirrhosis and hepatocellular carcinoma, Plasmodium parasites cause malaria. The use of cancer cell lines and animal models to study host-pathogen interactions is limited by uncontrolled proliferation, abnormal liver-specific functions, and stringent host dependency of the hepatotropic pathogens. HHs are thus the only ideal system to study these pathogens, however, maintaining these cells ex vivo is challenging.
For instance, 2D monolayers of human hepatoma-derived cell lines (such as HepG2-A16 and HepaRG) are easier to maintain, to amplify for scaling up, and to use for drug screening, thus representing a renewable alternative to primary hepatocytes. These model systems have been useful to study short-term infections of human Plasmodium parasites (P. vivax and P. falciparum); other hepatotropic pathogens such as Ebola, Lassa, human cytomegalovirus, and dengue viruses; and to generate virion stocks (HCV, HBV). For long-term scientific analyses and cultures, as well as clinical isolates of pathogens that do not infect hepatoma cells, immortalized cell lines have been engineered to differentiate and maintain HH functions for a longer duration. Additionally, cocultivation of primary hepatocytes with nonparenchymal cells or hepatocytes with mouse fibroblasts preserves hepatocyte phenotype. The latter is a self-assembling coculture system that could potentially maintain an infection for over 30 days and be used for testing anti-HBV drugs. A micropatterned coculture system, in which hepatocytes are positioned in “islands” via photolithographic patterning of collagen, surrounded by mouse embryonic fibroblasts, can maintain hepatocyte phenotypes for 4-6 weeks, and remain permissive to P. falciparum, P. vivax, HBV, and HCV infections. Furthermore, micropatterned coculture systems support full developmental liver stages of both P. falciparum and P. vivax, with the release of merozoites from hepatocytes and their subsequent infection of overlaid human red blood cells.
Alternatively, embryonic stem cells and induced pluripotent stem cells of human origin can be differentiated into hepatocytelike cells that enable investigation of host genetics within the context of host-pathogen interactions, and can also be used for target identification for drug development. However, stem cell cultures require significant culture expertise and may not represent a fully differentiated adult hepatocyte phenotype.
Although 2D cultures offer ease of use and monitoring of infection, they often lack the complexity of the liver microenvironment and impact of different cell types on liver infections. A 3D radial-flow bioreactor (cylindrical matrix) was able to maintain and amplify human hepatoma cells (for example, Huh7 cells), by providing sufficient oxygen and nutrient supply, supporting productive HCV infection for months. Other 3D cultures of hepatoma cells using polyethylene glycol–based hydrogels, thermoreversible gelatin polymers, alginate, galactosylated cellulosic sponges, matrigel, and collagen have been developed and shown to be permissive to HCV or HBV infections. Although 3D coculture systems exhibit better hepatic function and differential gene expression profiles in comparison to 2D counterparts, they require a large quantity of cells and are a challenge to scale up. Recently, several liver-on-a-chip models have been created that mimic shear stress, blood flow, and the extracellular environment within a tissue, holding great potential for modeling liver-specific pathogens.
Humanized mouse models with ectopic human liver structures have been developed in which primary HHs are transplanted following liver injury. Chimeric mouse models including Alb-uPA/SCID (HHs transplanted into urokinase-type plasminogen activator-transgenic severe combined immunodeficient mice), FNRG/FRG (HHs transplanted into Fah[-/-], Rag2[-/-], and Il2rg[-/-] mice with or without a nonobese diabetic background), and TK-NOG (HHs transplanted into herpes simplex virus type-1 thymidine kinase mice) were validated for HCV, HBV, P. falciparum, and P. vivax infections. It is, however, laborious to create and maintain chimeric mouse models and monitor infection processes in them.
It is important to note that the selection of model system and the readout modality to monitor infection will vary based on the experimental question at hand. Tissue engineering has thus far made significant contributions to the knowledge of hepatotropic pathogens; a continued effort to develop better liver models is envisioned.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY
VIDEO: Model supports endoscopic resection for some T1b esophageal adenocarcinomas
Endoscopic treatment of T1a esophageal adenocarcinoma outperformed esophagectomy across a range of ages and comorbidity levels in a Markov model.
Esophagectomy produced 0.16 more unadjusted life-years, but led to 0.27 fewer quality-adjusted life-years (QALYs), in the hypothetical case of a 75-year-old man with T1aN0M0 esophageal adenocarcinoma (EAC) and a Charlson comorbidity index score of 0, reported Jacqueline N. Chu, MD, of Massachusetts General Hospital, Boston, and her associates. “[We] believe QALYs are a more important endpoint because of the significant morbidity associated with esophagectomy,” they wrote in the March issue of Clinical Gastroenterology and Hepatology.
Source: American Gastroenterological Association
In contrast, the model portrayed the management of T1b EAC as “an individualized decision” – esophagectomy was preferable in 60- to 70-year-old patients with T1b EAC, but serial endoscopic treatment was better when patients were older, with more comorbidities, the researchers said. “For the sickest patients, those aged 80 and older with comorbidity index of 2, endoscopic treatment not only provided more QALYs but more unadjusted life years as well.”
Treatment of T1a EAC is transitioning from esophagectomy to serial endoscopic resection, which physicians still tend to regard as too risky in T1b EAC. The Markov model evaluated the efficacy and cost efficacy of the two approaches in hypothetical T1a and T1b patients of various ages and comorbidities, using cancer death data from the Surveillance, Epidemiology, and End Results (SEER) Medicare database and published cost data converted to 2017 U.S. dollars based on the U.S. Bureau of Labor Statistics’ Consumer Price Index.
Like the T1a case, the T1b base case consisted of a 75-year-old man with a Charlson comorbidity index of 0. Esophagectomy produced 0.72 more unadjusted life years than did endoscopic treatment (5.73 vs. 5.01) while yielding 0.22 more QALYs (4.07 vs. 3.85, respectively). Esophagectomy cost $156,981 more, but the model did not account for costs of chemotherapy and radiation or palliative care, all of which are more likely with endoscopic resection than esophagectomy, the researchers noted.
In sensitivity analyses, endoscopic treatment optimized quality of life in T1b EAC patients who were older than 80 years and had a comorbidity index of 1 or 2. Beyond that, treatment choice depended on posttreatment variables. “[If] a patient considered his or her quality of life postesophagectomy nearly equal to, or preferable to, [that] postendoscopic treatment, esophagectomy would be the optimal treatment strategy,” the investigators wrote. “An example would be the patient who would rather have an esophagectomy than worry about recurrence with endoscopic treatment.”
Pathologic analysis of T1a EACs can be inconsistent, and the model did not test whether high versus low pathologic risk affected treatment preference, the researchers said. They added data on T1NOS (T1 not otherwise specified) EACs to the model because the SEER-Medicare database included so few T1b endoscopic cases, but T1NOS patients had the worst outcomes and were in fact probably higher stage than T1. Fully 31% of endoscopy patients were T1NOS, compared with only 11% of esophagectomy patients, which would have biased the model against endoscopic treatment, according to the investigators.
The National Institutes of Health provided funding. Dr. Chu reported having no conflicts of interest. Three coinvestigators disclosed ties to CSA Medical, Ninepoint, C2 Therapeutics, Medtronic, and Trio Medicines. The remaining coinvestigators had no conflicts.
SOURCE: Chu JN et al. Clin Gastroenterol Hepatol. 2017 Nov 24. doi: 10.1016/j.cgh.2017.10.024.
Endoscopic treatment of T1a esophageal adenocarcinoma outperformed esophagectomy across a range of ages and comorbidity levels in a Markov model.
Esophagectomy produced 0.16 more unadjusted life-years, but led to 0.27 fewer quality-adjusted life-years (QALYs), in the hypothetical case of a 75-year-old man with T1aN0M0 esophageal adenocarcinoma (EAC) and a Charlson comorbidity index score of 0, reported Jacqueline N. Chu, MD, of Massachusetts General Hospital, Boston, and her associates. “[We] believe QALYs are a more important endpoint because of the significant morbidity associated with esophagectomy,” they wrote in the March issue of Clinical Gastroenterology and Hepatology.
Source: American Gastroenterological Association
In contrast, the model portrayed the management of T1b EAC as “an individualized decision” – esophagectomy was preferable in 60- to 70-year-old patients with T1b EAC, but serial endoscopic treatment was better when patients were older, with more comorbidities, the researchers said. “For the sickest patients, those aged 80 and older with comorbidity index of 2, endoscopic treatment not only provided more QALYs but more unadjusted life years as well.”
Treatment of T1a EAC is transitioning from esophagectomy to serial endoscopic resection, which physicians still tend to regard as too risky in T1b EAC. The Markov model evaluated the efficacy and cost efficacy of the two approaches in hypothetical T1a and T1b patients of various ages and comorbidities, using cancer death data from the Surveillance, Epidemiology, and End Results (SEER) Medicare database and published cost data converted to 2017 U.S. dollars based on the U.S. Bureau of Labor Statistics’ Consumer Price Index.
Like the T1a case, the T1b base case consisted of a 75-year-old man with a Charlson comorbidity index of 0. Esophagectomy produced 0.72 more unadjusted life years than did endoscopic treatment (5.73 vs. 5.01) while yielding 0.22 more QALYs (4.07 vs. 3.85, respectively). Esophagectomy cost $156,981 more, but the model did not account for costs of chemotherapy and radiation or palliative care, all of which are more likely with endoscopic resection than esophagectomy, the researchers noted.
In sensitivity analyses, endoscopic treatment optimized quality of life in T1b EAC patients who were older than 80 years and had a comorbidity index of 1 or 2. Beyond that, treatment choice depended on posttreatment variables. “[If] a patient considered his or her quality of life postesophagectomy nearly equal to, or preferable to, [that] postendoscopic treatment, esophagectomy would be the optimal treatment strategy,” the investigators wrote. “An example would be the patient who would rather have an esophagectomy than worry about recurrence with endoscopic treatment.”
Pathologic analysis of T1a EACs can be inconsistent, and the model did not test whether high versus low pathologic risk affected treatment preference, the researchers said. They added data on T1NOS (T1 not otherwise specified) EACs to the model because the SEER-Medicare database included so few T1b endoscopic cases, but T1NOS patients had the worst outcomes and were in fact probably higher stage than T1. Fully 31% of endoscopy patients were T1NOS, compared with only 11% of esophagectomy patients, which would have biased the model against endoscopic treatment, according to the investigators.
The National Institutes of Health provided funding. Dr. Chu reported having no conflicts of interest. Three coinvestigators disclosed ties to CSA Medical, Ninepoint, C2 Therapeutics, Medtronic, and Trio Medicines. The remaining coinvestigators had no conflicts.
SOURCE: Chu JN et al. Clin Gastroenterol Hepatol. 2017 Nov 24. doi: 10.1016/j.cgh.2017.10.024.
Endoscopic treatment of T1a esophageal adenocarcinoma outperformed esophagectomy across a range of ages and comorbidity levels in a Markov model.
Esophagectomy produced 0.16 more unadjusted life-years, but led to 0.27 fewer quality-adjusted life-years (QALYs), in the hypothetical case of a 75-year-old man with T1aN0M0 esophageal adenocarcinoma (EAC) and a Charlson comorbidity index score of 0, reported Jacqueline N. Chu, MD, of Massachusetts General Hospital, Boston, and her associates. “[We] believe QALYs are a more important endpoint because of the significant morbidity associated with esophagectomy,” they wrote in the March issue of Clinical Gastroenterology and Hepatology.
Source: American Gastroenterological Association
In contrast, the model portrayed the management of T1b EAC as “an individualized decision” – esophagectomy was preferable in 60- to 70-year-old patients with T1b EAC, but serial endoscopic treatment was better when patients were older, with more comorbidities, the researchers said. “For the sickest patients, those aged 80 and older with comorbidity index of 2, endoscopic treatment not only provided more QALYs but more unadjusted life years as well.”
Treatment of T1a EAC is transitioning from esophagectomy to serial endoscopic resection, which physicians still tend to regard as too risky in T1b EAC. The Markov model evaluated the efficacy and cost efficacy of the two approaches in hypothetical T1a and T1b patients of various ages and comorbidities, using cancer death data from the Surveillance, Epidemiology, and End Results (SEER) Medicare database and published cost data converted to 2017 U.S. dollars based on the U.S. Bureau of Labor Statistics’ Consumer Price Index.
Like the T1a case, the T1b base case consisted of a 75-year-old man with a Charlson comorbidity index of 0. Esophagectomy produced 0.72 more unadjusted life years than did endoscopic treatment (5.73 vs. 5.01) while yielding 0.22 more QALYs (4.07 vs. 3.85, respectively). Esophagectomy cost $156,981 more, but the model did not account for costs of chemotherapy and radiation or palliative care, all of which are more likely with endoscopic resection than esophagectomy, the researchers noted.
In sensitivity analyses, endoscopic treatment optimized quality of life in T1b EAC patients who were older than 80 years and had a comorbidity index of 1 or 2. Beyond that, treatment choice depended on posttreatment variables. “[If] a patient considered his or her quality of life postesophagectomy nearly equal to, or preferable to, [that] postendoscopic treatment, esophagectomy would be the optimal treatment strategy,” the investigators wrote. “An example would be the patient who would rather have an esophagectomy than worry about recurrence with endoscopic treatment.”
Pathologic analysis of T1a EACs can be inconsistent, and the model did not test whether high versus low pathologic risk affected treatment preference, the researchers said. They added data on T1NOS (T1 not otherwise specified) EACs to the model because the SEER-Medicare database included so few T1b endoscopic cases, but T1NOS patients had the worst outcomes and were in fact probably higher stage than T1. Fully 31% of endoscopy patients were T1NOS, compared with only 11% of esophagectomy patients, which would have biased the model against endoscopic treatment, according to the investigators.
The National Institutes of Health provided funding. Dr. Chu reported having no conflicts of interest. Three coinvestigators disclosed ties to CSA Medical, Ninepoint, C2 Therapeutics, Medtronic, and Trio Medicines. The remaining coinvestigators had no conflicts.
SOURCE: Chu JN et al. Clin Gastroenterol Hepatol. 2017 Nov 24. doi: 10.1016/j.cgh.2017.10.024.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: A Markov model supports endoscopic resection for some T1b esophageal adenocarcinomas.
Major finding: Endoscopic resection was preferred in T1b patients who were more than 80 years old or had a Charlson comorbidity index of 1or 2.
Data source: A Markov model with Surveillance, Epidemiology, and End Results (SEER) Medicare mortality data and published cost data converted to 2017 U.S. dollars based on the national Consumer Price Index.
Disclosures: The National Institutes of Health provided funding. Dr. Chu reported having no conflicts of interest. Three coinvestigators disclosed ties to CSA Medical, Ninepoint, C2 Therapeutics, Medtronic, and Trio Medicines. The remaining coinvestigators had no conflicts.
Source: Chu JN et al. Clin Gastroenterol Hepatol .2017 Nov 24. doi: 10.1016/j.cgh.2017.10.024.
Sofosbuvir/ledipasvir looks good in HBV coinfected patients
For patients
or death in a phase 3b, multicenter, open-label study.“Although we observed increases in HBV DNA in most patients, these increases were [usually] not associated with ALT [alanine amino transferase] flares or clinical complications,” reported Chun-Jen Liu, MD, of National Taiwan University College of Medicine and Hospital, Taipei, and his associates. Although nearly two-thirds of patients developed HBV reactivation, less than 5% developed alanine aminotransferase rises at least twice the upper limit of normal, and only one patient had symptomatic HBV reactivation, which entecavir therapy resolved. This study was the first to prospectively evaluate the risk of HBV reactivation during HCV treatment, the researchers wrote in the March issue of Gastroenterology.
Because chronic hepatitis C virus infection tends to suppress HBV replication, peginterferon/ribavirin or direct-acting anti-HCV treatment can reactivate HBV infection, especially in patients who test positive for hepatitis B surface antigen (HBsAg). Left untreated, reactivated HBV can lead to fulminant hepatitis, liver failure, and death, as noted on recently mandated boxed warnings.
Accordingly, guidelines recommend testing patients for HBV infection before starting HCV treatment. The study enrolled 111 coinfected patients; about two-thirds were female, and 16% had compensated cirrhosis. All tested positive for HBsAg at screening, and all but one also tested positive at baseline. Mean baseline HBV DNA levels were 2.1 log10 IU/mL. Patients received 90 mg ledipasvir plus 400 mg sofosbuvir for 12 weeks, and levels of HCV RNA, HBV DNA, and HBsAg were tested at weeks 1, 2, 4, 8, 12, posttreatment week 4, and then every 12 weeks until posttreatment week 108.
In all, 70 (63%) patients developed HBV reactivation, including 84% of the 37 patients with undetectable HBV DNA at baseline. During treatment, none of these patients had ALT rise more than twice the upper limit of normal. By 48 weeks post treatment, however, 77% still had quantifiable HBV DNA, and two had marked ALT rises. Furthermore, by posttreatment week 53, one of these patients developed bilirubinemia and symptomatic HBV infection (malaise, anorexia, sclera jaundice, and nausea), which resolved after treatment with entecavir.
A total of 74 patients had quantifiable baseline HBV DNA (at least 20 IU/mL). Three received entecavir or tenofovir disoproxil fumarate based on confirmed HBV reactivation with a concomitant ALT rise of at least twice the upper limit of normal. All were asymptomatic. There were no cases of liver failure or death.
“Regardless of HBV DNA and/or ALT elevations, no patient had signs of liver failure,” the researchers wrote. “Our results support the recommendations put forth in clinical treatment guidelines: HCV-infected patients should be evaluated for HBV infection prior to HCV treatment with direct-acting antivirals. Those who are HBsAg positive should be monitored during and after treatment for HBV reactivation, and treatment should be initiated in accordance with existing guidelines.”
Gilead funded the study. Dr. Liu and 12 coinvestigators reported having no conflicts of interest. Nine coinvestigators reported being employees and shareholders of Gilead, and one coinvestigator reporting consulting for Gilead. The senior author disclosed ties to Roche, Bristol-Myers Squibb, Johnson & Johnson, Bayer, MSD, and Taiha.
SOURCE: Lui C-J et al. Gastroenterology. 2017 Nov 21. doi: 10.1053/j.gastro.2017.11.011.
For patients
or death in a phase 3b, multicenter, open-label study.“Although we observed increases in HBV DNA in most patients, these increases were [usually] not associated with ALT [alanine amino transferase] flares or clinical complications,” reported Chun-Jen Liu, MD, of National Taiwan University College of Medicine and Hospital, Taipei, and his associates. Although nearly two-thirds of patients developed HBV reactivation, less than 5% developed alanine aminotransferase rises at least twice the upper limit of normal, and only one patient had symptomatic HBV reactivation, which entecavir therapy resolved. This study was the first to prospectively evaluate the risk of HBV reactivation during HCV treatment, the researchers wrote in the March issue of Gastroenterology.
Because chronic hepatitis C virus infection tends to suppress HBV replication, peginterferon/ribavirin or direct-acting anti-HCV treatment can reactivate HBV infection, especially in patients who test positive for hepatitis B surface antigen (HBsAg). Left untreated, reactivated HBV can lead to fulminant hepatitis, liver failure, and death, as noted on recently mandated boxed warnings.
Accordingly, guidelines recommend testing patients for HBV infection before starting HCV treatment. The study enrolled 111 coinfected patients; about two-thirds were female, and 16% had compensated cirrhosis. All tested positive for HBsAg at screening, and all but one also tested positive at baseline. Mean baseline HBV DNA levels were 2.1 log10 IU/mL. Patients received 90 mg ledipasvir plus 400 mg sofosbuvir for 12 weeks, and levels of HCV RNA, HBV DNA, and HBsAg were tested at weeks 1, 2, 4, 8, 12, posttreatment week 4, and then every 12 weeks until posttreatment week 108.
In all, 70 (63%) patients developed HBV reactivation, including 84% of the 37 patients with undetectable HBV DNA at baseline. During treatment, none of these patients had ALT rise more than twice the upper limit of normal. By 48 weeks post treatment, however, 77% still had quantifiable HBV DNA, and two had marked ALT rises. Furthermore, by posttreatment week 53, one of these patients developed bilirubinemia and symptomatic HBV infection (malaise, anorexia, sclera jaundice, and nausea), which resolved after treatment with entecavir.
A total of 74 patients had quantifiable baseline HBV DNA (at least 20 IU/mL). Three received entecavir or tenofovir disoproxil fumarate based on confirmed HBV reactivation with a concomitant ALT rise of at least twice the upper limit of normal. All were asymptomatic. There were no cases of liver failure or death.
“Regardless of HBV DNA and/or ALT elevations, no patient had signs of liver failure,” the researchers wrote. “Our results support the recommendations put forth in clinical treatment guidelines: HCV-infected patients should be evaluated for HBV infection prior to HCV treatment with direct-acting antivirals. Those who are HBsAg positive should be monitored during and after treatment for HBV reactivation, and treatment should be initiated in accordance with existing guidelines.”
Gilead funded the study. Dr. Liu and 12 coinvestigators reported having no conflicts of interest. Nine coinvestigators reported being employees and shareholders of Gilead, and one coinvestigator reporting consulting for Gilead. The senior author disclosed ties to Roche, Bristol-Myers Squibb, Johnson & Johnson, Bayer, MSD, and Taiha.
SOURCE: Lui C-J et al. Gastroenterology. 2017 Nov 21. doi: 10.1053/j.gastro.2017.11.011.
For patients
or death in a phase 3b, multicenter, open-label study.“Although we observed increases in HBV DNA in most patients, these increases were [usually] not associated with ALT [alanine amino transferase] flares or clinical complications,” reported Chun-Jen Liu, MD, of National Taiwan University College of Medicine and Hospital, Taipei, and his associates. Although nearly two-thirds of patients developed HBV reactivation, less than 5% developed alanine aminotransferase rises at least twice the upper limit of normal, and only one patient had symptomatic HBV reactivation, which entecavir therapy resolved. This study was the first to prospectively evaluate the risk of HBV reactivation during HCV treatment, the researchers wrote in the March issue of Gastroenterology.
Because chronic hepatitis C virus infection tends to suppress HBV replication, peginterferon/ribavirin or direct-acting anti-HCV treatment can reactivate HBV infection, especially in patients who test positive for hepatitis B surface antigen (HBsAg). Left untreated, reactivated HBV can lead to fulminant hepatitis, liver failure, and death, as noted on recently mandated boxed warnings.
Accordingly, guidelines recommend testing patients for HBV infection before starting HCV treatment. The study enrolled 111 coinfected patients; about two-thirds were female, and 16% had compensated cirrhosis. All tested positive for HBsAg at screening, and all but one also tested positive at baseline. Mean baseline HBV DNA levels were 2.1 log10 IU/mL. Patients received 90 mg ledipasvir plus 400 mg sofosbuvir for 12 weeks, and levels of HCV RNA, HBV DNA, and HBsAg were tested at weeks 1, 2, 4, 8, 12, posttreatment week 4, and then every 12 weeks until posttreatment week 108.
In all, 70 (63%) patients developed HBV reactivation, including 84% of the 37 patients with undetectable HBV DNA at baseline. During treatment, none of these patients had ALT rise more than twice the upper limit of normal. By 48 weeks post treatment, however, 77% still had quantifiable HBV DNA, and two had marked ALT rises. Furthermore, by posttreatment week 53, one of these patients developed bilirubinemia and symptomatic HBV infection (malaise, anorexia, sclera jaundice, and nausea), which resolved after treatment with entecavir.
A total of 74 patients had quantifiable baseline HBV DNA (at least 20 IU/mL). Three received entecavir or tenofovir disoproxil fumarate based on confirmed HBV reactivation with a concomitant ALT rise of at least twice the upper limit of normal. All were asymptomatic. There were no cases of liver failure or death.
“Regardless of HBV DNA and/or ALT elevations, no patient had signs of liver failure,” the researchers wrote. “Our results support the recommendations put forth in clinical treatment guidelines: HCV-infected patients should be evaluated for HBV infection prior to HCV treatment with direct-acting antivirals. Those who are HBsAg positive should be monitored during and after treatment for HBV reactivation, and treatment should be initiated in accordance with existing guidelines.”
Gilead funded the study. Dr. Liu and 12 coinvestigators reported having no conflicts of interest. Nine coinvestigators reported being employees and shareholders of Gilead, and one coinvestigator reporting consulting for Gilead. The senior author disclosed ties to Roche, Bristol-Myers Squibb, Johnson & Johnson, Bayer, MSD, and Taiha.
SOURCE: Lui C-J et al. Gastroenterology. 2017 Nov 21. doi: 10.1053/j.gastro.2017.11.011.
FROM GASTROENTEROLOGY
Key clinical point: Combination therapy with sofosbuvir/ledipasvir effectively treated chronic hepatitis C infection in hepatitis B coinfected patients.
Major finding: The rate of sustained viral response was 100% at 12 weeks. Most (63%) of patients had an increase in hepatitis B viral DNA, but only 5% of patients had a concomitant increase in alanine aminotransferase. There were no cases of liver failure or death.
Data source: A phase 3b, multicenter, single-arm, open-label study of 111 coinfected patients.
Disclosures: Gilead funded the study. Dr. Liu and 12 coinvestigators reported having no conflicts of interest. Nine coinvestigators reported being employees and shareholders of Gilead, and one coinvestigator reporting consulting for Gilead. The senior author disclosed ties to Roche, Bristol-Myers Squibb, Johnson & Johnson, Bayer, MSD, and Taiha.
Source: Lui C-J et al. Gastroenterology. 2017 Nov 21. doi: 10.1053/j.gastro.2017.11.011.
Ulcerative colitis is disabling over time
Between 70% and 80% of patients with ulcerative colitis relapsed within 10 years of diagnosis and 10%-15% had aggressive disease in a meta-analysis of 17 population-based cohorts spanning 1935 to 2016.
However, “contemporary population-based cohorts of patients diagnosed in the biologic era are lacking,” [and they] “may inform us of the population-level impact of paradigm shifts in approach to ulcerative colitis management during the last decade, such as early use of disease-modifying biologic therapy and treat-to-target [strategies],” wrote Mathurin Fumery, MD, of the University of California San Diego, La Jolla. The report was published in Clinical Gastroenterology and Hepatology (2017 Jun 16. doi: 10.1016/j.cgh.2017.06.016).
Population-based observational cohort studies follow an entire group in a geographic area over an extended time, which better characterizes the true natural history of disease outside highly controlled settings of clinical trials, the reviewers noted. They searched MEDLINE for population-based longitudinal studies of adults with newly diagnosed ulcerative colitis, whose medical records were reviewed, and who were followed for at least a year. They identified 60 such studies of 17 cohorts that included 15,316 patients in southern and northern Europe, Australia, Israel, the United States, Canada, China, Hong Kong, Indonesia, Sri Lanka, Macau, Malaysia, Singapore, and Thailand.
Left-sided colitis was most common (median, 40%; interquartile range, 33%-45%) and about 10%-30% of patients had disease extension. Patients tended to have mild to moderate disease that was most active at diagnosis and subsequently alternated between remission and mild activity. However, nearly half of patients were hospitalized at some point because of ulcerative colitis, and about half of that subgroup was rehospitalized within 5 years. Furthermore, up to 15% of patients with ulcerative colitis underwent colectomy within 10 years, a risk that mucosal healing helped mitigate. Use of corticosteroids dropped over time as the prevalence of immunomodulators and anti–tumor necrosis factor therapy rose.
“Although ulcerative colitis is not associated with an increased risk of mortality, it is associated with high morbidity and work disability, comparable to Crohn’s disease,” the reviewers concluded. Not only are contemporary population-level data lacking, but it also remains unclear whether treating patients with ulcerative colitis according to baseline risk affects the disease course, or whether the natural history of this disease differs in newly industrialized nations or the Asia-Oceania region, they added.
Dr. Fumery disclosed support from the French Society of Gastroenterology, AbbVie, MSD, Takeda, and Ferring. Coinvestigators disclosed ties to numerous pharmaceutical companies.
SOURCE: Fumery M et al. Clin Gastroenterol Hepatol. 2017 Jun 16. doi: 10.1016/j.cgh.2017.06.016.
Understanding the natural history of ulcerative colitis (UC) is imperative especially in view of emerging therapies that could have the potential to alter the natural course of disease. Dr. Fumery and his colleagues are to be congratulated for conducting a comprehensive review of different inception cohorts across the world and evaluating different facets of the disease. They found that the majority of patients had a mild-moderate disease course, which was most active at the time of diagnosis. Approximately half the patients require UC-related hospitalization at some time during the course of their disease. Similarly, 50% of patients received corticosteroids, and while almost all patients with UC were treated with mesalamine within 1 year of diagnosis, 30%-40% are not on mesalamine long term. They also identified consistent predictors of poor prognosis, including young age at diagnosis, extensive disease, early need for corticosteroids, and elevated biochemical markers.
These results are reassuring because they reinforce the previous observations that roughly half the patients with UC have an uncomplicated disease course and that the first few years of disease are the most aggressive. A good indicator was that the proportion of patients receiving corticosteroids decreased over time. The disheartening news was that the long-term colectomy rates have generally remained stable over time.
Nabeel Khan, MD, is assistant professor of clinical medicine, University of Pennsylvania, Philadelphia, and director of gastroenterology, Philadelphia Veterans Affairs Medical Center. He has received research grants from Takeda, Luitpold, and Pfizer.
Understanding the natural history of ulcerative colitis (UC) is imperative especially in view of emerging therapies that could have the potential to alter the natural course of disease. Dr. Fumery and his colleagues are to be congratulated for conducting a comprehensive review of different inception cohorts across the world and evaluating different facets of the disease. They found that the majority of patients had a mild-moderate disease course, which was most active at the time of diagnosis. Approximately half the patients require UC-related hospitalization at some time during the course of their disease. Similarly, 50% of patients received corticosteroids, and while almost all patients with UC were treated with mesalamine within 1 year of diagnosis, 30%-40% are not on mesalamine long term. They also identified consistent predictors of poor prognosis, including young age at diagnosis, extensive disease, early need for corticosteroids, and elevated biochemical markers.
These results are reassuring because they reinforce the previous observations that roughly half the patients with UC have an uncomplicated disease course and that the first few years of disease are the most aggressive. A good indicator was that the proportion of patients receiving corticosteroids decreased over time. The disheartening news was that the long-term colectomy rates have generally remained stable over time.
Nabeel Khan, MD, is assistant professor of clinical medicine, University of Pennsylvania, Philadelphia, and director of gastroenterology, Philadelphia Veterans Affairs Medical Center. He has received research grants from Takeda, Luitpold, and Pfizer.
Understanding the natural history of ulcerative colitis (UC) is imperative especially in view of emerging therapies that could have the potential to alter the natural course of disease. Dr. Fumery and his colleagues are to be congratulated for conducting a comprehensive review of different inception cohorts across the world and evaluating different facets of the disease. They found that the majority of patients had a mild-moderate disease course, which was most active at the time of diagnosis. Approximately half the patients require UC-related hospitalization at some time during the course of their disease. Similarly, 50% of patients received corticosteroids, and while almost all patients with UC were treated with mesalamine within 1 year of diagnosis, 30%-40% are not on mesalamine long term. They also identified consistent predictors of poor prognosis, including young age at diagnosis, extensive disease, early need for corticosteroids, and elevated biochemical markers.
These results are reassuring because they reinforce the previous observations that roughly half the patients with UC have an uncomplicated disease course and that the first few years of disease are the most aggressive. A good indicator was that the proportion of patients receiving corticosteroids decreased over time. The disheartening news was that the long-term colectomy rates have generally remained stable over time.
Nabeel Khan, MD, is assistant professor of clinical medicine, University of Pennsylvania, Philadelphia, and director of gastroenterology, Philadelphia Veterans Affairs Medical Center. He has received research grants from Takeda, Luitpold, and Pfizer.
Between 70% and 80% of patients with ulcerative colitis relapsed within 10 years of diagnosis and 10%-15% had aggressive disease in a meta-analysis of 17 population-based cohorts spanning 1935 to 2016.
However, “contemporary population-based cohorts of patients diagnosed in the biologic era are lacking,” [and they] “may inform us of the population-level impact of paradigm shifts in approach to ulcerative colitis management during the last decade, such as early use of disease-modifying biologic therapy and treat-to-target [strategies],” wrote Mathurin Fumery, MD, of the University of California San Diego, La Jolla. The report was published in Clinical Gastroenterology and Hepatology (2017 Jun 16. doi: 10.1016/j.cgh.2017.06.016).
Population-based observational cohort studies follow an entire group in a geographic area over an extended time, which better characterizes the true natural history of disease outside highly controlled settings of clinical trials, the reviewers noted. They searched MEDLINE for population-based longitudinal studies of adults with newly diagnosed ulcerative colitis, whose medical records were reviewed, and who were followed for at least a year. They identified 60 such studies of 17 cohorts that included 15,316 patients in southern and northern Europe, Australia, Israel, the United States, Canada, China, Hong Kong, Indonesia, Sri Lanka, Macau, Malaysia, Singapore, and Thailand.
Left-sided colitis was most common (median, 40%; interquartile range, 33%-45%) and about 10%-30% of patients had disease extension. Patients tended to have mild to moderate disease that was most active at diagnosis and subsequently alternated between remission and mild activity. However, nearly half of patients were hospitalized at some point because of ulcerative colitis, and about half of that subgroup was rehospitalized within 5 years. Furthermore, up to 15% of patients with ulcerative colitis underwent colectomy within 10 years, a risk that mucosal healing helped mitigate. Use of corticosteroids dropped over time as the prevalence of immunomodulators and anti–tumor necrosis factor therapy rose.
“Although ulcerative colitis is not associated with an increased risk of mortality, it is associated with high morbidity and work disability, comparable to Crohn’s disease,” the reviewers concluded. Not only are contemporary population-level data lacking, but it also remains unclear whether treating patients with ulcerative colitis according to baseline risk affects the disease course, or whether the natural history of this disease differs in newly industrialized nations or the Asia-Oceania region, they added.
Dr. Fumery disclosed support from the French Society of Gastroenterology, AbbVie, MSD, Takeda, and Ferring. Coinvestigators disclosed ties to numerous pharmaceutical companies.
SOURCE: Fumery M et al. Clin Gastroenterol Hepatol. 2017 Jun 16. doi: 10.1016/j.cgh.2017.06.016.
Between 70% and 80% of patients with ulcerative colitis relapsed within 10 years of diagnosis and 10%-15% had aggressive disease in a meta-analysis of 17 population-based cohorts spanning 1935 to 2016.
However, “contemporary population-based cohorts of patients diagnosed in the biologic era are lacking,” [and they] “may inform us of the population-level impact of paradigm shifts in approach to ulcerative colitis management during the last decade, such as early use of disease-modifying biologic therapy and treat-to-target [strategies],” wrote Mathurin Fumery, MD, of the University of California San Diego, La Jolla. The report was published in Clinical Gastroenterology and Hepatology (2017 Jun 16. doi: 10.1016/j.cgh.2017.06.016).
Population-based observational cohort studies follow an entire group in a geographic area over an extended time, which better characterizes the true natural history of disease outside highly controlled settings of clinical trials, the reviewers noted. They searched MEDLINE for population-based longitudinal studies of adults with newly diagnosed ulcerative colitis, whose medical records were reviewed, and who were followed for at least a year. They identified 60 such studies of 17 cohorts that included 15,316 patients in southern and northern Europe, Australia, Israel, the United States, Canada, China, Hong Kong, Indonesia, Sri Lanka, Macau, Malaysia, Singapore, and Thailand.
Left-sided colitis was most common (median, 40%; interquartile range, 33%-45%) and about 10%-30% of patients had disease extension. Patients tended to have mild to moderate disease that was most active at diagnosis and subsequently alternated between remission and mild activity. However, nearly half of patients were hospitalized at some point because of ulcerative colitis, and about half of that subgroup was rehospitalized within 5 years. Furthermore, up to 15% of patients with ulcerative colitis underwent colectomy within 10 years, a risk that mucosal healing helped mitigate. Use of corticosteroids dropped over time as the prevalence of immunomodulators and anti–tumor necrosis factor therapy rose.
“Although ulcerative colitis is not associated with an increased risk of mortality, it is associated with high morbidity and work disability, comparable to Crohn’s disease,” the reviewers concluded. Not only are contemporary population-level data lacking, but it also remains unclear whether treating patients with ulcerative colitis according to baseline risk affects the disease course, or whether the natural history of this disease differs in newly industrialized nations or the Asia-Oceania region, they added.
Dr. Fumery disclosed support from the French Society of Gastroenterology, AbbVie, MSD, Takeda, and Ferring. Coinvestigators disclosed ties to numerous pharmaceutical companies.
SOURCE: Fumery M et al. Clin Gastroenterol Hepatol. 2017 Jun 16. doi: 10.1016/j.cgh.2017.06.016.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: Although usually mild to moderate in severity, ulcerative colitis is disabling over time.
Major finding: Cumulative risk of relapse was 70%-80% at 10 years.
Data source: A systematic review and analysis of 17 population-based cohorts.
Disclosures: Dr. Fumery disclosed support from the French Society of Gastroenterology, Abbvie, MSD, Takeda, and Ferring. Coinvestigators disclosed ties to numerous pharmaceutical companies.
Source: Fumery M et al. Clin Gastroenterol Hepatol. 2017 Jun 16. doi: 10.1016/j.cgh.2017.06.016.
VIDEO: No short-term link found between PPIs, myocardial infarction
in a large retrospective insurance claims study.
SOURCE: AMERICAN GASTROENTEROLOGICAL ASSOCIATION
Over a median follow-up of 2-3 months, estimated weighted risks of first-ever MI were low and similar regardless of whether patients started PPIs or histamine2 receptor antagonists (H2RAs), reported Suzanne N. Landi of the University of North Carolina at Chapel Hill, and her associates. “Contrary to prior literature, our analyses do not indicate increased risk of MI in PPI initiators compared to histamine2-receptor antagonist initiators,” they wrote in the March issue of Gastroenterology.
Epidemiologic studies have produced mixed findings on PPI use and MI risk. Animal models and ex vivo studies of human tissue indicate that PPIs might harm coronary vessels by increasing plasma levels of asymmetrical dimethylarginine, which counteracts the vasoprotective activity of endothelial nitrous oxide synthase, the investigators noted. To further assess PPIs and risk of MI while minimizing potential confounding, they studied new users of either prescription PPIs or an active comparator, prescription H2RAs. The dataset included administrative claims for more than 5 million patients with no MI history who were enrolled in commercial insurance plans or Medicare Supplemental Insurance plans. The study data spanned from 2001 to 2014, and patients were followed from their initial antacid prescription until they either developed a first-ever MI, stopped their medication, or left their insurance plan. Median follow-up times were 60 days in patients with commercial insurance and 96 days in patients with Medicare Supplemental Insurance, which employers provide for individuals who are at least 65 years old.
After controlling for numerous measurable clinical and demographic confounders, the estimated 12-month risk of MI was about 2 cases per 1,000 commercially insured patients and about 8 cases per 1,000 Medicare Supplemental Insurance enrollees. The estimated 12-month risk of MI did not significantly differ between users of PPIs and H2RAs, regardless of whether they were enrolled in commercial insurance plans (weighted risk difference per 1,000 users, –0.08; 95% confidence interval, –0.51 to 0.36) or Medicare Supplemental Insurance (weighted risk difference per 1,000 users, –0.45; 95% CI, –1.53 to 0.58) plans.
Each antacid class also conferred a similar estimated risk of MI at 36 months, with weighted risk differences of 0.44 (95% CI, –0.90 to 1.63) per 1,000 commercial plan enrollees and –0.33 (95% CI, –4.40 to 3.46) per 1,000 Medicare Supplemental Insurance plan enrollees, the researchers reported. Weighted estimated risk ratios also were similar between drug classes, ranging from 0.87 (95% CI, 0.76 to 0.99) at 3 months among Medicare Supplemental Insurance enrollees to 1.08 (95% CI, 0.87 to 1.35) at 36 months among commercial insurance plan members.
“Previous studies have examined the risk of MI in PPI users and compared directly to nonusers, which may have resulted in stronger confounding by indication and other risk factors, such as BMI [body mass index] and baseline cardiovascular disease,” the investigators wrote. “Physicians and patients should not avoid starting a PPI because of concerns related to MI risk.”
The researchers received no grant support for this study. Ms. Landi disclosed a student fellowship from UCB Biosciences.
SOURCE: Source: Landi SN et al. Gastroenterology. 2017 Nov 6. doi: 10.1053/j.gastro.2017.10.042.
In the late 2000s, several large epidemiologic studies suggested that proton pump inhibitors (PPIs) increase the risk for MI in users of clopidogrel. There was a proposed mechanism: PPIs competitively inhibit cytochrome P450 isoenzymes, which blocked clopidogrel activation and, ex vivo, increased platelet aggregation. It sounded scary – but fortunately, some reassuring data quickly emerged. In 2007, the COGENT trial randomized patients with cardiovascular disease to a PPI/clopidogrel versus a placebo/clopidogrel combination pill. After 3 years of follow-up, there was no difference in rates of death or cardiovascular events. In the glaring light of this randomized controlled trial data, earlier studies didn’t look so convincing.
So why won’t the PPI/MI issue die? In part because COGENT was a relatively small study. It included 3,761 patients, but the main result depended on 109 cardiovascular events. Naysayers have argued that perhaps if COGENT had been a bigger study, the result would have been different.
In this context, the epidemiologic study by Suzanne Landi and her associates provides further reassurance that PPIs do not cause MI. Two insurance cohorts comprising over 5 million patients were used to compare PPI users with histamine2-receptor antagonist users after adjusting for baseline differences between the two groups. The large size of the dataset allowed the authors to make precise estimates; we can say with confidence that there was no clinically relevant PPI/MI risk in these data.
Can we forget about PPIs and MI? These days, my patients worry more about dementia or chronic kidney disease. But the PPI/MI story is worth remembering. Large epidemiologic studies are sometimes contradicted by subsequent studies and need to be evaluated in context.
Daniel E. Freedberg, MD, MS, is an assistant professor of medicine at the Columbia University Medical Center, New York. He has consulted for Pfizer.
In the late 2000s, several large epidemiologic studies suggested that proton pump inhibitors (PPIs) increase the risk for MI in users of clopidogrel. There was a proposed mechanism: PPIs competitively inhibit cytochrome P450 isoenzymes, which blocked clopidogrel activation and, ex vivo, increased platelet aggregation. It sounded scary – but fortunately, some reassuring data quickly emerged. In 2007, the COGENT trial randomized patients with cardiovascular disease to a PPI/clopidogrel versus a placebo/clopidogrel combination pill. After 3 years of follow-up, there was no difference in rates of death or cardiovascular events. In the glaring light of this randomized controlled trial data, earlier studies didn’t look so convincing.
So why won’t the PPI/MI issue die? In part because COGENT was a relatively small study. It included 3,761 patients, but the main result depended on 109 cardiovascular events. Naysayers have argued that perhaps if COGENT had been a bigger study, the result would have been different.
In this context, the epidemiologic study by Suzanne Landi and her associates provides further reassurance that PPIs do not cause MI. Two insurance cohorts comprising over 5 million patients were used to compare PPI users with histamine2-receptor antagonist users after adjusting for baseline differences between the two groups. The large size of the dataset allowed the authors to make precise estimates; we can say with confidence that there was no clinically relevant PPI/MI risk in these data.
Can we forget about PPIs and MI? These days, my patients worry more about dementia or chronic kidney disease. But the PPI/MI story is worth remembering. Large epidemiologic studies are sometimes contradicted by subsequent studies and need to be evaluated in context.
Daniel E. Freedberg, MD, MS, is an assistant professor of medicine at the Columbia University Medical Center, New York. He has consulted for Pfizer.
In the late 2000s, several large epidemiologic studies suggested that proton pump inhibitors (PPIs) increase the risk for MI in users of clopidogrel. There was a proposed mechanism: PPIs competitively inhibit cytochrome P450 isoenzymes, which blocked clopidogrel activation and, ex vivo, increased platelet aggregation. It sounded scary – but fortunately, some reassuring data quickly emerged. In 2007, the COGENT trial randomized patients with cardiovascular disease to a PPI/clopidogrel versus a placebo/clopidogrel combination pill. After 3 years of follow-up, there was no difference in rates of death or cardiovascular events. In the glaring light of this randomized controlled trial data, earlier studies didn’t look so convincing.
So why won’t the PPI/MI issue die? In part because COGENT was a relatively small study. It included 3,761 patients, but the main result depended on 109 cardiovascular events. Naysayers have argued that perhaps if COGENT had been a bigger study, the result would have been different.
In this context, the epidemiologic study by Suzanne Landi and her associates provides further reassurance that PPIs do not cause MI. Two insurance cohorts comprising over 5 million patients were used to compare PPI users with histamine2-receptor antagonist users after adjusting for baseline differences between the two groups. The large size of the dataset allowed the authors to make precise estimates; we can say with confidence that there was no clinically relevant PPI/MI risk in these data.
Can we forget about PPIs and MI? These days, my patients worry more about dementia or chronic kidney disease. But the PPI/MI story is worth remembering. Large epidemiologic studies are sometimes contradicted by subsequent studies and need to be evaluated in context.
Daniel E. Freedberg, MD, MS, is an assistant professor of medicine at the Columbia University Medical Center, New York. He has consulted for Pfizer.
in a large retrospective insurance claims study.
SOURCE: AMERICAN GASTROENTEROLOGICAL ASSOCIATION
Over a median follow-up of 2-3 months, estimated weighted risks of first-ever MI were low and similar regardless of whether patients started PPIs or histamine2 receptor antagonists (H2RAs), reported Suzanne N. Landi of the University of North Carolina at Chapel Hill, and her associates. “Contrary to prior literature, our analyses do not indicate increased risk of MI in PPI initiators compared to histamine2-receptor antagonist initiators,” they wrote in the March issue of Gastroenterology.
Epidemiologic studies have produced mixed findings on PPI use and MI risk. Animal models and ex vivo studies of human tissue indicate that PPIs might harm coronary vessels by increasing plasma levels of asymmetrical dimethylarginine, which counteracts the vasoprotective activity of endothelial nitrous oxide synthase, the investigators noted. To further assess PPIs and risk of MI while minimizing potential confounding, they studied new users of either prescription PPIs or an active comparator, prescription H2RAs. The dataset included administrative claims for more than 5 million patients with no MI history who were enrolled in commercial insurance plans or Medicare Supplemental Insurance plans. The study data spanned from 2001 to 2014, and patients were followed from their initial antacid prescription until they either developed a first-ever MI, stopped their medication, or left their insurance plan. Median follow-up times were 60 days in patients with commercial insurance and 96 days in patients with Medicare Supplemental Insurance, which employers provide for individuals who are at least 65 years old.
After controlling for numerous measurable clinical and demographic confounders, the estimated 12-month risk of MI was about 2 cases per 1,000 commercially insured patients and about 8 cases per 1,000 Medicare Supplemental Insurance enrollees. The estimated 12-month risk of MI did not significantly differ between users of PPIs and H2RAs, regardless of whether they were enrolled in commercial insurance plans (weighted risk difference per 1,000 users, –0.08; 95% confidence interval, –0.51 to 0.36) or Medicare Supplemental Insurance (weighted risk difference per 1,000 users, –0.45; 95% CI, –1.53 to 0.58) plans.
Each antacid class also conferred a similar estimated risk of MI at 36 months, with weighted risk differences of 0.44 (95% CI, –0.90 to 1.63) per 1,000 commercial plan enrollees and –0.33 (95% CI, –4.40 to 3.46) per 1,000 Medicare Supplemental Insurance plan enrollees, the researchers reported. Weighted estimated risk ratios also were similar between drug classes, ranging from 0.87 (95% CI, 0.76 to 0.99) at 3 months among Medicare Supplemental Insurance enrollees to 1.08 (95% CI, 0.87 to 1.35) at 36 months among commercial insurance plan members.
“Previous studies have examined the risk of MI in PPI users and compared directly to nonusers, which may have resulted in stronger confounding by indication and other risk factors, such as BMI [body mass index] and baseline cardiovascular disease,” the investigators wrote. “Physicians and patients should not avoid starting a PPI because of concerns related to MI risk.”
The researchers received no grant support for this study. Ms. Landi disclosed a student fellowship from UCB Biosciences.
SOURCE: Source: Landi SN et al. Gastroenterology. 2017 Nov 6. doi: 10.1053/j.gastro.2017.10.042.
in a large retrospective insurance claims study.
SOURCE: AMERICAN GASTROENTEROLOGICAL ASSOCIATION
Over a median follow-up of 2-3 months, estimated weighted risks of first-ever MI were low and similar regardless of whether patients started PPIs or histamine2 receptor antagonists (H2RAs), reported Suzanne N. Landi of the University of North Carolina at Chapel Hill, and her associates. “Contrary to prior literature, our analyses do not indicate increased risk of MI in PPI initiators compared to histamine2-receptor antagonist initiators,” they wrote in the March issue of Gastroenterology.
Epidemiologic studies have produced mixed findings on PPI use and MI risk. Animal models and ex vivo studies of human tissue indicate that PPIs might harm coronary vessels by increasing plasma levels of asymmetrical dimethylarginine, which counteracts the vasoprotective activity of endothelial nitrous oxide synthase, the investigators noted. To further assess PPIs and risk of MI while minimizing potential confounding, they studied new users of either prescription PPIs or an active comparator, prescription H2RAs. The dataset included administrative claims for more than 5 million patients with no MI history who were enrolled in commercial insurance plans or Medicare Supplemental Insurance plans. The study data spanned from 2001 to 2014, and patients were followed from their initial antacid prescription until they either developed a first-ever MI, stopped their medication, or left their insurance plan. Median follow-up times were 60 days in patients with commercial insurance and 96 days in patients with Medicare Supplemental Insurance, which employers provide for individuals who are at least 65 years old.
After controlling for numerous measurable clinical and demographic confounders, the estimated 12-month risk of MI was about 2 cases per 1,000 commercially insured patients and about 8 cases per 1,000 Medicare Supplemental Insurance enrollees. The estimated 12-month risk of MI did not significantly differ between users of PPIs and H2RAs, regardless of whether they were enrolled in commercial insurance plans (weighted risk difference per 1,000 users, –0.08; 95% confidence interval, –0.51 to 0.36) or Medicare Supplemental Insurance (weighted risk difference per 1,000 users, –0.45; 95% CI, –1.53 to 0.58) plans.
Each antacid class also conferred a similar estimated risk of MI at 36 months, with weighted risk differences of 0.44 (95% CI, –0.90 to 1.63) per 1,000 commercial plan enrollees and –0.33 (95% CI, –4.40 to 3.46) per 1,000 Medicare Supplemental Insurance plan enrollees, the researchers reported. Weighted estimated risk ratios also were similar between drug classes, ranging from 0.87 (95% CI, 0.76 to 0.99) at 3 months among Medicare Supplemental Insurance enrollees to 1.08 (95% CI, 0.87 to 1.35) at 36 months among commercial insurance plan members.
“Previous studies have examined the risk of MI in PPI users and compared directly to nonusers, which may have resulted in stronger confounding by indication and other risk factors, such as BMI [body mass index] and baseline cardiovascular disease,” the investigators wrote. “Physicians and patients should not avoid starting a PPI because of concerns related to MI risk.”
The researchers received no grant support for this study. Ms. Landi disclosed a student fellowship from UCB Biosciences.
SOURCE: Source: Landi SN et al. Gastroenterology. 2017 Nov 6. doi: 10.1053/j.gastro.2017.10.042.
FROM GASTROENTEROLOGY
Key clinical point: Starting a PPI did not appear to increase the short-term risk of MI.
Major finding: Over a median follow-up time of 2-3 months, the estimated risk of first-ever MI did not statistically differ between initiators of PPIs and initiators of histamine2-receptor antagonists.
Data source: Analyses of commercial and Medicare Supplemental Insurance claims for more than 5 million patients from 2001-2014.
Disclosures: The researchers received no grant support for this study. Ms. Landi disclosed a student fellowship from UCB Biosciences.
Source: Landi SN et al. Gastroenterology. 2017 Nov 6. doi: 10.1053/j.gastro.2017.10.042.
AGA Guideline: Use goal-directed fluid therapy, early oral feeding in acute pancreatitis
Patients with acute pancreatitis should receive “goal-directed” fluid therapy with normal saline or Ringer’s lactate solution rather than hydroxyethyl starch (HES) fluids, states a new guideline from the AGA Institute.
In a single-center randomized trial, hydroxyethyl starch fluids conferred a 3.9-fold increase in the odds of multiorgan failure (95% confidence interval for odds ratio, 1.2-12.0) compared with normal saline in patients with acute pancreatitis, wrote guideline authors Seth D. Crockett, MD, MPH, of the University of North Carolina, Chapel Hill, and his associates. This trial and another randomized study found no mortality benefit for HES compared with fluid resuscitation. The evidence is “very low quality” but mirrors the critical care literature, according to the experts. So far, Ringer’s lactate solution and normal saline have shown similar effects on the risk of organ failure, necrosis, and mortality, but ongoing trials should better clarify this choice, they noted (Gastroenterology. doi: 10.1053/j.gastro.2018.01.032).
The guideline addresses the initial 2-week period of treating acute pancreatitis. It defines goal-directed fluid therapy as titration based on meaningful targets, such as heart rate, mean arterial pressure, central venous pressure, urine output, blood urea nitrogen concentration, and hematocrit. Studies of goal-directed fluid therapy in acute pancreatitis have been unblinded, have used inconsistent outcome measures, and have found no definite benefits over nontargeted fluid therapy, note the guideline authors. Nevertheless, they conditionally recommend goal-directed fluid therapy, partly because a randomized, blinded trial of patients with severe sepsis or septic shock (which physiologically resembles acute pancreatitis) had in-hospital mortality rates of 31% when they received goal-directed fluid therapy and 47% when they received standard fluid therapy (P = .0009).
The guideline recommends against routine use of two interventions: prophylactic antibiotics and urgent endoscopic retrograde cholangiopancreatography (ERCP) for patients with acute pancreatitis. The authors note that no evidence supports routine prophylactic antibiotics for acute pancreatitis patients without cholangitis, and that urgent ERCP did not significantly affect the risk of mortality, multiorgan failure, single-organ failure, infected pancreatic and peripancreatic necrosis, or necrotizing pancreatitis in eight randomized controlled trials of patients with acute gallstone pancreatitis.
The guideline strongly recommends early oral feeding and enteral rather than parenteral nutrition for all patients with acute pancreatitis. In 11 randomized controlled trials, early and delayed feeding led to similar rates of mortality, but delayed feeding produced a 2.5-fold higher risk of necrosis (95% CI for OR, 1.4-4.4) and tended to increase the risk of infected peripancreatic necrosis, multiorgan failure, and total necrotizing pancreatitis, the authors wrote. In another 12 trials, enteral nutrition significantly reduced the risk of infected peripancreatic necrosis, single-organ failure, and multiorgan failure compared with parenteral nutrition.
Clinicians continue to debate cholecystectomy timing in patients with biliary or gallstone pancreatitis. The guidelines strongly recommend same-admission cholecystectomy, citing a randomized controlled trial in which this approach markedly reduced the combined risk of mortality and gallstone-related complications (OR, 0.2, 95% CI, 0.1-0.6), readmission for recurrent pancreatitis (OR, 0.3, 95% CI, 0.1-0.9), and pancreaticobiliary complications (OR, 0.2, 95% CI, 0.1-0.6). “The AGA issued a strong recommendation due to the quality of available evidence and the high likelihood of benefit from early versus delayed cholecystectomy in this patient population,” the experts stated.
Patients with biliary pancreatitis should be evaluated for cholecystectomy during the same admission, while those with alcohol-induced pancreatitis should receive a brief alcohol intervention, according to the guidelines, which also call for better studies of how alcohol and tobacco cessation measures affect risk of recurrent acute pancreatitis, chronic pancreatitis, and pancreatic cancer, as well as quality of life, health care utilization, and mortality.
The authors also noted knowledge gaps concerning the relative benefits of risk stratification tools, the use of prophylactic antibiotics in patients with severe acute pancreatitis or necrotizing pancreatitis, and the timing of ERCP in patients with severe biliary pancreatitis with persistent biliary obstruction.
The guideline was developed with sole funding by the AGA Institute with no external funding. The authors disclosed no relevant conflicts of interest.
Source: Crockett SD et al. Gastroenterology. doi: 10.1053/j.gastro.2018.01.032.
Patients with acute pancreatitis should receive “goal-directed” fluid therapy with normal saline or Ringer’s lactate solution rather than hydroxyethyl starch (HES) fluids, states a new guideline from the AGA Institute.
In a single-center randomized trial, hydroxyethyl starch fluids conferred a 3.9-fold increase in the odds of multiorgan failure (95% confidence interval for odds ratio, 1.2-12.0) compared with normal saline in patients with acute pancreatitis, wrote guideline authors Seth D. Crockett, MD, MPH, of the University of North Carolina, Chapel Hill, and his associates. This trial and another randomized study found no mortality benefit for HES compared with fluid resuscitation. The evidence is “very low quality” but mirrors the critical care literature, according to the experts. So far, Ringer’s lactate solution and normal saline have shown similar effects on the risk of organ failure, necrosis, and mortality, but ongoing trials should better clarify this choice, they noted (Gastroenterology. doi: 10.1053/j.gastro.2018.01.032).
The guideline addresses the initial 2-week period of treating acute pancreatitis. It defines goal-directed fluid therapy as titration based on meaningful targets, such as heart rate, mean arterial pressure, central venous pressure, urine output, blood urea nitrogen concentration, and hematocrit. Studies of goal-directed fluid therapy in acute pancreatitis have been unblinded, have used inconsistent outcome measures, and have found no definite benefits over nontargeted fluid therapy, note the guideline authors. Nevertheless, they conditionally recommend goal-directed fluid therapy, partly because a randomized, blinded trial of patients with severe sepsis or septic shock (which physiologically resembles acute pancreatitis) had in-hospital mortality rates of 31% when they received goal-directed fluid therapy and 47% when they received standard fluid therapy (P = .0009).
The guideline recommends against routine use of two interventions: prophylactic antibiotics and urgent endoscopic retrograde cholangiopancreatography (ERCP) for patients with acute pancreatitis. The authors note that no evidence supports routine prophylactic antibiotics for acute pancreatitis patients without cholangitis, and that urgent ERCP did not significantly affect the risk of mortality, multiorgan failure, single-organ failure, infected pancreatic and peripancreatic necrosis, or necrotizing pancreatitis in eight randomized controlled trials of patients with acute gallstone pancreatitis.
The guideline strongly recommends early oral feeding and enteral rather than parenteral nutrition for all patients with acute pancreatitis. In 11 randomized controlled trials, early and delayed feeding led to similar rates of mortality, but delayed feeding produced a 2.5-fold higher risk of necrosis (95% CI for OR, 1.4-4.4) and tended to increase the risk of infected peripancreatic necrosis, multiorgan failure, and total necrotizing pancreatitis, the authors wrote. In another 12 trials, enteral nutrition significantly reduced the risk of infected peripancreatic necrosis, single-organ failure, and multiorgan failure compared with parenteral nutrition.
Clinicians continue to debate cholecystectomy timing in patients with biliary or gallstone pancreatitis. The guidelines strongly recommend same-admission cholecystectomy, citing a randomized controlled trial in which this approach markedly reduced the combined risk of mortality and gallstone-related complications (OR, 0.2, 95% CI, 0.1-0.6), readmission for recurrent pancreatitis (OR, 0.3, 95% CI, 0.1-0.9), and pancreaticobiliary complications (OR, 0.2, 95% CI, 0.1-0.6). “The AGA issued a strong recommendation due to the quality of available evidence and the high likelihood of benefit from early versus delayed cholecystectomy in this patient population,” the experts stated.
Patients with biliary pancreatitis should be evaluated for cholecystectomy during the same admission, while those with alcohol-induced pancreatitis should receive a brief alcohol intervention, according to the guidelines, which also call for better studies of how alcohol and tobacco cessation measures affect risk of recurrent acute pancreatitis, chronic pancreatitis, and pancreatic cancer, as well as quality of life, health care utilization, and mortality.
The authors also noted knowledge gaps concerning the relative benefits of risk stratification tools, the use of prophylactic antibiotics in patients with severe acute pancreatitis or necrotizing pancreatitis, and the timing of ERCP in patients with severe biliary pancreatitis with persistent biliary obstruction.
The guideline was developed with sole funding by the AGA Institute with no external funding. The authors disclosed no relevant conflicts of interest.
Source: Crockett SD et al. Gastroenterology. doi: 10.1053/j.gastro.2018.01.032.
Patients with acute pancreatitis should receive “goal-directed” fluid therapy with normal saline or Ringer’s lactate solution rather than hydroxyethyl starch (HES) fluids, states a new guideline from the AGA Institute.
In a single-center randomized trial, hydroxyethyl starch fluids conferred a 3.9-fold increase in the odds of multiorgan failure (95% confidence interval for odds ratio, 1.2-12.0) compared with normal saline in patients with acute pancreatitis, wrote guideline authors Seth D. Crockett, MD, MPH, of the University of North Carolina, Chapel Hill, and his associates. This trial and another randomized study found no mortality benefit for HES compared with fluid resuscitation. The evidence is “very low quality” but mirrors the critical care literature, according to the experts. So far, Ringer’s lactate solution and normal saline have shown similar effects on the risk of organ failure, necrosis, and mortality, but ongoing trials should better clarify this choice, they noted (Gastroenterology. doi: 10.1053/j.gastro.2018.01.032).
The guideline addresses the initial 2-week period of treating acute pancreatitis. It defines goal-directed fluid therapy as titration based on meaningful targets, such as heart rate, mean arterial pressure, central venous pressure, urine output, blood urea nitrogen concentration, and hematocrit. Studies of goal-directed fluid therapy in acute pancreatitis have been unblinded, have used inconsistent outcome measures, and have found no definite benefits over nontargeted fluid therapy, note the guideline authors. Nevertheless, they conditionally recommend goal-directed fluid therapy, partly because a randomized, blinded trial of patients with severe sepsis or septic shock (which physiologically resembles acute pancreatitis) had in-hospital mortality rates of 31% when they received goal-directed fluid therapy and 47% when they received standard fluid therapy (P = .0009).
The guideline recommends against routine use of two interventions: prophylactic antibiotics and urgent endoscopic retrograde cholangiopancreatography (ERCP) for patients with acute pancreatitis. The authors note that no evidence supports routine prophylactic antibiotics for acute pancreatitis patients without cholangitis, and that urgent ERCP did not significantly affect the risk of mortality, multiorgan failure, single-organ failure, infected pancreatic and peripancreatic necrosis, or necrotizing pancreatitis in eight randomized controlled trials of patients with acute gallstone pancreatitis.
The guideline strongly recommends early oral feeding and enteral rather than parenteral nutrition for all patients with acute pancreatitis. In 11 randomized controlled trials, early and delayed feeding led to similar rates of mortality, but delayed feeding produced a 2.5-fold higher risk of necrosis (95% CI for OR, 1.4-4.4) and tended to increase the risk of infected peripancreatic necrosis, multiorgan failure, and total necrotizing pancreatitis, the authors wrote. In another 12 trials, enteral nutrition significantly reduced the risk of infected peripancreatic necrosis, single-organ failure, and multiorgan failure compared with parenteral nutrition.
Clinicians continue to debate cholecystectomy timing in patients with biliary or gallstone pancreatitis. The guidelines strongly recommend same-admission cholecystectomy, citing a randomized controlled trial in which this approach markedly reduced the combined risk of mortality and gallstone-related complications (OR, 0.2, 95% CI, 0.1-0.6), readmission for recurrent pancreatitis (OR, 0.3, 95% CI, 0.1-0.9), and pancreaticobiliary complications (OR, 0.2, 95% CI, 0.1-0.6). “The AGA issued a strong recommendation due to the quality of available evidence and the high likelihood of benefit from early versus delayed cholecystectomy in this patient population,” the experts stated.
Patients with biliary pancreatitis should be evaluated for cholecystectomy during the same admission, while those with alcohol-induced pancreatitis should receive a brief alcohol intervention, according to the guidelines, which also call for better studies of how alcohol and tobacco cessation measures affect risk of recurrent acute pancreatitis, chronic pancreatitis, and pancreatic cancer, as well as quality of life, health care utilization, and mortality.
The authors also noted knowledge gaps concerning the relative benefits of risk stratification tools, the use of prophylactic antibiotics in patients with severe acute pancreatitis or necrotizing pancreatitis, and the timing of ERCP in patients with severe biliary pancreatitis with persistent biliary obstruction.
The guideline was developed with sole funding by the AGA Institute with no external funding. The authors disclosed no relevant conflicts of interest.
Source: Crockett SD et al. Gastroenterology. doi: 10.1053/j.gastro.2018.01.032.
FROM GASTROENTEROLOGY
Obesity affects the ability to diagnose liver fibrosis
Body mass index accounts for a 43.7% discordance in fibrosis findings between magnetic resonance elastography (MRE) and transient elastography (TE), according to a study from the University of California, San Diego.
“This study demonstrates that BMI is a significant factor of discordancy between MRE and TE for the stage of significant fibrosis (2-4 vs. 0-1),” wrote Cyrielle Caussy, MD, and her colleagues (Clin Gastrolenterol Hepatol. 2018 Jan 15. doi: 10.1016/j.cgh.2017.10.037). “Furthermore, this study showed that the grade of obesity is also a significant predictor of discordancy between MRE and TE because the discordance rate between MRE and TE increases with the increase in BMI.”
Dr. Caussy of the University of California, San Diego, and her colleagues had noted that MRE and TE had discordant findings in obese patients. To ascertain under what conditions TE and MRE produce the same readings, Dr. Caussy and her associates conducted a cross-sectional study of two cohorts with nonalcoholic fatty liver disease (NAFLD) who underwent contemporaneous MRE, TE, and liver biopsy. TE utilized both M and XL probes during imaging. The training cohort involved 119 adult patients undergoing NAFLD testing from October 2011 through January 2017. The validation cohort, consisting of 75 adults with NAFLD undergoing liver imaging from March 2010 through May 2013, was formed to validate the findings of the training cohort.
The study revealed that BMI was a significant predictor of the difference between MRE and TE results and made it difficult to assess the stage of liver fibrosis (2-4 vs. 0-1). After adjustment for age and sex, BMI accounted for a 5-unit increase of 1.694 (95% confidence interval, 1.145-2.507; P = .008). This was not a static relationship, and as BMI increased, so did the discordance between MRE and TE (P = .0309). Interestingly, the discordance rate was significantly higher in participants with BMIs greater than 35 kg/m2, compared with participants with BMIs below 35 (63.0% vs. 38.0%; P = .022), the investigators reported.
While the study revealed valuable information, it had both strengths and limitations. A strength of the study was the use of two cohorts, specifically the validation cohort. The use of the liver biopsy as a reference, which is the standard for assessing fibrosis, was also a strength of the study. A limitation was that the study was conducted at specialized, tertiary care centers using advanced imaging techniques that may not be available at other clinics. Additionally, the cohorts included a small number of patients with advanced fibrosis.
“The integration of the BMI in the screening strategy for the noninvasive detection of liver fibrosis in NAFLD should be considered, and this parameter would help to determine when MRE is not needed in future guidelines” wrote Dr. Caussy and her associates. “Further cost-effectiveness studies are necessary to evaluate the clinical utility of MRE, TE, and/or liver biopsy to develop optimal screening strategies for diagnosing NAFLD-associated fibrosis.”
Jun Chen, MD, Meng Yin, MD, and Richard L. Ehman, MD, all have intellectual property rights and financial interests in elastography technology. Dr. Ehman also serves as an noncompensated CEO of Resoundant. Claude B. Sirlin, MD, has served as a consultant to Bayer and GE Healthcare. All other authors did not disclose any conflicts.
The AGA Obesity Practice Guide provides a comprehensive, multi-disciplinary process to personalize innovative obesity care for safe and effective weight management. Learn more at www.gastro.org/obesity.
SOURCE: Caussy C et al. Clin Gastrolenterol Hepatol. 2018 Jan 15. doi: 10.1016/j.cgh.2017.10.037.
Body mass index accounts for a 43.7% discordance in fibrosis findings between magnetic resonance elastography (MRE) and transient elastography (TE), according to a study from the University of California, San Diego.
“This study demonstrates that BMI is a significant factor of discordancy between MRE and TE for the stage of significant fibrosis (2-4 vs. 0-1),” wrote Cyrielle Caussy, MD, and her colleagues (Clin Gastrolenterol Hepatol. 2018 Jan 15. doi: 10.1016/j.cgh.2017.10.037). “Furthermore, this study showed that the grade of obesity is also a significant predictor of discordancy between MRE and TE because the discordance rate between MRE and TE increases with the increase in BMI.”
Dr. Caussy of the University of California, San Diego, and her colleagues had noted that MRE and TE had discordant findings in obese patients. To ascertain under what conditions TE and MRE produce the same readings, Dr. Caussy and her associates conducted a cross-sectional study of two cohorts with nonalcoholic fatty liver disease (NAFLD) who underwent contemporaneous MRE, TE, and liver biopsy. TE utilized both M and XL probes during imaging. The training cohort involved 119 adult patients undergoing NAFLD testing from October 2011 through January 2017. The validation cohort, consisting of 75 adults with NAFLD undergoing liver imaging from March 2010 through May 2013, was formed to validate the findings of the training cohort.
The study revealed that BMI was a significant predictor of the difference between MRE and TE results and made it difficult to assess the stage of liver fibrosis (2-4 vs. 0-1). After adjustment for age and sex, BMI accounted for a 5-unit increase of 1.694 (95% confidence interval, 1.145-2.507; P = .008). This was not a static relationship, and as BMI increased, so did the discordance between MRE and TE (P = .0309). Interestingly, the discordance rate was significantly higher in participants with BMIs greater than 35 kg/m2, compared with participants with BMIs below 35 (63.0% vs. 38.0%; P = .022), the investigators reported.
While the study revealed valuable information, it had both strengths and limitations. A strength of the study was the use of two cohorts, specifically the validation cohort. The use of the liver biopsy as a reference, which is the standard for assessing fibrosis, was also a strength of the study. A limitation was that the study was conducted at specialized, tertiary care centers using advanced imaging techniques that may not be available at other clinics. Additionally, the cohorts included a small number of patients with advanced fibrosis.
“The integration of the BMI in the screening strategy for the noninvasive detection of liver fibrosis in NAFLD should be considered, and this parameter would help to determine when MRE is not needed in future guidelines” wrote Dr. Caussy and her associates. “Further cost-effectiveness studies are necessary to evaluate the clinical utility of MRE, TE, and/or liver biopsy to develop optimal screening strategies for diagnosing NAFLD-associated fibrosis.”
Jun Chen, MD, Meng Yin, MD, and Richard L. Ehman, MD, all have intellectual property rights and financial interests in elastography technology. Dr. Ehman also serves as an noncompensated CEO of Resoundant. Claude B. Sirlin, MD, has served as a consultant to Bayer and GE Healthcare. All other authors did not disclose any conflicts.
The AGA Obesity Practice Guide provides a comprehensive, multi-disciplinary process to personalize innovative obesity care for safe and effective weight management. Learn more at www.gastro.org/obesity.
SOURCE: Caussy C et al. Clin Gastrolenterol Hepatol. 2018 Jan 15. doi: 10.1016/j.cgh.2017.10.037.
Body mass index accounts for a 43.7% discordance in fibrosis findings between magnetic resonance elastography (MRE) and transient elastography (TE), according to a study from the University of California, San Diego.
“This study demonstrates that BMI is a significant factor of discordancy between MRE and TE for the stage of significant fibrosis (2-4 vs. 0-1),” wrote Cyrielle Caussy, MD, and her colleagues (Clin Gastrolenterol Hepatol. 2018 Jan 15. doi: 10.1016/j.cgh.2017.10.037). “Furthermore, this study showed that the grade of obesity is also a significant predictor of discordancy between MRE and TE because the discordance rate between MRE and TE increases with the increase in BMI.”
Dr. Caussy of the University of California, San Diego, and her colleagues had noted that MRE and TE had discordant findings in obese patients. To ascertain under what conditions TE and MRE produce the same readings, Dr. Caussy and her associates conducted a cross-sectional study of two cohorts with nonalcoholic fatty liver disease (NAFLD) who underwent contemporaneous MRE, TE, and liver biopsy. TE utilized both M and XL probes during imaging. The training cohort involved 119 adult patients undergoing NAFLD testing from October 2011 through January 2017. The validation cohort, consisting of 75 adults with NAFLD undergoing liver imaging from March 2010 through May 2013, was formed to validate the findings of the training cohort.
The study revealed that BMI was a significant predictor of the difference between MRE and TE results and made it difficult to assess the stage of liver fibrosis (2-4 vs. 0-1). After adjustment for age and sex, BMI accounted for a 5-unit increase of 1.694 (95% confidence interval, 1.145-2.507; P = .008). This was not a static relationship, and as BMI increased, so did the discordance between MRE and TE (P = .0309). Interestingly, the discordance rate was significantly higher in participants with BMIs greater than 35 kg/m2, compared with participants with BMIs below 35 (63.0% vs. 38.0%; P = .022), the investigators reported.
While the study revealed valuable information, it had both strengths and limitations. A strength of the study was the use of two cohorts, specifically the validation cohort. The use of the liver biopsy as a reference, which is the standard for assessing fibrosis, was also a strength of the study. A limitation was that the study was conducted at specialized, tertiary care centers using advanced imaging techniques that may not be available at other clinics. Additionally, the cohorts included a small number of patients with advanced fibrosis.
“The integration of the BMI in the screening strategy for the noninvasive detection of liver fibrosis in NAFLD should be considered, and this parameter would help to determine when MRE is not needed in future guidelines” wrote Dr. Caussy and her associates. “Further cost-effectiveness studies are necessary to evaluate the clinical utility of MRE, TE, and/or liver biopsy to develop optimal screening strategies for diagnosing NAFLD-associated fibrosis.”
Jun Chen, MD, Meng Yin, MD, and Richard L. Ehman, MD, all have intellectual property rights and financial interests in elastography technology. Dr. Ehman also serves as an noncompensated CEO of Resoundant. Claude B. Sirlin, MD, has served as a consultant to Bayer and GE Healthcare. All other authors did not disclose any conflicts.
The AGA Obesity Practice Guide provides a comprehensive, multi-disciplinary process to personalize innovative obesity care for safe and effective weight management. Learn more at www.gastro.org/obesity.
SOURCE: Caussy C et al. Clin Gastrolenterol Hepatol. 2018 Jan 15. doi: 10.1016/j.cgh.2017.10.037.
VIDEO: Cystic fibrosis patients need earlier, more frequent colorectal cancer screening
Adults with cystic fibrosis (CF) should undergo screening colonoscopy for colorectal cancer every 5 years beginning at age 40 years, unless they have had a solid organ transplant – in which case, screening should begin at age 30 years. For both groups, screening intervals should be shortened to 3 years if any adenomatous polyps are recovered.
The new screening recommendation is 1 of 10 set forth by the Cystic Fibrosis Foundation, in conjunction with the American Gastroenterological Association. The document reflects the significantly increased risk of colorectal cancer among adults with the chronic lung disorder, Denis Hadjiliadis, MD, and his colleagues wrote in the February issue of Gastroenterology. ; the risk approaches a 30-fold increase among CF patients who have undergone a lung transplant.
SOURCE: American Gastroenterological Association
In addition to making recommendations on screening intervals and protocols, the document asks clinicians to reframe their thinking of CF as a respiratory-only disease.
“Physicians should recognize that CF is a colon cancer syndrome,” wrote Dr. Hadjiliadis, director of the Adult Cystic Fibrosis Program at the University of Pennsylvania, Philadelphia, and his coauthors.
The increased colorectal cancer risk has become increasingly evident as CF patients live longer, Dr. Hadjiliadis and the panel wrote.
“The current median predicted survival is 41 years, and persons born in 2015 have an estimated average life expectancy of 45 years. The increasing longevity of adults with CF puts them at risk for other diseases, such as gastrointestinal cancer.”
In addition to the normal age-related risk, however, CF patients seem to have an elevated risk profile unique to the disease. The underlying causes have not been fully elucidated but may have to do with mutations in the cystic fibrosis transmembrane conductance regulator (CFTR), which are responsible for the excess thickened mucosal secretions that characterize CF. CFTR also is a tumor-suppressor gene in the intestinal tract of mice, and is important in gastrointestinal epithelial homeostasis. “Absence of CFTR is associated with dysregulation of the immune response, intestinal stem cells, and growth signaling regulators,” the authors noted.
In response to this observed increased risk of colorectal cancers among CF patients, the Cystic Fibrosis Foundation convened an 18-member task force to review the extant literature and compile colorectal cancer screening recommendations for CF patients who show no signs of such malignancies. The team reviewed 1,159 articles and based its findings on the 50 most relevant. The papers comprised observational studies, case-control studies, and case reports; there are no randomized clinical trials of screening for this population.
The American Gastroenterological Association reviewed and approved all of the recommendations:
- Screening decisions should be a collaborative process between the CF patient and clinician, taking into account comorbidities, safety, and quality of life. This should include a discussion of expected lifespan; patients with limited lifespan won’t benefit from screening for a slow-growing cancer. Patients should also consider that the colonoscopy prep for CF patients is somewhat more complex than for non-CF patients. “Given these complexities, the task force agreed that individuals with CF and their providers should … carefully assess the risks and benefits of CRC screening and its impact on the health and quality of life for the adult with CF.”
- The decision team should include an endoscopist. An endoscopist with CF training is preferred, but the panel noted these specialists are rare.
- Colonoscopy is the preferred method of screening for CF patients, since it can both detect and remove polyps. “This is one of the main reasons why colonoscopy is the screening procedure of choice for other high-risk groups,” the panel noted.
- There is insufficient evidence to recommend alternate screening methods in CF patients, including CT scanning, colonography, stool-based tests, or flexible sigmoidoscopy.
- In CF patients without signs of CRC, screening should commence at age 40 years and be repeated every 5 years as long as the results are negative.
- Any CF patient who has had adenomatous polyps on a screening colonoscopy should have a repeat colonoscopy within 3 years, unless clinical findings support more frequent screening.
- For any adult CF patient older than age 30 years who has undergone a solid organ transplant, screening colonoscopy should commence within 2 years of transplantation. “Although the absolute risk of CRC in individuals with CF is extremely low for patients younger than 30 years, the risk … greatly increases after lung transplantation,” to 25-30 times the age-adjusted baseline, the panel wrote. “Increased posttransplantation survival means that many transplant patients will enter older age groups where there is an increased risk of cancer.” Screening should be performed after recovery and within 2 years, unless there was a negative colonoscopy in the 5 years before transplant.
- Thereafter, patients who have had a solid organ transplant should undergo colonoscopy every 5 years, based on their life expectancy. “In cases where the expected survival time is limited (less than 10 years), screening should not be performed. For adults appropriately selected, lung transplantation usually increases survival probability. Therefore, a lung transplantation candidate with a short life expectancy is likely to become a screening candidate before and after transplantation at the appropriate ages described here, because the potential survival increases to approximately 10 years.”
- Colonoscopy should be repeated every 3 years on CF patients with transplants with a history of adenomatous polyps. This interval may be as short as 1 year for patients with high-risk, large, or multiple polyps.
- CF patients should undergo more intense bowel prep for colonoscopy, with three-four washes of a minimum of one liter of purgative per wash; the last wash should occur 4-6 hours before the procedure. Split-prep regimens (several smaller-volume washes) are better than a single larger-volume wash. The panel suggested a sample CF-specific regimen available from the Minnesota Cystic Fibrosis Center.
The new document reflects expert consensus on the currently available data, the panel said. As more data emerge, the recommendations might change.
“It is possible that different subpopulations will need more or less frequent schedules for rescreening and surveillance. Our recommendations are making an effort to balance the risk of missing advanced colorectal cancer and minimizing the burden and risk of too frequent examinations.”
None of the panel members had any financial disclosures.
SOURCE: Hadjiliadis D et al. Gastroenterology. 2017 Dec 28. doi. org/10.1053/j.gastro.2017.12.012
According to the Cystic Fibrosis Foundation Patient Registry, more than 30,000 people are living with cystic fibrosis (CF ) in the United States. More than half of the CF population is over 18 years of age! It is extremely important to talk to patients about preventative medicine which was not a topic of conversation CF healthcare providers were adding to their management plan in the past.
According to the Cystic Fibrosis Foundation Patient Registry, more than 30,000 people are living with cystic fibrosis (CF ) in the United States. More than half of the CF population is over 18 years of age! It is extremely important to talk to patients about preventative medicine which was not a topic of conversation CF healthcare providers were adding to their management plan in the past.
According to the Cystic Fibrosis Foundation Patient Registry, more than 30,000 people are living with cystic fibrosis (CF ) in the United States. More than half of the CF population is over 18 years of age! It is extremely important to talk to patients about preventative medicine which was not a topic of conversation CF healthcare providers were adding to their management plan in the past.
Adults with cystic fibrosis (CF) should undergo screening colonoscopy for colorectal cancer every 5 years beginning at age 40 years, unless they have had a solid organ transplant – in which case, screening should begin at age 30 years. For both groups, screening intervals should be shortened to 3 years if any adenomatous polyps are recovered.
The new screening recommendation is 1 of 10 set forth by the Cystic Fibrosis Foundation, in conjunction with the American Gastroenterological Association. The document reflects the significantly increased risk of colorectal cancer among adults with the chronic lung disorder, Denis Hadjiliadis, MD, and his colleagues wrote in the February issue of Gastroenterology. ; the risk approaches a 30-fold increase among CF patients who have undergone a lung transplant.
SOURCE: American Gastroenterological Association
In addition to making recommendations on screening intervals and protocols, the document asks clinicians to reframe their thinking of CF as a respiratory-only disease.
“Physicians should recognize that CF is a colon cancer syndrome,” wrote Dr. Hadjiliadis, director of the Adult Cystic Fibrosis Program at the University of Pennsylvania, Philadelphia, and his coauthors.
The increased colorectal cancer risk has become increasingly evident as CF patients live longer, Dr. Hadjiliadis and the panel wrote.
“The current median predicted survival is 41 years, and persons born in 2015 have an estimated average life expectancy of 45 years. The increasing longevity of adults with CF puts them at risk for other diseases, such as gastrointestinal cancer.”
In addition to the normal age-related risk, however, CF patients seem to have an elevated risk profile unique to the disease. The underlying causes have not been fully elucidated but may have to do with mutations in the cystic fibrosis transmembrane conductance regulator (CFTR), which are responsible for the excess thickened mucosal secretions that characterize CF. CFTR also is a tumor-suppressor gene in the intestinal tract of mice, and is important in gastrointestinal epithelial homeostasis. “Absence of CFTR is associated with dysregulation of the immune response, intestinal stem cells, and growth signaling regulators,” the authors noted.
In response to this observed increased risk of colorectal cancers among CF patients, the Cystic Fibrosis Foundation convened an 18-member task force to review the extant literature and compile colorectal cancer screening recommendations for CF patients who show no signs of such malignancies. The team reviewed 1,159 articles and based its findings on the 50 most relevant. The papers comprised observational studies, case-control studies, and case reports; there are no randomized clinical trials of screening for this population.
The American Gastroenterological Association reviewed and approved all of the recommendations:
- Screening decisions should be a collaborative process between the CF patient and clinician, taking into account comorbidities, safety, and quality of life. This should include a discussion of expected lifespan; patients with limited lifespan won’t benefit from screening for a slow-growing cancer. Patients should also consider that the colonoscopy prep for CF patients is somewhat more complex than for non-CF patients. “Given these complexities, the task force agreed that individuals with CF and their providers should … carefully assess the risks and benefits of CRC screening and its impact on the health and quality of life for the adult with CF.”
- The decision team should include an endoscopist. An endoscopist with CF training is preferred, but the panel noted these specialists are rare.
- Colonoscopy is the preferred method of screening for CF patients, since it can both detect and remove polyps. “This is one of the main reasons why colonoscopy is the screening procedure of choice for other high-risk groups,” the panel noted.
- There is insufficient evidence to recommend alternate screening methods in CF patients, including CT scanning, colonography, stool-based tests, or flexible sigmoidoscopy.
- In CF patients without signs of CRC, screening should commence at age 40 years and be repeated every 5 years as long as the results are negative.
- Any CF patient who has had adenomatous polyps on a screening colonoscopy should have a repeat colonoscopy within 3 years, unless clinical findings support more frequent screening.
- For any adult CF patient older than age 30 years who has undergone a solid organ transplant, screening colonoscopy should commence within 2 years of transplantation. “Although the absolute risk of CRC in individuals with CF is extremely low for patients younger than 30 years, the risk … greatly increases after lung transplantation,” to 25-30 times the age-adjusted baseline, the panel wrote. “Increased posttransplantation survival means that many transplant patients will enter older age groups where there is an increased risk of cancer.” Screening should be performed after recovery and within 2 years, unless there was a negative colonoscopy in the 5 years before transplant.
- Thereafter, patients who have had a solid organ transplant should undergo colonoscopy every 5 years, based on their life expectancy. “In cases where the expected survival time is limited (less than 10 years), screening should not be performed. For adults appropriately selected, lung transplantation usually increases survival probability. Therefore, a lung transplantation candidate with a short life expectancy is likely to become a screening candidate before and after transplantation at the appropriate ages described here, because the potential survival increases to approximately 10 years.”
- Colonoscopy should be repeated every 3 years on CF patients with transplants with a history of adenomatous polyps. This interval may be as short as 1 year for patients with high-risk, large, or multiple polyps.
- CF patients should undergo more intense bowel prep for colonoscopy, with three-four washes of a minimum of one liter of purgative per wash; the last wash should occur 4-6 hours before the procedure. Split-prep regimens (several smaller-volume washes) are better than a single larger-volume wash. The panel suggested a sample CF-specific regimen available from the Minnesota Cystic Fibrosis Center.
The new document reflects expert consensus on the currently available data, the panel said. As more data emerge, the recommendations might change.
“It is possible that different subpopulations will need more or less frequent schedules for rescreening and surveillance. Our recommendations are making an effort to balance the risk of missing advanced colorectal cancer and minimizing the burden and risk of too frequent examinations.”
None of the panel members had any financial disclosures.
SOURCE: Hadjiliadis D et al. Gastroenterology. 2017 Dec 28. doi. org/10.1053/j.gastro.2017.12.012
Adults with cystic fibrosis (CF) should undergo screening colonoscopy for colorectal cancer every 5 years beginning at age 40 years, unless they have had a solid organ transplant – in which case, screening should begin at age 30 years. For both groups, screening intervals should be shortened to 3 years if any adenomatous polyps are recovered.
The new screening recommendation is 1 of 10 set forth by the Cystic Fibrosis Foundation, in conjunction with the American Gastroenterological Association. The document reflects the significantly increased risk of colorectal cancer among adults with the chronic lung disorder, Denis Hadjiliadis, MD, and his colleagues wrote in the February issue of Gastroenterology. ; the risk approaches a 30-fold increase among CF patients who have undergone a lung transplant.
SOURCE: American Gastroenterological Association
In addition to making recommendations on screening intervals and protocols, the document asks clinicians to reframe their thinking of CF as a respiratory-only disease.
“Physicians should recognize that CF is a colon cancer syndrome,” wrote Dr. Hadjiliadis, director of the Adult Cystic Fibrosis Program at the University of Pennsylvania, Philadelphia, and his coauthors.
The increased colorectal cancer risk has become increasingly evident as CF patients live longer, Dr. Hadjiliadis and the panel wrote.
“The current median predicted survival is 41 years, and persons born in 2015 have an estimated average life expectancy of 45 years. The increasing longevity of adults with CF puts them at risk for other diseases, such as gastrointestinal cancer.”
In addition to the normal age-related risk, however, CF patients seem to have an elevated risk profile unique to the disease. The underlying causes have not been fully elucidated but may have to do with mutations in the cystic fibrosis transmembrane conductance regulator (CFTR), which are responsible for the excess thickened mucosal secretions that characterize CF. CFTR also is a tumor-suppressor gene in the intestinal tract of mice, and is important in gastrointestinal epithelial homeostasis. “Absence of CFTR is associated with dysregulation of the immune response, intestinal stem cells, and growth signaling regulators,” the authors noted.
In response to this observed increased risk of colorectal cancers among CF patients, the Cystic Fibrosis Foundation convened an 18-member task force to review the extant literature and compile colorectal cancer screening recommendations for CF patients who show no signs of such malignancies. The team reviewed 1,159 articles and based its findings on the 50 most relevant. The papers comprised observational studies, case-control studies, and case reports; there are no randomized clinical trials of screening for this population.
The American Gastroenterological Association reviewed and approved all of the recommendations:
- Screening decisions should be a collaborative process between the CF patient and clinician, taking into account comorbidities, safety, and quality of life. This should include a discussion of expected lifespan; patients with limited lifespan won’t benefit from screening for a slow-growing cancer. Patients should also consider that the colonoscopy prep for CF patients is somewhat more complex than for non-CF patients. “Given these complexities, the task force agreed that individuals with CF and their providers should … carefully assess the risks and benefits of CRC screening and its impact on the health and quality of life for the adult with CF.”
- The decision team should include an endoscopist. An endoscopist with CF training is preferred, but the panel noted these specialists are rare.
- Colonoscopy is the preferred method of screening for CF patients, since it can both detect and remove polyps. “This is one of the main reasons why colonoscopy is the screening procedure of choice for other high-risk groups,” the panel noted.
- There is insufficient evidence to recommend alternate screening methods in CF patients, including CT scanning, colonography, stool-based tests, or flexible sigmoidoscopy.
- In CF patients without signs of CRC, screening should commence at age 40 years and be repeated every 5 years as long as the results are negative.
- Any CF patient who has had adenomatous polyps on a screening colonoscopy should have a repeat colonoscopy within 3 years, unless clinical findings support more frequent screening.
- For any adult CF patient older than age 30 years who has undergone a solid organ transplant, screening colonoscopy should commence within 2 years of transplantation. “Although the absolute risk of CRC in individuals with CF is extremely low for patients younger than 30 years, the risk … greatly increases after lung transplantation,” to 25-30 times the age-adjusted baseline, the panel wrote. “Increased posttransplantation survival means that many transplant patients will enter older age groups where there is an increased risk of cancer.” Screening should be performed after recovery and within 2 years, unless there was a negative colonoscopy in the 5 years before transplant.
- Thereafter, patients who have had a solid organ transplant should undergo colonoscopy every 5 years, based on their life expectancy. “In cases where the expected survival time is limited (less than 10 years), screening should not be performed. For adults appropriately selected, lung transplantation usually increases survival probability. Therefore, a lung transplantation candidate with a short life expectancy is likely to become a screening candidate before and after transplantation at the appropriate ages described here, because the potential survival increases to approximately 10 years.”
- Colonoscopy should be repeated every 3 years on CF patients with transplants with a history of adenomatous polyps. This interval may be as short as 1 year for patients with high-risk, large, or multiple polyps.
- CF patients should undergo more intense bowel prep for colonoscopy, with three-four washes of a minimum of one liter of purgative per wash; the last wash should occur 4-6 hours before the procedure. Split-prep regimens (several smaller-volume washes) are better than a single larger-volume wash. The panel suggested a sample CF-specific regimen available from the Minnesota Cystic Fibrosis Center.
The new document reflects expert consensus on the currently available data, the panel said. As more data emerge, the recommendations might change.
“It is possible that different subpopulations will need more or less frequent schedules for rescreening and surveillance. Our recommendations are making an effort to balance the risk of missing advanced colorectal cancer and minimizing the burden and risk of too frequent examinations.”
None of the panel members had any financial disclosures.
SOURCE: Hadjiliadis D et al. Gastroenterology. 2017 Dec 28. doi. org/10.1053/j.gastro.2017.12.012
FROM GASTROENTEROLOGY
VIDEO: Gluten-free diet tied to heavy metal bioaccumulation
A gluten-free diet was associated with significantly increased blood levels of mercury, lead, and cadmium and with significantly increased urinary levels of arsenic in a large cross-sectional population-based survey study.
Source: American Gastroenterological Association
After researchers controlled for demographic characteristics, “levels of all heavy metals remained significantly higher in persons following a gluten-free diet, compared with those not following a gluten-free diet,” Stephanie L. Raehsler, MPH, of Mayo Clinic in Rochester, Minn., wrote with her associates in an article published in the February issue of Clinical Gastroenterology and Hepatology.
The purported (unproven) benefits of a gluten-free diet (GFD) have propelled them into the mainstream outside the settings of celiac disease, dermatitis herpetiformis, and wheat allergy. However, GFDs have been linked to nutritional deficits of iron, ferritin, zinc, and fiber, to increased consumption of sugar, fats, and salt, and to excessive bioaccumulation of mercury, the investigators noted.
High intake of rice, a staple of many GFDs, also has been associated with elevated urinary excretion of arsenic (PLoS One. 2014 Sep 8;9[9]:e104768. doi: 10.1371/journal.pone.0104768). To further characterize these relationships, the researchers analyzed data for 2009 through 2012 from 11,354 participants in the National Health and Nutrition Examination Survey (NHANES). Blood levels of lead, mercury, and cadmium were available from 115 participants who reported following a GFD, and data on urinary arsenic levels were available from 32 such individuals.
In the overall study group, blood mercury levels averaged 1.37 mcg/L (95% confidence interval, 1.02-1.85 mcg/L) among persons on a GFD and 0.93 mcg/L (95% CI, 0.86-1.0 mcg/L) in persons not on a GFD (P = .008). Individuals on a GFD also had significantly higher total blood levels of lead (1.42 vs. 1.13 mcg/L; P = .007 ) and cadmium (0.42 vs. 0.34; P = .03), and they had significantly higher urinary levels of total arsenic (15.2 vs. 8.4 mcg/L; P = .003). These significant differences persisted after researchers controlled for age, sex, race, and smoking status.
Additionally, among 101 individuals on GFDs who had no laboratory or clinical indication of celiac disease, blood levels of total mercury were significantly elevated, compared with individuals not on a GFD (1.40 vs. 0.93 mcg/L; P = .02), as were blood lead concentrations (1.44 vs. 1.13 mcg/L; P = .01) and urinary arsenic levels (14.7 vs. 8.3 mcg/L; P = .01). Blood cadmium levels also were increased (0.42 vs. 0.34 mcg/L), but this difference did not reach statistical significance (P = .06).
Individuals who reported eating fish or shellfish in the past month had higher blood mercury levels than those who did not, regardless of whether they were on a GFD. However, only two individuals in the study exceeded the toxicity threshold for mercury and neither was on a GFD, the researchers said. For most individuals on a GFD, levels of all heavy metals except urinary arsenic stayed under the recognized limits for toxicity, they noted.
The number of respondents following a GFD was small, but the investigators followed NHANES recommendations on sampling weights and sample design variables. Also, although the NHANES included only one question on GFDs, trained interviewers were used to help minimize bias. “Studies are needed to determine the long-term effects of accumulation of these elements in persons on a GFD,” the researchers concluded.
The Centers for Disease Control and Prevention provided partial funding. The researchers reported having no conflicts of interest.
SOURCE: Raehsler S et al. Clin Gastro Hepatol. 2018;(in press).
A gluten-free diet was associated with significantly increased blood levels of mercury, lead, and cadmium and with significantly increased urinary levels of arsenic in a large cross-sectional population-based survey study.
Source: American Gastroenterological Association
After researchers controlled for demographic characteristics, “levels of all heavy metals remained significantly higher in persons following a gluten-free diet, compared with those not following a gluten-free diet,” Stephanie L. Raehsler, MPH, of Mayo Clinic in Rochester, Minn., wrote with her associates in an article published in the February issue of Clinical Gastroenterology and Hepatology.
The purported (unproven) benefits of a gluten-free diet (GFD) have propelled them into the mainstream outside the settings of celiac disease, dermatitis herpetiformis, and wheat allergy. However, GFDs have been linked to nutritional deficits of iron, ferritin, zinc, and fiber, to increased consumption of sugar, fats, and salt, and to excessive bioaccumulation of mercury, the investigators noted.
High intake of rice, a staple of many GFDs, also has been associated with elevated urinary excretion of arsenic (PLoS One. 2014 Sep 8;9[9]:e104768. doi: 10.1371/journal.pone.0104768). To further characterize these relationships, the researchers analyzed data for 2009 through 2012 from 11,354 participants in the National Health and Nutrition Examination Survey (NHANES). Blood levels of lead, mercury, and cadmium were available from 115 participants who reported following a GFD, and data on urinary arsenic levels were available from 32 such individuals.
In the overall study group, blood mercury levels averaged 1.37 mcg/L (95% confidence interval, 1.02-1.85 mcg/L) among persons on a GFD and 0.93 mcg/L (95% CI, 0.86-1.0 mcg/L) in persons not on a GFD (P = .008). Individuals on a GFD also had significantly higher total blood levels of lead (1.42 vs. 1.13 mcg/L; P = .007 ) and cadmium (0.42 vs. 0.34; P = .03), and they had significantly higher urinary levels of total arsenic (15.2 vs. 8.4 mcg/L; P = .003). These significant differences persisted after researchers controlled for age, sex, race, and smoking status.
Additionally, among 101 individuals on GFDs who had no laboratory or clinical indication of celiac disease, blood levels of total mercury were significantly elevated, compared with individuals not on a GFD (1.40 vs. 0.93 mcg/L; P = .02), as were blood lead concentrations (1.44 vs. 1.13 mcg/L; P = .01) and urinary arsenic levels (14.7 vs. 8.3 mcg/L; P = .01). Blood cadmium levels also were increased (0.42 vs. 0.34 mcg/L), but this difference did not reach statistical significance (P = .06).
Individuals who reported eating fish or shellfish in the past month had higher blood mercury levels than those who did not, regardless of whether they were on a GFD. However, only two individuals in the study exceeded the toxicity threshold for mercury and neither was on a GFD, the researchers said. For most individuals on a GFD, levels of all heavy metals except urinary arsenic stayed under the recognized limits for toxicity, they noted.
The number of respondents following a GFD was small, but the investigators followed NHANES recommendations on sampling weights and sample design variables. Also, although the NHANES included only one question on GFDs, trained interviewers were used to help minimize bias. “Studies are needed to determine the long-term effects of accumulation of these elements in persons on a GFD,” the researchers concluded.
The Centers for Disease Control and Prevention provided partial funding. The researchers reported having no conflicts of interest.
SOURCE: Raehsler S et al. Clin Gastro Hepatol. 2018;(in press).
A gluten-free diet was associated with significantly increased blood levels of mercury, lead, and cadmium and with significantly increased urinary levels of arsenic in a large cross-sectional population-based survey study.
Source: American Gastroenterological Association
After researchers controlled for demographic characteristics, “levels of all heavy metals remained significantly higher in persons following a gluten-free diet, compared with those not following a gluten-free diet,” Stephanie L. Raehsler, MPH, of Mayo Clinic in Rochester, Minn., wrote with her associates in an article published in the February issue of Clinical Gastroenterology and Hepatology.
The purported (unproven) benefits of a gluten-free diet (GFD) have propelled them into the mainstream outside the settings of celiac disease, dermatitis herpetiformis, and wheat allergy. However, GFDs have been linked to nutritional deficits of iron, ferritin, zinc, and fiber, to increased consumption of sugar, fats, and salt, and to excessive bioaccumulation of mercury, the investigators noted.
High intake of rice, a staple of many GFDs, also has been associated with elevated urinary excretion of arsenic (PLoS One. 2014 Sep 8;9[9]:e104768. doi: 10.1371/journal.pone.0104768). To further characterize these relationships, the researchers analyzed data for 2009 through 2012 from 11,354 participants in the National Health and Nutrition Examination Survey (NHANES). Blood levels of lead, mercury, and cadmium were available from 115 participants who reported following a GFD, and data on urinary arsenic levels were available from 32 such individuals.
In the overall study group, blood mercury levels averaged 1.37 mcg/L (95% confidence interval, 1.02-1.85 mcg/L) among persons on a GFD and 0.93 mcg/L (95% CI, 0.86-1.0 mcg/L) in persons not on a GFD (P = .008). Individuals on a GFD also had significantly higher total blood levels of lead (1.42 vs. 1.13 mcg/L; P = .007 ) and cadmium (0.42 vs. 0.34; P = .03), and they had significantly higher urinary levels of total arsenic (15.2 vs. 8.4 mcg/L; P = .003). These significant differences persisted after researchers controlled for age, sex, race, and smoking status.
Additionally, among 101 individuals on GFDs who had no laboratory or clinical indication of celiac disease, blood levels of total mercury were significantly elevated, compared with individuals not on a GFD (1.40 vs. 0.93 mcg/L; P = .02), as were blood lead concentrations (1.44 vs. 1.13 mcg/L; P = .01) and urinary arsenic levels (14.7 vs. 8.3 mcg/L; P = .01). Blood cadmium levels also were increased (0.42 vs. 0.34 mcg/L), but this difference did not reach statistical significance (P = .06).
Individuals who reported eating fish or shellfish in the past month had higher blood mercury levels than those who did not, regardless of whether they were on a GFD. However, only two individuals in the study exceeded the toxicity threshold for mercury and neither was on a GFD, the researchers said. For most individuals on a GFD, levels of all heavy metals except urinary arsenic stayed under the recognized limits for toxicity, they noted.
The number of respondents following a GFD was small, but the investigators followed NHANES recommendations on sampling weights and sample design variables. Also, although the NHANES included only one question on GFDs, trained interviewers were used to help minimize bias. “Studies are needed to determine the long-term effects of accumulation of these elements in persons on a GFD,” the researchers concluded.
The Centers for Disease Control and Prevention provided partial funding. The researchers reported having no conflicts of interest.
SOURCE: Raehsler S et al. Clin Gastro Hepatol. 2018;(in press).
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: A gluten-free diet was associated with significantly increased bioaccumulation of several heavy metals.
Major finding: After accounting for demographic factors, blood or urinary levels of lead, cadmium, arsenic, and mercury were significantly higher in persons following a gluten-free diet, compared with those who did not follow a gluten-free diet.
Data source: A population-based, cross-sectional study of 11,354 respondents to NHANES 2009-2012, including 115 persons on a gluten-free diet.
Disclosures: The Centers for Disease Control and Prevention provided partial funding. The researchers reported having no conflicts of interest.
Source: Raehsler S et al. Clin Gastro Hepatol. 2018 (in press).
VIP an unwelcome contributor to eosinophilic esophagitis
Vasoactive intestinal peptide (VIP) appears to play an important role in the pathology of eosinophilic esophagitis (EoE) by recruiting mast cells and eosinophils that contribute to EoE’s hallmark symptoms of dysphagia and esophageal dysmotility, investigators reported in the February issue of Cellular and Molecular Gastroenterology and Hepatology.
Blocking one of three VIP receptors – chemoattractant receptor-homologous molecule expressed on Th2 (CRTH2) – could reduce eosinophil infiltration and mast cell numbers in the esophagus, wrote Alok K. Verma, PhD, a postodoctoral fellow at Tulane University in New Orleans, and his colleagues.
“We suggest that inhibiting the VIP–CRTH2 axis may ameliorate the dysphagia, stricture, and motility dysfunction of chronic EoE,” they wrote in a research letter to Cellular and Molecular Gastroenterology and Hepatology.
Several cytokines and chemokines, notably interleukin-5 and eotaxin-3, have been fingered as suspects in eosinophil infiltration, but whether chemokines other than eotaxin play a role has not been well documented, the investigators noted.
They hypothesized that VIP may be a chemoattractant that draws eosinophils into perineural areas of the muscular mucosa of the esophagus.
To test this idea, they looked at VIP-expression in samples from patients both with and without EoE and found that VIP expression was low among controls (without EoE); they also found that eosinophils were seen to accumulate near VIP-expressing nerve cells in biopsy samples from patients with EoE.
When they performed in vitro studies of VIP binding and immunologic functions, they found that eosinophils primarily express the CRTH2 receptor rather than the vasoactive intestinal peptide receptor 1 (VPAC-1) or VPAC-2.
They also demonstrated that VIP’s effects on eosinophil motility was similar to that of eotaxin and that, when they pretreated eosinophils with a CRTH2 inhibitor, esoinophil motility was hampered.
The investigators next looked at biopsy specimens from patients with EoE and found that eosinophils that express CRTH2 accumulated in the epithelial mucosa.
To see whether (as they and other researchers had suspected) VIP and its interaction with the CRTH2 receptor might play a role in mast cell recruitment, they performed immunofluorescence analyses and confirmed the presence of the CRTH2 receptor on tryptase-positive mast cells in the esophageal mucosa of patients with EoE.
“These findings suggest that, similar to eosinophils, mast cells accumulate via interaction of the CRTH2 receptor with neutrally derived VIP,” they wrote.
Finally, to see whether a reduction in peak eosinophil levels in patients with EoE with a CRTH2 antagonist – as seen in prior studies – could also ameliorate the negative effects of mast cells on esophageal function, they looked at the effects of CRTH2 inhibition in a mouse model of human EoE.
They found that, in the mice treated with a CRTH2 blocker, each segment of the esophagus had significant reductions in both eosinophil infiltration and mast cell numbers (P less than .05 for each).
The work was supported in part by grants from the National Institutes of Health and the Tulane Edward G. Schlieder Educational Foundation. Senior author Anil Mishra, PhD, disclosed serving as a consultant for Axcan Pharma, Aptalis, Elite Biosciences, Calypso Biotech SA, and Enumeral Biomedical. The remaining authors disclosed no conflicts of interest.
SOURCE: Verma AK et al. Cell Mol Gastroenterol Hepatol. 2018;5[1]:99-100.e7.
The rapid increase in the incidence of pediatric and adult eosinophilic esophagitis (EoE) draws immediate attention to the importance of studying the mechanisms underlying this detrimental condition. The lack of preventive or curative therapies for EoE further underscores the importance of research that addresses gaps in our understanding of how eosinophilic inflammation of the esophagus is regulated on the molecular and cellular level. EoE is classified as an allergic immune disorder of the gastrointestinal tract and is characterized by eosinophil-rich, chronic Th2-type inflammation of the esophagus.
In this recent publication, the laboratory of Anil Mishra, PhD, showed that vasoactive intestinal peptide (VIP) serves as a potent chemoattractant for eosinophils and promotes accumulation of these innate immune cells adjacent to nerve cells in the muscular mucosa. Increased VIP expression was documented in EoE patients when compared to controls, and the authors identified the chemoattractant receptor homologous molecule expressed on Th2 lymphocytes (CRTH2) as a main binding receptor for VIP. Interestingly, CRTH2 was not only found to be expressed on eosinophils but also on tissue mast cells – another innate immune cell type that significantly contributes to the inflammatory tissue infiltrate in EoE patients. Based on the human findings, the authors tested whether VIP plays a major role in recruiting eosinophils and mast cells to the inflamed esophagus and whether CRTH2 blockade can modulate experimental EoE. Indeed, EoE pathology improved in animals that were treated with a CRTH2 antagonist.
In conclusion, these observations suggest that inhibiting the VIP-CRTH2 axis may serve as a therapeutic intervention pathway to ameliorate innate tissue inflammation in EoE patients.
Edda Fiebiger, PhD, is in the department of pediatrics in the division of gastroenterology, hepatology and nutrition at Boston Children’s Hospital, as well as in the department of medicine at Harvard Medical School, also in Boston. She had no disclosures.
The rapid increase in the incidence of pediatric and adult eosinophilic esophagitis (EoE) draws immediate attention to the importance of studying the mechanisms underlying this detrimental condition. The lack of preventive or curative therapies for EoE further underscores the importance of research that addresses gaps in our understanding of how eosinophilic inflammation of the esophagus is regulated on the molecular and cellular level. EoE is classified as an allergic immune disorder of the gastrointestinal tract and is characterized by eosinophil-rich, chronic Th2-type inflammation of the esophagus.
In this recent publication, the laboratory of Anil Mishra, PhD, showed that vasoactive intestinal peptide (VIP) serves as a potent chemoattractant for eosinophils and promotes accumulation of these innate immune cells adjacent to nerve cells in the muscular mucosa. Increased VIP expression was documented in EoE patients when compared to controls, and the authors identified the chemoattractant receptor homologous molecule expressed on Th2 lymphocytes (CRTH2) as a main binding receptor for VIP. Interestingly, CRTH2 was not only found to be expressed on eosinophils but also on tissue mast cells – another innate immune cell type that significantly contributes to the inflammatory tissue infiltrate in EoE patients. Based on the human findings, the authors tested whether VIP plays a major role in recruiting eosinophils and mast cells to the inflamed esophagus and whether CRTH2 blockade can modulate experimental EoE. Indeed, EoE pathology improved in animals that were treated with a CRTH2 antagonist.
In conclusion, these observations suggest that inhibiting the VIP-CRTH2 axis may serve as a therapeutic intervention pathway to ameliorate innate tissue inflammation in EoE patients.
Edda Fiebiger, PhD, is in the department of pediatrics in the division of gastroenterology, hepatology and nutrition at Boston Children’s Hospital, as well as in the department of medicine at Harvard Medical School, also in Boston. She had no disclosures.
The rapid increase in the incidence of pediatric and adult eosinophilic esophagitis (EoE) draws immediate attention to the importance of studying the mechanisms underlying this detrimental condition. The lack of preventive or curative therapies for EoE further underscores the importance of research that addresses gaps in our understanding of how eosinophilic inflammation of the esophagus is regulated on the molecular and cellular level. EoE is classified as an allergic immune disorder of the gastrointestinal tract and is characterized by eosinophil-rich, chronic Th2-type inflammation of the esophagus.
In this recent publication, the laboratory of Anil Mishra, PhD, showed that vasoactive intestinal peptide (VIP) serves as a potent chemoattractant for eosinophils and promotes accumulation of these innate immune cells adjacent to nerve cells in the muscular mucosa. Increased VIP expression was documented in EoE patients when compared to controls, and the authors identified the chemoattractant receptor homologous molecule expressed on Th2 lymphocytes (CRTH2) as a main binding receptor for VIP. Interestingly, CRTH2 was not only found to be expressed on eosinophils but also on tissue mast cells – another innate immune cell type that significantly contributes to the inflammatory tissue infiltrate in EoE patients. Based on the human findings, the authors tested whether VIP plays a major role in recruiting eosinophils and mast cells to the inflamed esophagus and whether CRTH2 blockade can modulate experimental EoE. Indeed, EoE pathology improved in animals that were treated with a CRTH2 antagonist.
In conclusion, these observations suggest that inhibiting the VIP-CRTH2 axis may serve as a therapeutic intervention pathway to ameliorate innate tissue inflammation in EoE patients.
Edda Fiebiger, PhD, is in the department of pediatrics in the division of gastroenterology, hepatology and nutrition at Boston Children’s Hospital, as well as in the department of medicine at Harvard Medical School, also in Boston. She had no disclosures.
Vasoactive intestinal peptide (VIP) appears to play an important role in the pathology of eosinophilic esophagitis (EoE) by recruiting mast cells and eosinophils that contribute to EoE’s hallmark symptoms of dysphagia and esophageal dysmotility, investigators reported in the February issue of Cellular and Molecular Gastroenterology and Hepatology.
Blocking one of three VIP receptors – chemoattractant receptor-homologous molecule expressed on Th2 (CRTH2) – could reduce eosinophil infiltration and mast cell numbers in the esophagus, wrote Alok K. Verma, PhD, a postodoctoral fellow at Tulane University in New Orleans, and his colleagues.
“We suggest that inhibiting the VIP–CRTH2 axis may ameliorate the dysphagia, stricture, and motility dysfunction of chronic EoE,” they wrote in a research letter to Cellular and Molecular Gastroenterology and Hepatology.
Several cytokines and chemokines, notably interleukin-5 and eotaxin-3, have been fingered as suspects in eosinophil infiltration, but whether chemokines other than eotaxin play a role has not been well documented, the investigators noted.
They hypothesized that VIP may be a chemoattractant that draws eosinophils into perineural areas of the muscular mucosa of the esophagus.
To test this idea, they looked at VIP-expression in samples from patients both with and without EoE and found that VIP expression was low among controls (without EoE); they also found that eosinophils were seen to accumulate near VIP-expressing nerve cells in biopsy samples from patients with EoE.
When they performed in vitro studies of VIP binding and immunologic functions, they found that eosinophils primarily express the CRTH2 receptor rather than the vasoactive intestinal peptide receptor 1 (VPAC-1) or VPAC-2.
They also demonstrated that VIP’s effects on eosinophil motility was similar to that of eotaxin and that, when they pretreated eosinophils with a CRTH2 inhibitor, esoinophil motility was hampered.
The investigators next looked at biopsy specimens from patients with EoE and found that eosinophils that express CRTH2 accumulated in the epithelial mucosa.
To see whether (as they and other researchers had suspected) VIP and its interaction with the CRTH2 receptor might play a role in mast cell recruitment, they performed immunofluorescence analyses and confirmed the presence of the CRTH2 receptor on tryptase-positive mast cells in the esophageal mucosa of patients with EoE.
“These findings suggest that, similar to eosinophils, mast cells accumulate via interaction of the CRTH2 receptor with neutrally derived VIP,” they wrote.
Finally, to see whether a reduction in peak eosinophil levels in patients with EoE with a CRTH2 antagonist – as seen in prior studies – could also ameliorate the negative effects of mast cells on esophageal function, they looked at the effects of CRTH2 inhibition in a mouse model of human EoE.
They found that, in the mice treated with a CRTH2 blocker, each segment of the esophagus had significant reductions in both eosinophil infiltration and mast cell numbers (P less than .05 for each).
The work was supported in part by grants from the National Institutes of Health and the Tulane Edward G. Schlieder Educational Foundation. Senior author Anil Mishra, PhD, disclosed serving as a consultant for Axcan Pharma, Aptalis, Elite Biosciences, Calypso Biotech SA, and Enumeral Biomedical. The remaining authors disclosed no conflicts of interest.
SOURCE: Verma AK et al. Cell Mol Gastroenterol Hepatol. 2018;5[1]:99-100.e7.
Vasoactive intestinal peptide (VIP) appears to play an important role in the pathology of eosinophilic esophagitis (EoE) by recruiting mast cells and eosinophils that contribute to EoE’s hallmark symptoms of dysphagia and esophageal dysmotility, investigators reported in the February issue of Cellular and Molecular Gastroenterology and Hepatology.
Blocking one of three VIP receptors – chemoattractant receptor-homologous molecule expressed on Th2 (CRTH2) – could reduce eosinophil infiltration and mast cell numbers in the esophagus, wrote Alok K. Verma, PhD, a postodoctoral fellow at Tulane University in New Orleans, and his colleagues.
“We suggest that inhibiting the VIP–CRTH2 axis may ameliorate the dysphagia, stricture, and motility dysfunction of chronic EoE,” they wrote in a research letter to Cellular and Molecular Gastroenterology and Hepatology.
Several cytokines and chemokines, notably interleukin-5 and eotaxin-3, have been fingered as suspects in eosinophil infiltration, but whether chemokines other than eotaxin play a role has not been well documented, the investigators noted.
They hypothesized that VIP may be a chemoattractant that draws eosinophils into perineural areas of the muscular mucosa of the esophagus.
To test this idea, they looked at VIP-expression in samples from patients both with and without EoE and found that VIP expression was low among controls (without EoE); they also found that eosinophils were seen to accumulate near VIP-expressing nerve cells in biopsy samples from patients with EoE.
When they performed in vitro studies of VIP binding and immunologic functions, they found that eosinophils primarily express the CRTH2 receptor rather than the vasoactive intestinal peptide receptor 1 (VPAC-1) or VPAC-2.
They also demonstrated that VIP’s effects on eosinophil motility was similar to that of eotaxin and that, when they pretreated eosinophils with a CRTH2 inhibitor, esoinophil motility was hampered.
The investigators next looked at biopsy specimens from patients with EoE and found that eosinophils that express CRTH2 accumulated in the epithelial mucosa.
To see whether (as they and other researchers had suspected) VIP and its interaction with the CRTH2 receptor might play a role in mast cell recruitment, they performed immunofluorescence analyses and confirmed the presence of the CRTH2 receptor on tryptase-positive mast cells in the esophageal mucosa of patients with EoE.
“These findings suggest that, similar to eosinophils, mast cells accumulate via interaction of the CRTH2 receptor with neutrally derived VIP,” they wrote.
Finally, to see whether a reduction in peak eosinophil levels in patients with EoE with a CRTH2 antagonist – as seen in prior studies – could also ameliorate the negative effects of mast cells on esophageal function, they looked at the effects of CRTH2 inhibition in a mouse model of human EoE.
They found that, in the mice treated with a CRTH2 blocker, each segment of the esophagus had significant reductions in both eosinophil infiltration and mast cell numbers (P less than .05 for each).
The work was supported in part by grants from the National Institutes of Health and the Tulane Edward G. Schlieder Educational Foundation. Senior author Anil Mishra, PhD, disclosed serving as a consultant for Axcan Pharma, Aptalis, Elite Biosciences, Calypso Biotech SA, and Enumeral Biomedical. The remaining authors disclosed no conflicts of interest.
SOURCE: Verma AK et al. Cell Mol Gastroenterol Hepatol. 2018;5[1]:99-100.e7.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: VIP appears to play an important role in the pathogenesis of eosinophilic esophagitis (EoE).
Major finding: Neurally derived VIP and its interaction with the CRTH2 receptor appear to recruit eosinophils and mast cells into the esophageal mucosa.
Data source: In vitro studies of human EoE biopsy samples and in vivo studies in mouse models of EoE.
Disclosures: The work was supported in part by grants from the National Institutes of Health and the Tulane Edward G. Schlieder Educational Foundation. Senior author Anil Mishra, PhD, disclosed serving as a consultant for Axcan Pharma, Aptalis, Elite Biosciences, Calypso Biotech SA, and Enumeral Biomedical. The remaining authors disclosed no conflicts of interest.
Source: Verma AK et al. Cell Mol Gastroenterol Hepatol. 2018;5[1]:99-100.e7.