User login
Ultraprocessed foods: Large study finds link with Crohn’s disease
Higher consumption of ultraprocessed foods was linked with a significantly higher risk of Crohn’s disease (CD), but not ulcerative colitis, in a large prospective cohort study published online in Clinical Gastroenterology and Hepatology.
Researchers, led by Chun-Han Lo, MD, of Massachusetts General Hospital, Boston, defined ultraprocessed foods “as ready-to-consume formulations of ingredients, typically created by [a] series industrial techniques and processes. They frequently involve the incorporation of additives, such as sweeteners, preservatives, emulsifiers, thickeners, and flavors, which aid in food preservation and produce hyperpalatable products.”
The rising global incidence of inflammatory bowel disease (IBD) in regions undergoing Westernization has overlapped with rising increase in consumption of ultraprocessed food (UPF) over the past few decades, according to the authors. Previous studies have focused on links with individual nutrients and IBD, but this study focuses on the processing role itself. This study comprised 245,112 participants (203,516 women and 41,596 men) and more than 5,468,444 person-years of follow-up, taken from three cohorts: Nurses’ Health Study, Nurses’ Health Study II, and Health Professionals Follow-up Study.
In the highest quartile, UPFs made up on average nearly half (46.4%) of participants’ total energy consumption, compared with 21% in the lowest quartile.
The researchers found that compared with participants in the lowest quartile of simple updated UPF consumption, those in the highest quartile had a significantly increased risk of CD (adjusted hazard ratio, 1.70; 95% confidence interval, 1.23-2.35).
In addition, “a secondary analysis across different CD locations demonstrated that participants in the highest quartile of simple updated UPF intake had the highest risk of ileal, colonic, and ileocolonic CD,” the authors wrote.
Three groups of processed foods driving risk increase
Three groups of UPFs appeared to drive the increased risk of CD: ultraprocessed breads and breakfast foods; frozen or shelf-stable ready-to-eat/heat meals; and sauces, cheeses, spreads, and gravies.
Just as with overall consumption, researchers did not find an association between any of those three subgroups and UC risk.
The authors suggested several reasons for the link with Crohn’s disease. Among them were that higher UPF consumption may mean those foods are taking the place of unprocessed or minimally processed foods, such as those rich in fiber. Second, UPFs contain additives, such as salt, that may promote intestinal inflammation. Third, artificial sweeteners in UPFs may predispose the gut to inflammation, as supported by supplementing sucralose/maltodextrins in mice models of spontaneous ileitis.
As for why CD, but not UC, the authors said diet may be more relevant and have a stronger effect biologically in CD compared with UC. Another potential reason, they said, is that results “may reflect the greater specificity of dietary ligands and metabolites on the small intestine compared with the colon.”
Data from three large, prospective cohorts
Researchers used data from three ongoing, prospective nationwide cohorts of health professionals in the United States – the Nurses’ Health Study (1986-2014); the Nurses’ Health Study II (1991-2017); and the Health Professionals Follow-up Study (1986-2012).
In all three cohorts, participants filled in questionnaires at enrollment and every 2 years thereafter with information such as medical history and lifestyle factors. Diet was assessed via validated semi-quantitative food frequency questionnaires.
They used Cox proportional hazards models, adjusting for confounders to estimate the hazard ratios (HRs) and 95% confidence intervals for Crohn’s disease and ulcerative colitis, according to participants’ self-reports of their consumption of ultraprocessed foods.
Further studies could help determine which UPF components are driving the higher risk for Crohn’s disease and whether risk differs by length of exposure to UPFs.
“By avoiding UPF consumption, individuals might substantially lower their risk of developing CD in addition to gaining other health benefits,” the authors wrote.
A coauthor, James M. Richter, MD, is a consultant for Policy Analysis Inc and Takeda Pharmaceuticals. Andrew T. Chan, MD, serves as a consultant for Janssen Pharmaceuticals, Pfizer Inc, and Bayer Pharma AG. Ashwin N. Ananthakrishnan, MD, has served as a scientific advisory board member for Abbvie, Gilead, and Kyn Therapeutics, and received research grants from Pfizer and Merck. The remaining authors disclosed no conflicts. This work was supported by the National Institutes of Health; the Beker Foundation, the Chleck Family Foundation, and the Crohn’s and Colitis Foundation.
Because consumption of industrially manufactured foods has risen in parallel with the incidence of autoimmune diseases, modern diets are hypothesized to contribute to the development of inflammatory bowel disease. In this study, Lo and colleagues conducted a retrospective cohort study to determine if individuals who reported higher levels of ultraprocessed food intake had higher rates of developing IBD. In their adjusted analysis, the authors report that the rate of developing Crohn’s disease was 70% higher for individuals who consumed the highest quartile of ultraprocessed foods; there was no association seen with ulcerative colitis.
While we await clarification about which ingredients are responsible, we should continue to encourage our patients to incorporate whole foods into their diets for both gastrointestinal and cardiometabolic health. At the same time, we must remain empathetic to systemic barriers to accessing and preparing high-quality, minimally processed foods. As such, we should advocate for policies and programs that mitigate food deserts. If food policy remains status quo, this study illustrates a frightening possibility of how disparities in gastrointestinal health equity could worsen in the future.
Ravy K. Vajravelu, MD, MSCE, is an assistant professor of medicine in the division of gastroenterology, hepatology, and nutrition at the University of Pittsburgh Center for Health Equity Research and Promotion and the VA Pittsburgh Healthcare System. This commentary does not represent the views of the U.S. Department of Veterans Affairs or the United States Government. Dr. Vajravelu reports no relevant disclosures.
Because consumption of industrially manufactured foods has risen in parallel with the incidence of autoimmune diseases, modern diets are hypothesized to contribute to the development of inflammatory bowel disease. In this study, Lo and colleagues conducted a retrospective cohort study to determine if individuals who reported higher levels of ultraprocessed food intake had higher rates of developing IBD. In their adjusted analysis, the authors report that the rate of developing Crohn’s disease was 70% higher for individuals who consumed the highest quartile of ultraprocessed foods; there was no association seen with ulcerative colitis.
While we await clarification about which ingredients are responsible, we should continue to encourage our patients to incorporate whole foods into their diets for both gastrointestinal and cardiometabolic health. At the same time, we must remain empathetic to systemic barriers to accessing and preparing high-quality, minimally processed foods. As such, we should advocate for policies and programs that mitigate food deserts. If food policy remains status quo, this study illustrates a frightening possibility of how disparities in gastrointestinal health equity could worsen in the future.
Ravy K. Vajravelu, MD, MSCE, is an assistant professor of medicine in the division of gastroenterology, hepatology, and nutrition at the University of Pittsburgh Center for Health Equity Research and Promotion and the VA Pittsburgh Healthcare System. This commentary does not represent the views of the U.S. Department of Veterans Affairs or the United States Government. Dr. Vajravelu reports no relevant disclosures.
Because consumption of industrially manufactured foods has risen in parallel with the incidence of autoimmune diseases, modern diets are hypothesized to contribute to the development of inflammatory bowel disease. In this study, Lo and colleagues conducted a retrospective cohort study to determine if individuals who reported higher levels of ultraprocessed food intake had higher rates of developing IBD. In their adjusted analysis, the authors report that the rate of developing Crohn’s disease was 70% higher for individuals who consumed the highest quartile of ultraprocessed foods; there was no association seen with ulcerative colitis.
While we await clarification about which ingredients are responsible, we should continue to encourage our patients to incorporate whole foods into their diets for both gastrointestinal and cardiometabolic health. At the same time, we must remain empathetic to systemic barriers to accessing and preparing high-quality, minimally processed foods. As such, we should advocate for policies and programs that mitigate food deserts. If food policy remains status quo, this study illustrates a frightening possibility of how disparities in gastrointestinal health equity could worsen in the future.
Ravy K. Vajravelu, MD, MSCE, is an assistant professor of medicine in the division of gastroenterology, hepatology, and nutrition at the University of Pittsburgh Center for Health Equity Research and Promotion and the VA Pittsburgh Healthcare System. This commentary does not represent the views of the U.S. Department of Veterans Affairs or the United States Government. Dr. Vajravelu reports no relevant disclosures.
Higher consumption of ultraprocessed foods was linked with a significantly higher risk of Crohn’s disease (CD), but not ulcerative colitis, in a large prospective cohort study published online in Clinical Gastroenterology and Hepatology.
Researchers, led by Chun-Han Lo, MD, of Massachusetts General Hospital, Boston, defined ultraprocessed foods “as ready-to-consume formulations of ingredients, typically created by [a] series industrial techniques and processes. They frequently involve the incorporation of additives, such as sweeteners, preservatives, emulsifiers, thickeners, and flavors, which aid in food preservation and produce hyperpalatable products.”
The rising global incidence of inflammatory bowel disease (IBD) in regions undergoing Westernization has overlapped with rising increase in consumption of ultraprocessed food (UPF) over the past few decades, according to the authors. Previous studies have focused on links with individual nutrients and IBD, but this study focuses on the processing role itself. This study comprised 245,112 participants (203,516 women and 41,596 men) and more than 5,468,444 person-years of follow-up, taken from three cohorts: Nurses’ Health Study, Nurses’ Health Study II, and Health Professionals Follow-up Study.
In the highest quartile, UPFs made up on average nearly half (46.4%) of participants’ total energy consumption, compared with 21% in the lowest quartile.
The researchers found that compared with participants in the lowest quartile of simple updated UPF consumption, those in the highest quartile had a significantly increased risk of CD (adjusted hazard ratio, 1.70; 95% confidence interval, 1.23-2.35).
In addition, “a secondary analysis across different CD locations demonstrated that participants in the highest quartile of simple updated UPF intake had the highest risk of ileal, colonic, and ileocolonic CD,” the authors wrote.
Three groups of processed foods driving risk increase
Three groups of UPFs appeared to drive the increased risk of CD: ultraprocessed breads and breakfast foods; frozen or shelf-stable ready-to-eat/heat meals; and sauces, cheeses, spreads, and gravies.
Just as with overall consumption, researchers did not find an association between any of those three subgroups and UC risk.
The authors suggested several reasons for the link with Crohn’s disease. Among them were that higher UPF consumption may mean those foods are taking the place of unprocessed or minimally processed foods, such as those rich in fiber. Second, UPFs contain additives, such as salt, that may promote intestinal inflammation. Third, artificial sweeteners in UPFs may predispose the gut to inflammation, as supported by supplementing sucralose/maltodextrins in mice models of spontaneous ileitis.
As for why CD, but not UC, the authors said diet may be more relevant and have a stronger effect biologically in CD compared with UC. Another potential reason, they said, is that results “may reflect the greater specificity of dietary ligands and metabolites on the small intestine compared with the colon.”
Data from three large, prospective cohorts
Researchers used data from three ongoing, prospective nationwide cohorts of health professionals in the United States – the Nurses’ Health Study (1986-2014); the Nurses’ Health Study II (1991-2017); and the Health Professionals Follow-up Study (1986-2012).
In all three cohorts, participants filled in questionnaires at enrollment and every 2 years thereafter with information such as medical history and lifestyle factors. Diet was assessed via validated semi-quantitative food frequency questionnaires.
They used Cox proportional hazards models, adjusting for confounders to estimate the hazard ratios (HRs) and 95% confidence intervals for Crohn’s disease and ulcerative colitis, according to participants’ self-reports of their consumption of ultraprocessed foods.
Further studies could help determine which UPF components are driving the higher risk for Crohn’s disease and whether risk differs by length of exposure to UPFs.
“By avoiding UPF consumption, individuals might substantially lower their risk of developing CD in addition to gaining other health benefits,” the authors wrote.
A coauthor, James M. Richter, MD, is a consultant for Policy Analysis Inc and Takeda Pharmaceuticals. Andrew T. Chan, MD, serves as a consultant for Janssen Pharmaceuticals, Pfizer Inc, and Bayer Pharma AG. Ashwin N. Ananthakrishnan, MD, has served as a scientific advisory board member for Abbvie, Gilead, and Kyn Therapeutics, and received research grants from Pfizer and Merck. The remaining authors disclosed no conflicts. This work was supported by the National Institutes of Health; the Beker Foundation, the Chleck Family Foundation, and the Crohn’s and Colitis Foundation.
Higher consumption of ultraprocessed foods was linked with a significantly higher risk of Crohn’s disease (CD), but not ulcerative colitis, in a large prospective cohort study published online in Clinical Gastroenterology and Hepatology.
Researchers, led by Chun-Han Lo, MD, of Massachusetts General Hospital, Boston, defined ultraprocessed foods “as ready-to-consume formulations of ingredients, typically created by [a] series industrial techniques and processes. They frequently involve the incorporation of additives, such as sweeteners, preservatives, emulsifiers, thickeners, and flavors, which aid in food preservation and produce hyperpalatable products.”
The rising global incidence of inflammatory bowel disease (IBD) in regions undergoing Westernization has overlapped with rising increase in consumption of ultraprocessed food (UPF) over the past few decades, according to the authors. Previous studies have focused on links with individual nutrients and IBD, but this study focuses on the processing role itself. This study comprised 245,112 participants (203,516 women and 41,596 men) and more than 5,468,444 person-years of follow-up, taken from three cohorts: Nurses’ Health Study, Nurses’ Health Study II, and Health Professionals Follow-up Study.
In the highest quartile, UPFs made up on average nearly half (46.4%) of participants’ total energy consumption, compared with 21% in the lowest quartile.
The researchers found that compared with participants in the lowest quartile of simple updated UPF consumption, those in the highest quartile had a significantly increased risk of CD (adjusted hazard ratio, 1.70; 95% confidence interval, 1.23-2.35).
In addition, “a secondary analysis across different CD locations demonstrated that participants in the highest quartile of simple updated UPF intake had the highest risk of ileal, colonic, and ileocolonic CD,” the authors wrote.
Three groups of processed foods driving risk increase
Three groups of UPFs appeared to drive the increased risk of CD: ultraprocessed breads and breakfast foods; frozen or shelf-stable ready-to-eat/heat meals; and sauces, cheeses, spreads, and gravies.
Just as with overall consumption, researchers did not find an association between any of those three subgroups and UC risk.
The authors suggested several reasons for the link with Crohn’s disease. Among them were that higher UPF consumption may mean those foods are taking the place of unprocessed or minimally processed foods, such as those rich in fiber. Second, UPFs contain additives, such as salt, that may promote intestinal inflammation. Third, artificial sweeteners in UPFs may predispose the gut to inflammation, as supported by supplementing sucralose/maltodextrins in mice models of spontaneous ileitis.
As for why CD, but not UC, the authors said diet may be more relevant and have a stronger effect biologically in CD compared with UC. Another potential reason, they said, is that results “may reflect the greater specificity of dietary ligands and metabolites on the small intestine compared with the colon.”
Data from three large, prospective cohorts
Researchers used data from three ongoing, prospective nationwide cohorts of health professionals in the United States – the Nurses’ Health Study (1986-2014); the Nurses’ Health Study II (1991-2017); and the Health Professionals Follow-up Study (1986-2012).
In all three cohorts, participants filled in questionnaires at enrollment and every 2 years thereafter with information such as medical history and lifestyle factors. Diet was assessed via validated semi-quantitative food frequency questionnaires.
They used Cox proportional hazards models, adjusting for confounders to estimate the hazard ratios (HRs) and 95% confidence intervals for Crohn’s disease and ulcerative colitis, according to participants’ self-reports of their consumption of ultraprocessed foods.
Further studies could help determine which UPF components are driving the higher risk for Crohn’s disease and whether risk differs by length of exposure to UPFs.
“By avoiding UPF consumption, individuals might substantially lower their risk of developing CD in addition to gaining other health benefits,” the authors wrote.
A coauthor, James M. Richter, MD, is a consultant for Policy Analysis Inc and Takeda Pharmaceuticals. Andrew T. Chan, MD, serves as a consultant for Janssen Pharmaceuticals, Pfizer Inc, and Bayer Pharma AG. Ashwin N. Ananthakrishnan, MD, has served as a scientific advisory board member for Abbvie, Gilead, and Kyn Therapeutics, and received research grants from Pfizer and Merck. The remaining authors disclosed no conflicts. This work was supported by the National Institutes of Health; the Beker Foundation, the Chleck Family Foundation, and the Crohn’s and Colitis Foundation.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Locoregional therapy lowers wait-list dropout in HCC
The use of bridging locoregional therapy (LRT) before liver transplantation in patients with hepatocellular carcinoma (HCC) has significantly increased in the United States within the past 15 years, a recent analysis suggests. Data show that liver transplant candidates with HCC who have elevated tumor burden and patients with more compensated liver disease have received a greater number of treatments while awaiting transplant.
According to the researchers, led by Allison Kwong, MD, of Stanford (Calif.) University, liver transplant remains a curative option for individuals with unresectable HCC who meet prespecified size criteria. In the United States, a mandated waiting period of 6 months prior “to gaining exception points has been implemented” in an effort “to allow for consideration of tumor biology and reduce the disparities in wait-list dropout between HCC and non-HCC patients,” the researchers wrote.
Several forms of LRT are now available for HCC, including chemoembolization, external beam radiation, radioembolization, and radiofrequency or microwave ablation. In the liver transplant setting, these LRT options enable management of intrahepatic disease in patients who are waiting for liver transplant, Dr. Kwong and colleagues explained.
The researchers, who published their study findings in the May issue of Clinical Gastroenterology and Hepatology, sought to examine the national temporal trends and wait-list outcomes of LRT in 31,609 patients eligible for liver transplant with greater than or equal to one approved HCC exception application in the United States.
Patient data were obtained from the Organ Procurement and Transplantation Network database and comprised primary adult LT candidates who were listed from the years 2003 to 2018. The investigators assessed explant histology and performed multivariable competing risk analysis to examine the relationship between the type of first LRT and time to wait-list dropout.
The wait-list dropout variable was defined by list removal because of death or excessive illness. The researchers noted that list removal likely represents disease progression “beyond transplantable criteria and beyond which patients were unlikely to benefit from or be eligible for further LRT.”
In the study population, the median age was 59 years, and approximately 77% of patients were male. More than half (53.1%) of the cohort had hepatitis C as the predominant liver disease etiology. Patients had a median follow-up period of 214 days on the waiting list.
Most patients (79%) received deceased or living-donor transplants, and 18.6% of patients were removed from the waiting list. Between the 2003 and 2006 period, the median wait-list time was 123 days, but this median wait-list duration increased to 257 days for patients listed between 2015 and 2018.
A total of 34,610 LRTs were performed among 24,145 liver transplant candidates during the study period. From 2003 to 2018, the proportion of patients with greater than or equal to 1 LRT recorded in the database rose from 42.3% to 92.4%, respectively. Most patients (67.8%) who received liver-directed therapy had a single LRT, while 23.8% of patients had two LRTs, 6.2% had three LRTs, and 2.2% had greater than or equal to four LRTs.
The most frequent type of LRT performed was chemoembolization, followed by thermal ablation. Radioembolization increased from less than 5% in 2013 to 19% in 2018. Moreover, in 2018, chemoembolization accounted for 50% of LRTs, while thermal ablation accounted for 22% of LRTs.
The incidence rates of LRT per 100 wait-list days was above average in patients who had an initial tumor burden beyond the Milan criteria (0.188), an alpha-fetoprotein level of 21-40 (0.171) or 41-500 ng/mL (0.179), Child-Pugh class A (0.160), patients in short (0.151) and medium (0.154) wait-time regions, as well as patients who were listed following implementation of cap-and-delay in October 2015 (0.192).
In the multivariable competing-risk analysis for wait-list dropout, adjusting for initial tumor burden and AFP, Child-Pugh class, wait region, and listing era, no locoregional therapy was associated with an increased risk of wait-list dropout versus chemoembolization as the first LRT in a multivariable competing-risk analysis (subhazard ratio, 1.37; 95% CI, 1.28-1.47). The inverse probability of treatment weighting–adjusted analysis found an association between radioembolization, when compared with chemoembolization, and a reduced risk of wait-list dropout (sHR, 0.85; 95% CI, 0.81-0.89). Thermal ablation was also associated with a reduced risk of wait-list dropout, compared with chemoembolization (sHR, 0.95; 95% CI, 0.91-0.99). “Radioembolization and thermal ablation may be superior to chemoembolization and prove to be more cost-effective options, depending on the clinical context,” the researchers wrote.
The researchers noted that they were unable to distinguish patients who were removed from the waiting list between those with disease progression versus liver failure.
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
In 1996, Mazzaferro and colleagues reported the results of a cohort of 48 patients with cirrhosis who had small, unresectable hepatocellular carcinoma (HCC). The actuarial survival rate was 75% at 4 years, and 83% of these patients had no recurrence, so, orthotopic liver transplantation became one of the standard options with curative intent for the treatment HCC. Because of HCC biology, some of these tumors grow or, worst-case scenario, are outside the Milan criteria. Locoregional therapies (LRT) were applied to arrest or downsize the tumor(s) to be within the liver transplantation criteria.
Kwong and colleagues, using the data of the Organ Procurement and Transplantation Network database, showed an exponential increase of LRT over 15 years: from 32.5% in 2003 to 92.4% in 2018. The Barcelona Clinic Liver Cancer staging system classifies chemoembolization, the most common LRT modality used in this cohort, as a palliative treatment rather than curative. Not surprisingly, the authors found that radioembolization was independently associated with a 15% reduction in the wait-list dropout rate, compared with chemoembolization. Further, listing in longer wait-time regions and more recent years was independently associated with a higher likelihood of wait-list dropout.
These data may be worrisome for patients listed for HCC. The median Model for End-Stage Liver Disease at Transplant Minus 3 National Policy, introduced in May 2019, decreases the transplantation rates in patients with HCC. Consequently, longer wait-list time leads to increase utilization of LRT to keep these patients within criteria. Radioembolization could become the preferred LRT therapy to stop tumor growth than chemoembolization and, probably, will be more cost effective. Future work should address explant outcomes and outcome on downstaging with external radiation therapy and adjuvant use of immunotherapy.
Ruben Hernaez, MD, MPH, PhD, is an assistant professor at the Michael E. DeBakey Veterans Affairs Medical Center and Baylor College of Medicine, both in Houston. He has no relevant conflicts to disclose.
In 1996, Mazzaferro and colleagues reported the results of a cohort of 48 patients with cirrhosis who had small, unresectable hepatocellular carcinoma (HCC). The actuarial survival rate was 75% at 4 years, and 83% of these patients had no recurrence, so, orthotopic liver transplantation became one of the standard options with curative intent for the treatment HCC. Because of HCC biology, some of these tumors grow or, worst-case scenario, are outside the Milan criteria. Locoregional therapies (LRT) were applied to arrest or downsize the tumor(s) to be within the liver transplantation criteria.
Kwong and colleagues, using the data of the Organ Procurement and Transplantation Network database, showed an exponential increase of LRT over 15 years: from 32.5% in 2003 to 92.4% in 2018. The Barcelona Clinic Liver Cancer staging system classifies chemoembolization, the most common LRT modality used in this cohort, as a palliative treatment rather than curative. Not surprisingly, the authors found that radioembolization was independently associated with a 15% reduction in the wait-list dropout rate, compared with chemoembolization. Further, listing in longer wait-time regions and more recent years was independently associated with a higher likelihood of wait-list dropout.
These data may be worrisome for patients listed for HCC. The median Model for End-Stage Liver Disease at Transplant Minus 3 National Policy, introduced in May 2019, decreases the transplantation rates in patients with HCC. Consequently, longer wait-list time leads to increase utilization of LRT to keep these patients within criteria. Radioembolization could become the preferred LRT therapy to stop tumor growth than chemoembolization and, probably, will be more cost effective. Future work should address explant outcomes and outcome on downstaging with external radiation therapy and adjuvant use of immunotherapy.
Ruben Hernaez, MD, MPH, PhD, is an assistant professor at the Michael E. DeBakey Veterans Affairs Medical Center and Baylor College of Medicine, both in Houston. He has no relevant conflicts to disclose.
In 1996, Mazzaferro and colleagues reported the results of a cohort of 48 patients with cirrhosis who had small, unresectable hepatocellular carcinoma (HCC). The actuarial survival rate was 75% at 4 years, and 83% of these patients had no recurrence, so, orthotopic liver transplantation became one of the standard options with curative intent for the treatment HCC. Because of HCC biology, some of these tumors grow or, worst-case scenario, are outside the Milan criteria. Locoregional therapies (LRT) were applied to arrest or downsize the tumor(s) to be within the liver transplantation criteria.
Kwong and colleagues, using the data of the Organ Procurement and Transplantation Network database, showed an exponential increase of LRT over 15 years: from 32.5% in 2003 to 92.4% in 2018. The Barcelona Clinic Liver Cancer staging system classifies chemoembolization, the most common LRT modality used in this cohort, as a palliative treatment rather than curative. Not surprisingly, the authors found that radioembolization was independently associated with a 15% reduction in the wait-list dropout rate, compared with chemoembolization. Further, listing in longer wait-time regions and more recent years was independently associated with a higher likelihood of wait-list dropout.
These data may be worrisome for patients listed for HCC. The median Model for End-Stage Liver Disease at Transplant Minus 3 National Policy, introduced in May 2019, decreases the transplantation rates in patients with HCC. Consequently, longer wait-list time leads to increase utilization of LRT to keep these patients within criteria. Radioembolization could become the preferred LRT therapy to stop tumor growth than chemoembolization and, probably, will be more cost effective. Future work should address explant outcomes and outcome on downstaging with external radiation therapy and adjuvant use of immunotherapy.
Ruben Hernaez, MD, MPH, PhD, is an assistant professor at the Michael E. DeBakey Veterans Affairs Medical Center and Baylor College of Medicine, both in Houston. He has no relevant conflicts to disclose.
The use of bridging locoregional therapy (LRT) before liver transplantation in patients with hepatocellular carcinoma (HCC) has significantly increased in the United States within the past 15 years, a recent analysis suggests. Data show that liver transplant candidates with HCC who have elevated tumor burden and patients with more compensated liver disease have received a greater number of treatments while awaiting transplant.
According to the researchers, led by Allison Kwong, MD, of Stanford (Calif.) University, liver transplant remains a curative option for individuals with unresectable HCC who meet prespecified size criteria. In the United States, a mandated waiting period of 6 months prior “to gaining exception points has been implemented” in an effort “to allow for consideration of tumor biology and reduce the disparities in wait-list dropout between HCC and non-HCC patients,” the researchers wrote.
Several forms of LRT are now available for HCC, including chemoembolization, external beam radiation, radioembolization, and radiofrequency or microwave ablation. In the liver transplant setting, these LRT options enable management of intrahepatic disease in patients who are waiting for liver transplant, Dr. Kwong and colleagues explained.
The researchers, who published their study findings in the May issue of Clinical Gastroenterology and Hepatology, sought to examine the national temporal trends and wait-list outcomes of LRT in 31,609 patients eligible for liver transplant with greater than or equal to one approved HCC exception application in the United States.
Patient data were obtained from the Organ Procurement and Transplantation Network database and comprised primary adult LT candidates who were listed from the years 2003 to 2018. The investigators assessed explant histology and performed multivariable competing risk analysis to examine the relationship between the type of first LRT and time to wait-list dropout.
The wait-list dropout variable was defined by list removal because of death or excessive illness. The researchers noted that list removal likely represents disease progression “beyond transplantable criteria and beyond which patients were unlikely to benefit from or be eligible for further LRT.”
In the study population, the median age was 59 years, and approximately 77% of patients were male. More than half (53.1%) of the cohort had hepatitis C as the predominant liver disease etiology. Patients had a median follow-up period of 214 days on the waiting list.
Most patients (79%) received deceased or living-donor transplants, and 18.6% of patients were removed from the waiting list. Between the 2003 and 2006 period, the median wait-list time was 123 days, but this median wait-list duration increased to 257 days for patients listed between 2015 and 2018.
A total of 34,610 LRTs were performed among 24,145 liver transplant candidates during the study period. From 2003 to 2018, the proportion of patients with greater than or equal to 1 LRT recorded in the database rose from 42.3% to 92.4%, respectively. Most patients (67.8%) who received liver-directed therapy had a single LRT, while 23.8% of patients had two LRTs, 6.2% had three LRTs, and 2.2% had greater than or equal to four LRTs.
The most frequent type of LRT performed was chemoembolization, followed by thermal ablation. Radioembolization increased from less than 5% in 2013 to 19% in 2018. Moreover, in 2018, chemoembolization accounted for 50% of LRTs, while thermal ablation accounted for 22% of LRTs.
The incidence rates of LRT per 100 wait-list days was above average in patients who had an initial tumor burden beyond the Milan criteria (0.188), an alpha-fetoprotein level of 21-40 (0.171) or 41-500 ng/mL (0.179), Child-Pugh class A (0.160), patients in short (0.151) and medium (0.154) wait-time regions, as well as patients who were listed following implementation of cap-and-delay in October 2015 (0.192).
In the multivariable competing-risk analysis for wait-list dropout, adjusting for initial tumor burden and AFP, Child-Pugh class, wait region, and listing era, no locoregional therapy was associated with an increased risk of wait-list dropout versus chemoembolization as the first LRT in a multivariable competing-risk analysis (subhazard ratio, 1.37; 95% CI, 1.28-1.47). The inverse probability of treatment weighting–adjusted analysis found an association between radioembolization, when compared with chemoembolization, and a reduced risk of wait-list dropout (sHR, 0.85; 95% CI, 0.81-0.89). Thermal ablation was also associated with a reduced risk of wait-list dropout, compared with chemoembolization (sHR, 0.95; 95% CI, 0.91-0.99). “Radioembolization and thermal ablation may be superior to chemoembolization and prove to be more cost-effective options, depending on the clinical context,” the researchers wrote.
The researchers noted that they were unable to distinguish patients who were removed from the waiting list between those with disease progression versus liver failure.
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
The use of bridging locoregional therapy (LRT) before liver transplantation in patients with hepatocellular carcinoma (HCC) has significantly increased in the United States within the past 15 years, a recent analysis suggests. Data show that liver transplant candidates with HCC who have elevated tumor burden and patients with more compensated liver disease have received a greater number of treatments while awaiting transplant.
According to the researchers, led by Allison Kwong, MD, of Stanford (Calif.) University, liver transplant remains a curative option for individuals with unresectable HCC who meet prespecified size criteria. In the United States, a mandated waiting period of 6 months prior “to gaining exception points has been implemented” in an effort “to allow for consideration of tumor biology and reduce the disparities in wait-list dropout between HCC and non-HCC patients,” the researchers wrote.
Several forms of LRT are now available for HCC, including chemoembolization, external beam radiation, radioembolization, and radiofrequency or microwave ablation. In the liver transplant setting, these LRT options enable management of intrahepatic disease in patients who are waiting for liver transplant, Dr. Kwong and colleagues explained.
The researchers, who published their study findings in the May issue of Clinical Gastroenterology and Hepatology, sought to examine the national temporal trends and wait-list outcomes of LRT in 31,609 patients eligible for liver transplant with greater than or equal to one approved HCC exception application in the United States.
Patient data were obtained from the Organ Procurement and Transplantation Network database and comprised primary adult LT candidates who were listed from the years 2003 to 2018. The investigators assessed explant histology and performed multivariable competing risk analysis to examine the relationship between the type of first LRT and time to wait-list dropout.
The wait-list dropout variable was defined by list removal because of death or excessive illness. The researchers noted that list removal likely represents disease progression “beyond transplantable criteria and beyond which patients were unlikely to benefit from or be eligible for further LRT.”
In the study population, the median age was 59 years, and approximately 77% of patients were male. More than half (53.1%) of the cohort had hepatitis C as the predominant liver disease etiology. Patients had a median follow-up period of 214 days on the waiting list.
Most patients (79%) received deceased or living-donor transplants, and 18.6% of patients were removed from the waiting list. Between the 2003 and 2006 period, the median wait-list time was 123 days, but this median wait-list duration increased to 257 days for patients listed between 2015 and 2018.
A total of 34,610 LRTs were performed among 24,145 liver transplant candidates during the study period. From 2003 to 2018, the proportion of patients with greater than or equal to 1 LRT recorded in the database rose from 42.3% to 92.4%, respectively. Most patients (67.8%) who received liver-directed therapy had a single LRT, while 23.8% of patients had two LRTs, 6.2% had three LRTs, and 2.2% had greater than or equal to four LRTs.
The most frequent type of LRT performed was chemoembolization, followed by thermal ablation. Radioembolization increased from less than 5% in 2013 to 19% in 2018. Moreover, in 2018, chemoembolization accounted for 50% of LRTs, while thermal ablation accounted for 22% of LRTs.
The incidence rates of LRT per 100 wait-list days was above average in patients who had an initial tumor burden beyond the Milan criteria (0.188), an alpha-fetoprotein level of 21-40 (0.171) or 41-500 ng/mL (0.179), Child-Pugh class A (0.160), patients in short (0.151) and medium (0.154) wait-time regions, as well as patients who were listed following implementation of cap-and-delay in October 2015 (0.192).
In the multivariable competing-risk analysis for wait-list dropout, adjusting for initial tumor burden and AFP, Child-Pugh class, wait region, and listing era, no locoregional therapy was associated with an increased risk of wait-list dropout versus chemoembolization as the first LRT in a multivariable competing-risk analysis (subhazard ratio, 1.37; 95% CI, 1.28-1.47). The inverse probability of treatment weighting–adjusted analysis found an association between radioembolization, when compared with chemoembolization, and a reduced risk of wait-list dropout (sHR, 0.85; 95% CI, 0.81-0.89). Thermal ablation was also associated with a reduced risk of wait-list dropout, compared with chemoembolization (sHR, 0.95; 95% CI, 0.91-0.99). “Radioembolization and thermal ablation may be superior to chemoembolization and prove to be more cost-effective options, depending on the clinical context,” the researchers wrote.
The researchers noted that they were unable to distinguish patients who were removed from the waiting list between those with disease progression versus liver failure.
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
AGA Clinical Practice Update: Expert review on personalizing GERD management
A recent American Gastroenterological Association Clinical Practice Update for evaluation and management of gastroesophageal reflux disease (GERD) focuses on delivering personalized diagnostic and therapeutic strategies.
The document includes new advice on use of upfront objective testing for isolated extraesophageal symptoms, confirmation of GERD diagnosis prior to long-term GERD therapy even in PPI responders, as well as important elements focused on personalization of therapy.
Although GERD is common, with an estimated 30% of people in the United States experiencing symptoms, up to half of all individuals on proton pump inhibitor (PPI) therapy report incomplete symptom improvement. That could be due to the heterogeneous nature of symptoms, which may include heartburn and regurgitation, chest pain, and cough or sore throat, among others. Other conditions may produce similar symptoms or could be exacerbated by the presence of GERD.
The authors of the expert review, published in Clinical Gastroenterology and Hepatology, note that these considerations have driven increased interest in personalized approaches to the management of GERD. The practice update includes sections on how to approach GERD symptoms in the clinic, personalized diagnosis related to GERD symptoms, and precision management.
In the initial management, the authors offer advice on involving the patient in creating a care plan, patient education, and conducting a 4- to 8-week PPI trial in patients with heartburn, regurgitation, or noncardiac chest pains without accompanying alarm signals. If symptoms don’t improve to the patient’s satisfaction, dosing can be boosted to twice per day, or a more effective acid suppressor can be substituted and continued at a once-daily dose. When the response to PPIs is adequate, the dose should be reduced until the lowest effective dose is reached, or the patient could potentially be moved to H2 receptor antagonists or other antacids. However, patients with erosive esophagitis, biopsy-confirmed Barrett’s esophagus, or peptic stricture must stay on long-term PPI therapy.
The authors also gave advice on when to conduct objective testing. When a PPI trial doesn’t adequately address troublesome heartburn, regurgitation, and/or noncardiac chest pain, or if alarm systems are present, endoscopy should be employed to look for erosive reflux disease or long-segment Barrett’s esophagus as conclusive evidence for GERD. If these are absent, prolonged wireless pH monitoring while a patient is off medication is suggested. In addition, patients with extraesophageal symptoms suspected to be caused by reflux should undergo upfront objective reflux testing while off PPI therapy rather than doing an empiric PPI trial.
The authors advise that, if patients don’t have proven GERD and are continued on PPI therapy, they should be evaluated within 12 months to ensure that the therapy and dose are appropriate. Physicians should offer endoscopy with prolonged wireless reflux monitoring in the absence of PPI therapy (ideally after 2-4 weeks of withdrawal) to confirm that long-term PPI therapy is needed.
In the section on personalization of disease management, the authors note that ambulatory reflux monitoring and upper gastrointestinal endoscopy can be used to guide management of GERD. When upper GI endoscopy reveals no erosive findings and esophageal acid exposure time (AET) is less than 4% throughout all days of prolonged wireless pH monitoring, the physician can conclude that the patient has no pathologic gastroesophageal reflux and is likely to have a functional esophageal disorder. In contrast, erosive findings during upper GI endoscopy and/or AET more than 4% across at least 1 day of wireless pH monitoring suggests a GERD diagnosis.
Optimization of PPI is important among patients with GERD, and the authors stress that patients should be educated about the safety of PPI use.
Adjunctive pharmacotherapy is useful and can include alginate antacids for breakthrough symptoms, H2RAs for nocturnal symptoms, baclofen to counter regurgitation or belching, and prokinetics for accompanying gastroparesis. The choice of medications depends on the phenotype, and they should not be used empirically.
For patients with functional heartburn or reflux disease linked to esophageal hypervigilance, reflux sensitivity, or behavioral disorders, options include pharmacologic neuromodulation, hypnotherapy provided by a behavioral therapist, cognitive behavioral therapy, and diaphragmatic breathing and relaxation.
If symptoms persist despite efforts at optimization of treatments and lifestyle factors, ambulatory 24-hour pH-impedance monitoring on PPI can be used to investigate mechanistic causes, especially when there is no known antireflux barrier abnormality, but the technique requires expertise to correctly interpret. This can ensure that the symptoms are not due to reflux hypersensitivity, rumination syndrome, or a belching disorder. When symptoms are confirmed to be treatment resistant, therapy should be escalated, using a strategy that incorporates a pattern of reflux, integrity of the antireflux barrier, obesity if present, and psychological factors.
Surgical options for confirmed GERD include laparoscopic fundoplication and magnetic sphincter augmentation. Transoral incisionless fundoplication can be performed endoscopically in selected patients. For obese patients with confirmed GERD, Roux-en-Y gastric bypass is effective at reducing reflux and can be used as a salvage treatment for nonobese patients. Sleeve gastrectomy may exacerbate GERD.
The authors reported relationships with Medtronic, Diversatek, Ironwood, and Takeda. The authors also reported funding from National Institutes of Health grants.
A recent American Gastroenterological Association Clinical Practice Update for evaluation and management of gastroesophageal reflux disease (GERD) focuses on delivering personalized diagnostic and therapeutic strategies.
The document includes new advice on use of upfront objective testing for isolated extraesophageal symptoms, confirmation of GERD diagnosis prior to long-term GERD therapy even in PPI responders, as well as important elements focused on personalization of therapy.
Although GERD is common, with an estimated 30% of people in the United States experiencing symptoms, up to half of all individuals on proton pump inhibitor (PPI) therapy report incomplete symptom improvement. That could be due to the heterogeneous nature of symptoms, which may include heartburn and regurgitation, chest pain, and cough or sore throat, among others. Other conditions may produce similar symptoms or could be exacerbated by the presence of GERD.
The authors of the expert review, published in Clinical Gastroenterology and Hepatology, note that these considerations have driven increased interest in personalized approaches to the management of GERD. The practice update includes sections on how to approach GERD symptoms in the clinic, personalized diagnosis related to GERD symptoms, and precision management.
In the initial management, the authors offer advice on involving the patient in creating a care plan, patient education, and conducting a 4- to 8-week PPI trial in patients with heartburn, regurgitation, or noncardiac chest pains without accompanying alarm signals. If symptoms don’t improve to the patient’s satisfaction, dosing can be boosted to twice per day, or a more effective acid suppressor can be substituted and continued at a once-daily dose. When the response to PPIs is adequate, the dose should be reduced until the lowest effective dose is reached, or the patient could potentially be moved to H2 receptor antagonists or other antacids. However, patients with erosive esophagitis, biopsy-confirmed Barrett’s esophagus, or peptic stricture must stay on long-term PPI therapy.
The authors also gave advice on when to conduct objective testing. When a PPI trial doesn’t adequately address troublesome heartburn, regurgitation, and/or noncardiac chest pain, or if alarm systems are present, endoscopy should be employed to look for erosive reflux disease or long-segment Barrett’s esophagus as conclusive evidence for GERD. If these are absent, prolonged wireless pH monitoring while a patient is off medication is suggested. In addition, patients with extraesophageal symptoms suspected to be caused by reflux should undergo upfront objective reflux testing while off PPI therapy rather than doing an empiric PPI trial.
The authors advise that, if patients don’t have proven GERD and are continued on PPI therapy, they should be evaluated within 12 months to ensure that the therapy and dose are appropriate. Physicians should offer endoscopy with prolonged wireless reflux monitoring in the absence of PPI therapy (ideally after 2-4 weeks of withdrawal) to confirm that long-term PPI therapy is needed.
In the section on personalization of disease management, the authors note that ambulatory reflux monitoring and upper gastrointestinal endoscopy can be used to guide management of GERD. When upper GI endoscopy reveals no erosive findings and esophageal acid exposure time (AET) is less than 4% throughout all days of prolonged wireless pH monitoring, the physician can conclude that the patient has no pathologic gastroesophageal reflux and is likely to have a functional esophageal disorder. In contrast, erosive findings during upper GI endoscopy and/or AET more than 4% across at least 1 day of wireless pH monitoring suggests a GERD diagnosis.
Optimization of PPI is important among patients with GERD, and the authors stress that patients should be educated about the safety of PPI use.
Adjunctive pharmacotherapy is useful and can include alginate antacids for breakthrough symptoms, H2RAs for nocturnal symptoms, baclofen to counter regurgitation or belching, and prokinetics for accompanying gastroparesis. The choice of medications depends on the phenotype, and they should not be used empirically.
For patients with functional heartburn or reflux disease linked to esophageal hypervigilance, reflux sensitivity, or behavioral disorders, options include pharmacologic neuromodulation, hypnotherapy provided by a behavioral therapist, cognitive behavioral therapy, and diaphragmatic breathing and relaxation.
If symptoms persist despite efforts at optimization of treatments and lifestyle factors, ambulatory 24-hour pH-impedance monitoring on PPI can be used to investigate mechanistic causes, especially when there is no known antireflux barrier abnormality, but the technique requires expertise to correctly interpret. This can ensure that the symptoms are not due to reflux hypersensitivity, rumination syndrome, or a belching disorder. When symptoms are confirmed to be treatment resistant, therapy should be escalated, using a strategy that incorporates a pattern of reflux, integrity of the antireflux barrier, obesity if present, and psychological factors.
Surgical options for confirmed GERD include laparoscopic fundoplication and magnetic sphincter augmentation. Transoral incisionless fundoplication can be performed endoscopically in selected patients. For obese patients with confirmed GERD, Roux-en-Y gastric bypass is effective at reducing reflux and can be used as a salvage treatment for nonobese patients. Sleeve gastrectomy may exacerbate GERD.
The authors reported relationships with Medtronic, Diversatek, Ironwood, and Takeda. The authors also reported funding from National Institutes of Health grants.
A recent American Gastroenterological Association Clinical Practice Update for evaluation and management of gastroesophageal reflux disease (GERD) focuses on delivering personalized diagnostic and therapeutic strategies.
The document includes new advice on use of upfront objective testing for isolated extraesophageal symptoms, confirmation of GERD diagnosis prior to long-term GERD therapy even in PPI responders, as well as important elements focused on personalization of therapy.
Although GERD is common, with an estimated 30% of people in the United States experiencing symptoms, up to half of all individuals on proton pump inhibitor (PPI) therapy report incomplete symptom improvement. That could be due to the heterogeneous nature of symptoms, which may include heartburn and regurgitation, chest pain, and cough or sore throat, among others. Other conditions may produce similar symptoms or could be exacerbated by the presence of GERD.
The authors of the expert review, published in Clinical Gastroenterology and Hepatology, note that these considerations have driven increased interest in personalized approaches to the management of GERD. The practice update includes sections on how to approach GERD symptoms in the clinic, personalized diagnosis related to GERD symptoms, and precision management.
In the initial management, the authors offer advice on involving the patient in creating a care plan, patient education, and conducting a 4- to 8-week PPI trial in patients with heartburn, regurgitation, or noncardiac chest pains without accompanying alarm signals. If symptoms don’t improve to the patient’s satisfaction, dosing can be boosted to twice per day, or a more effective acid suppressor can be substituted and continued at a once-daily dose. When the response to PPIs is adequate, the dose should be reduced until the lowest effective dose is reached, or the patient could potentially be moved to H2 receptor antagonists or other antacids. However, patients with erosive esophagitis, biopsy-confirmed Barrett’s esophagus, or peptic stricture must stay on long-term PPI therapy.
The authors also gave advice on when to conduct objective testing. When a PPI trial doesn’t adequately address troublesome heartburn, regurgitation, and/or noncardiac chest pain, or if alarm systems are present, endoscopy should be employed to look for erosive reflux disease or long-segment Barrett’s esophagus as conclusive evidence for GERD. If these are absent, prolonged wireless pH monitoring while a patient is off medication is suggested. In addition, patients with extraesophageal symptoms suspected to be caused by reflux should undergo upfront objective reflux testing while off PPI therapy rather than doing an empiric PPI trial.
The authors advise that, if patients don’t have proven GERD and are continued on PPI therapy, they should be evaluated within 12 months to ensure that the therapy and dose are appropriate. Physicians should offer endoscopy with prolonged wireless reflux monitoring in the absence of PPI therapy (ideally after 2-4 weeks of withdrawal) to confirm that long-term PPI therapy is needed.
In the section on personalization of disease management, the authors note that ambulatory reflux monitoring and upper gastrointestinal endoscopy can be used to guide management of GERD. When upper GI endoscopy reveals no erosive findings and esophageal acid exposure time (AET) is less than 4% throughout all days of prolonged wireless pH monitoring, the physician can conclude that the patient has no pathologic gastroesophageal reflux and is likely to have a functional esophageal disorder. In contrast, erosive findings during upper GI endoscopy and/or AET more than 4% across at least 1 day of wireless pH monitoring suggests a GERD diagnosis.
Optimization of PPI is important among patients with GERD, and the authors stress that patients should be educated about the safety of PPI use.
Adjunctive pharmacotherapy is useful and can include alginate antacids for breakthrough symptoms, H2RAs for nocturnal symptoms, baclofen to counter regurgitation or belching, and prokinetics for accompanying gastroparesis. The choice of medications depends on the phenotype, and they should not be used empirically.
For patients with functional heartburn or reflux disease linked to esophageal hypervigilance, reflux sensitivity, or behavioral disorders, options include pharmacologic neuromodulation, hypnotherapy provided by a behavioral therapist, cognitive behavioral therapy, and diaphragmatic breathing and relaxation.
If symptoms persist despite efforts at optimization of treatments and lifestyle factors, ambulatory 24-hour pH-impedance monitoring on PPI can be used to investigate mechanistic causes, especially when there is no known antireflux barrier abnormality, but the technique requires expertise to correctly interpret. This can ensure that the symptoms are not due to reflux hypersensitivity, rumination syndrome, or a belching disorder. When symptoms are confirmed to be treatment resistant, therapy should be escalated, using a strategy that incorporates a pattern of reflux, integrity of the antireflux barrier, obesity if present, and psychological factors.
Surgical options for confirmed GERD include laparoscopic fundoplication and magnetic sphincter augmentation. Transoral incisionless fundoplication can be performed endoscopically in selected patients. For obese patients with confirmed GERD, Roux-en-Y gastric bypass is effective at reducing reflux and can be used as a salvage treatment for nonobese patients. Sleeve gastrectomy may exacerbate GERD.
The authors reported relationships with Medtronic, Diversatek, Ironwood, and Takeda. The authors also reported funding from National Institutes of Health grants.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
AGA Clinical Practice Update: Expert review on deprescribing PPIs
An American Gastroenterological Association practice update on deprescribing proton-pump inhibitors (PPIs) delineates conditions under which drug withdrawal should be considered, and acknowledges that conversations between physicians and patients can be complicated. An inappropriate decision to discontinue PPI therapy can have significant consequences for the patient, while continued inappropriate use raises health care costs and may rarely lead to adverse effects.
One purpose of the update is to provide guidance when patients and providers don’t have the resources to systematically examine the issue, especially when other medical concerns may be in play. The authors also suggested that physicians include pharmacists in the employment of the best practices advice.
“None of these statements represents a radical departure from previously published guidance on PPI appropriateness and deprescribing: Our [recommendations] simply seek to summarize the evidence and to provide the clinician with a single document which distills the evidence down into clinically applicable guidance statements,” Laura Targownik, MD, associate professor of medicine at the University of Toronto and corresponding author of the practice update published in Gastroenterology said in an interview.
“PPIs are highly effective medications for specific gastrointestinal conditions, and are largely safe. However, PPIs are often used in situations where they have minimal and no proven benefit, leading to unnecessary health care spending and unnecessary exposure to drugs. Our paper helps clinicians identify which patients require long-term PPI use as well as those who may be using them unnecessarily, and provides actionable advice on how to deprescribe PPIs from those deemed to be using them without clear benefit,” said Dr. Targownik.
An estimated 7%-15% of health care patients in general and 40% of those over 70 use PPIs at any given time, making them among the most commonly used drugs. About one in four patients who start PPIs will use them for a year or more. Aside from their use for acid-mediated upper gastrointestinal conditions, PPIs often find use for less well-defined complaints. Since PPIs are available over the counter, physicians may not even be involved in a patient’s decision to use them.
Although PPI use has been associated with adverse events, including chronic kidney disease, fractures, dementia, and greater risk of COVID-19 infection, there is not high-quality evidence to suggest that PPIs are directly responsible for any of these adverse events.
The authors suggested the primary care provider should periodically review and document the complaints or indications that prompt PPI use. When a patient is found to have no chronic condition that PPIs could reasonably address, the physician should consider a trial withdrawal. Patients who take PPIs twice daily for a known chronic condition should be considered for a reduction to a once-daily dose.
In general, PPI discontinuation is not a good option for most patients with complicated gastroesophageal reflux disease, such as those with a history of severe erosive esophagitis, esophageal ulcer, or peptic stricture. The same is true for patients with Barrett’s esophagus, eosinophilic esophagitis, or idiopathic pulmonary fibrosis.
Before any deprescribing is considered, the patient should be evaluated for risk of upper gastrointestinal bleeding, and those at high risk are not candidates for PPI deprescribing.
When the decision is made to withdraw PPIs, the patient should be advised of an increased risk of transient upper gastrointestinal symptoms caused by rebound acid hypersecretion.
The withdrawal of PPIs can be done abruptly, or the dose can be tapered gradually.
PPI-associated adverse events should not be a consideration when discussing the option of withdrawing from PPIs. Instead, the decision should be based on the absence of a specific reason for their use. A history of such adverse events, or a current adverse event, should not be a sole reason for discontinuation, nor should risk factors associated with risk of adverse events. Concerns about adverse events have driven recent interest in reducing use of PPIs, but those adverse events were identified through retrospective studies and may be only associated with PPI use rather than caused by it. In many cases there is no plausible mechanistic cause, and no clinical trials have demonstrated increased adverse events in PPI users.
Three-quarters of physicians say they have altered treatment plans for patients because of concerns about PPI adverse events, and 80% say they would advise patients to withdraw PPIs if they learned the patient was at increased risk of upper gastrointestinal bleeding. Unnecessary withdrawal can lead to recurrent symptoms and complications when PPIs are effective treatments. “Therefore, physicians should not use concern about unproven complications of PPI use as a justification for PPI deprescribing if there remain ongoing valid indications for PPI use,” the authors wrote.
Dr. Targownik has received investigator-initiated funding from Janssen Canada and served on advisory boards for AbbVie Canada, Takeda Canada, Merck Canada, Pfizer Canada, Janssen Canada, Roche Canada, and Sandoz Canada. She is the lead on an IBD registry supported by AbbVie Canada, Takeda Canada, Merck Canada, Pfizer Canada, Amgen Canada, Roche Canada, and Sandoz Canada. None of the companies with whom Dr. Targownik has a relation are involved in the manufacturing, distribution, or sales of PPIs or any other agents mentioned in the manuscript.
An American Gastroenterological Association practice update on deprescribing proton-pump inhibitors (PPIs) delineates conditions under which drug withdrawal should be considered, and acknowledges that conversations between physicians and patients can be complicated. An inappropriate decision to discontinue PPI therapy can have significant consequences for the patient, while continued inappropriate use raises health care costs and may rarely lead to adverse effects.
One purpose of the update is to provide guidance when patients and providers don’t have the resources to systematically examine the issue, especially when other medical concerns may be in play. The authors also suggested that physicians include pharmacists in the employment of the best practices advice.
“None of these statements represents a radical departure from previously published guidance on PPI appropriateness and deprescribing: Our [recommendations] simply seek to summarize the evidence and to provide the clinician with a single document which distills the evidence down into clinically applicable guidance statements,” Laura Targownik, MD, associate professor of medicine at the University of Toronto and corresponding author of the practice update published in Gastroenterology said in an interview.
“PPIs are highly effective medications for specific gastrointestinal conditions, and are largely safe. However, PPIs are often used in situations where they have minimal and no proven benefit, leading to unnecessary health care spending and unnecessary exposure to drugs. Our paper helps clinicians identify which patients require long-term PPI use as well as those who may be using them unnecessarily, and provides actionable advice on how to deprescribe PPIs from those deemed to be using them without clear benefit,” said Dr. Targownik.
An estimated 7%-15% of health care patients in general and 40% of those over 70 use PPIs at any given time, making them among the most commonly used drugs. About one in four patients who start PPIs will use them for a year or more. Aside from their use for acid-mediated upper gastrointestinal conditions, PPIs often find use for less well-defined complaints. Since PPIs are available over the counter, physicians may not even be involved in a patient’s decision to use them.
Although PPI use has been associated with adverse events, including chronic kidney disease, fractures, dementia, and greater risk of COVID-19 infection, there is not high-quality evidence to suggest that PPIs are directly responsible for any of these adverse events.
The authors suggested the primary care provider should periodically review and document the complaints or indications that prompt PPI use. When a patient is found to have no chronic condition that PPIs could reasonably address, the physician should consider a trial withdrawal. Patients who take PPIs twice daily for a known chronic condition should be considered for a reduction to a once-daily dose.
In general, PPI discontinuation is not a good option for most patients with complicated gastroesophageal reflux disease, such as those with a history of severe erosive esophagitis, esophageal ulcer, or peptic stricture. The same is true for patients with Barrett’s esophagus, eosinophilic esophagitis, or idiopathic pulmonary fibrosis.
Before any deprescribing is considered, the patient should be evaluated for risk of upper gastrointestinal bleeding, and those at high risk are not candidates for PPI deprescribing.
When the decision is made to withdraw PPIs, the patient should be advised of an increased risk of transient upper gastrointestinal symptoms caused by rebound acid hypersecretion.
The withdrawal of PPIs can be done abruptly, or the dose can be tapered gradually.
PPI-associated adverse events should not be a consideration when discussing the option of withdrawing from PPIs. Instead, the decision should be based on the absence of a specific reason for their use. A history of such adverse events, or a current adverse event, should not be a sole reason for discontinuation, nor should risk factors associated with risk of adverse events. Concerns about adverse events have driven recent interest in reducing use of PPIs, but those adverse events were identified through retrospective studies and may be only associated with PPI use rather than caused by it. In many cases there is no plausible mechanistic cause, and no clinical trials have demonstrated increased adverse events in PPI users.
Three-quarters of physicians say they have altered treatment plans for patients because of concerns about PPI adverse events, and 80% say they would advise patients to withdraw PPIs if they learned the patient was at increased risk of upper gastrointestinal bleeding. Unnecessary withdrawal can lead to recurrent symptoms and complications when PPIs are effective treatments. “Therefore, physicians should not use concern about unproven complications of PPI use as a justification for PPI deprescribing if there remain ongoing valid indications for PPI use,” the authors wrote.
Dr. Targownik has received investigator-initiated funding from Janssen Canada and served on advisory boards for AbbVie Canada, Takeda Canada, Merck Canada, Pfizer Canada, Janssen Canada, Roche Canada, and Sandoz Canada. She is the lead on an IBD registry supported by AbbVie Canada, Takeda Canada, Merck Canada, Pfizer Canada, Amgen Canada, Roche Canada, and Sandoz Canada. None of the companies with whom Dr. Targownik has a relation are involved in the manufacturing, distribution, or sales of PPIs or any other agents mentioned in the manuscript.
An American Gastroenterological Association practice update on deprescribing proton-pump inhibitors (PPIs) delineates conditions under which drug withdrawal should be considered, and acknowledges that conversations between physicians and patients can be complicated. An inappropriate decision to discontinue PPI therapy can have significant consequences for the patient, while continued inappropriate use raises health care costs and may rarely lead to adverse effects.
One purpose of the update is to provide guidance when patients and providers don’t have the resources to systematically examine the issue, especially when other medical concerns may be in play. The authors also suggested that physicians include pharmacists in the employment of the best practices advice.
“None of these statements represents a radical departure from previously published guidance on PPI appropriateness and deprescribing: Our [recommendations] simply seek to summarize the evidence and to provide the clinician with a single document which distills the evidence down into clinically applicable guidance statements,” Laura Targownik, MD, associate professor of medicine at the University of Toronto and corresponding author of the practice update published in Gastroenterology said in an interview.
“PPIs are highly effective medications for specific gastrointestinal conditions, and are largely safe. However, PPIs are often used in situations where they have minimal and no proven benefit, leading to unnecessary health care spending and unnecessary exposure to drugs. Our paper helps clinicians identify which patients require long-term PPI use as well as those who may be using them unnecessarily, and provides actionable advice on how to deprescribe PPIs from those deemed to be using them without clear benefit,” said Dr. Targownik.
An estimated 7%-15% of health care patients in general and 40% of those over 70 use PPIs at any given time, making them among the most commonly used drugs. About one in four patients who start PPIs will use them for a year or more. Aside from their use for acid-mediated upper gastrointestinal conditions, PPIs often find use for less well-defined complaints. Since PPIs are available over the counter, physicians may not even be involved in a patient’s decision to use them.
Although PPI use has been associated with adverse events, including chronic kidney disease, fractures, dementia, and greater risk of COVID-19 infection, there is not high-quality evidence to suggest that PPIs are directly responsible for any of these adverse events.
The authors suggested the primary care provider should periodically review and document the complaints or indications that prompt PPI use. When a patient is found to have no chronic condition that PPIs could reasonably address, the physician should consider a trial withdrawal. Patients who take PPIs twice daily for a known chronic condition should be considered for a reduction to a once-daily dose.
In general, PPI discontinuation is not a good option for most patients with complicated gastroesophageal reflux disease, such as those with a history of severe erosive esophagitis, esophageal ulcer, or peptic stricture. The same is true for patients with Barrett’s esophagus, eosinophilic esophagitis, or idiopathic pulmonary fibrosis.
Before any deprescribing is considered, the patient should be evaluated for risk of upper gastrointestinal bleeding, and those at high risk are not candidates for PPI deprescribing.
When the decision is made to withdraw PPIs, the patient should be advised of an increased risk of transient upper gastrointestinal symptoms caused by rebound acid hypersecretion.
The withdrawal of PPIs can be done abruptly, or the dose can be tapered gradually.
PPI-associated adverse events should not be a consideration when discussing the option of withdrawing from PPIs. Instead, the decision should be based on the absence of a specific reason for their use. A history of such adverse events, or a current adverse event, should not be a sole reason for discontinuation, nor should risk factors associated with risk of adverse events. Concerns about adverse events have driven recent interest in reducing use of PPIs, but those adverse events were identified through retrospective studies and may be only associated with PPI use rather than caused by it. In many cases there is no plausible mechanistic cause, and no clinical trials have demonstrated increased adverse events in PPI users.
Three-quarters of physicians say they have altered treatment plans for patients because of concerns about PPI adverse events, and 80% say they would advise patients to withdraw PPIs if they learned the patient was at increased risk of upper gastrointestinal bleeding. Unnecessary withdrawal can lead to recurrent symptoms and complications when PPIs are effective treatments. “Therefore, physicians should not use concern about unproven complications of PPI use as a justification for PPI deprescribing if there remain ongoing valid indications for PPI use,” the authors wrote.
Dr. Targownik has received investigator-initiated funding from Janssen Canada and served on advisory boards for AbbVie Canada, Takeda Canada, Merck Canada, Pfizer Canada, Janssen Canada, Roche Canada, and Sandoz Canada. She is the lead on an IBD registry supported by AbbVie Canada, Takeda Canada, Merck Canada, Pfizer Canada, Amgen Canada, Roche Canada, and Sandoz Canada. None of the companies with whom Dr. Targownik has a relation are involved in the manufacturing, distribution, or sales of PPIs or any other agents mentioned in the manuscript.
FROM GASTROENTEROLOGY
Researchers present cellular atlas of the human gut
New research sheds light on how different cell types behave across all intestinal regions and demonstrates variations in gene expression between these cells across three independent organ donors.
Research led by Joseph Burclaff, PhD, of the University of North Carolina at Chapel Hill, explained that the regional differences observed in the study “highlight the importance of regional selection when studying the gut.” Dr. Burclaff and colleagues, whose findings were published online in Cellular and Molecular Gastroenterology and Hepatology, wrote that they hope their “database serves as a resource to understand how drugs affect the intestinal epithelium and as guidance for future precision medicine approaches.”
In the study, Dr. Burclaff and colleagues performed single-cell transcriptomics that covered the duodenum, jejunum, ileum, as well as ascending, descending, and transverse colon from three independently processed organ donors. The donors varied in age, race, and body mass index.
The investigators evaluated 12,590 single epithelial cells for organ-specific lineage biomarkers, differentially regulated genes, receptors, and drug targets. The focus of the analyses was on intrinsic cell properties and their capacity for response to extrinsic signals found along the gut axis.
The research group assigned cells to 25 epithelial lineage clusters. According to the researchers, multiple accepted intestinal cell markers did not specifically mark all intestinal stem cells. In addition, the investigators explained that lysozyme expression was not unique to Paneth cells, and these cells lacked expression of certain “expected niche factors.” In fact, the researchers demonstrated lysozyme’s insufficiency for marking human Paneth cells.
Bestrophin-4þ (BEST4þ) cells, which expressed neuropeptide Y, demonstrated maturational differences between the colon and small intestine, suggesting organ-specific maturation for tuft and BEST4+ cells. In addition, the data from Dr. Burclaff and colleagues suggest BEST4+ cells are engaged in “diverse roles within the intestinal epithelium, laying the groundwork for functional studies.”
The researchers noted that “tuft cells possess a broad ability to interact with the innate and adaptive immune systems through previously unreported receptors.” Specifically, the researchers found these cells exhibit genes believed to be important for taste signaling, monitoring intestinal content, and signaling the immune system.
Certain classes of cell junctions, hormones, mucins, and nutrient absorption genes demonstrated “unappreciated regional expression differences across lineages,” the researchers wrote. The investigators added that the differential expression of receptors as well as drug targets across lineages demonstrated “biological variation and the potential for variegated responses.”
The researchers noted that while the regional differences identified in their study show the importance of regional selection during gut investigations, several previous colonic single-cell RNA sequencing studies did not specify the sample region or explain “if pooled samples are from consistent regions.”
In the study, the investigators also assessed how drugs may affect the intestinal epithelium and why certain side effects associated with pharmacologic agents occur. The researchers identified 498 drugs approved by the Food and Drug Administration that had 232 primary gene targets expressed in the gut epithelial dataset.
In their analysis, the researchers found that carboxylesterase-2, which metabolizes the drug irinotecan into biologically active SN-38, is the highest expressed phase 1 metabolism gene in the small intestine. Phase 2 enzyme UGT1A1, which inactivates SN-38, features low gut epithelial expression. The researchers explained that this finding suggests the cancer drug irinotecan may feature prolonged gut activation, supporting the notion that the orally administered agent may have efficacy against cancers of the intestine.
The researchers concluded their “database provides a foundation for understanding individual contributions of diverse epithelial cells across the length of the human intestine and colon to maintain physiologic function.”
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
Single cell transcriptomics has revolutionized our understanding of complex tissues, as this technology enables the identification of rare and/or novel cell types. Gastrointestinal science has benefited greatly from these technical advances, with multiple studies profiling liver, pancreas, stomach and intestine in health and disease, both in mouse and human samples.
In sum, this study is the “final answer” for GI biologists needing a complete compendium of all genes active in the multitude of specialized human intestinal epithelial cells.
Klaus H. Kaestner, PhD, MS, is with the department of genetics and the Center for Molecular Studies in Digestive and Liver Diseases at the University of Pennsylvania, Philadelphia. He declares having no conflicts of interest.
Single cell transcriptomics has revolutionized our understanding of complex tissues, as this technology enables the identification of rare and/or novel cell types. Gastrointestinal science has benefited greatly from these technical advances, with multiple studies profiling liver, pancreas, stomach and intestine in health and disease, both in mouse and human samples.
In sum, this study is the “final answer” for GI biologists needing a complete compendium of all genes active in the multitude of specialized human intestinal epithelial cells.
Klaus H. Kaestner, PhD, MS, is with the department of genetics and the Center for Molecular Studies in Digestive and Liver Diseases at the University of Pennsylvania, Philadelphia. He declares having no conflicts of interest.
Single cell transcriptomics has revolutionized our understanding of complex tissues, as this technology enables the identification of rare and/or novel cell types. Gastrointestinal science has benefited greatly from these technical advances, with multiple studies profiling liver, pancreas, stomach and intestine in health and disease, both in mouse and human samples.
In sum, this study is the “final answer” for GI biologists needing a complete compendium of all genes active in the multitude of specialized human intestinal epithelial cells.
Klaus H. Kaestner, PhD, MS, is with the department of genetics and the Center for Molecular Studies in Digestive and Liver Diseases at the University of Pennsylvania, Philadelphia. He declares having no conflicts of interest.
New research sheds light on how different cell types behave across all intestinal regions and demonstrates variations in gene expression between these cells across three independent organ donors.
Research led by Joseph Burclaff, PhD, of the University of North Carolina at Chapel Hill, explained that the regional differences observed in the study “highlight the importance of regional selection when studying the gut.” Dr. Burclaff and colleagues, whose findings were published online in Cellular and Molecular Gastroenterology and Hepatology, wrote that they hope their “database serves as a resource to understand how drugs affect the intestinal epithelium and as guidance for future precision medicine approaches.”
In the study, Dr. Burclaff and colleagues performed single-cell transcriptomics that covered the duodenum, jejunum, ileum, as well as ascending, descending, and transverse colon from three independently processed organ donors. The donors varied in age, race, and body mass index.
The investigators evaluated 12,590 single epithelial cells for organ-specific lineage biomarkers, differentially regulated genes, receptors, and drug targets. The focus of the analyses was on intrinsic cell properties and their capacity for response to extrinsic signals found along the gut axis.
The research group assigned cells to 25 epithelial lineage clusters. According to the researchers, multiple accepted intestinal cell markers did not specifically mark all intestinal stem cells. In addition, the investigators explained that lysozyme expression was not unique to Paneth cells, and these cells lacked expression of certain “expected niche factors.” In fact, the researchers demonstrated lysozyme’s insufficiency for marking human Paneth cells.
Bestrophin-4þ (BEST4þ) cells, which expressed neuropeptide Y, demonstrated maturational differences between the colon and small intestine, suggesting organ-specific maturation for tuft and BEST4+ cells. In addition, the data from Dr. Burclaff and colleagues suggest BEST4+ cells are engaged in “diverse roles within the intestinal epithelium, laying the groundwork for functional studies.”
The researchers noted that “tuft cells possess a broad ability to interact with the innate and adaptive immune systems through previously unreported receptors.” Specifically, the researchers found these cells exhibit genes believed to be important for taste signaling, monitoring intestinal content, and signaling the immune system.
Certain classes of cell junctions, hormones, mucins, and nutrient absorption genes demonstrated “unappreciated regional expression differences across lineages,” the researchers wrote. The investigators added that the differential expression of receptors as well as drug targets across lineages demonstrated “biological variation and the potential for variegated responses.”
The researchers noted that while the regional differences identified in their study show the importance of regional selection during gut investigations, several previous colonic single-cell RNA sequencing studies did not specify the sample region or explain “if pooled samples are from consistent regions.”
In the study, the investigators also assessed how drugs may affect the intestinal epithelium and why certain side effects associated with pharmacologic agents occur. The researchers identified 498 drugs approved by the Food and Drug Administration that had 232 primary gene targets expressed in the gut epithelial dataset.
In their analysis, the researchers found that carboxylesterase-2, which metabolizes the drug irinotecan into biologically active SN-38, is the highest expressed phase 1 metabolism gene in the small intestine. Phase 2 enzyme UGT1A1, which inactivates SN-38, features low gut epithelial expression. The researchers explained that this finding suggests the cancer drug irinotecan may feature prolonged gut activation, supporting the notion that the orally administered agent may have efficacy against cancers of the intestine.
The researchers concluded their “database provides a foundation for understanding individual contributions of diverse epithelial cells across the length of the human intestine and colon to maintain physiologic function.”
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
New research sheds light on how different cell types behave across all intestinal regions and demonstrates variations in gene expression between these cells across three independent organ donors.
Research led by Joseph Burclaff, PhD, of the University of North Carolina at Chapel Hill, explained that the regional differences observed in the study “highlight the importance of regional selection when studying the gut.” Dr. Burclaff and colleagues, whose findings were published online in Cellular and Molecular Gastroenterology and Hepatology, wrote that they hope their “database serves as a resource to understand how drugs affect the intestinal epithelium and as guidance for future precision medicine approaches.”
In the study, Dr. Burclaff and colleagues performed single-cell transcriptomics that covered the duodenum, jejunum, ileum, as well as ascending, descending, and transverse colon from three independently processed organ donors. The donors varied in age, race, and body mass index.
The investigators evaluated 12,590 single epithelial cells for organ-specific lineage biomarkers, differentially regulated genes, receptors, and drug targets. The focus of the analyses was on intrinsic cell properties and their capacity for response to extrinsic signals found along the gut axis.
The research group assigned cells to 25 epithelial lineage clusters. According to the researchers, multiple accepted intestinal cell markers did not specifically mark all intestinal stem cells. In addition, the investigators explained that lysozyme expression was not unique to Paneth cells, and these cells lacked expression of certain “expected niche factors.” In fact, the researchers demonstrated lysozyme’s insufficiency for marking human Paneth cells.
Bestrophin-4þ (BEST4þ) cells, which expressed neuropeptide Y, demonstrated maturational differences between the colon and small intestine, suggesting organ-specific maturation for tuft and BEST4+ cells. In addition, the data from Dr. Burclaff and colleagues suggest BEST4+ cells are engaged in “diverse roles within the intestinal epithelium, laying the groundwork for functional studies.”
The researchers noted that “tuft cells possess a broad ability to interact with the innate and adaptive immune systems through previously unreported receptors.” Specifically, the researchers found these cells exhibit genes believed to be important for taste signaling, monitoring intestinal content, and signaling the immune system.
Certain classes of cell junctions, hormones, mucins, and nutrient absorption genes demonstrated “unappreciated regional expression differences across lineages,” the researchers wrote. The investigators added that the differential expression of receptors as well as drug targets across lineages demonstrated “biological variation and the potential for variegated responses.”
The researchers noted that while the regional differences identified in their study show the importance of regional selection during gut investigations, several previous colonic single-cell RNA sequencing studies did not specify the sample region or explain “if pooled samples are from consistent regions.”
In the study, the investigators also assessed how drugs may affect the intestinal epithelium and why certain side effects associated with pharmacologic agents occur. The researchers identified 498 drugs approved by the Food and Drug Administration that had 232 primary gene targets expressed in the gut epithelial dataset.
In their analysis, the researchers found that carboxylesterase-2, which metabolizes the drug irinotecan into biologically active SN-38, is the highest expressed phase 1 metabolism gene in the small intestine. Phase 2 enzyme UGT1A1, which inactivates SN-38, features low gut epithelial expression. The researchers explained that this finding suggests the cancer drug irinotecan may feature prolonged gut activation, supporting the notion that the orally administered agent may have efficacy against cancers of the intestine.
The researchers concluded their “database provides a foundation for understanding individual contributions of diverse epithelial cells across the length of the human intestine and colon to maintain physiologic function.”
The researchers reported no conflicts of interest with the pharmaceutical industry. The study received no industry funding.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY
Guselkumab found promising for Crohn’s in phase 2 study
according to phase 2 trial data
Conventional first-line therapies for Crohn’s disease (CD) often are not effective for maintaining clinical remission and are associated with significant toxicity concerns, wrote study investigator William J. Sandborn, MD, of the University of California, San Diego, and colleagues. Guselkumab is a human monoclonal antibody that selectively inhibits the p19 subunit of interleukin 23, a cytokine that plays an important role in gut inflammation, the researchers wrote. Their report was published online Feb. 5 in Gastroenterology.
In the phase 2 GALAXI-1 study, Dr. Sandborn and colleagues evaluated the safety and efficacy of guselkumab in 309 patients with moderate to severe CD for at least 3 months. All patients previously had experienced either an inadequate response or intolerance to convention treatment or biologic agents.
Patients were randomly assigned to either placebo (n = 61); intravenous guselkumab at doses of 200 mg (n = 61), 600 mg (n = 63), or 1,200 mg at weeks 0, 4, and 8 (n = 61); or a reference arm comprising ustekinumab approximately 6 mg/kg IV at week 0 and subcutaneous 90 mg at week 8 (n = 63).
The study’s primary endpoint included the change from baseline to 12 weeks in the CD Activity Index score. The mean age of the population was 38.8 years and the mean duration of CD was 8.8 years.
There were patients in the primary efficacy analysis set who discontinued the study through week 12. At one point the study was paused to assess a serious adverse event of toxic hepatitis in a guselkumab-treated patient. Fifty-one patients were discontinued from the study because their induction treatment was paused during the adverse event evaluation; however, these patients were included in the safety analyses.
At the 12-week follow-up assessment, patients assigned to all doses of guselkumab experienced significantly greater reductions in the CD Activity Index from baseline when compared with placebo (least squares mean: 200 mg: –160.4; 600 mg: –138.9; and 1,200 mg: –144.9 vs. placebo: –36.2; all P < .05). In addition, a significantly greater proportion of patients in each guselkumab arm achieved clinical remission compared with the placebo group (CD Activity Index < 150; 57.4%, 55.6%, and 45.9% vs. 16.4%; all P < .05).
Among the patients who had an inadequate response or intolerance to prior biologic therapy, 47.5% of those in the combined guselkumab arm and 10.0% in the placebo arm met the criteria for clinical remission at 12 weeks. In addition, 62.4% of patients in the combined guselkumab group and 20% in the placebo group within the prior biologic therapy subgroup achieved clinical response at week 12.
Of patients with inadequate response or intolerance to prior conventional therapy, approximately 60% treated with guselkumab at all doses vs. 22.6% of the placebo group had clinical remission by 12 weeks. Also within this subgroup, 70.2% of patients in the combined guselkumab arm and 29% in the placebo arm had clinical response.
Finally, among the 360 patients in the safety analysis set, the proportions of patients with at least one adverse event were similar across the treatment groups during the treatment period (60% for placebo; 45.7% for guselkumab combined; 50.7% for ustekinumab).
There was no observable relationship between the dose of guselkumab and the proportion of patients with adverse events. Infection rates were 21.4% in the placebo arm, 15.1% in the combined guselkumab group, and 12.7% in the ustekinumab arm. Approximately 3.7% of patients in the combined guselkumab arm, 5.7% of patients in the placebo arm, and 5.6% of patients in the ustekinumab arm experienced at least one serious adverse event.
Greater proportions of patients receiving guselkumab achieved clinical response, Patient Reported Outcomes–2 remission, clinical-biomarker response, and endoscopic response at week 12 vs. placebo. Efficacy of ustekinumab vs. placebo was demonstrated. Safety event rates were generally similar across treatment groups.
Limitations of the study included the small number of patients in the overall dataset and the relatively short treatment period of 12 weeks. The researchers noted that phase 3 studies of guselkumab for the treatment of Crohn’s disease are underway.
Several of the researchers reported conflicts of interest with the pharmaceutical industry. The study received funding from Janssen Research & Development, LLC.
Over the last 20 years, multiple targeted therapies have been developed for Crohn’s disease (CD) and have changed the management landscape for this chronic disease. Despite many successes, a proportion of patients still experience treatment failure or intolerance to the currently available biologics, and the need for ongoing development of new therapies remains. This study by Sandborn and colleagues highlights the development of a novel therapy for Crohn’s disease patients. The novel therapy, guselkumab, targets a more specific interleukin pathway (IL-23p19 inhibition) than is currently available. In the study, guselkumab was found to be effective at improving multiple clinical parameters such as Crohn’s Disease Activity Index and Patient-Reported Outcome–2 as well as objective parameters including biomarker response and endoscopic response in patients with moderate to severe CD. There was no apparent exposure response observed over multiple dose regimens. Guselkumab also demonstrated a favorable safety profile.
Robin Dalal, MD, is an assistant professor of medicine, director of IBD education, and director of the advanced IBD fellowship at Vanderbilt University Medical Center in Nashville, Tenn. She reported being a consultant for AbbVie.
Over the last 20 years, multiple targeted therapies have been developed for Crohn’s disease (CD) and have changed the management landscape for this chronic disease. Despite many successes, a proportion of patients still experience treatment failure or intolerance to the currently available biologics, and the need for ongoing development of new therapies remains. This study by Sandborn and colleagues highlights the development of a novel therapy for Crohn’s disease patients. The novel therapy, guselkumab, targets a more specific interleukin pathway (IL-23p19 inhibition) than is currently available. In the study, guselkumab was found to be effective at improving multiple clinical parameters such as Crohn’s Disease Activity Index and Patient-Reported Outcome–2 as well as objective parameters including biomarker response and endoscopic response in patients with moderate to severe CD. There was no apparent exposure response observed over multiple dose regimens. Guselkumab also demonstrated a favorable safety profile.
Robin Dalal, MD, is an assistant professor of medicine, director of IBD education, and director of the advanced IBD fellowship at Vanderbilt University Medical Center in Nashville, Tenn. She reported being a consultant for AbbVie.
Over the last 20 years, multiple targeted therapies have been developed for Crohn’s disease (CD) and have changed the management landscape for this chronic disease. Despite many successes, a proportion of patients still experience treatment failure or intolerance to the currently available biologics, and the need for ongoing development of new therapies remains. This study by Sandborn and colleagues highlights the development of a novel therapy for Crohn’s disease patients. The novel therapy, guselkumab, targets a more specific interleukin pathway (IL-23p19 inhibition) than is currently available. In the study, guselkumab was found to be effective at improving multiple clinical parameters such as Crohn’s Disease Activity Index and Patient-Reported Outcome–2 as well as objective parameters including biomarker response and endoscopic response in patients with moderate to severe CD. There was no apparent exposure response observed over multiple dose regimens. Guselkumab also demonstrated a favorable safety profile.
Robin Dalal, MD, is an assistant professor of medicine, director of IBD education, and director of the advanced IBD fellowship at Vanderbilt University Medical Center in Nashville, Tenn. She reported being a consultant for AbbVie.
according to phase 2 trial data
Conventional first-line therapies for Crohn’s disease (CD) often are not effective for maintaining clinical remission and are associated with significant toxicity concerns, wrote study investigator William J. Sandborn, MD, of the University of California, San Diego, and colleagues. Guselkumab is a human monoclonal antibody that selectively inhibits the p19 subunit of interleukin 23, a cytokine that plays an important role in gut inflammation, the researchers wrote. Their report was published online Feb. 5 in Gastroenterology.
In the phase 2 GALAXI-1 study, Dr. Sandborn and colleagues evaluated the safety and efficacy of guselkumab in 309 patients with moderate to severe CD for at least 3 months. All patients previously had experienced either an inadequate response or intolerance to convention treatment or biologic agents.
Patients were randomly assigned to either placebo (n = 61); intravenous guselkumab at doses of 200 mg (n = 61), 600 mg (n = 63), or 1,200 mg at weeks 0, 4, and 8 (n = 61); or a reference arm comprising ustekinumab approximately 6 mg/kg IV at week 0 and subcutaneous 90 mg at week 8 (n = 63).
The study’s primary endpoint included the change from baseline to 12 weeks in the CD Activity Index score. The mean age of the population was 38.8 years and the mean duration of CD was 8.8 years.
There were patients in the primary efficacy analysis set who discontinued the study through week 12. At one point the study was paused to assess a serious adverse event of toxic hepatitis in a guselkumab-treated patient. Fifty-one patients were discontinued from the study because their induction treatment was paused during the adverse event evaluation; however, these patients were included in the safety analyses.
At the 12-week follow-up assessment, patients assigned to all doses of guselkumab experienced significantly greater reductions in the CD Activity Index from baseline when compared with placebo (least squares mean: 200 mg: –160.4; 600 mg: –138.9; and 1,200 mg: –144.9 vs. placebo: –36.2; all P < .05). In addition, a significantly greater proportion of patients in each guselkumab arm achieved clinical remission compared with the placebo group (CD Activity Index < 150; 57.4%, 55.6%, and 45.9% vs. 16.4%; all P < .05).
Among the patients who had an inadequate response or intolerance to prior biologic therapy, 47.5% of those in the combined guselkumab arm and 10.0% in the placebo arm met the criteria for clinical remission at 12 weeks. In addition, 62.4% of patients in the combined guselkumab group and 20% in the placebo group within the prior biologic therapy subgroup achieved clinical response at week 12.
Of patients with inadequate response or intolerance to prior conventional therapy, approximately 60% treated with guselkumab at all doses vs. 22.6% of the placebo group had clinical remission by 12 weeks. Also within this subgroup, 70.2% of patients in the combined guselkumab arm and 29% in the placebo arm had clinical response.
Finally, among the 360 patients in the safety analysis set, the proportions of patients with at least one adverse event were similar across the treatment groups during the treatment period (60% for placebo; 45.7% for guselkumab combined; 50.7% for ustekinumab).
There was no observable relationship between the dose of guselkumab and the proportion of patients with adverse events. Infection rates were 21.4% in the placebo arm, 15.1% in the combined guselkumab group, and 12.7% in the ustekinumab arm. Approximately 3.7% of patients in the combined guselkumab arm, 5.7% of patients in the placebo arm, and 5.6% of patients in the ustekinumab arm experienced at least one serious adverse event.
Greater proportions of patients receiving guselkumab achieved clinical response, Patient Reported Outcomes–2 remission, clinical-biomarker response, and endoscopic response at week 12 vs. placebo. Efficacy of ustekinumab vs. placebo was demonstrated. Safety event rates were generally similar across treatment groups.
Limitations of the study included the small number of patients in the overall dataset and the relatively short treatment period of 12 weeks. The researchers noted that phase 3 studies of guselkumab for the treatment of Crohn’s disease are underway.
Several of the researchers reported conflicts of interest with the pharmaceutical industry. The study received funding from Janssen Research & Development, LLC.
according to phase 2 trial data
Conventional first-line therapies for Crohn’s disease (CD) often are not effective for maintaining clinical remission and are associated with significant toxicity concerns, wrote study investigator William J. Sandborn, MD, of the University of California, San Diego, and colleagues. Guselkumab is a human monoclonal antibody that selectively inhibits the p19 subunit of interleukin 23, a cytokine that plays an important role in gut inflammation, the researchers wrote. Their report was published online Feb. 5 in Gastroenterology.
In the phase 2 GALAXI-1 study, Dr. Sandborn and colleagues evaluated the safety and efficacy of guselkumab in 309 patients with moderate to severe CD for at least 3 months. All patients previously had experienced either an inadequate response or intolerance to convention treatment or biologic agents.
Patients were randomly assigned to either placebo (n = 61); intravenous guselkumab at doses of 200 mg (n = 61), 600 mg (n = 63), or 1,200 mg at weeks 0, 4, and 8 (n = 61); or a reference arm comprising ustekinumab approximately 6 mg/kg IV at week 0 and subcutaneous 90 mg at week 8 (n = 63).
The study’s primary endpoint included the change from baseline to 12 weeks in the CD Activity Index score. The mean age of the population was 38.8 years and the mean duration of CD was 8.8 years.
There were patients in the primary efficacy analysis set who discontinued the study through week 12. At one point the study was paused to assess a serious adverse event of toxic hepatitis in a guselkumab-treated patient. Fifty-one patients were discontinued from the study because their induction treatment was paused during the adverse event evaluation; however, these patients were included in the safety analyses.
At the 12-week follow-up assessment, patients assigned to all doses of guselkumab experienced significantly greater reductions in the CD Activity Index from baseline when compared with placebo (least squares mean: 200 mg: –160.4; 600 mg: –138.9; and 1,200 mg: –144.9 vs. placebo: –36.2; all P < .05). In addition, a significantly greater proportion of patients in each guselkumab arm achieved clinical remission compared with the placebo group (CD Activity Index < 150; 57.4%, 55.6%, and 45.9% vs. 16.4%; all P < .05).
Among the patients who had an inadequate response or intolerance to prior biologic therapy, 47.5% of those in the combined guselkumab arm and 10.0% in the placebo arm met the criteria for clinical remission at 12 weeks. In addition, 62.4% of patients in the combined guselkumab group and 20% in the placebo group within the prior biologic therapy subgroup achieved clinical response at week 12.
Of patients with inadequate response or intolerance to prior conventional therapy, approximately 60% treated with guselkumab at all doses vs. 22.6% of the placebo group had clinical remission by 12 weeks. Also within this subgroup, 70.2% of patients in the combined guselkumab arm and 29% in the placebo arm had clinical response.
Finally, among the 360 patients in the safety analysis set, the proportions of patients with at least one adverse event were similar across the treatment groups during the treatment period (60% for placebo; 45.7% for guselkumab combined; 50.7% for ustekinumab).
There was no observable relationship between the dose of guselkumab and the proportion of patients with adverse events. Infection rates were 21.4% in the placebo arm, 15.1% in the combined guselkumab group, and 12.7% in the ustekinumab arm. Approximately 3.7% of patients in the combined guselkumab arm, 5.7% of patients in the placebo arm, and 5.6% of patients in the ustekinumab arm experienced at least one serious adverse event.
Greater proportions of patients receiving guselkumab achieved clinical response, Patient Reported Outcomes–2 remission, clinical-biomarker response, and endoscopic response at week 12 vs. placebo. Efficacy of ustekinumab vs. placebo was demonstrated. Safety event rates were generally similar across treatment groups.
Limitations of the study included the small number of patients in the overall dataset and the relatively short treatment period of 12 weeks. The researchers noted that phase 3 studies of guselkumab for the treatment of Crohn’s disease are underway.
Several of the researchers reported conflicts of interest with the pharmaceutical industry. The study received funding from Janssen Research & Development, LLC.
FROM GASTROENTEROLOGY
Registry data support lowering CRC screening age to 45
Approximately one-third of people between 45 and 49 years of age who undergo colonoscopies have neoplastic colorectal pathology, according to a retrospective analysis.
According to the researchers, led by Parth Trivedi, MD, of the Icahn School of Medicine at Mount Sinai, New York, there has progressively been a “disturbing” rise in early-onset colorectal cancer (CRC) in the United States, which has prompted guidelines from the American Cancer Society to the U.S. Preventive Services Task Force to recommend lowering the CRC screening starting age to 45 years old for average-risk individuals. Despite these recommendations, little research to date has fully characterized the prevalence of colorectal neoplasia in individuals younger than the currently recommended CRC onset screening age of 50 years.
Dr. Trivedi and colleagues, who published their study findings in Gastroenterology, retrospectively reviewed colonoscopy data recorded in the Gastrointestinal Quality Improvement Consortium Registry to address the current knowledge gaps on early-onset CRC. Collected data were for procedures conducted at 123 AMSURG ambulatory endoscopy centers across 29 states between January 2014 and February 2021. In total, 2,921,816 colonoscopies during the study period among patients aged 18-54 years were recorded by AMSURG-associated endoscopists; of these, 562,559 met inclusion criteria for high-quality screening or diagnostic colonoscopy procedures.
The researchers pooled a young-onset age group, including patients between the ages of 18 and 49 years old, in whom 145,998 procedures were performed, including 79,934 procedures in patients aged 45-49 years. A comparator group with 336,627 procedures in patients aged 50-54 years was also included in the study. The findings were categorized into CRC, advanced premalignant lesions (APL), and “any neoplasia,” the latter of which included all adenomas, sessile serrated polyps, and CRC.
Among patients aged 18-44 years, the most frequent indications were “diagnostic-other” (45.6%) as well as “diagnostic-bleeding” (39.4%). Among patients between 45 and 49 years of age, the most frequent indications were “screening” (41.4%) and “diagnostic-other” (30.7%). Nearly all (90%) procedures among those aged 50-54 years were for screening.
A multivariable logistic regression identified 5 variables predictive of either APL or CRC in patients between 18 and 49 years of age: increasing age (odds ratio, 1.08; 95% confidence interval, 1.07-1.08; P <0.01), male sex (OR = 1.67; 95% CI, 1.63-1.70; P <0.01), White race (vs. African American: OR = 0.76; 95% CI, 0.73-0.79, P <0.01; vs. Asian: OR = 0.89; 95% CI, 0.84-0.94, P <0.01), family history of CRC (OR = 1.21; 95% CI, 1.16-1.26; P <0.01) and polyps (OR = 1.33; 95% CI, 1.24-1.43; P <0.01), and examinations for bleeding (OR = 1.15; 95% CI, 1.12-1.18; P <0.01) or screening (OR = 1.20; 95% CI, 1.16-1.24; P <0.01).
The prevalence of neoplastic findings in the young-onset age-group increased with increasing age for the categories of any neoplasia, APLs, and CRC. Among patients aged 40-44, 26.59% had any neoplasia, 5.76% had APL, and 0.53% had CRC. In those aged 45-49 years, around 32% had any neoplasia, approximately 7.5% had APLs, and nearly 0.58% had CRC. In the 50- to 54-year-old group, the prevalences of any neoplasia, APL, and CRC were 37.72%, 9.48%, and 0.32%, respectively.
Across all age groups, a family history of CRC was associated with a higher prevalence of any neoplasia and APL. In addition, the rates of any APL and neoplasia in patients with a family history of CRC were comparable to patients who were 5 years older but had no family history of the disease. Across most young-onset age group, individuals with a positive family history had a lower CRC prevalence versus patients with no family history.
The researchers noted that their population data are derived from ambulatory endoscopy centers, which may introduce bias associated with insurance coverage or patient preference to attend specific endoscopic centers. Additionally, the investigators stated that many records on race and ethnicity were missing, further limiting the findings.
“The present analysis of neoplastic colorectal pathology among individuals younger than age 50 suggests that lowering the screening age to 45 for men and women of all races and ethnicities will likely detect important pathology rather frequently,” they concluded. In addition, the researchers noted that the study results “underscore the importance of early messaging to patients and providers in the years leading up to age 45.” Ultimately, improved “awareness of pathology prevalence in individuals younger than age 45 can help guide clinicians in the clinical management of CRC risk,” the researchers wrote.
Several of the researchers reported conflicts of interest with Exact Sciences Corp and Freenome. The study received no industry funding.
An alarming trend of increased colorectal cancer (CRC) incidence has been noted among individuals 20-49 years of age over the past 2 decades. This fact combined with the results of microsimulation studies have led all purveyors of CRC screening guidelines in the United States to lower their recommended age for the initiation of average-risk screening from 50 to 45. Despite this major shift in recommendations, relatively little is known about the rates of premalignant neoplasia in this population.
Future studies will need to document the actual effectiveness of CRC screening in persons aged 45-49 and examine the actual cost-benefit of lowering the recommended screening age.
Reid M. Ness, MD, MPH, AGAF is an associate professor in the division of gastroenterology, hepatology and nutrition, department of medicine, Vanderbilt University Medical Center, Nashville, Tenn., and at the VA Tennessee Valley Healthcare System, Nashville campus. He is also an investigator in the Vanderbilt-Ingram Cancer Center. Dr. Ness is a study investigator with Guardant Health.
An alarming trend of increased colorectal cancer (CRC) incidence has been noted among individuals 20-49 years of age over the past 2 decades. This fact combined with the results of microsimulation studies have led all purveyors of CRC screening guidelines in the United States to lower their recommended age for the initiation of average-risk screening from 50 to 45. Despite this major shift in recommendations, relatively little is known about the rates of premalignant neoplasia in this population.
Future studies will need to document the actual effectiveness of CRC screening in persons aged 45-49 and examine the actual cost-benefit of lowering the recommended screening age.
Reid M. Ness, MD, MPH, AGAF is an associate professor in the division of gastroenterology, hepatology and nutrition, department of medicine, Vanderbilt University Medical Center, Nashville, Tenn., and at the VA Tennessee Valley Healthcare System, Nashville campus. He is also an investigator in the Vanderbilt-Ingram Cancer Center. Dr. Ness is a study investigator with Guardant Health.
An alarming trend of increased colorectal cancer (CRC) incidence has been noted among individuals 20-49 years of age over the past 2 decades. This fact combined with the results of microsimulation studies have led all purveyors of CRC screening guidelines in the United States to lower their recommended age for the initiation of average-risk screening from 50 to 45. Despite this major shift in recommendations, relatively little is known about the rates of premalignant neoplasia in this population.
Future studies will need to document the actual effectiveness of CRC screening in persons aged 45-49 and examine the actual cost-benefit of lowering the recommended screening age.
Reid M. Ness, MD, MPH, AGAF is an associate professor in the division of gastroenterology, hepatology and nutrition, department of medicine, Vanderbilt University Medical Center, Nashville, Tenn., and at the VA Tennessee Valley Healthcare System, Nashville campus. He is also an investigator in the Vanderbilt-Ingram Cancer Center. Dr. Ness is a study investigator with Guardant Health.
Approximately one-third of people between 45 and 49 years of age who undergo colonoscopies have neoplastic colorectal pathology, according to a retrospective analysis.
According to the researchers, led by Parth Trivedi, MD, of the Icahn School of Medicine at Mount Sinai, New York, there has progressively been a “disturbing” rise in early-onset colorectal cancer (CRC) in the United States, which has prompted guidelines from the American Cancer Society to the U.S. Preventive Services Task Force to recommend lowering the CRC screening starting age to 45 years old for average-risk individuals. Despite these recommendations, little research to date has fully characterized the prevalence of colorectal neoplasia in individuals younger than the currently recommended CRC onset screening age of 50 years.
Dr. Trivedi and colleagues, who published their study findings in Gastroenterology, retrospectively reviewed colonoscopy data recorded in the Gastrointestinal Quality Improvement Consortium Registry to address the current knowledge gaps on early-onset CRC. Collected data were for procedures conducted at 123 AMSURG ambulatory endoscopy centers across 29 states between January 2014 and February 2021. In total, 2,921,816 colonoscopies during the study period among patients aged 18-54 years were recorded by AMSURG-associated endoscopists; of these, 562,559 met inclusion criteria for high-quality screening or diagnostic colonoscopy procedures.
The researchers pooled a young-onset age group, including patients between the ages of 18 and 49 years old, in whom 145,998 procedures were performed, including 79,934 procedures in patients aged 45-49 years. A comparator group with 336,627 procedures in patients aged 50-54 years was also included in the study. The findings were categorized into CRC, advanced premalignant lesions (APL), and “any neoplasia,” the latter of which included all adenomas, sessile serrated polyps, and CRC.
Among patients aged 18-44 years, the most frequent indications were “diagnostic-other” (45.6%) as well as “diagnostic-bleeding” (39.4%). Among patients between 45 and 49 years of age, the most frequent indications were “screening” (41.4%) and “diagnostic-other” (30.7%). Nearly all (90%) procedures among those aged 50-54 years were for screening.
A multivariable logistic regression identified 5 variables predictive of either APL or CRC in patients between 18 and 49 years of age: increasing age (odds ratio, 1.08; 95% confidence interval, 1.07-1.08; P <0.01), male sex (OR = 1.67; 95% CI, 1.63-1.70; P <0.01), White race (vs. African American: OR = 0.76; 95% CI, 0.73-0.79, P <0.01; vs. Asian: OR = 0.89; 95% CI, 0.84-0.94, P <0.01), family history of CRC (OR = 1.21; 95% CI, 1.16-1.26; P <0.01) and polyps (OR = 1.33; 95% CI, 1.24-1.43; P <0.01), and examinations for bleeding (OR = 1.15; 95% CI, 1.12-1.18; P <0.01) or screening (OR = 1.20; 95% CI, 1.16-1.24; P <0.01).
The prevalence of neoplastic findings in the young-onset age-group increased with increasing age for the categories of any neoplasia, APLs, and CRC. Among patients aged 40-44, 26.59% had any neoplasia, 5.76% had APL, and 0.53% had CRC. In those aged 45-49 years, around 32% had any neoplasia, approximately 7.5% had APLs, and nearly 0.58% had CRC. In the 50- to 54-year-old group, the prevalences of any neoplasia, APL, and CRC were 37.72%, 9.48%, and 0.32%, respectively.
Across all age groups, a family history of CRC was associated with a higher prevalence of any neoplasia and APL. In addition, the rates of any APL and neoplasia in patients with a family history of CRC were comparable to patients who were 5 years older but had no family history of the disease. Across most young-onset age group, individuals with a positive family history had a lower CRC prevalence versus patients with no family history.
The researchers noted that their population data are derived from ambulatory endoscopy centers, which may introduce bias associated with insurance coverage or patient preference to attend specific endoscopic centers. Additionally, the investigators stated that many records on race and ethnicity were missing, further limiting the findings.
“The present analysis of neoplastic colorectal pathology among individuals younger than age 50 suggests that lowering the screening age to 45 for men and women of all races and ethnicities will likely detect important pathology rather frequently,” they concluded. In addition, the researchers noted that the study results “underscore the importance of early messaging to patients and providers in the years leading up to age 45.” Ultimately, improved “awareness of pathology prevalence in individuals younger than age 45 can help guide clinicians in the clinical management of CRC risk,” the researchers wrote.
Several of the researchers reported conflicts of interest with Exact Sciences Corp and Freenome. The study received no industry funding.
Approximately one-third of people between 45 and 49 years of age who undergo colonoscopies have neoplastic colorectal pathology, according to a retrospective analysis.
According to the researchers, led by Parth Trivedi, MD, of the Icahn School of Medicine at Mount Sinai, New York, there has progressively been a “disturbing” rise in early-onset colorectal cancer (CRC) in the United States, which has prompted guidelines from the American Cancer Society to the U.S. Preventive Services Task Force to recommend lowering the CRC screening starting age to 45 years old for average-risk individuals. Despite these recommendations, little research to date has fully characterized the prevalence of colorectal neoplasia in individuals younger than the currently recommended CRC onset screening age of 50 years.
Dr. Trivedi and colleagues, who published their study findings in Gastroenterology, retrospectively reviewed colonoscopy data recorded in the Gastrointestinal Quality Improvement Consortium Registry to address the current knowledge gaps on early-onset CRC. Collected data were for procedures conducted at 123 AMSURG ambulatory endoscopy centers across 29 states between January 2014 and February 2021. In total, 2,921,816 colonoscopies during the study period among patients aged 18-54 years were recorded by AMSURG-associated endoscopists; of these, 562,559 met inclusion criteria for high-quality screening or diagnostic colonoscopy procedures.
The researchers pooled a young-onset age group, including patients between the ages of 18 and 49 years old, in whom 145,998 procedures were performed, including 79,934 procedures in patients aged 45-49 years. A comparator group with 336,627 procedures in patients aged 50-54 years was also included in the study. The findings were categorized into CRC, advanced premalignant lesions (APL), and “any neoplasia,” the latter of which included all adenomas, sessile serrated polyps, and CRC.
Among patients aged 18-44 years, the most frequent indications were “diagnostic-other” (45.6%) as well as “diagnostic-bleeding” (39.4%). Among patients between 45 and 49 years of age, the most frequent indications were “screening” (41.4%) and “diagnostic-other” (30.7%). Nearly all (90%) procedures among those aged 50-54 years were for screening.
A multivariable logistic regression identified 5 variables predictive of either APL or CRC in patients between 18 and 49 years of age: increasing age (odds ratio, 1.08; 95% confidence interval, 1.07-1.08; P <0.01), male sex (OR = 1.67; 95% CI, 1.63-1.70; P <0.01), White race (vs. African American: OR = 0.76; 95% CI, 0.73-0.79, P <0.01; vs. Asian: OR = 0.89; 95% CI, 0.84-0.94, P <0.01), family history of CRC (OR = 1.21; 95% CI, 1.16-1.26; P <0.01) and polyps (OR = 1.33; 95% CI, 1.24-1.43; P <0.01), and examinations for bleeding (OR = 1.15; 95% CI, 1.12-1.18; P <0.01) or screening (OR = 1.20; 95% CI, 1.16-1.24; P <0.01).
The prevalence of neoplastic findings in the young-onset age-group increased with increasing age for the categories of any neoplasia, APLs, and CRC. Among patients aged 40-44, 26.59% had any neoplasia, 5.76% had APL, and 0.53% had CRC. In those aged 45-49 years, around 32% had any neoplasia, approximately 7.5% had APLs, and nearly 0.58% had CRC. In the 50- to 54-year-old group, the prevalences of any neoplasia, APL, and CRC were 37.72%, 9.48%, and 0.32%, respectively.
Across all age groups, a family history of CRC was associated with a higher prevalence of any neoplasia and APL. In addition, the rates of any APL and neoplasia in patients with a family history of CRC were comparable to patients who were 5 years older but had no family history of the disease. Across most young-onset age group, individuals with a positive family history had a lower CRC prevalence versus patients with no family history.
The researchers noted that their population data are derived from ambulatory endoscopy centers, which may introduce bias associated with insurance coverage or patient preference to attend specific endoscopic centers. Additionally, the investigators stated that many records on race and ethnicity were missing, further limiting the findings.
“The present analysis of neoplastic colorectal pathology among individuals younger than age 50 suggests that lowering the screening age to 45 for men and women of all races and ethnicities will likely detect important pathology rather frequently,” they concluded. In addition, the researchers noted that the study results “underscore the importance of early messaging to patients and providers in the years leading up to age 45.” Ultimately, improved “awareness of pathology prevalence in individuals younger than age 45 can help guide clinicians in the clinical management of CRC risk,” the researchers wrote.
Several of the researchers reported conflicts of interest with Exact Sciences Corp and Freenome. The study received no industry funding.
FROM GASTROENTEROLOGY
Cryoballoon ablation demonstrates long-term durability in BE
Similar to radiofrequency ablation, cryoballoon ablation (CBA) is a durable approach that can eradicate Barrett’s esophagus (BE) in treatment-naive patients with dysplastic BE, according to a single-center cohort study.
Endoscopic mucosal resection (EMR), radiofrequency ablation (RFA), and cryotherapy are established techniques used in endoscopic eradication therapy of BE, wrote study authors led by Mohamad Dbouk, MD, of Johns Hopkins Medical Institutions in Baltimore in Techniques and Innovations in Gastrointestinal Endoscopy. Unlike RFA which uses heat to induce tissue necrosis and reepithelialization of normal tissue, cryotherapy applies extreme cold in the treatment of BE. While cryotherapy as an endoscopic ablative technique has been studied over the past decade as an alternative treatment modality, Dr. Dbouk and researchers noted that long-term data on durability of and outcomes with this approach are lacking.
To gauge the durability of CBA for dysplastic BE, the researchers examined outcomes of 59 consecutive patients with BE and confirmed low-grade dysplasia (n = 22), high-grade dysplasia (n = 33), or intramucosal cancer (n = 4), all of whom were treated with CBA for the purposes of BE eradication. The single-center cohort comprised only treatment-naive patients who had a mean age of 66.8 (91.5% male). In the overall cohort, the mean length of the BE was 5 cm, although 23.7% of patients had BE ≥8 cm in length.
Following confirmation of dysplastic BE in biopsies and/or EMR specimens at baseline, patients underwent CBA applied to the gastric cardia as well as all visible BE with the cryoballoon focal ablation system and focal or pear-shaped cryoballoon. The investigators performed surveillance esophagogastroduodenoscopy (EGD) to assess the CBA response. Patients with high-grade dysplasia underwent EGD and biopsy every 3 months for the first year after completing CBA, every 6 months for the second year, and once per year thereafter. However, those with biopsies at baseline that showed low-grade dysplasia (LGD) underwent EGD and biopsy every 6 months during the first year after CBA and annually thereafter. Retreatment with ablation was allowed if recurrent dysplasia or intestinal metaplasia was found.
The study’s primary endpoints included short-term efficacy – defined as the rate of complete eradication of dysplasia (CE-D) and intestinal metaplasia (CE-IM) at 1-year follow-up – and durability – characterized by the proportion of patients with CE-D and CE-IM within 18 months and maintained at 2- and 3-year follow-up.
The median follow-up period for the patient cohort was 54.3 months. Approximately 95% of the 56 patients who were evaluable at 1 year achieved CE-D, while 75% achieved CE-IM. In an analysis that stratified patients by their baseline dysplasia grade, the rates of CE-D were each 96% in the LGD and HGD groups. At 1 year, the median number of CBA sessions used to achieve CE-IM was 3.
Throughout treatment and the follow-up period, none of the patients progressed beyond their dysplasia grade at baseline or developed esophageal cancer. All patients had maintained CE-D for years 2, 3, and 4. In addition, 98% of patients had CE-IM at 2 years, 98% had CE-IM at 3 years, and 97% of patients had CE-IM at 4 years. After stratification of patients by baseline grade of dysplasia, the researchers found no significant difference between groups in the rates of CE-D and CE-IM at each follow-up year.
In 48 patients who initially achieved CE-IM, 14.6% developed recurrent intestinal metaplasia (IM), including six in the esophagus and one in the GEJ, after a median of 20.7 months. Approximately 57% of patients who developed recurrent IM had baseline LGD, while 43% had HGD at baseline. The length of BE was not significantly associated with the risk of IM recurrence, according to a Cox proportional hazard analysis (hazard ratio, 1.02; 95% confidence interval, 0.86-1.2; P = .8).
Approximately 8.5% of patients had post-CBA strictures that required dilation during the study period. According to bivariate analysis, individuals with a BE length of ≥8 cm were significantly more likely to develop strictures, compared with patients without ultra-long BE (28.6% vs. 2.2%, respectively; P = .009). Strictures occurred during the first 4 months after the initial CBA. The median period from the first CBA treatment to stricture detection on follow-up EGD was 2 months. Around 1.7% of patients experienced postprocedural bleeding that required clipping for closure. These patients were on clopidogrel for atrial fibrillation during the first year of active treatment.
Limitations of the study included the small sample size as well as the inclusion of patients from a single center, which the researchers suggest may limit the generalizability of the results.
“More research is needed to confirm the long-term durability of CBA,” the authors concluded. “Randomized controlled trials comparing CBA with RFA are needed to assess the role of CBA as a first-line and rescue EET.”
Several of the researchers reported conflicts of interest with industry. The study received no industry funding.
Barrett’s endoscopic eradication therapy, resection of visible lesions, and ablation of remaining Barrett’s mucosa are the standard of care for dysplasia management. Radiofrequency ablation (RFA) is 91% successful in eliminating dysplasia and 78% in eliminating intestinal metaplasia (IM). Recurrence of dysplasia is rare, although recurrence of IM is 20%.
Given the impressive results of RFA, one might ask why alternative ablation therapies are needed. CbFAS equipment costs are lower than those of RFA, and discomfort after the procedure may be less. Failure of ablation is poorly understood, likely attributable to inadequate reflux suppression and maybe thicker areas of Barrett’s mucosa. The greater depth of injury with cryoablation may succeed in some cases of RFA failure. Complexity of this ablation procedure remains high, and excessive overlap of treatment sites probably explains the higher stricture rate. Where cryoballoon ablation fits in the Barrett’s ablation paradigm is not clear. The lower cost and availability may provide traction for this new technology in the established field of Barrett’s ablation.
Bruce D. Greenwald, MD, is a professor of medicine at the University of Maryland, Baltimore, and the Marlene and Stewart Greenebaum Comprehensive Cancer Center, Baltimore. He is a consultant for Steris Endoscopy.
Barrett’s endoscopic eradication therapy, resection of visible lesions, and ablation of remaining Barrett’s mucosa are the standard of care for dysplasia management. Radiofrequency ablation (RFA) is 91% successful in eliminating dysplasia and 78% in eliminating intestinal metaplasia (IM). Recurrence of dysplasia is rare, although recurrence of IM is 20%.
Given the impressive results of RFA, one might ask why alternative ablation therapies are needed. CbFAS equipment costs are lower than those of RFA, and discomfort after the procedure may be less. Failure of ablation is poorly understood, likely attributable to inadequate reflux suppression and maybe thicker areas of Barrett’s mucosa. The greater depth of injury with cryoablation may succeed in some cases of RFA failure. Complexity of this ablation procedure remains high, and excessive overlap of treatment sites probably explains the higher stricture rate. Where cryoballoon ablation fits in the Barrett’s ablation paradigm is not clear. The lower cost and availability may provide traction for this new technology in the established field of Barrett’s ablation.
Bruce D. Greenwald, MD, is a professor of medicine at the University of Maryland, Baltimore, and the Marlene and Stewart Greenebaum Comprehensive Cancer Center, Baltimore. He is a consultant for Steris Endoscopy.
Barrett’s endoscopic eradication therapy, resection of visible lesions, and ablation of remaining Barrett’s mucosa are the standard of care for dysplasia management. Radiofrequency ablation (RFA) is 91% successful in eliminating dysplasia and 78% in eliminating intestinal metaplasia (IM). Recurrence of dysplasia is rare, although recurrence of IM is 20%.
Given the impressive results of RFA, one might ask why alternative ablation therapies are needed. CbFAS equipment costs are lower than those of RFA, and discomfort after the procedure may be less. Failure of ablation is poorly understood, likely attributable to inadequate reflux suppression and maybe thicker areas of Barrett’s mucosa. The greater depth of injury with cryoablation may succeed in some cases of RFA failure. Complexity of this ablation procedure remains high, and excessive overlap of treatment sites probably explains the higher stricture rate. Where cryoballoon ablation fits in the Barrett’s ablation paradigm is not clear. The lower cost and availability may provide traction for this new technology in the established field of Barrett’s ablation.
Bruce D. Greenwald, MD, is a professor of medicine at the University of Maryland, Baltimore, and the Marlene and Stewart Greenebaum Comprehensive Cancer Center, Baltimore. He is a consultant for Steris Endoscopy.
Similar to radiofrequency ablation, cryoballoon ablation (CBA) is a durable approach that can eradicate Barrett’s esophagus (BE) in treatment-naive patients with dysplastic BE, according to a single-center cohort study.
Endoscopic mucosal resection (EMR), radiofrequency ablation (RFA), and cryotherapy are established techniques used in endoscopic eradication therapy of BE, wrote study authors led by Mohamad Dbouk, MD, of Johns Hopkins Medical Institutions in Baltimore in Techniques and Innovations in Gastrointestinal Endoscopy. Unlike RFA which uses heat to induce tissue necrosis and reepithelialization of normal tissue, cryotherapy applies extreme cold in the treatment of BE. While cryotherapy as an endoscopic ablative technique has been studied over the past decade as an alternative treatment modality, Dr. Dbouk and researchers noted that long-term data on durability of and outcomes with this approach are lacking.
To gauge the durability of CBA for dysplastic BE, the researchers examined outcomes of 59 consecutive patients with BE and confirmed low-grade dysplasia (n = 22), high-grade dysplasia (n = 33), or intramucosal cancer (n = 4), all of whom were treated with CBA for the purposes of BE eradication. The single-center cohort comprised only treatment-naive patients who had a mean age of 66.8 (91.5% male). In the overall cohort, the mean length of the BE was 5 cm, although 23.7% of patients had BE ≥8 cm in length.
Following confirmation of dysplastic BE in biopsies and/or EMR specimens at baseline, patients underwent CBA applied to the gastric cardia as well as all visible BE with the cryoballoon focal ablation system and focal or pear-shaped cryoballoon. The investigators performed surveillance esophagogastroduodenoscopy (EGD) to assess the CBA response. Patients with high-grade dysplasia underwent EGD and biopsy every 3 months for the first year after completing CBA, every 6 months for the second year, and once per year thereafter. However, those with biopsies at baseline that showed low-grade dysplasia (LGD) underwent EGD and biopsy every 6 months during the first year after CBA and annually thereafter. Retreatment with ablation was allowed if recurrent dysplasia or intestinal metaplasia was found.
The study’s primary endpoints included short-term efficacy – defined as the rate of complete eradication of dysplasia (CE-D) and intestinal metaplasia (CE-IM) at 1-year follow-up – and durability – characterized by the proportion of patients with CE-D and CE-IM within 18 months and maintained at 2- and 3-year follow-up.
The median follow-up period for the patient cohort was 54.3 months. Approximately 95% of the 56 patients who were evaluable at 1 year achieved CE-D, while 75% achieved CE-IM. In an analysis that stratified patients by their baseline dysplasia grade, the rates of CE-D were each 96% in the LGD and HGD groups. At 1 year, the median number of CBA sessions used to achieve CE-IM was 3.
Throughout treatment and the follow-up period, none of the patients progressed beyond their dysplasia grade at baseline or developed esophageal cancer. All patients had maintained CE-D for years 2, 3, and 4. In addition, 98% of patients had CE-IM at 2 years, 98% had CE-IM at 3 years, and 97% of patients had CE-IM at 4 years. After stratification of patients by baseline grade of dysplasia, the researchers found no significant difference between groups in the rates of CE-D and CE-IM at each follow-up year.
In 48 patients who initially achieved CE-IM, 14.6% developed recurrent intestinal metaplasia (IM), including six in the esophagus and one in the GEJ, after a median of 20.7 months. Approximately 57% of patients who developed recurrent IM had baseline LGD, while 43% had HGD at baseline. The length of BE was not significantly associated with the risk of IM recurrence, according to a Cox proportional hazard analysis (hazard ratio, 1.02; 95% confidence interval, 0.86-1.2; P = .8).
Approximately 8.5% of patients had post-CBA strictures that required dilation during the study period. According to bivariate analysis, individuals with a BE length of ≥8 cm were significantly more likely to develop strictures, compared with patients without ultra-long BE (28.6% vs. 2.2%, respectively; P = .009). Strictures occurred during the first 4 months after the initial CBA. The median period from the first CBA treatment to stricture detection on follow-up EGD was 2 months. Around 1.7% of patients experienced postprocedural bleeding that required clipping for closure. These patients were on clopidogrel for atrial fibrillation during the first year of active treatment.
Limitations of the study included the small sample size as well as the inclusion of patients from a single center, which the researchers suggest may limit the generalizability of the results.
“More research is needed to confirm the long-term durability of CBA,” the authors concluded. “Randomized controlled trials comparing CBA with RFA are needed to assess the role of CBA as a first-line and rescue EET.”
Several of the researchers reported conflicts of interest with industry. The study received no industry funding.
Similar to radiofrequency ablation, cryoballoon ablation (CBA) is a durable approach that can eradicate Barrett’s esophagus (BE) in treatment-naive patients with dysplastic BE, according to a single-center cohort study.
Endoscopic mucosal resection (EMR), radiofrequency ablation (RFA), and cryotherapy are established techniques used in endoscopic eradication therapy of BE, wrote study authors led by Mohamad Dbouk, MD, of Johns Hopkins Medical Institutions in Baltimore in Techniques and Innovations in Gastrointestinal Endoscopy. Unlike RFA which uses heat to induce tissue necrosis and reepithelialization of normal tissue, cryotherapy applies extreme cold in the treatment of BE. While cryotherapy as an endoscopic ablative technique has been studied over the past decade as an alternative treatment modality, Dr. Dbouk and researchers noted that long-term data on durability of and outcomes with this approach are lacking.
To gauge the durability of CBA for dysplastic BE, the researchers examined outcomes of 59 consecutive patients with BE and confirmed low-grade dysplasia (n = 22), high-grade dysplasia (n = 33), or intramucosal cancer (n = 4), all of whom were treated with CBA for the purposes of BE eradication. The single-center cohort comprised only treatment-naive patients who had a mean age of 66.8 (91.5% male). In the overall cohort, the mean length of the BE was 5 cm, although 23.7% of patients had BE ≥8 cm in length.
Following confirmation of dysplastic BE in biopsies and/or EMR specimens at baseline, patients underwent CBA applied to the gastric cardia as well as all visible BE with the cryoballoon focal ablation system and focal or pear-shaped cryoballoon. The investigators performed surveillance esophagogastroduodenoscopy (EGD) to assess the CBA response. Patients with high-grade dysplasia underwent EGD and biopsy every 3 months for the first year after completing CBA, every 6 months for the second year, and once per year thereafter. However, those with biopsies at baseline that showed low-grade dysplasia (LGD) underwent EGD and biopsy every 6 months during the first year after CBA and annually thereafter. Retreatment with ablation was allowed if recurrent dysplasia or intestinal metaplasia was found.
The study’s primary endpoints included short-term efficacy – defined as the rate of complete eradication of dysplasia (CE-D) and intestinal metaplasia (CE-IM) at 1-year follow-up – and durability – characterized by the proportion of patients with CE-D and CE-IM within 18 months and maintained at 2- and 3-year follow-up.
The median follow-up period for the patient cohort was 54.3 months. Approximately 95% of the 56 patients who were evaluable at 1 year achieved CE-D, while 75% achieved CE-IM. In an analysis that stratified patients by their baseline dysplasia grade, the rates of CE-D were each 96% in the LGD and HGD groups. At 1 year, the median number of CBA sessions used to achieve CE-IM was 3.
Throughout treatment and the follow-up period, none of the patients progressed beyond their dysplasia grade at baseline or developed esophageal cancer. All patients had maintained CE-D for years 2, 3, and 4. In addition, 98% of patients had CE-IM at 2 years, 98% had CE-IM at 3 years, and 97% of patients had CE-IM at 4 years. After stratification of patients by baseline grade of dysplasia, the researchers found no significant difference between groups in the rates of CE-D and CE-IM at each follow-up year.
In 48 patients who initially achieved CE-IM, 14.6% developed recurrent intestinal metaplasia (IM), including six in the esophagus and one in the GEJ, after a median of 20.7 months. Approximately 57% of patients who developed recurrent IM had baseline LGD, while 43% had HGD at baseline. The length of BE was not significantly associated with the risk of IM recurrence, according to a Cox proportional hazard analysis (hazard ratio, 1.02; 95% confidence interval, 0.86-1.2; P = .8).
Approximately 8.5% of patients had post-CBA strictures that required dilation during the study period. According to bivariate analysis, individuals with a BE length of ≥8 cm were significantly more likely to develop strictures, compared with patients without ultra-long BE (28.6% vs. 2.2%, respectively; P = .009). Strictures occurred during the first 4 months after the initial CBA. The median period from the first CBA treatment to stricture detection on follow-up EGD was 2 months. Around 1.7% of patients experienced postprocedural bleeding that required clipping for closure. These patients were on clopidogrel for atrial fibrillation during the first year of active treatment.
Limitations of the study included the small sample size as well as the inclusion of patients from a single center, which the researchers suggest may limit the generalizability of the results.
“More research is needed to confirm the long-term durability of CBA,” the authors concluded. “Randomized controlled trials comparing CBA with RFA are needed to assess the role of CBA as a first-line and rescue EET.”
Several of the researchers reported conflicts of interest with industry. The study received no industry funding.
FROM TECHNIQUES AND INNOVATIONS IN GASTROINTESTINAL ENDOSCOPY
New HBV model may open door to more effective antivirals
A new mouse model that better represents chronic infection with hepatitis B virus (HBV) in humans may lead to more effective antiviral therapies for HBV, according to investigators.
During human infection, HBV genomes take the form of covalently closed circular DNA (cccDNA), a structure that has thwarted effective antiviral therapy and, until now, creation of an accurate mouse model, reported lead author Zaichao Xu, PhD, of Wuhan (China) University and colleagues.
“As the viral persistence reservoir plays a central role in HBV infection, HBV cccDNA is the key obstacle for a cure,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology.
Although several previous mouse models have approximated this phenomenon with recombinant cccDNA-like molecules (rcccDNA), the present model is the first to achieve genuine cccDNA, which does not naturally occur in mice.
“Although rcccDNA supports persistent viral replication and antigen expression, the nature of rcccDNA may differ from authentic cccDNA, as additional sequences, like LoxP or attR, were inserted into the HBV genome,” the investigators noted.
The new model was created by first constructing an adeno-associated virus vector carrying a replication-deficient HBV1.04-fold genome (AAV-HBV1.04). When injected into mice, the vector led to cccDNA formation via ataxia-telangiectasia and Rad3-related protein (ATR)–mediated DNA damage response, a finding that was confirmed by blocking the same process with ATR inhibitors.
Immediately after injection, mice tested positive for both hepatitis B e antigen (HBeAg) and hepatitis B surface antigen (HBsAg), with peak concentrations after either 4 or 8 weeks depending on dose. HBV DNA was also detected in serum after injection, and 50% of hepatocytes exhibited HBsAg and hepatitis B core protein (HBc) after 1 week. At week 66, HBsAg, HBeAg, and HBc were still detectable in the liver.
“The expression of HBc could only be observed in the liver, but not in other organs or tissues, suggesting that the AAV-HBV1.04 only targeted the mouse liver,” the investigators wrote.
Further experimentation involving known cccDNA-binding proteins supported the similarity between cccDNA in the mouse model and natural infection.
“These results suggested that the chromatinization and transcriptional activation of cccDNA formed in this model dose not differ from wild-type cccDNA formed through infection.”
Next, Dr. Xu and colleagues demonstrated that the infected mice could serve as a reliable model for antiviral research. One week after injection with the vector, mice were treated with entecavir, polyinosinic-polycytidylic acid (poly[I:C]), or phosphate-buffered saline (PBS; control). As anticipated, entecavir suppressed circulating HBV DNA, but not HBsAg, HBeAg, or HBV cccDNA, whereas treatment with poly(I:C) reduced all HBV markers.
“This novel mouse model will provide a unique platform for studying HBV cccDNA and developing novel antivirals to achieve HBV cure,” the investigators concluded.
The study was supported by the National Natural Science Foundation of China, the Fundamental Research Funds for the Central Universities, Hubei Province’s Outstanding Medical Academic Leader Program, and others. The investigators reported no conflicts of interest.
On the heels of the wondrous development of curative antiviral agents for hepatitis C virus (HCV), renewed attention has been directed to efforts to bring about the cure of HBV. However, this task will hinge on successful elimination of covalently closed circular DNA (cccDNA), a highly stable form of viral DNA that is exceedingly difficult to eliminate. Efforts to develop successful curative strategies will in turn rely on development of small animal models that support HBV cccDNA formation and virus production, which has until recently proved elusive. In the past several years, several mouse HBV models supporting cccDNA formation have been constructed using adeno-associated vector (AAV)–mediated transduction of a linearized HBV genome. Both the AAV-HBV linear episome and cccDNA have been consistently replicated and detected in these models. While they recapitulate the key steps of the viral life cycle, these models do not, however, lend themselves to direct assessment of cccDNA, which have traditionally required detection of cccDNA in the liver.
Raymond T. Chung, MD, is a professor of medicine at Harvard Medical School and director of the Hepatology and Liver Center at Massachusetts General Hospital, both in Boston. He has no conflicts to disclose.
On the heels of the wondrous development of curative antiviral agents for hepatitis C virus (HCV), renewed attention has been directed to efforts to bring about the cure of HBV. However, this task will hinge on successful elimination of covalently closed circular DNA (cccDNA), a highly stable form of viral DNA that is exceedingly difficult to eliminate. Efforts to develop successful curative strategies will in turn rely on development of small animal models that support HBV cccDNA formation and virus production, which has until recently proved elusive. In the past several years, several mouse HBV models supporting cccDNA formation have been constructed using adeno-associated vector (AAV)–mediated transduction of a linearized HBV genome. Both the AAV-HBV linear episome and cccDNA have been consistently replicated and detected in these models. While they recapitulate the key steps of the viral life cycle, these models do not, however, lend themselves to direct assessment of cccDNA, which have traditionally required detection of cccDNA in the liver.
Raymond T. Chung, MD, is a professor of medicine at Harvard Medical School and director of the Hepatology and Liver Center at Massachusetts General Hospital, both in Boston. He has no conflicts to disclose.
On the heels of the wondrous development of curative antiviral agents for hepatitis C virus (HCV), renewed attention has been directed to efforts to bring about the cure of HBV. However, this task will hinge on successful elimination of covalently closed circular DNA (cccDNA), a highly stable form of viral DNA that is exceedingly difficult to eliminate. Efforts to develop successful curative strategies will in turn rely on development of small animal models that support HBV cccDNA formation and virus production, which has until recently proved elusive. In the past several years, several mouse HBV models supporting cccDNA formation have been constructed using adeno-associated vector (AAV)–mediated transduction of a linearized HBV genome. Both the AAV-HBV linear episome and cccDNA have been consistently replicated and detected in these models. While they recapitulate the key steps of the viral life cycle, these models do not, however, lend themselves to direct assessment of cccDNA, which have traditionally required detection of cccDNA in the liver.
Raymond T. Chung, MD, is a professor of medicine at Harvard Medical School and director of the Hepatology and Liver Center at Massachusetts General Hospital, both in Boston. He has no conflicts to disclose.
A new mouse model that better represents chronic infection with hepatitis B virus (HBV) in humans may lead to more effective antiviral therapies for HBV, according to investigators.
During human infection, HBV genomes take the form of covalently closed circular DNA (cccDNA), a structure that has thwarted effective antiviral therapy and, until now, creation of an accurate mouse model, reported lead author Zaichao Xu, PhD, of Wuhan (China) University and colleagues.
“As the viral persistence reservoir plays a central role in HBV infection, HBV cccDNA is the key obstacle for a cure,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology.
Although several previous mouse models have approximated this phenomenon with recombinant cccDNA-like molecules (rcccDNA), the present model is the first to achieve genuine cccDNA, which does not naturally occur in mice.
“Although rcccDNA supports persistent viral replication and antigen expression, the nature of rcccDNA may differ from authentic cccDNA, as additional sequences, like LoxP or attR, were inserted into the HBV genome,” the investigators noted.
The new model was created by first constructing an adeno-associated virus vector carrying a replication-deficient HBV1.04-fold genome (AAV-HBV1.04). When injected into mice, the vector led to cccDNA formation via ataxia-telangiectasia and Rad3-related protein (ATR)–mediated DNA damage response, a finding that was confirmed by blocking the same process with ATR inhibitors.
Immediately after injection, mice tested positive for both hepatitis B e antigen (HBeAg) and hepatitis B surface antigen (HBsAg), with peak concentrations after either 4 or 8 weeks depending on dose. HBV DNA was also detected in serum after injection, and 50% of hepatocytes exhibited HBsAg and hepatitis B core protein (HBc) after 1 week. At week 66, HBsAg, HBeAg, and HBc were still detectable in the liver.
“The expression of HBc could only be observed in the liver, but not in other organs or tissues, suggesting that the AAV-HBV1.04 only targeted the mouse liver,” the investigators wrote.
Further experimentation involving known cccDNA-binding proteins supported the similarity between cccDNA in the mouse model and natural infection.
“These results suggested that the chromatinization and transcriptional activation of cccDNA formed in this model dose not differ from wild-type cccDNA formed through infection.”
Next, Dr. Xu and colleagues demonstrated that the infected mice could serve as a reliable model for antiviral research. One week after injection with the vector, mice were treated with entecavir, polyinosinic-polycytidylic acid (poly[I:C]), or phosphate-buffered saline (PBS; control). As anticipated, entecavir suppressed circulating HBV DNA, but not HBsAg, HBeAg, or HBV cccDNA, whereas treatment with poly(I:C) reduced all HBV markers.
“This novel mouse model will provide a unique platform for studying HBV cccDNA and developing novel antivirals to achieve HBV cure,” the investigators concluded.
The study was supported by the National Natural Science Foundation of China, the Fundamental Research Funds for the Central Universities, Hubei Province’s Outstanding Medical Academic Leader Program, and others. The investigators reported no conflicts of interest.
A new mouse model that better represents chronic infection with hepatitis B virus (HBV) in humans may lead to more effective antiviral therapies for HBV, according to investigators.
During human infection, HBV genomes take the form of covalently closed circular DNA (cccDNA), a structure that has thwarted effective antiviral therapy and, until now, creation of an accurate mouse model, reported lead author Zaichao Xu, PhD, of Wuhan (China) University and colleagues.
“As the viral persistence reservoir plays a central role in HBV infection, HBV cccDNA is the key obstacle for a cure,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology.
Although several previous mouse models have approximated this phenomenon with recombinant cccDNA-like molecules (rcccDNA), the present model is the first to achieve genuine cccDNA, which does not naturally occur in mice.
“Although rcccDNA supports persistent viral replication and antigen expression, the nature of rcccDNA may differ from authentic cccDNA, as additional sequences, like LoxP or attR, were inserted into the HBV genome,” the investigators noted.
The new model was created by first constructing an adeno-associated virus vector carrying a replication-deficient HBV1.04-fold genome (AAV-HBV1.04). When injected into mice, the vector led to cccDNA formation via ataxia-telangiectasia and Rad3-related protein (ATR)–mediated DNA damage response, a finding that was confirmed by blocking the same process with ATR inhibitors.
Immediately after injection, mice tested positive for both hepatitis B e antigen (HBeAg) and hepatitis B surface antigen (HBsAg), with peak concentrations after either 4 or 8 weeks depending on dose. HBV DNA was also detected in serum after injection, and 50% of hepatocytes exhibited HBsAg and hepatitis B core protein (HBc) after 1 week. At week 66, HBsAg, HBeAg, and HBc were still detectable in the liver.
“The expression of HBc could only be observed in the liver, but not in other organs or tissues, suggesting that the AAV-HBV1.04 only targeted the mouse liver,” the investigators wrote.
Further experimentation involving known cccDNA-binding proteins supported the similarity between cccDNA in the mouse model and natural infection.
“These results suggested that the chromatinization and transcriptional activation of cccDNA formed in this model dose not differ from wild-type cccDNA formed through infection.”
Next, Dr. Xu and colleagues demonstrated that the infected mice could serve as a reliable model for antiviral research. One week after injection with the vector, mice were treated with entecavir, polyinosinic-polycytidylic acid (poly[I:C]), or phosphate-buffered saline (PBS; control). As anticipated, entecavir suppressed circulating HBV DNA, but not HBsAg, HBeAg, or HBV cccDNA, whereas treatment with poly(I:C) reduced all HBV markers.
“This novel mouse model will provide a unique platform for studying HBV cccDNA and developing novel antivirals to achieve HBV cure,” the investigators concluded.
The study was supported by the National Natural Science Foundation of China, the Fundamental Research Funds for the Central Universities, Hubei Province’s Outstanding Medical Academic Leader Program, and others. The investigators reported no conflicts of interest.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY
Bowel ultrasound may overtake colonoscopy in Crohn’s
Bowel ultrasound predicts the clinical course of Crohn’s disease for up to 1 year, according to results of a prospective trial involving 225 patients.
After additional confirmation in larger studies, ultrasound could serve as a noninvasive alternative to colonoscopy for monitoring and predicting disease course, reported lead author Mariangela Allocca, MD, PhD, of Humanitas University, Milan, and colleagues.
“Frequent colonoscopies are expensive, invasive, and not well tolerated by patients, thus noninvasive tools for assessment and monitoring are strongly needed,” the investigators wrote in Clinical Gastroenterology and Hepatology. “Bowel ultrasound accurately detects inflammatory bowel disease activity, extent, and complications, particularly in Crohn’s disease. Considering its low cost, minimal invasiveness ... and easy repeatability, bowel ultrasound may be a simple, readily available tool for assessing and monitoring Crohn’s disease.”
To test this hypothesis, Dr. Allocca and colleagues enrolled 225 consecutive patients with ileal and/or colonic Crohn’s disease diagnosed for at least 6 months and managed at a tertiary hospital in Italy. All patients underwent both colonoscopy and bowel ultrasound with no more than 3 months between each procedure.
Colonoscopy results were characterized by the Simplified Endoscopic Score for Crohn’s disease (SES-CD), whereas ultrasound was scored using a several parameters, including bowel wall pattern, bowel thickness, bowel wall flow, presence of complications (abscess, fistula, stricture), and characteristics of mesenteric lymph nodes and tissue. Ultrasound scores were considered high if they exceeded a cut-off of 3.52, which was determined by a receiver operating characteristic curve analysis.
Participants were followed for 12 months after baseline ultrasound. The primary objective was to determine the relationship between baseline ultrasound findings and negative disease course, defined by steroid usage, need for surgery, need for hospitalization, and/or change in therapy. The secondary objective was to understand the relationship between ultrasound findings and endoscopy activity.
Multivariable analysis revealed that ultrasound scores greater than 3.52 predicted a negative clinical disease course for up to one year (odds ratio, 6.97; 95% confidence interval, 2.87-16.93; P < .001), as did the presence of at least one disease complication at baseline (OR, 3.90; 95% CI, 1.21-12.53; P = 0.21). A worse clinical course at one-year was also predicted by a baseline fecal calprotectin value of at least 250 mcg/g (OR, 5.43; 95% CI, 2.25-13.11; P < .001) and male sex (OR, 2.60; 95% CI, 1.12-6.02; P = .025).
Investigators then assessed individual disease outcomes at 12 months and baseline results. For example, high ultrasound score and calprotectin at baseline each predicted the need for treatment escalation. In comparison, disease behavior (inflammatory, stricturing, penetrating) and C reactive protein predicted need for corticosteroids. The only significant predictor of hospitalization a year later was CRP.
“[B]owel ultrasound is able to predict disease course in Crohn’s disease patients,” they wrote. “It may identify patients at high risk of a negative course to adopt effective strategies to prevent any disease progression. Our data need to be confirmed and validated in further large studies.”
The investigators disclosed relationships with Janssen, AbbVie, Mundipharma, and others.
Patients with Crohn’s disease (CD) undergo multiple colonoscopies during their lifetime. Endoscopic assessment is often necessary to determine extent and severity of inflammation to guide choice of therapy, assess mucosal healing on current therapy, and for surveillance examination for colorectal dysplasia. Multiple colonoscopies over a lifetime present a significant financial burden for patients. The invasive nature of the procedure, along with the small but potential risk of perforation and patient discomfort make for an undesirable experience. Cross-sectional imaging offers the advantage of noninvasive modality to assess bowel wall and extraluminal complications related to CD. Bowel ultrasound, performed as point of care imaging by gastroenterologists, is an emerging imaging alternative to visualize the bowel.
In the study by Allocca et al., the authors developed a bowel ultrasound–based score incorporating bowel wall thickness, pattern, flow, and presence of extraluminal complications. The score was developed by comparing ultrasound parameters with colonoscopy findings for each segment of the colon and terminal ileum. In a cohort of 225 patients, a bowel ultrasound score of >3.52 along with at least one extraluminal complication, baseline fecal calprotectin of >250 mcg/g, and male gender were linked with adverse outcomes within 12 months (defined as need for steroids, change of therapy, hospitalization, or surgery).
While these observations need to be validated externally, this study further consolidates the role for bowel ultrasound as a viable imaging modality to monitor disease and response to therapy in CD. Prior studies have shown bowel ultrasound is a valid alternative to MR enterography – without the expense, limited availability, and need for gadolinium contrast. As the therapeutic targets in IBD move toward mucosa healing, bowel ultrasound offers the promise of a cost-effective, noninvasive, point-of care test that can be performed during an office consultation. The operator dependent nature of this modality may limit its uptake and utilization. The International Bowel Ultrasound Group (IBUS) has collaborated with the European Crohn’s and Colitis organization as well as the Canadian Association of Gastroenterology to establish training and research in bowel ultrasound. Soon, patients can expect a bowel ultrasound to become part of their routine assessment during an office consultation.
Manreet Kaur, MD, is medical director of the Inflammatory Bowel Disease Center and an associate professor in the division of gastroenterology and hepatology at Baylor College of Medicine, Houston. She has no relevant conflicts of interest.
Patients with Crohn’s disease (CD) undergo multiple colonoscopies during their lifetime. Endoscopic assessment is often necessary to determine extent and severity of inflammation to guide choice of therapy, assess mucosal healing on current therapy, and for surveillance examination for colorectal dysplasia. Multiple colonoscopies over a lifetime present a significant financial burden for patients. The invasive nature of the procedure, along with the small but potential risk of perforation and patient discomfort make for an undesirable experience. Cross-sectional imaging offers the advantage of noninvasive modality to assess bowel wall and extraluminal complications related to CD. Bowel ultrasound, performed as point of care imaging by gastroenterologists, is an emerging imaging alternative to visualize the bowel.
In the study by Allocca et al., the authors developed a bowel ultrasound–based score incorporating bowel wall thickness, pattern, flow, and presence of extraluminal complications. The score was developed by comparing ultrasound parameters with colonoscopy findings for each segment of the colon and terminal ileum. In a cohort of 225 patients, a bowel ultrasound score of >3.52 along with at least one extraluminal complication, baseline fecal calprotectin of >250 mcg/g, and male gender were linked with adverse outcomes within 12 months (defined as need for steroids, change of therapy, hospitalization, or surgery).
While these observations need to be validated externally, this study further consolidates the role for bowel ultrasound as a viable imaging modality to monitor disease and response to therapy in CD. Prior studies have shown bowel ultrasound is a valid alternative to MR enterography – without the expense, limited availability, and need for gadolinium contrast. As the therapeutic targets in IBD move toward mucosa healing, bowel ultrasound offers the promise of a cost-effective, noninvasive, point-of care test that can be performed during an office consultation. The operator dependent nature of this modality may limit its uptake and utilization. The International Bowel Ultrasound Group (IBUS) has collaborated with the European Crohn’s and Colitis organization as well as the Canadian Association of Gastroenterology to establish training and research in bowel ultrasound. Soon, patients can expect a bowel ultrasound to become part of their routine assessment during an office consultation.
Manreet Kaur, MD, is medical director of the Inflammatory Bowel Disease Center and an associate professor in the division of gastroenterology and hepatology at Baylor College of Medicine, Houston. She has no relevant conflicts of interest.
Patients with Crohn’s disease (CD) undergo multiple colonoscopies during their lifetime. Endoscopic assessment is often necessary to determine extent and severity of inflammation to guide choice of therapy, assess mucosal healing on current therapy, and for surveillance examination for colorectal dysplasia. Multiple colonoscopies over a lifetime present a significant financial burden for patients. The invasive nature of the procedure, along with the small but potential risk of perforation and patient discomfort make for an undesirable experience. Cross-sectional imaging offers the advantage of noninvasive modality to assess bowel wall and extraluminal complications related to CD. Bowel ultrasound, performed as point of care imaging by gastroenterologists, is an emerging imaging alternative to visualize the bowel.
In the study by Allocca et al., the authors developed a bowel ultrasound–based score incorporating bowel wall thickness, pattern, flow, and presence of extraluminal complications. The score was developed by comparing ultrasound parameters with colonoscopy findings for each segment of the colon and terminal ileum. In a cohort of 225 patients, a bowel ultrasound score of >3.52 along with at least one extraluminal complication, baseline fecal calprotectin of >250 mcg/g, and male gender were linked with adverse outcomes within 12 months (defined as need for steroids, change of therapy, hospitalization, or surgery).
While these observations need to be validated externally, this study further consolidates the role for bowel ultrasound as a viable imaging modality to monitor disease and response to therapy in CD. Prior studies have shown bowel ultrasound is a valid alternative to MR enterography – without the expense, limited availability, and need for gadolinium contrast. As the therapeutic targets in IBD move toward mucosa healing, bowel ultrasound offers the promise of a cost-effective, noninvasive, point-of care test that can be performed during an office consultation. The operator dependent nature of this modality may limit its uptake and utilization. The International Bowel Ultrasound Group (IBUS) has collaborated with the European Crohn’s and Colitis organization as well as the Canadian Association of Gastroenterology to establish training and research in bowel ultrasound. Soon, patients can expect a bowel ultrasound to become part of their routine assessment during an office consultation.
Manreet Kaur, MD, is medical director of the Inflammatory Bowel Disease Center and an associate professor in the division of gastroenterology and hepatology at Baylor College of Medicine, Houston. She has no relevant conflicts of interest.
Bowel ultrasound predicts the clinical course of Crohn’s disease for up to 1 year, according to results of a prospective trial involving 225 patients.
After additional confirmation in larger studies, ultrasound could serve as a noninvasive alternative to colonoscopy for monitoring and predicting disease course, reported lead author Mariangela Allocca, MD, PhD, of Humanitas University, Milan, and colleagues.
“Frequent colonoscopies are expensive, invasive, and not well tolerated by patients, thus noninvasive tools for assessment and monitoring are strongly needed,” the investigators wrote in Clinical Gastroenterology and Hepatology. “Bowel ultrasound accurately detects inflammatory bowel disease activity, extent, and complications, particularly in Crohn’s disease. Considering its low cost, minimal invasiveness ... and easy repeatability, bowel ultrasound may be a simple, readily available tool for assessing and monitoring Crohn’s disease.”
To test this hypothesis, Dr. Allocca and colleagues enrolled 225 consecutive patients with ileal and/or colonic Crohn’s disease diagnosed for at least 6 months and managed at a tertiary hospital in Italy. All patients underwent both colonoscopy and bowel ultrasound with no more than 3 months between each procedure.
Colonoscopy results were characterized by the Simplified Endoscopic Score for Crohn’s disease (SES-CD), whereas ultrasound was scored using a several parameters, including bowel wall pattern, bowel thickness, bowel wall flow, presence of complications (abscess, fistula, stricture), and characteristics of mesenteric lymph nodes and tissue. Ultrasound scores were considered high if they exceeded a cut-off of 3.52, which was determined by a receiver operating characteristic curve analysis.
Participants were followed for 12 months after baseline ultrasound. The primary objective was to determine the relationship between baseline ultrasound findings and negative disease course, defined by steroid usage, need for surgery, need for hospitalization, and/or change in therapy. The secondary objective was to understand the relationship between ultrasound findings and endoscopy activity.
Multivariable analysis revealed that ultrasound scores greater than 3.52 predicted a negative clinical disease course for up to one year (odds ratio, 6.97; 95% confidence interval, 2.87-16.93; P < .001), as did the presence of at least one disease complication at baseline (OR, 3.90; 95% CI, 1.21-12.53; P = 0.21). A worse clinical course at one-year was also predicted by a baseline fecal calprotectin value of at least 250 mcg/g (OR, 5.43; 95% CI, 2.25-13.11; P < .001) and male sex (OR, 2.60; 95% CI, 1.12-6.02; P = .025).
Investigators then assessed individual disease outcomes at 12 months and baseline results. For example, high ultrasound score and calprotectin at baseline each predicted the need for treatment escalation. In comparison, disease behavior (inflammatory, stricturing, penetrating) and C reactive protein predicted need for corticosteroids. The only significant predictor of hospitalization a year later was CRP.
“[B]owel ultrasound is able to predict disease course in Crohn’s disease patients,” they wrote. “It may identify patients at high risk of a negative course to adopt effective strategies to prevent any disease progression. Our data need to be confirmed and validated in further large studies.”
The investigators disclosed relationships with Janssen, AbbVie, Mundipharma, and others.
Bowel ultrasound predicts the clinical course of Crohn’s disease for up to 1 year, according to results of a prospective trial involving 225 patients.
After additional confirmation in larger studies, ultrasound could serve as a noninvasive alternative to colonoscopy for monitoring and predicting disease course, reported lead author Mariangela Allocca, MD, PhD, of Humanitas University, Milan, and colleagues.
“Frequent colonoscopies are expensive, invasive, and not well tolerated by patients, thus noninvasive tools for assessment and monitoring are strongly needed,” the investigators wrote in Clinical Gastroenterology and Hepatology. “Bowel ultrasound accurately detects inflammatory bowel disease activity, extent, and complications, particularly in Crohn’s disease. Considering its low cost, minimal invasiveness ... and easy repeatability, bowel ultrasound may be a simple, readily available tool for assessing and monitoring Crohn’s disease.”
To test this hypothesis, Dr. Allocca and colleagues enrolled 225 consecutive patients with ileal and/or colonic Crohn’s disease diagnosed for at least 6 months and managed at a tertiary hospital in Italy. All patients underwent both colonoscopy and bowel ultrasound with no more than 3 months between each procedure.
Colonoscopy results were characterized by the Simplified Endoscopic Score for Crohn’s disease (SES-CD), whereas ultrasound was scored using a several parameters, including bowel wall pattern, bowel thickness, bowel wall flow, presence of complications (abscess, fistula, stricture), and characteristics of mesenteric lymph nodes and tissue. Ultrasound scores were considered high if they exceeded a cut-off of 3.52, which was determined by a receiver operating characteristic curve analysis.
Participants were followed for 12 months after baseline ultrasound. The primary objective was to determine the relationship between baseline ultrasound findings and negative disease course, defined by steroid usage, need for surgery, need for hospitalization, and/or change in therapy. The secondary objective was to understand the relationship between ultrasound findings and endoscopy activity.
Multivariable analysis revealed that ultrasound scores greater than 3.52 predicted a negative clinical disease course for up to one year (odds ratio, 6.97; 95% confidence interval, 2.87-16.93; P < .001), as did the presence of at least one disease complication at baseline (OR, 3.90; 95% CI, 1.21-12.53; P = 0.21). A worse clinical course at one-year was also predicted by a baseline fecal calprotectin value of at least 250 mcg/g (OR, 5.43; 95% CI, 2.25-13.11; P < .001) and male sex (OR, 2.60; 95% CI, 1.12-6.02; P = .025).
Investigators then assessed individual disease outcomes at 12 months and baseline results. For example, high ultrasound score and calprotectin at baseline each predicted the need for treatment escalation. In comparison, disease behavior (inflammatory, stricturing, penetrating) and C reactive protein predicted need for corticosteroids. The only significant predictor of hospitalization a year later was CRP.
“[B]owel ultrasound is able to predict disease course in Crohn’s disease patients,” they wrote. “It may identify patients at high risk of a negative course to adopt effective strategies to prevent any disease progression. Our data need to be confirmed and validated in further large studies.”
The investigators disclosed relationships with Janssen, AbbVie, Mundipharma, and others.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY