User login
AGA issues position statements on reducing CRC burden
The American Gastroenterological Association has published eight position statements aimed at reducing the burden of colorectal cancer (CRC).
The evidence-based statements, published in Gastroenterology, call for a national approach to CRC screening, outline the elements of a high-quality screening program, and make clear that payers should cover all costs, from bowel prep through pathology, plus follow-up for high-risk patients.
“There is strong evidence that CRC screening is effective [at reducing CRC incidence and mortality] ... but less than 70% of eligible individuals have been screened,” wrote authors led by David Lieberman, MD, who is on the AGA Executive Committee on the Screening Continuum and affiliated with Oregon Health and Science University, Portland, noting the recent expansion of eligibility to include individuals in the 45- to 49-year age group.
“CRC screening saves lives, but only if people get screened,” Dr. Lieberman said in a press release from the AGA. “Cost sharing is an important barrier to screening, which contributes to racial, ethnic and socioeconomic inequities in colorectal cancer outcomes. The full cost of screening – including noninvasive tests and follow-up colonoscopies – should be covered without cost to patients.”
He added: “AGA wishes to collaborate with stakeholders to eliminate obstacles to screening, which disproportionately impact those with low income and lack of insurance.”
Eliminating disparities in screening
Among the position statements, Dr. Lieberman and colleagues first called for “development of a national approach to CRC screening” to patch gaps in access across the United States.
“Systematic outreach occurs infrequently,” they noted. “CRC screening prevalence is much lower among individuals who do not have access to health care due to lack of insurance, do not have a primary care provider, or are part of a medically underserved community.”
According to Dr. Lieberman and colleagues, the AGA is also “working with a broad coalition of stakeholders,” such as the American Cancer Society, payers, patient advocacy groups, and others, to create a “national resource ... focused on ensuring high-quality CRC screening and eliminating barriers to CRC screening.”
Specifically, the coalition will work to collectively tackle “disparities created by social determinants of health, which includes lack of access to screening, transportation, and even work hours and child care.
“The AGA recognizes that moving the needle to achieve a CRC screening participation goal of 80% will take a village,” they wrote.
Elements of high-quality CRC screening
The investigators went on to describe the key features of a high-quality CRC screening program, including “colonoscopy and noninvasive screening options, patient education, outreach, and navigation support.”
Dr. Lieberman and colleagues pointed out that offering more than one type of screening test “acknowledges patient preferences and improves participation.”
Certain noninvasive methods, such as fecal immunochemical testing (FIT), eliminate “important barriers” to screening, they noted, such as the need for special preparation, time off work, and transportation to a medical facility.
For individuals who have high-risk adenomas (HRAs) or advanced sessile serrated lesions (SSLs), screening should be expanded to include follow-up, the investigators added.
“Evidence from a systematic review demonstrates that individuals with HRAs at baseline have a 3- to 4-fold higher risk of incident CRC during follow-up compared with individuals with no adenoma or low-risk adenomas,” they wrote. “There is also evidence that individuals with advanced SSLs have a three= to fourfold higher risk of CRC, compared with individuals with nonadvanced SSLs.”
Payers should cover costs
To further improve access to care, payers should cover the full costs of CRC screening because “copays and deductibles are barriers to screening and contribute to socioeconomic disparities,” that “disproportionately impact those with low income and lack of insurance,” according to Dr. Lieberman and colleagues.
They noted that the Affordable Care Act “eliminated copayments for preventive services,” yet a recent study showed that almost half of patients with commercial insurance and more than three-quarters of patients with Medicare still share some cost of CRC screening.
The investigators made clear that payers need to cover costs from start to finish, including “bowel preparation, facility and professional fees, anesthesia, and pathology,” as well as follow-up screening for high-risk patients identified by noninvasive methods.
“Noninvasive colorectal screening should be considered as programs with multiple steps, each of which, including follow-up colonoscopy if the test is positive, should be covered by payers without cost sharing as part of the screening continuum,” Dr. Lieberman and colleagues wrote.
Changes underway
According to Steven Itzkowitz, MD, professor of medicine and oncological sciences and director of the gastroenterology fellowship training program at the Icahn School of Medicine at Mount Sinai, New York, the AGA publication is important because it “consolidates many of the critical issues related to decreasing the burden of colorectal cancer in the United States.”
Dr. Itzkowitz noted that changes are already underway to eliminate cost as a barrier to screening.
“The good news is that, in the past year, the Departments of Health & Human Services, Labor, and Treasury declared that cost sharing should not be imposed, and plans are required to cover screening colonoscopy with polyp removal and colonoscopy that is performed to follow-up after an abnormal noninvasive CRC screening test,” Dr. Itzkowitz said in an interview. “Many plans are following suit, but it will take time for this coverage to take effect across all plans.”
For individual gastroenterologists who would like to do their part in reducing screening inequity, Dr. Itzkowitz suggested leveraging noninvasive testing, as the AGA recommends.
“This publication is the latest to call for using noninvasive, stool-based testing in addition to colonoscopy,” Dr. Itzkowitz said. “FIT and multitarget stool DNA tests all have proven efficacy in this regard, so gastroenterologists should have those conversations with their patients. GIs can also make it easier for patients to complete colonoscopy by developing patient navigation programs, direct access referrals, and systems for communicating with primary care providers for easier referrals and communicating colonoscopy results.”
Many practices are already instituting such improvements in response to the restrictions imposed by the COVID-19 pandemic, according to Dr. Itzkowitz.“These changes, plus better coverage by payers, will make a huge impact on health equity when it comes to colorectal cancer screening.”
The publication was supported by the AGA. The investigators disclosed relationships with Geneoscopy, ColoWrap, UniversalDx, and others. Dr. Itzkowitz disclosed no relevant conflicts of interest.
Groups interested in collaborating with AGA should contact Kathleen Teixeira, AGA Vice President, Public Policy and Advocacy, at [email protected].
The American Gastroenterological Association has published eight position statements aimed at reducing the burden of colorectal cancer (CRC).
The evidence-based statements, published in Gastroenterology, call for a national approach to CRC screening, outline the elements of a high-quality screening program, and make clear that payers should cover all costs, from bowel prep through pathology, plus follow-up for high-risk patients.
“There is strong evidence that CRC screening is effective [at reducing CRC incidence and mortality] ... but less than 70% of eligible individuals have been screened,” wrote authors led by David Lieberman, MD, who is on the AGA Executive Committee on the Screening Continuum and affiliated with Oregon Health and Science University, Portland, noting the recent expansion of eligibility to include individuals in the 45- to 49-year age group.
“CRC screening saves lives, but only if people get screened,” Dr. Lieberman said in a press release from the AGA. “Cost sharing is an important barrier to screening, which contributes to racial, ethnic and socioeconomic inequities in colorectal cancer outcomes. The full cost of screening – including noninvasive tests and follow-up colonoscopies – should be covered without cost to patients.”
He added: “AGA wishes to collaborate with stakeholders to eliminate obstacles to screening, which disproportionately impact those with low income and lack of insurance.”
Eliminating disparities in screening
Among the position statements, Dr. Lieberman and colleagues first called for “development of a national approach to CRC screening” to patch gaps in access across the United States.
“Systematic outreach occurs infrequently,” they noted. “CRC screening prevalence is much lower among individuals who do not have access to health care due to lack of insurance, do not have a primary care provider, or are part of a medically underserved community.”
According to Dr. Lieberman and colleagues, the AGA is also “working with a broad coalition of stakeholders,” such as the American Cancer Society, payers, patient advocacy groups, and others, to create a “national resource ... focused on ensuring high-quality CRC screening and eliminating barriers to CRC screening.”
Specifically, the coalition will work to collectively tackle “disparities created by social determinants of health, which includes lack of access to screening, transportation, and even work hours and child care.
“The AGA recognizes that moving the needle to achieve a CRC screening participation goal of 80% will take a village,” they wrote.
Elements of high-quality CRC screening
The investigators went on to describe the key features of a high-quality CRC screening program, including “colonoscopy and noninvasive screening options, patient education, outreach, and navigation support.”
Dr. Lieberman and colleagues pointed out that offering more than one type of screening test “acknowledges patient preferences and improves participation.”
Certain noninvasive methods, such as fecal immunochemical testing (FIT), eliminate “important barriers” to screening, they noted, such as the need for special preparation, time off work, and transportation to a medical facility.
For individuals who have high-risk adenomas (HRAs) or advanced sessile serrated lesions (SSLs), screening should be expanded to include follow-up, the investigators added.
“Evidence from a systematic review demonstrates that individuals with HRAs at baseline have a 3- to 4-fold higher risk of incident CRC during follow-up compared with individuals with no adenoma or low-risk adenomas,” they wrote. “There is also evidence that individuals with advanced SSLs have a three= to fourfold higher risk of CRC, compared with individuals with nonadvanced SSLs.”
Payers should cover costs
To further improve access to care, payers should cover the full costs of CRC screening because “copays and deductibles are barriers to screening and contribute to socioeconomic disparities,” that “disproportionately impact those with low income and lack of insurance,” according to Dr. Lieberman and colleagues.
They noted that the Affordable Care Act “eliminated copayments for preventive services,” yet a recent study showed that almost half of patients with commercial insurance and more than three-quarters of patients with Medicare still share some cost of CRC screening.
The investigators made clear that payers need to cover costs from start to finish, including “bowel preparation, facility and professional fees, anesthesia, and pathology,” as well as follow-up screening for high-risk patients identified by noninvasive methods.
“Noninvasive colorectal screening should be considered as programs with multiple steps, each of which, including follow-up colonoscopy if the test is positive, should be covered by payers without cost sharing as part of the screening continuum,” Dr. Lieberman and colleagues wrote.
Changes underway
According to Steven Itzkowitz, MD, professor of medicine and oncological sciences and director of the gastroenterology fellowship training program at the Icahn School of Medicine at Mount Sinai, New York, the AGA publication is important because it “consolidates many of the critical issues related to decreasing the burden of colorectal cancer in the United States.”
Dr. Itzkowitz noted that changes are already underway to eliminate cost as a barrier to screening.
“The good news is that, in the past year, the Departments of Health & Human Services, Labor, and Treasury declared that cost sharing should not be imposed, and plans are required to cover screening colonoscopy with polyp removal and colonoscopy that is performed to follow-up after an abnormal noninvasive CRC screening test,” Dr. Itzkowitz said in an interview. “Many plans are following suit, but it will take time for this coverage to take effect across all plans.”
For individual gastroenterologists who would like to do their part in reducing screening inequity, Dr. Itzkowitz suggested leveraging noninvasive testing, as the AGA recommends.
“This publication is the latest to call for using noninvasive, stool-based testing in addition to colonoscopy,” Dr. Itzkowitz said. “FIT and multitarget stool DNA tests all have proven efficacy in this regard, so gastroenterologists should have those conversations with their patients. GIs can also make it easier for patients to complete colonoscopy by developing patient navigation programs, direct access referrals, and systems for communicating with primary care providers for easier referrals and communicating colonoscopy results.”
Many practices are already instituting such improvements in response to the restrictions imposed by the COVID-19 pandemic, according to Dr. Itzkowitz.“These changes, plus better coverage by payers, will make a huge impact on health equity when it comes to colorectal cancer screening.”
The publication was supported by the AGA. The investigators disclosed relationships with Geneoscopy, ColoWrap, UniversalDx, and others. Dr. Itzkowitz disclosed no relevant conflicts of interest.
Groups interested in collaborating with AGA should contact Kathleen Teixeira, AGA Vice President, Public Policy and Advocacy, at [email protected].
The American Gastroenterological Association has published eight position statements aimed at reducing the burden of colorectal cancer (CRC).
The evidence-based statements, published in Gastroenterology, call for a national approach to CRC screening, outline the elements of a high-quality screening program, and make clear that payers should cover all costs, from bowel prep through pathology, plus follow-up for high-risk patients.
“There is strong evidence that CRC screening is effective [at reducing CRC incidence and mortality] ... but less than 70% of eligible individuals have been screened,” wrote authors led by David Lieberman, MD, who is on the AGA Executive Committee on the Screening Continuum and affiliated with Oregon Health and Science University, Portland, noting the recent expansion of eligibility to include individuals in the 45- to 49-year age group.
“CRC screening saves lives, but only if people get screened,” Dr. Lieberman said in a press release from the AGA. “Cost sharing is an important barrier to screening, which contributes to racial, ethnic and socioeconomic inequities in colorectal cancer outcomes. The full cost of screening – including noninvasive tests and follow-up colonoscopies – should be covered without cost to patients.”
He added: “AGA wishes to collaborate with stakeholders to eliminate obstacles to screening, which disproportionately impact those with low income and lack of insurance.”
Eliminating disparities in screening
Among the position statements, Dr. Lieberman and colleagues first called for “development of a national approach to CRC screening” to patch gaps in access across the United States.
“Systematic outreach occurs infrequently,” they noted. “CRC screening prevalence is much lower among individuals who do not have access to health care due to lack of insurance, do not have a primary care provider, or are part of a medically underserved community.”
According to Dr. Lieberman and colleagues, the AGA is also “working with a broad coalition of stakeholders,” such as the American Cancer Society, payers, patient advocacy groups, and others, to create a “national resource ... focused on ensuring high-quality CRC screening and eliminating barriers to CRC screening.”
Specifically, the coalition will work to collectively tackle “disparities created by social determinants of health, which includes lack of access to screening, transportation, and even work hours and child care.
“The AGA recognizes that moving the needle to achieve a CRC screening participation goal of 80% will take a village,” they wrote.
Elements of high-quality CRC screening
The investigators went on to describe the key features of a high-quality CRC screening program, including “colonoscopy and noninvasive screening options, patient education, outreach, and navigation support.”
Dr. Lieberman and colleagues pointed out that offering more than one type of screening test “acknowledges patient preferences and improves participation.”
Certain noninvasive methods, such as fecal immunochemical testing (FIT), eliminate “important barriers” to screening, they noted, such as the need for special preparation, time off work, and transportation to a medical facility.
For individuals who have high-risk adenomas (HRAs) or advanced sessile serrated lesions (SSLs), screening should be expanded to include follow-up, the investigators added.
“Evidence from a systematic review demonstrates that individuals with HRAs at baseline have a 3- to 4-fold higher risk of incident CRC during follow-up compared with individuals with no adenoma or low-risk adenomas,” they wrote. “There is also evidence that individuals with advanced SSLs have a three= to fourfold higher risk of CRC, compared with individuals with nonadvanced SSLs.”
Payers should cover costs
To further improve access to care, payers should cover the full costs of CRC screening because “copays and deductibles are barriers to screening and contribute to socioeconomic disparities,” that “disproportionately impact those with low income and lack of insurance,” according to Dr. Lieberman and colleagues.
They noted that the Affordable Care Act “eliminated copayments for preventive services,” yet a recent study showed that almost half of patients with commercial insurance and more than three-quarters of patients with Medicare still share some cost of CRC screening.
The investigators made clear that payers need to cover costs from start to finish, including “bowel preparation, facility and professional fees, anesthesia, and pathology,” as well as follow-up screening for high-risk patients identified by noninvasive methods.
“Noninvasive colorectal screening should be considered as programs with multiple steps, each of which, including follow-up colonoscopy if the test is positive, should be covered by payers without cost sharing as part of the screening continuum,” Dr. Lieberman and colleagues wrote.
Changes underway
According to Steven Itzkowitz, MD, professor of medicine and oncological sciences and director of the gastroenterology fellowship training program at the Icahn School of Medicine at Mount Sinai, New York, the AGA publication is important because it “consolidates many of the critical issues related to decreasing the burden of colorectal cancer in the United States.”
Dr. Itzkowitz noted that changes are already underway to eliminate cost as a barrier to screening.
“The good news is that, in the past year, the Departments of Health & Human Services, Labor, and Treasury declared that cost sharing should not be imposed, and plans are required to cover screening colonoscopy with polyp removal and colonoscopy that is performed to follow-up after an abnormal noninvasive CRC screening test,” Dr. Itzkowitz said in an interview. “Many plans are following suit, but it will take time for this coverage to take effect across all plans.”
For individual gastroenterologists who would like to do their part in reducing screening inequity, Dr. Itzkowitz suggested leveraging noninvasive testing, as the AGA recommends.
“This publication is the latest to call for using noninvasive, stool-based testing in addition to colonoscopy,” Dr. Itzkowitz said. “FIT and multitarget stool DNA tests all have proven efficacy in this regard, so gastroenterologists should have those conversations with their patients. GIs can also make it easier for patients to complete colonoscopy by developing patient navigation programs, direct access referrals, and systems for communicating with primary care providers for easier referrals and communicating colonoscopy results.”
Many practices are already instituting such improvements in response to the restrictions imposed by the COVID-19 pandemic, according to Dr. Itzkowitz.“These changes, plus better coverage by payers, will make a huge impact on health equity when it comes to colorectal cancer screening.”
The publication was supported by the AGA. The investigators disclosed relationships with Geneoscopy, ColoWrap, UniversalDx, and others. Dr. Itzkowitz disclosed no relevant conflicts of interest.
Groups interested in collaborating with AGA should contact Kathleen Teixeira, AGA Vice President, Public Policy and Advocacy, at [email protected].
FROM GASTROENTEROLOGY
AI-based CADe outperforms high-definition white light in colonoscopy
An artificial intelligence (AI)–based computer-aided polyp detection (CADe) system missed fewer adenomas, polyps, and sessile serrated lesions and identified more adenomas per colonoscopy than a high-definition white light (HDWL) colonoscopy, according to findings from a randomized study.
While adenoma detection by colonoscopy is associated with a reduced risk of interval colon cancer, detection rates of adenomas vary among physicians. AI approaches, such as machine learning and deep learning, may improve adenoma detection rates during colonoscopy and thus potentially improve outcomes for patients, suggested study authors led by Jeremy R. Glissen Brown, MD, of the Beth Israel Deaconess Medical Center and Harvard Medical School, Boston, who reported their trial findings in Clinical Gastroenterology and Hepatology.
The investigators explained that, although AI approaches may offer benefits in adenoma detection, there have been no prospective data for U.S. populations on the efficacy of an AI-based CADe system for improving adenoma detection rates (ADRs) and reducing adenoma miss rates (AMRs). To overcome this research gap, the investigators performed a prospective, multicenter, single-blind randomized tandem colonoscopy study which assessed a deep learning–based CADe system in 232 patients.
Individuals who presented to the four included U.S. medical centers for either colorectal cancer screening or surveillance were randomly assigned to the CADe system colonoscopy first (n = 116) or HDWL colonoscopy first (n = 116). This was immediately followed by the other procedure, in tandem fashion, performed by the same endoscopist. AMR was the primary outcome of interest, while secondary outcomes were adenomas per colonoscopy (APC) and the miss rate of sessile serrated lesions (SSL).
The researchers excluded 9 patients, which resulted in a total patient population of 223 patients. Approximately 45.3% of the cohort was female, 67.7% were White, and 21% were Black. Most patients (60%) were indicated for primary colorectal cancer screening.
Compared with the HDWL-first group, the AMR was significantly lower in the CADe-first group (31.25% vs. 20.12%, respectively; P = .0247). The researchers commented that, although the CADe system resulted in a statistically significantly lower AMR, the rate still reflects missed adenomas.
Additionally, the CADe-first group had a lower SSL miss rate, compared with the HDWL-first group (7.14% vs. 42.11%, respectively; P = .0482). The researchers noted that their study is one of the first research studies to show that a computer-assisted polyp detection system can reduce the SSL miss rate. The first-pass APC was also significantly higher in the CADe-first group (1.19 vs. 0.90; P = .0323). No statistically significant difference was observed between the groups in regard to the first-pass ADR (50.44% for the CADe-first group vs. 43.64 % for the HDWL-first group; P = .3091).
A multivariate logistic regression analysis identified three significant factors predictive of missed polyps: use of HDWL first vs. the computer-assisted detection system first (odds ratio, 1.8830; P = .0214), age 65 years or younger (OR, 1.7390; P = .0451), and right colon vs. other location (OR, 1.7865; P = .0436).
According to the researchers, the study was not powered to identify differences in ADR, thereby limiting the interpretation of this analysis. In addition, the investigators noted that the tandem colonoscopy study design is limited in its generalizability to real-world clinical settings. Also, given that endoscopists were not blinded to group assignments while performing each withdrawal, the researchers commented that “it is possible that endoscopist performance was influenced by being observed or that endoscopists who participated for the length of the study became over-reliant on” the CADe system during withdrawal, resulting in an underestimate or overestimation of the system’s performance.
The authors concluded that their findings suggest that an AI-based CADe system with colonoscopy “has the potential to decrease interprovider variability in colonoscopy quality by reducing AMR, even in experienced providers.”
This was an investigator-initiated study, with research software and study funding provided by Wision AI. The investigators reported relationships with Wision AI, as well as Olympus, Fujifilm, and Medtronic.
Several randomized trials testing artificial intelligence (AI)–assisted colonoscopy showed improvement in adenoma detection. This study adds to the growing body of evidence that computer-aided detection (CADe) systems for adenoma augment adenoma detection rates, even among highly skilled endoscopists whose baseline ADRs are much higher than the currently recommended threshold for quality colonoscopy (25%).
This study also highlights the usefulness of CADe in aiding detection of sessile serrated lesions (SSL). Recognition of SSL appears to be challenging for trainees and the most likely type of missed large adenomas overall.
AI-based systems will enhance but will not replace the highly skilled operator. As this study pointed out, despite the superior ADR, adenomas were still missed by CADe. The main reason for this was that the missed polyps were not brought into the visual field by the operator. A combination of a CADe program and a distal attachment mucosa exposure device in the hands of an experienced endoscopists might bring the best results.
Monika Fischer, MD, is an associate professor of medicine at Indiana University, Indianapolis. She reported no relevant conflicts of interest.
Several randomized trials testing artificial intelligence (AI)–assisted colonoscopy showed improvement in adenoma detection. This study adds to the growing body of evidence that computer-aided detection (CADe) systems for adenoma augment adenoma detection rates, even among highly skilled endoscopists whose baseline ADRs are much higher than the currently recommended threshold for quality colonoscopy (25%).
This study also highlights the usefulness of CADe in aiding detection of sessile serrated lesions (SSL). Recognition of SSL appears to be challenging for trainees and the most likely type of missed large adenomas overall.
AI-based systems will enhance but will not replace the highly skilled operator. As this study pointed out, despite the superior ADR, adenomas were still missed by CADe. The main reason for this was that the missed polyps were not brought into the visual field by the operator. A combination of a CADe program and a distal attachment mucosa exposure device in the hands of an experienced endoscopists might bring the best results.
Monika Fischer, MD, is an associate professor of medicine at Indiana University, Indianapolis. She reported no relevant conflicts of interest.
Several randomized trials testing artificial intelligence (AI)–assisted colonoscopy showed improvement in adenoma detection. This study adds to the growing body of evidence that computer-aided detection (CADe) systems for adenoma augment adenoma detection rates, even among highly skilled endoscopists whose baseline ADRs are much higher than the currently recommended threshold for quality colonoscopy (25%).
This study also highlights the usefulness of CADe in aiding detection of sessile serrated lesions (SSL). Recognition of SSL appears to be challenging for trainees and the most likely type of missed large adenomas overall.
AI-based systems will enhance but will not replace the highly skilled operator. As this study pointed out, despite the superior ADR, adenomas were still missed by CADe. The main reason for this was that the missed polyps were not brought into the visual field by the operator. A combination of a CADe program and a distal attachment mucosa exposure device in the hands of an experienced endoscopists might bring the best results.
Monika Fischer, MD, is an associate professor of medicine at Indiana University, Indianapolis. She reported no relevant conflicts of interest.
An artificial intelligence (AI)–based computer-aided polyp detection (CADe) system missed fewer adenomas, polyps, and sessile serrated lesions and identified more adenomas per colonoscopy than a high-definition white light (HDWL) colonoscopy, according to findings from a randomized study.
While adenoma detection by colonoscopy is associated with a reduced risk of interval colon cancer, detection rates of adenomas vary among physicians. AI approaches, such as machine learning and deep learning, may improve adenoma detection rates during colonoscopy and thus potentially improve outcomes for patients, suggested study authors led by Jeremy R. Glissen Brown, MD, of the Beth Israel Deaconess Medical Center and Harvard Medical School, Boston, who reported their trial findings in Clinical Gastroenterology and Hepatology.
The investigators explained that, although AI approaches may offer benefits in adenoma detection, there have been no prospective data for U.S. populations on the efficacy of an AI-based CADe system for improving adenoma detection rates (ADRs) and reducing adenoma miss rates (AMRs). To overcome this research gap, the investigators performed a prospective, multicenter, single-blind randomized tandem colonoscopy study which assessed a deep learning–based CADe system in 232 patients.
Individuals who presented to the four included U.S. medical centers for either colorectal cancer screening or surveillance were randomly assigned to the CADe system colonoscopy first (n = 116) or HDWL colonoscopy first (n = 116). This was immediately followed by the other procedure, in tandem fashion, performed by the same endoscopist. AMR was the primary outcome of interest, while secondary outcomes were adenomas per colonoscopy (APC) and the miss rate of sessile serrated lesions (SSL).
The researchers excluded 9 patients, which resulted in a total patient population of 223 patients. Approximately 45.3% of the cohort was female, 67.7% were White, and 21% were Black. Most patients (60%) were indicated for primary colorectal cancer screening.
Compared with the HDWL-first group, the AMR was significantly lower in the CADe-first group (31.25% vs. 20.12%, respectively; P = .0247). The researchers commented that, although the CADe system resulted in a statistically significantly lower AMR, the rate still reflects missed adenomas.
Additionally, the CADe-first group had a lower SSL miss rate, compared with the HDWL-first group (7.14% vs. 42.11%, respectively; P = .0482). The researchers noted that their study is one of the first research studies to show that a computer-assisted polyp detection system can reduce the SSL miss rate. The first-pass APC was also significantly higher in the CADe-first group (1.19 vs. 0.90; P = .0323). No statistically significant difference was observed between the groups in regard to the first-pass ADR (50.44% for the CADe-first group vs. 43.64 % for the HDWL-first group; P = .3091).
A multivariate logistic regression analysis identified three significant factors predictive of missed polyps: use of HDWL first vs. the computer-assisted detection system first (odds ratio, 1.8830; P = .0214), age 65 years or younger (OR, 1.7390; P = .0451), and right colon vs. other location (OR, 1.7865; P = .0436).
According to the researchers, the study was not powered to identify differences in ADR, thereby limiting the interpretation of this analysis. In addition, the investigators noted that the tandem colonoscopy study design is limited in its generalizability to real-world clinical settings. Also, given that endoscopists were not blinded to group assignments while performing each withdrawal, the researchers commented that “it is possible that endoscopist performance was influenced by being observed or that endoscopists who participated for the length of the study became over-reliant on” the CADe system during withdrawal, resulting in an underestimate or overestimation of the system’s performance.
The authors concluded that their findings suggest that an AI-based CADe system with colonoscopy “has the potential to decrease interprovider variability in colonoscopy quality by reducing AMR, even in experienced providers.”
This was an investigator-initiated study, with research software and study funding provided by Wision AI. The investigators reported relationships with Wision AI, as well as Olympus, Fujifilm, and Medtronic.
An artificial intelligence (AI)–based computer-aided polyp detection (CADe) system missed fewer adenomas, polyps, and sessile serrated lesions and identified more adenomas per colonoscopy than a high-definition white light (HDWL) colonoscopy, according to findings from a randomized study.
While adenoma detection by colonoscopy is associated with a reduced risk of interval colon cancer, detection rates of adenomas vary among physicians. AI approaches, such as machine learning and deep learning, may improve adenoma detection rates during colonoscopy and thus potentially improve outcomes for patients, suggested study authors led by Jeremy R. Glissen Brown, MD, of the Beth Israel Deaconess Medical Center and Harvard Medical School, Boston, who reported their trial findings in Clinical Gastroenterology and Hepatology.
The investigators explained that, although AI approaches may offer benefits in adenoma detection, there have been no prospective data for U.S. populations on the efficacy of an AI-based CADe system for improving adenoma detection rates (ADRs) and reducing adenoma miss rates (AMRs). To overcome this research gap, the investigators performed a prospective, multicenter, single-blind randomized tandem colonoscopy study which assessed a deep learning–based CADe system in 232 patients.
Individuals who presented to the four included U.S. medical centers for either colorectal cancer screening or surveillance were randomly assigned to the CADe system colonoscopy first (n = 116) or HDWL colonoscopy first (n = 116). This was immediately followed by the other procedure, in tandem fashion, performed by the same endoscopist. AMR was the primary outcome of interest, while secondary outcomes were adenomas per colonoscopy (APC) and the miss rate of sessile serrated lesions (SSL).
The researchers excluded 9 patients, which resulted in a total patient population of 223 patients. Approximately 45.3% of the cohort was female, 67.7% were White, and 21% were Black. Most patients (60%) were indicated for primary colorectal cancer screening.
Compared with the HDWL-first group, the AMR was significantly lower in the CADe-first group (31.25% vs. 20.12%, respectively; P = .0247). The researchers commented that, although the CADe system resulted in a statistically significantly lower AMR, the rate still reflects missed adenomas.
Additionally, the CADe-first group had a lower SSL miss rate, compared with the HDWL-first group (7.14% vs. 42.11%, respectively; P = .0482). The researchers noted that their study is one of the first research studies to show that a computer-assisted polyp detection system can reduce the SSL miss rate. The first-pass APC was also significantly higher in the CADe-first group (1.19 vs. 0.90; P = .0323). No statistically significant difference was observed between the groups in regard to the first-pass ADR (50.44% for the CADe-first group vs. 43.64 % for the HDWL-first group; P = .3091).
A multivariate logistic regression analysis identified three significant factors predictive of missed polyps: use of HDWL first vs. the computer-assisted detection system first (odds ratio, 1.8830; P = .0214), age 65 years or younger (OR, 1.7390; P = .0451), and right colon vs. other location (OR, 1.7865; P = .0436).
According to the researchers, the study was not powered to identify differences in ADR, thereby limiting the interpretation of this analysis. In addition, the investigators noted that the tandem colonoscopy study design is limited in its generalizability to real-world clinical settings. Also, given that endoscopists were not blinded to group assignments while performing each withdrawal, the researchers commented that “it is possible that endoscopist performance was influenced by being observed or that endoscopists who participated for the length of the study became over-reliant on” the CADe system during withdrawal, resulting in an underestimate or overestimation of the system’s performance.
The authors concluded that their findings suggest that an AI-based CADe system with colonoscopy “has the potential to decrease interprovider variability in colonoscopy quality by reducing AMR, even in experienced providers.”
This was an investigator-initiated study, with research software and study funding provided by Wision AI. The investigators reported relationships with Wision AI, as well as Olympus, Fujifilm, and Medtronic.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Esophageal cancer screening isn’t for everyone: Study
Endoscopic screening for esophageal adenocarcinoma (EAC), may not be a cost-effective strategy for all populations, possibly even leading to net harm in some, according to a comparative cost-effectiveness analysis.
Several U.S. guidelines suggest the use of endoscopic screening for EAC, yet recommendations within these guidelines vary in terms of which population should receive screening, according study authors led by Joel H. Rubenstein, MD, of the Lieutenant Charles S. Kettles Veterans Affairs Medical Center, Ann Arbor, Mich. Their findings were published in Gastroenterology. In addition, there have been no randomized trials to date that have evaluated endoscopic screening outcomes among different populations. Population screening recommendations in the current guidelines have been informed mostly by observational data and expert opinion.
Existing cost-effectiveness analyses of EAC screening have mostly focused on screening older men with gastroesophageal reflux disease (GERD) at certain ages, and many of these analyses have limited data regarding diverse patient populations.
In their study, Dr. Rubenstein and colleagues performed a comparative cost-effectiveness analysis of endoscopic screening for EAC that was restricted to individuals with GERD symptoms in the general population. The analysis was stratified by race and sex. The primary objective of the analysis was to identify and establish the optimal age at which to offer endoscopic screening in the specific populations evaluated in the study.
The investigators conducted their comparative cost-effectiveness analyses using three independent simulation models. The independently developed models – which focused on EAC natural history, screening, surveillance, and treatment – are part of the National Cancer Institute’s Cancer Intervention and Surveillance Modeling Network. For each model, there were four cohorts, defined by race as either White or Black and sex, which were independently calibrated to targets to reproduce the EAC incidence in the United States. The three models were based on somewhat different structures and assumptions; for example, two of the models assumed stable prevalence of GERD symptoms of approximately 20% across ages, while the third assumed a near-linear increase across adulthood. All three assumed EAC develops only in individuals with Barrett’s esophagus.
In each base case, the researchers simulated cohorts of people in the United States who were born in 1950, and then stratified these individuals by race and sex and followed each individual from 40 years of age until 100 years of age. The researchers considered 42 strategies, such as no screening, a single endoscopic screening at six specified ages (between 40 and 65 years of age), and a single screening in individuals with GERD symptoms at the six specified ages.
Primary results were the averaged results across all three models. The optimal screening strategy, defined by the investigators, was the strategy with the highest effectiveness that had an incremental cost-effectiveness ratio of less than $100,000 per quality-adjusted life-year gained.
The most effective – yet the most costly – screening strategies for White men were those that screened all of them once between 40 and 55 years of age. The optimal screening strategy, however, was one that screened individuals with GERD twice, once at age 45 years and again at 60 years. The researchers determined that screening Black men with GERD once at 55 years of age was optimal.
By contrast, the optimal strategy for women, whether White or Black, was no screening at all. “In particular, among Black women, screening is, at best, very expensive with little benefit, and some strategies cause net harm,” the authors wrote.
The investigators wrote that there is a need for empiric, long-term studies “to confirm whether repeated screening has a substantial yield of incident” Barrett’s esophagus. The researchers also noted that their study was limited by the lack of inclusion of additional risk factors, such as smoking, obesity, and family history, which may have led to different conclusions on specific screening strategies.
“We certainly acknowledge the history of health care inequities, and that race is a social construct that, in the vast majority of medical contexts, has no biological basis. We are circumspect regarding making recommendations based on race or sex if environmental exposures or genetic factors on which to make to those recommendations were available,” they wrote.
The study was supported by National Institutes of Health/National Cancer Institute grants. Some authors disclosed relationships with Lucid Diagnostics, Value Analytics Labs, and Cernostics.
Over the past decades we have seen an alarming rise in the incidence of esophageal adenocarcinoma, mostly diagnosed at an advanced stage when curative treatment is no longer an option. Esophageal adenocarcinoma develops from Barrett’s esophagus that, if known to be present, can be surveilled to detect dysplasia and cancer at an early and curable stage.
Whereas currently screening for Barrett’s esophagus focused on White males with gastroesophageal reflux, little was known about screening in non-White and non-male populations. Identifying who and how to screen poses a challenge, and in real life such studies looking at varied populations would require many patients, years of follow-up, much effort and substantial costs. Rubenstein and colleagues used three independent simulation models to simulate many different screening scenarios, while taking gender and race into account. The outcomes of this study, which demonstrate that one size does not fit all, will be very relevant in guiding future strategies regarding screening for Barrett’s esophagus and early esophageal adenocarcinoma. Although the study is based around endoscopic screening, the insights gained from this study will also be relevant when considering the use of nonendoscopic screening tools.
R.E. Pouw, MD, PhD, is with Amsterdam University Medical Centers. She disclosed having been a consultant for MicroTech and Medtronic and having received speaker fees from Pentax.
Over the past decades we have seen an alarming rise in the incidence of esophageal adenocarcinoma, mostly diagnosed at an advanced stage when curative treatment is no longer an option. Esophageal adenocarcinoma develops from Barrett’s esophagus that, if known to be present, can be surveilled to detect dysplasia and cancer at an early and curable stage.
Whereas currently screening for Barrett’s esophagus focused on White males with gastroesophageal reflux, little was known about screening in non-White and non-male populations. Identifying who and how to screen poses a challenge, and in real life such studies looking at varied populations would require many patients, years of follow-up, much effort and substantial costs. Rubenstein and colleagues used three independent simulation models to simulate many different screening scenarios, while taking gender and race into account. The outcomes of this study, which demonstrate that one size does not fit all, will be very relevant in guiding future strategies regarding screening for Barrett’s esophagus and early esophageal adenocarcinoma. Although the study is based around endoscopic screening, the insights gained from this study will also be relevant when considering the use of nonendoscopic screening tools.
R.E. Pouw, MD, PhD, is with Amsterdam University Medical Centers. She disclosed having been a consultant for MicroTech and Medtronic and having received speaker fees from Pentax.
Over the past decades we have seen an alarming rise in the incidence of esophageal adenocarcinoma, mostly diagnosed at an advanced stage when curative treatment is no longer an option. Esophageal adenocarcinoma develops from Barrett’s esophagus that, if known to be present, can be surveilled to detect dysplasia and cancer at an early and curable stage.
Whereas currently screening for Barrett’s esophagus focused on White males with gastroesophageal reflux, little was known about screening in non-White and non-male populations. Identifying who and how to screen poses a challenge, and in real life such studies looking at varied populations would require many patients, years of follow-up, much effort and substantial costs. Rubenstein and colleagues used three independent simulation models to simulate many different screening scenarios, while taking gender and race into account. The outcomes of this study, which demonstrate that one size does not fit all, will be very relevant in guiding future strategies regarding screening for Barrett’s esophagus and early esophageal adenocarcinoma. Although the study is based around endoscopic screening, the insights gained from this study will also be relevant when considering the use of nonendoscopic screening tools.
R.E. Pouw, MD, PhD, is with Amsterdam University Medical Centers. She disclosed having been a consultant for MicroTech and Medtronic and having received speaker fees from Pentax.
Endoscopic screening for esophageal adenocarcinoma (EAC), may not be a cost-effective strategy for all populations, possibly even leading to net harm in some, according to a comparative cost-effectiveness analysis.
Several U.S. guidelines suggest the use of endoscopic screening for EAC, yet recommendations within these guidelines vary in terms of which population should receive screening, according study authors led by Joel H. Rubenstein, MD, of the Lieutenant Charles S. Kettles Veterans Affairs Medical Center, Ann Arbor, Mich. Their findings were published in Gastroenterology. In addition, there have been no randomized trials to date that have evaluated endoscopic screening outcomes among different populations. Population screening recommendations in the current guidelines have been informed mostly by observational data and expert opinion.
Existing cost-effectiveness analyses of EAC screening have mostly focused on screening older men with gastroesophageal reflux disease (GERD) at certain ages, and many of these analyses have limited data regarding diverse patient populations.
In their study, Dr. Rubenstein and colleagues performed a comparative cost-effectiveness analysis of endoscopic screening for EAC that was restricted to individuals with GERD symptoms in the general population. The analysis was stratified by race and sex. The primary objective of the analysis was to identify and establish the optimal age at which to offer endoscopic screening in the specific populations evaluated in the study.
The investigators conducted their comparative cost-effectiveness analyses using three independent simulation models. The independently developed models – which focused on EAC natural history, screening, surveillance, and treatment – are part of the National Cancer Institute’s Cancer Intervention and Surveillance Modeling Network. For each model, there were four cohorts, defined by race as either White or Black and sex, which were independently calibrated to targets to reproduce the EAC incidence in the United States. The three models were based on somewhat different structures and assumptions; for example, two of the models assumed stable prevalence of GERD symptoms of approximately 20% across ages, while the third assumed a near-linear increase across adulthood. All three assumed EAC develops only in individuals with Barrett’s esophagus.
In each base case, the researchers simulated cohorts of people in the United States who were born in 1950, and then stratified these individuals by race and sex and followed each individual from 40 years of age until 100 years of age. The researchers considered 42 strategies, such as no screening, a single endoscopic screening at six specified ages (between 40 and 65 years of age), and a single screening in individuals with GERD symptoms at the six specified ages.
Primary results were the averaged results across all three models. The optimal screening strategy, defined by the investigators, was the strategy with the highest effectiveness that had an incremental cost-effectiveness ratio of less than $100,000 per quality-adjusted life-year gained.
The most effective – yet the most costly – screening strategies for White men were those that screened all of them once between 40 and 55 years of age. The optimal screening strategy, however, was one that screened individuals with GERD twice, once at age 45 years and again at 60 years. The researchers determined that screening Black men with GERD once at 55 years of age was optimal.
By contrast, the optimal strategy for women, whether White or Black, was no screening at all. “In particular, among Black women, screening is, at best, very expensive with little benefit, and some strategies cause net harm,” the authors wrote.
The investigators wrote that there is a need for empiric, long-term studies “to confirm whether repeated screening has a substantial yield of incident” Barrett’s esophagus. The researchers also noted that their study was limited by the lack of inclusion of additional risk factors, such as smoking, obesity, and family history, which may have led to different conclusions on specific screening strategies.
“We certainly acknowledge the history of health care inequities, and that race is a social construct that, in the vast majority of medical contexts, has no biological basis. We are circumspect regarding making recommendations based on race or sex if environmental exposures or genetic factors on which to make to those recommendations were available,” they wrote.
The study was supported by National Institutes of Health/National Cancer Institute grants. Some authors disclosed relationships with Lucid Diagnostics, Value Analytics Labs, and Cernostics.
Endoscopic screening for esophageal adenocarcinoma (EAC), may not be a cost-effective strategy for all populations, possibly even leading to net harm in some, according to a comparative cost-effectiveness analysis.
Several U.S. guidelines suggest the use of endoscopic screening for EAC, yet recommendations within these guidelines vary in terms of which population should receive screening, according study authors led by Joel H. Rubenstein, MD, of the Lieutenant Charles S. Kettles Veterans Affairs Medical Center, Ann Arbor, Mich. Their findings were published in Gastroenterology. In addition, there have been no randomized trials to date that have evaluated endoscopic screening outcomes among different populations. Population screening recommendations in the current guidelines have been informed mostly by observational data and expert opinion.
Existing cost-effectiveness analyses of EAC screening have mostly focused on screening older men with gastroesophageal reflux disease (GERD) at certain ages, and many of these analyses have limited data regarding diverse patient populations.
In their study, Dr. Rubenstein and colleagues performed a comparative cost-effectiveness analysis of endoscopic screening for EAC that was restricted to individuals with GERD symptoms in the general population. The analysis was stratified by race and sex. The primary objective of the analysis was to identify and establish the optimal age at which to offer endoscopic screening in the specific populations evaluated in the study.
The investigators conducted their comparative cost-effectiveness analyses using three independent simulation models. The independently developed models – which focused on EAC natural history, screening, surveillance, and treatment – are part of the National Cancer Institute’s Cancer Intervention and Surveillance Modeling Network. For each model, there were four cohorts, defined by race as either White or Black and sex, which were independently calibrated to targets to reproduce the EAC incidence in the United States. The three models were based on somewhat different structures and assumptions; for example, two of the models assumed stable prevalence of GERD symptoms of approximately 20% across ages, while the third assumed a near-linear increase across adulthood. All three assumed EAC develops only in individuals with Barrett’s esophagus.
In each base case, the researchers simulated cohorts of people in the United States who were born in 1950, and then stratified these individuals by race and sex and followed each individual from 40 years of age until 100 years of age. The researchers considered 42 strategies, such as no screening, a single endoscopic screening at six specified ages (between 40 and 65 years of age), and a single screening in individuals with GERD symptoms at the six specified ages.
Primary results were the averaged results across all three models. The optimal screening strategy, defined by the investigators, was the strategy with the highest effectiveness that had an incremental cost-effectiveness ratio of less than $100,000 per quality-adjusted life-year gained.
The most effective – yet the most costly – screening strategies for White men were those that screened all of them once between 40 and 55 years of age. The optimal screening strategy, however, was one that screened individuals with GERD twice, once at age 45 years and again at 60 years. The researchers determined that screening Black men with GERD once at 55 years of age was optimal.
By contrast, the optimal strategy for women, whether White or Black, was no screening at all. “In particular, among Black women, screening is, at best, very expensive with little benefit, and some strategies cause net harm,” the authors wrote.
The investigators wrote that there is a need for empiric, long-term studies “to confirm whether repeated screening has a substantial yield of incident” Barrett’s esophagus. The researchers also noted that their study was limited by the lack of inclusion of additional risk factors, such as smoking, obesity, and family history, which may have led to different conclusions on specific screening strategies.
“We certainly acknowledge the history of health care inequities, and that race is a social construct that, in the vast majority of medical contexts, has no biological basis. We are circumspect regarding making recommendations based on race or sex if environmental exposures or genetic factors on which to make to those recommendations were available,” they wrote.
The study was supported by National Institutes of Health/National Cancer Institute grants. Some authors disclosed relationships with Lucid Diagnostics, Value Analytics Labs, and Cernostics.
FROM GASTROENTEROLOGY
Confronting endoscopic infection control
The reprocessing of endoscopes following gastrointestinal endoscopy is highly effective for mitigating the risk of exogenous infections, yet challenges in duodenoscope reprocessing continue to persist. While several enhanced reprocessing measures have been developed to reduce duodenoscope-related infection risks, the effectiveness of these enhanced measures is largely unclear.
Rahul A. Shimpi, MD, and Joshua P. Spaete, MD, from Duke University, Durham, N.C., wrote in a paper in Techniques and Innovations in Gastrointestinal Endoscopy that novel disposable duodenoscope technologies offer promise for reducing infection risk and overcoming current reprocessing challenges. The paper notes that, despite this promise, there is a need to better define the usability, costs, and environmental impact of these disposable technologies.
Current challenges in endoscope reprocessing
According to the authors, the reprocessing of gastrointestinal endoscopes involves several sequential steps that require a “meticulous” attention to detail “to ensure the adequacy of reprocessing.” Human factors/errors are a major contributor to suboptimal reprocessing quality, and these errors are often related to varying adherence to current reprocessing protocols among centers and reprocessing staff members.
Despite these challenges, infectious complications associated with gastrointestinal endoscopy are rare, particularly in relation to end-viewing endoscopes. Many high-profile infectious outbreaks associated with duodenoscopes have been reported in recent years, however, which has heightened the awareness and corresponding concern with endoscope reprocessing. Many of these infectious outbreaks, the authors said, have involved multidrug-resistant organisms.
The complex elevator mechanism, which the authors noted “is relatively inaccessible during the precleaning and manual cleaning steps in reprocessing,” represents a paramount challenge in the reprocessing of duodenoscopes. The challenge related to this mechanism potentially contributes to greater biofilm formation and contamination. Other factors implicated in the transmission of duodenoscope-associated infections from patient to patient include other design issues, human errors in reprocessing, endoscope damage and channel defects, and storage and environmental factors.
“Given the reprocessing challenges posed by duodenoscopes, in 2015 the Food and Drug Administration issued a recommendation that one or more supplemental measures be implemented by facilities as a means to decrease the infectious risk posed by duodenoscopes,” the authors noted, including ethylene oxide (EtO) sterilization, liquid chemical sterilization, and repeat high-level disinfection (HLD). They added, however, that a recent U.S. multisociety reprocessing guideline “does not recommend repeat high-level disinfection over single high-level disinfection, and recommends use of EtO sterilization only for duodenoscopes in infectious outbreak settings.”
New sterilization technologies
Liquid chemical sterilization may be a promising alternative to EtO sterilization because it features a shorter disinfection cycle time and less endoscope wear or damage. However, clinical data for the effectiveness of LCS in endoscope reprocessing remains very limited.
The high costs and toxicities associated with EtO sterilization may be overcome by the plasma-activated gas, another novel low-temperature sterilization technology. This newer sterilization technique also features a shorter reprocessing time, thereby making it an attractive option for duodenoscope reprocessing. The authors noted that, although it showed promise in a proof-of-concept study, “plasma-activated gas has not been assessed in working endoscopes or compared directly to existing HLD and EtO sterilization technologies.”
Quality indicators in reprocessing
Recently, several quality indicators have been developed to assess the quality of endoscope reprocessing. The indicators, the authors noted, may theoretically allow “for point-of-care assessment of reprocessing quality.” To date, the data to support these indicators are limited.
Adenosine triphosphate testing has been the most widely studied indicator because this can be used to examine the presence of biofilms during endoscope reprocessing via previously established ATP benchmark levels, the authors wrote. Studies that have assessed the efficacy of ATP testing, however, are limited by their use of heterogeneous assays, analytical techniques, and cutoffs for identifying contamination.
Hemoglobin, protein, and carbohydrate are other point-of-care indicators that have previously demonstrated potential capability of assessing the achievement of adequate manual endoscope cleaning before high-level disinfection or sterilization.
Novel disposable duodenoscope technologies
Given that consistent research studies have shown the existence of residual duodenoscope contamination after standard and enhanced reprocessing, there has been increased attention placed on novel disposable duodenoscope technologies. In 2019, the FDA recommended a move toward duodenoscopes with disposable components because it could make reprocessing easier, more effective, or altogether unnecessary. According to the authors, there are currently six duodenoscopes with disposable components that are cleared by the FDA for use. These include three that use a disposable endcap, one that uses a disposable elevator and endcap, and two that are fully disposable. The authors stated that, while “improved access to the elevator facilitated by a disposable endcap may allow for improved cleaning” and reduce contamination and formation of biofilm, there are no data to confirm these proposed advantages.
There are several unanswered questions regarding new disposable duodenoscope technologies, including questions related to the usability, costs, and environmental impact of these technologies. The authors summarized several studies discussing these issues; however, a clear definition or consensus regarding how to approach these challenges has yet to be established. In addition to these unanswered questions, the authors also noted that identifying the acceptable rate of infectious risk associated with disposable duodenoscopes is another “important task” that needs to be accomplished in the near future.
Environmental impact
The authors stated that the health care system in the United States is directly responsible for up to 10% of total U.S. greenhouse emissions. Additionally, the substantial use of chemicals and water in endoscope reprocessing represents a “substantial” concern for the environment. One estimate suggested that a mean of 40 total endoscopies per day generates around 15.78 tons of CO2 per year.
Given the unclear impact disposable endoscopes may have on the environment, the authors suggested that there is a clear need to discover interventions that reduce their potential negative impact. Strategies that reduce the number of endoscopies performed, increased recycling and use of recyclable materials, and use of renewable energy sources in endoscopy units have been proposed.
“The massive environmental impact of gastrointestinal endoscopy as a whole has become increasingly recognized,” the authors wrote, “and further study and interventions directed at improving the environmental footprint of endoscopy will be of foremost importance.”
The authors disclosed no conflicts of interest.
The future remains to be seen
Solutions surrounding proper endoscope reprocessing and infection prevention have become a major focus of investigation and innovation in endoscope design, particularly related to duodenoscopes. As multiple infectious outbreaks associated with duodenoscopes have been reported, the complex mechanism of the duodenoscope elevator has emerged as the target for modification because it is somewhat inaccessible and difficult to adequately clean.
One of the major considerations related to disposable duodenoscopes is the cost. Currently, the savings from removing the need for reprocessing equipment, supplies, and personnel does not balance the cost of the disposable duodenoscope. Studies on the environmental impact of disposable duodenoscopes suggest a major increase in endoscopy-related waste.
In summary, enhanced reprocessing techniques and modified scope design elements may not achieve adequate thresholds for infection prevention. Furthermore, while fully disposable duodenoscopes offer promise, questions remain about overall functionality, cost, and the potentially profound environmental impact. Further research is warranted on feasible solutions for infection prevention, and the issues of cost and environmental impact must be addressed before the widespread adoption of disposable duodenoscopes.
Jennifer Maranki, MD, MSc, is professor of medicine and director of endoscopy at Penn State Hershey (Pennsylvania) Medical Center. She reports being a consultant for Boston Scientific.
The future remains to be seen
Solutions surrounding proper endoscope reprocessing and infection prevention have become a major focus of investigation and innovation in endoscope design, particularly related to duodenoscopes. As multiple infectious outbreaks associated with duodenoscopes have been reported, the complex mechanism of the duodenoscope elevator has emerged as the target for modification because it is somewhat inaccessible and difficult to adequately clean.
One of the major considerations related to disposable duodenoscopes is the cost. Currently, the savings from removing the need for reprocessing equipment, supplies, and personnel does not balance the cost of the disposable duodenoscope. Studies on the environmental impact of disposable duodenoscopes suggest a major increase in endoscopy-related waste.
In summary, enhanced reprocessing techniques and modified scope design elements may not achieve adequate thresholds for infection prevention. Furthermore, while fully disposable duodenoscopes offer promise, questions remain about overall functionality, cost, and the potentially profound environmental impact. Further research is warranted on feasible solutions for infection prevention, and the issues of cost and environmental impact must be addressed before the widespread adoption of disposable duodenoscopes.
Jennifer Maranki, MD, MSc, is professor of medicine and director of endoscopy at Penn State Hershey (Pennsylvania) Medical Center. She reports being a consultant for Boston Scientific.
The future remains to be seen
Solutions surrounding proper endoscope reprocessing and infection prevention have become a major focus of investigation and innovation in endoscope design, particularly related to duodenoscopes. As multiple infectious outbreaks associated with duodenoscopes have been reported, the complex mechanism of the duodenoscope elevator has emerged as the target for modification because it is somewhat inaccessible and difficult to adequately clean.
One of the major considerations related to disposable duodenoscopes is the cost. Currently, the savings from removing the need for reprocessing equipment, supplies, and personnel does not balance the cost of the disposable duodenoscope. Studies on the environmental impact of disposable duodenoscopes suggest a major increase in endoscopy-related waste.
In summary, enhanced reprocessing techniques and modified scope design elements may not achieve adequate thresholds for infection prevention. Furthermore, while fully disposable duodenoscopes offer promise, questions remain about overall functionality, cost, and the potentially profound environmental impact. Further research is warranted on feasible solutions for infection prevention, and the issues of cost and environmental impact must be addressed before the widespread adoption of disposable duodenoscopes.
Jennifer Maranki, MD, MSc, is professor of medicine and director of endoscopy at Penn State Hershey (Pennsylvania) Medical Center. She reports being a consultant for Boston Scientific.
The reprocessing of endoscopes following gastrointestinal endoscopy is highly effective for mitigating the risk of exogenous infections, yet challenges in duodenoscope reprocessing continue to persist. While several enhanced reprocessing measures have been developed to reduce duodenoscope-related infection risks, the effectiveness of these enhanced measures is largely unclear.
Rahul A. Shimpi, MD, and Joshua P. Spaete, MD, from Duke University, Durham, N.C., wrote in a paper in Techniques and Innovations in Gastrointestinal Endoscopy that novel disposable duodenoscope technologies offer promise for reducing infection risk and overcoming current reprocessing challenges. The paper notes that, despite this promise, there is a need to better define the usability, costs, and environmental impact of these disposable technologies.
Current challenges in endoscope reprocessing
According to the authors, the reprocessing of gastrointestinal endoscopes involves several sequential steps that require a “meticulous” attention to detail “to ensure the adequacy of reprocessing.” Human factors/errors are a major contributor to suboptimal reprocessing quality, and these errors are often related to varying adherence to current reprocessing protocols among centers and reprocessing staff members.
Despite these challenges, infectious complications associated with gastrointestinal endoscopy are rare, particularly in relation to end-viewing endoscopes. Many high-profile infectious outbreaks associated with duodenoscopes have been reported in recent years, however, which has heightened the awareness and corresponding concern with endoscope reprocessing. Many of these infectious outbreaks, the authors said, have involved multidrug-resistant organisms.
The complex elevator mechanism, which the authors noted “is relatively inaccessible during the precleaning and manual cleaning steps in reprocessing,” represents a paramount challenge in the reprocessing of duodenoscopes. The challenge related to this mechanism potentially contributes to greater biofilm formation and contamination. Other factors implicated in the transmission of duodenoscope-associated infections from patient to patient include other design issues, human errors in reprocessing, endoscope damage and channel defects, and storage and environmental factors.
“Given the reprocessing challenges posed by duodenoscopes, in 2015 the Food and Drug Administration issued a recommendation that one or more supplemental measures be implemented by facilities as a means to decrease the infectious risk posed by duodenoscopes,” the authors noted, including ethylene oxide (EtO) sterilization, liquid chemical sterilization, and repeat high-level disinfection (HLD). They added, however, that a recent U.S. multisociety reprocessing guideline “does not recommend repeat high-level disinfection over single high-level disinfection, and recommends use of EtO sterilization only for duodenoscopes in infectious outbreak settings.”
New sterilization technologies
Liquid chemical sterilization may be a promising alternative to EtO sterilization because it features a shorter disinfection cycle time and less endoscope wear or damage. However, clinical data for the effectiveness of LCS in endoscope reprocessing remains very limited.
The high costs and toxicities associated with EtO sterilization may be overcome by the plasma-activated gas, another novel low-temperature sterilization technology. This newer sterilization technique also features a shorter reprocessing time, thereby making it an attractive option for duodenoscope reprocessing. The authors noted that, although it showed promise in a proof-of-concept study, “plasma-activated gas has not been assessed in working endoscopes or compared directly to existing HLD and EtO sterilization technologies.”
Quality indicators in reprocessing
Recently, several quality indicators have been developed to assess the quality of endoscope reprocessing. The indicators, the authors noted, may theoretically allow “for point-of-care assessment of reprocessing quality.” To date, the data to support these indicators are limited.
Adenosine triphosphate testing has been the most widely studied indicator because this can be used to examine the presence of biofilms during endoscope reprocessing via previously established ATP benchmark levels, the authors wrote. Studies that have assessed the efficacy of ATP testing, however, are limited by their use of heterogeneous assays, analytical techniques, and cutoffs for identifying contamination.
Hemoglobin, protein, and carbohydrate are other point-of-care indicators that have previously demonstrated potential capability of assessing the achievement of adequate manual endoscope cleaning before high-level disinfection or sterilization.
Novel disposable duodenoscope technologies
Given that consistent research studies have shown the existence of residual duodenoscope contamination after standard and enhanced reprocessing, there has been increased attention placed on novel disposable duodenoscope technologies. In 2019, the FDA recommended a move toward duodenoscopes with disposable components because it could make reprocessing easier, more effective, or altogether unnecessary. According to the authors, there are currently six duodenoscopes with disposable components that are cleared by the FDA for use. These include three that use a disposable endcap, one that uses a disposable elevator and endcap, and two that are fully disposable. The authors stated that, while “improved access to the elevator facilitated by a disposable endcap may allow for improved cleaning” and reduce contamination and formation of biofilm, there are no data to confirm these proposed advantages.
There are several unanswered questions regarding new disposable duodenoscope technologies, including questions related to the usability, costs, and environmental impact of these technologies. The authors summarized several studies discussing these issues; however, a clear definition or consensus regarding how to approach these challenges has yet to be established. In addition to these unanswered questions, the authors also noted that identifying the acceptable rate of infectious risk associated with disposable duodenoscopes is another “important task” that needs to be accomplished in the near future.
Environmental impact
The authors stated that the health care system in the United States is directly responsible for up to 10% of total U.S. greenhouse emissions. Additionally, the substantial use of chemicals and water in endoscope reprocessing represents a “substantial” concern for the environment. One estimate suggested that a mean of 40 total endoscopies per day generates around 15.78 tons of CO2 per year.
Given the unclear impact disposable endoscopes may have on the environment, the authors suggested that there is a clear need to discover interventions that reduce their potential negative impact. Strategies that reduce the number of endoscopies performed, increased recycling and use of recyclable materials, and use of renewable energy sources in endoscopy units have been proposed.
“The massive environmental impact of gastrointestinal endoscopy as a whole has become increasingly recognized,” the authors wrote, “and further study and interventions directed at improving the environmental footprint of endoscopy will be of foremost importance.”
The authors disclosed no conflicts of interest.
The reprocessing of endoscopes following gastrointestinal endoscopy is highly effective for mitigating the risk of exogenous infections, yet challenges in duodenoscope reprocessing continue to persist. While several enhanced reprocessing measures have been developed to reduce duodenoscope-related infection risks, the effectiveness of these enhanced measures is largely unclear.
Rahul A. Shimpi, MD, and Joshua P. Spaete, MD, from Duke University, Durham, N.C., wrote in a paper in Techniques and Innovations in Gastrointestinal Endoscopy that novel disposable duodenoscope technologies offer promise for reducing infection risk and overcoming current reprocessing challenges. The paper notes that, despite this promise, there is a need to better define the usability, costs, and environmental impact of these disposable technologies.
Current challenges in endoscope reprocessing
According to the authors, the reprocessing of gastrointestinal endoscopes involves several sequential steps that require a “meticulous” attention to detail “to ensure the adequacy of reprocessing.” Human factors/errors are a major contributor to suboptimal reprocessing quality, and these errors are often related to varying adherence to current reprocessing protocols among centers and reprocessing staff members.
Despite these challenges, infectious complications associated with gastrointestinal endoscopy are rare, particularly in relation to end-viewing endoscopes. Many high-profile infectious outbreaks associated with duodenoscopes have been reported in recent years, however, which has heightened the awareness and corresponding concern with endoscope reprocessing. Many of these infectious outbreaks, the authors said, have involved multidrug-resistant organisms.
The complex elevator mechanism, which the authors noted “is relatively inaccessible during the precleaning and manual cleaning steps in reprocessing,” represents a paramount challenge in the reprocessing of duodenoscopes. The challenge related to this mechanism potentially contributes to greater biofilm formation and contamination. Other factors implicated in the transmission of duodenoscope-associated infections from patient to patient include other design issues, human errors in reprocessing, endoscope damage and channel defects, and storage and environmental factors.
“Given the reprocessing challenges posed by duodenoscopes, in 2015 the Food and Drug Administration issued a recommendation that one or more supplemental measures be implemented by facilities as a means to decrease the infectious risk posed by duodenoscopes,” the authors noted, including ethylene oxide (EtO) sterilization, liquid chemical sterilization, and repeat high-level disinfection (HLD). They added, however, that a recent U.S. multisociety reprocessing guideline “does not recommend repeat high-level disinfection over single high-level disinfection, and recommends use of EtO sterilization only for duodenoscopes in infectious outbreak settings.”
New sterilization technologies
Liquid chemical sterilization may be a promising alternative to EtO sterilization because it features a shorter disinfection cycle time and less endoscope wear or damage. However, clinical data for the effectiveness of LCS in endoscope reprocessing remains very limited.
The high costs and toxicities associated with EtO sterilization may be overcome by the plasma-activated gas, another novel low-temperature sterilization technology. This newer sterilization technique also features a shorter reprocessing time, thereby making it an attractive option for duodenoscope reprocessing. The authors noted that, although it showed promise in a proof-of-concept study, “plasma-activated gas has not been assessed in working endoscopes or compared directly to existing HLD and EtO sterilization technologies.”
Quality indicators in reprocessing
Recently, several quality indicators have been developed to assess the quality of endoscope reprocessing. The indicators, the authors noted, may theoretically allow “for point-of-care assessment of reprocessing quality.” To date, the data to support these indicators are limited.
Adenosine triphosphate testing has been the most widely studied indicator because this can be used to examine the presence of biofilms during endoscope reprocessing via previously established ATP benchmark levels, the authors wrote. Studies that have assessed the efficacy of ATP testing, however, are limited by their use of heterogeneous assays, analytical techniques, and cutoffs for identifying contamination.
Hemoglobin, protein, and carbohydrate are other point-of-care indicators that have previously demonstrated potential capability of assessing the achievement of adequate manual endoscope cleaning before high-level disinfection or sterilization.
Novel disposable duodenoscope technologies
Given that consistent research studies have shown the existence of residual duodenoscope contamination after standard and enhanced reprocessing, there has been increased attention placed on novel disposable duodenoscope technologies. In 2019, the FDA recommended a move toward duodenoscopes with disposable components because it could make reprocessing easier, more effective, or altogether unnecessary. According to the authors, there are currently six duodenoscopes with disposable components that are cleared by the FDA for use. These include three that use a disposable endcap, one that uses a disposable elevator and endcap, and two that are fully disposable. The authors stated that, while “improved access to the elevator facilitated by a disposable endcap may allow for improved cleaning” and reduce contamination and formation of biofilm, there are no data to confirm these proposed advantages.
There are several unanswered questions regarding new disposable duodenoscope technologies, including questions related to the usability, costs, and environmental impact of these technologies. The authors summarized several studies discussing these issues; however, a clear definition or consensus regarding how to approach these challenges has yet to be established. In addition to these unanswered questions, the authors also noted that identifying the acceptable rate of infectious risk associated with disposable duodenoscopes is another “important task” that needs to be accomplished in the near future.
Environmental impact
The authors stated that the health care system in the United States is directly responsible for up to 10% of total U.S. greenhouse emissions. Additionally, the substantial use of chemicals and water in endoscope reprocessing represents a “substantial” concern for the environment. One estimate suggested that a mean of 40 total endoscopies per day generates around 15.78 tons of CO2 per year.
Given the unclear impact disposable endoscopes may have on the environment, the authors suggested that there is a clear need to discover interventions that reduce their potential negative impact. Strategies that reduce the number of endoscopies performed, increased recycling and use of recyclable materials, and use of renewable energy sources in endoscopy units have been proposed.
“The massive environmental impact of gastrointestinal endoscopy as a whole has become increasingly recognized,” the authors wrote, “and further study and interventions directed at improving the environmental footprint of endoscopy will be of foremost importance.”
The authors disclosed no conflicts of interest.
FROM TECHNIQUES AND INNOVATIONS IN GASTROINTESTINAL ENDOSCOPY
Do myenteric neurons replicate in small intestine?
A new study contradicts controversial findings from a 2017 study that had suggested around two-thirds of myenteric neurons replicate within 1 week under normal conditions, which – if true – would have an impact on research into several GI diseases and pathologies.
Previous research had suggested that enteric nerve cells, which help control peristalsis throughout the digestive tract, do not replicate in the small intestine under normal conditions, with some limited potential for it observed only after injury, wrote Heikki Virtanen, MD, of the University of Helsinki (Finland), and colleagues. Their report is in Cellular and Molecular Gastroenterology and Hepatology. However, a study by Subhash Kulkarni, PhD, published in 2017, “challenged this dogma, suggesting that almost 70% of myenteric neurons are replaced within 1 week under normal physiological conditions.” These findings were reportedly considered controversial and presented “possibly far-reaching impact on future research,” Dr. Virtanen and colleagues explained.
According to the researchers, the difference between the controversial study findings and other research results may be partially explained by differences in methodology such as DNA labeling times, antigen retrieval methods, and analyzed portions of the small intestine. Dr. Virtanen and colleagues initiated the current study because no systematic evaluation of those potential confounding variables or attempt at independently replicating the findings had been undertaken.
For example, Dr. Virtanen and colleagues administered the nucleoside analogue 5-iodo-2’-deoxyuridine (IdU) in drinking water with the same concentration and labeling period, DNA denaturation steps, and antibodies as Dr. Kulkarni’s 2017 study had used. However, they also examined additional areas of the small intestine, employed paraffin embedding, performed parallel analysis using “click chemistry”-based detection of 5-ethynyl-2’-deoxyuridine (EdU), and more.
The gut’s epithelial cells turn over within 1 week “and serve as an internal positive control for DNA replication,” the researchers noted. In this study, IdU-positive enteric nerve cells were not revealed in microscopic analysis of immunohistochemically labeled small intestines of both cryosections and paraffin-embedded sections or in measurement of 300 ganglia in the small intestine. In contrast, the researchers wrote that the epithelium demonstrated label retention.
In their discussion section of their paper, Dr. Virtanen and colleagues wrote that while “proliferating epithelial cells were readily detectable” in the study, they were unable to detect enteric neuronal proliferation. Although noting that they could not identify reasons for the observations by Kulkarni and colleagues, Dr. Virtanen and colleagues continued to suspect unnoticed variables in the 2017 study affected its findings.
“The fact that the repeat of exactly the same experiment with the same reagents and methods did not reproduce the finding, not even partially, supports this interpretation and is further supported by the same conclusion using EdU-based click chemistry data and previous studies.”
The authors disclose no conflicts.
The enteric nervous system (ENS) is composed of neurons and glia along the GI tract that are responsible for coordinating its motility, absorption, secretion, and other essential functions. While new neurons are formed during gut development, enteric neurogenesis in adult animals has been a subject of controversy but is of fundamental importance to understanding ENS biology and pathophysiology.
To settle the debate, Virtanen et al. replicated the Kulkarni study using the same methods, with the addition of EdU-based click chemistry, and found no replicating neurons. The bulk of evidence thus supports the concept that enteric neurons in the adult gut are a stable population that undergo minimal turnover. Enteric neuronal progenitors, however, are present in the adult gut and can undergo neurogenesis in response to injury. Further research is needed to identify the signals that activate that neurogenic response and to understand how it can be leveraged to treat neurointestinal diseases.
Allan M. Goldstein, MD, is chief of pediatric surgery at Massachusetts General Hospital, professor of surgery at Harvard Medical School, principal investigator in the Pediatric Surgery Research Laboratories, and codirector of the Massachusetts General Center for Neurointestinal Health, all in Boston. He has no relevant conflicts.
The enteric nervous system (ENS) is composed of neurons and glia along the GI tract that are responsible for coordinating its motility, absorption, secretion, and other essential functions. While new neurons are formed during gut development, enteric neurogenesis in adult animals has been a subject of controversy but is of fundamental importance to understanding ENS biology and pathophysiology.
To settle the debate, Virtanen et al. replicated the Kulkarni study using the same methods, with the addition of EdU-based click chemistry, and found no replicating neurons. The bulk of evidence thus supports the concept that enteric neurons in the adult gut are a stable population that undergo minimal turnover. Enteric neuronal progenitors, however, are present in the adult gut and can undergo neurogenesis in response to injury. Further research is needed to identify the signals that activate that neurogenic response and to understand how it can be leveraged to treat neurointestinal diseases.
Allan M. Goldstein, MD, is chief of pediatric surgery at Massachusetts General Hospital, professor of surgery at Harvard Medical School, principal investigator in the Pediatric Surgery Research Laboratories, and codirector of the Massachusetts General Center for Neurointestinal Health, all in Boston. He has no relevant conflicts.
The enteric nervous system (ENS) is composed of neurons and glia along the GI tract that are responsible for coordinating its motility, absorption, secretion, and other essential functions. While new neurons are formed during gut development, enteric neurogenesis in adult animals has been a subject of controversy but is of fundamental importance to understanding ENS biology and pathophysiology.
To settle the debate, Virtanen et al. replicated the Kulkarni study using the same methods, with the addition of EdU-based click chemistry, and found no replicating neurons. The bulk of evidence thus supports the concept that enteric neurons in the adult gut are a stable population that undergo minimal turnover. Enteric neuronal progenitors, however, are present in the adult gut and can undergo neurogenesis in response to injury. Further research is needed to identify the signals that activate that neurogenic response and to understand how it can be leveraged to treat neurointestinal diseases.
Allan M. Goldstein, MD, is chief of pediatric surgery at Massachusetts General Hospital, professor of surgery at Harvard Medical School, principal investigator in the Pediatric Surgery Research Laboratories, and codirector of the Massachusetts General Center for Neurointestinal Health, all in Boston. He has no relevant conflicts.
A new study contradicts controversial findings from a 2017 study that had suggested around two-thirds of myenteric neurons replicate within 1 week under normal conditions, which – if true – would have an impact on research into several GI diseases and pathologies.
Previous research had suggested that enteric nerve cells, which help control peristalsis throughout the digestive tract, do not replicate in the small intestine under normal conditions, with some limited potential for it observed only after injury, wrote Heikki Virtanen, MD, of the University of Helsinki (Finland), and colleagues. Their report is in Cellular and Molecular Gastroenterology and Hepatology. However, a study by Subhash Kulkarni, PhD, published in 2017, “challenged this dogma, suggesting that almost 70% of myenteric neurons are replaced within 1 week under normal physiological conditions.” These findings were reportedly considered controversial and presented “possibly far-reaching impact on future research,” Dr. Virtanen and colleagues explained.
According to the researchers, the difference between the controversial study findings and other research results may be partially explained by differences in methodology such as DNA labeling times, antigen retrieval methods, and analyzed portions of the small intestine. Dr. Virtanen and colleagues initiated the current study because no systematic evaluation of those potential confounding variables or attempt at independently replicating the findings had been undertaken.
For example, Dr. Virtanen and colleagues administered the nucleoside analogue 5-iodo-2’-deoxyuridine (IdU) in drinking water with the same concentration and labeling period, DNA denaturation steps, and antibodies as Dr. Kulkarni’s 2017 study had used. However, they also examined additional areas of the small intestine, employed paraffin embedding, performed parallel analysis using “click chemistry”-based detection of 5-ethynyl-2’-deoxyuridine (EdU), and more.
The gut’s epithelial cells turn over within 1 week “and serve as an internal positive control for DNA replication,” the researchers noted. In this study, IdU-positive enteric nerve cells were not revealed in microscopic analysis of immunohistochemically labeled small intestines of both cryosections and paraffin-embedded sections or in measurement of 300 ganglia in the small intestine. In contrast, the researchers wrote that the epithelium demonstrated label retention.
In their discussion section of their paper, Dr. Virtanen and colleagues wrote that while “proliferating epithelial cells were readily detectable” in the study, they were unable to detect enteric neuronal proliferation. Although noting that they could not identify reasons for the observations by Kulkarni and colleagues, Dr. Virtanen and colleagues continued to suspect unnoticed variables in the 2017 study affected its findings.
“The fact that the repeat of exactly the same experiment with the same reagents and methods did not reproduce the finding, not even partially, supports this interpretation and is further supported by the same conclusion using EdU-based click chemistry data and previous studies.”
The authors disclose no conflicts.
A new study contradicts controversial findings from a 2017 study that had suggested around two-thirds of myenteric neurons replicate within 1 week under normal conditions, which – if true – would have an impact on research into several GI diseases and pathologies.
Previous research had suggested that enteric nerve cells, which help control peristalsis throughout the digestive tract, do not replicate in the small intestine under normal conditions, with some limited potential for it observed only after injury, wrote Heikki Virtanen, MD, of the University of Helsinki (Finland), and colleagues. Their report is in Cellular and Molecular Gastroenterology and Hepatology. However, a study by Subhash Kulkarni, PhD, published in 2017, “challenged this dogma, suggesting that almost 70% of myenteric neurons are replaced within 1 week under normal physiological conditions.” These findings were reportedly considered controversial and presented “possibly far-reaching impact on future research,” Dr. Virtanen and colleagues explained.
According to the researchers, the difference between the controversial study findings and other research results may be partially explained by differences in methodology such as DNA labeling times, antigen retrieval methods, and analyzed portions of the small intestine. Dr. Virtanen and colleagues initiated the current study because no systematic evaluation of those potential confounding variables or attempt at independently replicating the findings had been undertaken.
For example, Dr. Virtanen and colleagues administered the nucleoside analogue 5-iodo-2’-deoxyuridine (IdU) in drinking water with the same concentration and labeling period, DNA denaturation steps, and antibodies as Dr. Kulkarni’s 2017 study had used. However, they also examined additional areas of the small intestine, employed paraffin embedding, performed parallel analysis using “click chemistry”-based detection of 5-ethynyl-2’-deoxyuridine (EdU), and more.
The gut’s epithelial cells turn over within 1 week “and serve as an internal positive control for DNA replication,” the researchers noted. In this study, IdU-positive enteric nerve cells were not revealed in microscopic analysis of immunohistochemically labeled small intestines of both cryosections and paraffin-embedded sections or in measurement of 300 ganglia in the small intestine. In contrast, the researchers wrote that the epithelium demonstrated label retention.
In their discussion section of their paper, Dr. Virtanen and colleagues wrote that while “proliferating epithelial cells were readily detectable” in the study, they were unable to detect enteric neuronal proliferation. Although noting that they could not identify reasons for the observations by Kulkarni and colleagues, Dr. Virtanen and colleagues continued to suspect unnoticed variables in the 2017 study affected its findings.
“The fact that the repeat of exactly the same experiment with the same reagents and methods did not reproduce the finding, not even partially, supports this interpretation and is further supported by the same conclusion using EdU-based click chemistry data and previous studies.”
The authors disclose no conflicts.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY
AGA Clinical Practice Guideline: Diagnosis, treatment of rare hamartomatous polyposis
which is comprised of experts representing the American College of Gastroenterology, the American Gastroenterological Association, and the American Society for Gastrointestinal Endoscopy.
Gastrointestinal hamartomatous polyposis syndromes are rare, autosomal dominant disorders associated with intestinal and extraintestinal tumors. Expert consensus statements have previously offered some recommendations for managing these syndromes, but clinical data are scarce, so the present review “is intended to establish a starting point for future research,” lead author C. Richard Boland, MD, of the University of California, San Diego, and colleagues reported.
According to the investigators, “there are essentially no long-term prospective controlled studies of comparative effectiveness of management strategies for these syndromes.” As a result, their recommendations are based on “low-quality” evidence according to GRADE criteria.
Still, Dr. Boland and colleagues highlighted that “there has been tremendous progress in recent years, both in understanding the underlying genetics that underpin these disorders and in elucidating the biology of associated premalignant and malignant conditions.”
The guideline was published online in Gastroenterology .
Four syndromes reviewed
The investigators gathered these data to provide an overview of genetic and clinical features for each syndrome, as well as management strategies. Four disorders are included: juvenile polyposis syndrome; Peutz-Jeghers syndrome; hereditary mixed polyposis syndrome; and PTEN-hamartoma tumor syndrome, encompassing Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome.
Although all gastrointestinal hamartomatous polyposis syndromes are caused by germline alterations, Dr. Boland and colleagues pointed out that diagnoses are typically made based on clinical criteria, with germline results serving as confirmatory evidence.
The guideline recommends that any patient with a family history of hamartomatous polyps, or with a history of at least two hamartomatous polyps, should undergo genetic testing. The guideline also provides more nuanced genetic testing algorithms for each syndrome.
Among all the hamartomatous polyp disorders, Peutz-Jeghers syndrome is most understood, according to the investigators. It is caused by aberrations in the STK11 gene, and is characterized by polyps with “branching bands of smooth muscle covered by hyperplastic glandular mucosa” that may occur in the stomach, small intestine, and colon. Patients are also at risk of extraintestinal neoplasia.
For management of Peutz-Jeghers syndrome, the guideline advises frequent endoscopic surveillance to prevent mechanical obstruction and bleeding, as well as multidisciplinary surveillance of the breasts, pancreas, ovaries, testes, and lungs.
Juvenile polyposis syndrome is most often characterized by solitary, sporadic polyps in the colorectum (98% of patients affected), followed distantly by polyps in the stomach (14%), ileum (7%), jejunum (7%), and duodenum (7%). The condition is linked with abnormalities in BMPR1A or SMAD4 genes, with SMAD4 germline abnormalities more often leading to “massive” gastric polyps, gastrointestinal bleeding, protein-losing enteropathy, and a higher incidence of gastric cancer in adulthood. Most patients with SMAD4 mutations also have hereditary hemorrhagic telangiectasia, characterized by gastrointestinal bleeding from mucocutaneous telangiectasias, arteriovenous malformations, and epistaxis.
Management of juvenile polyposis syndrome depends on frequent colonoscopies with polypectomies beginning at 12-15 years.
“The goal of surveillance in juvenile polyposis syndrome is to mitigate symptoms related to the disorder and decrease the risk of complications from the manifestations, including cancer,” Dr. Boland and colleagues wrote.
PTEN-hamartoma tumor syndrome, which includes both Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome, is caused by abnormalities in the eponymous PTEN gene. Patients with the condition have an increased risk of colon cancer and polyposis, as well as extraintestinal cancers.
Diagnosis of PTEN-hamartoma tumor syndrome may be complex, involving “clinical examination, mammography and breast MRI, thyroid ultrasound, transvaginal ultrasound, upper gastrointestinal endoscopy, colonoscopy, and renal ultrasound,” according to the guideline.
After diagnosis, frequent colonoscopies are recommended, typically starting at age 35 years, as well as continued surveillance of other organs.
Hereditary mixed polyposis syndrome, which involves attenuated colonic polyposis, is the rarest of the four disorders, having been reported in only “a few families,” according to the guideline. The condition has been linked with “large duplications of the promoter region or entire GREM1 gene.”
Onset is typically in the late 20s, “which is when colonoscopic surveillance should begin,” the investigators wrote. More data are needed to determine appropriate surveillance intervals and if the condition is associated with increased risk of extraintestinal neoplasia.
This call for more research into gastrointestinal hamartomatous polyposis syndromes carried through to the conclusion of the guideline.
“Long-term prospective studies of mutation carriers are still needed to further clarify the risk of cancer and the role of surveillance in these syndromes,” Dr. Boland and colleagues wrote. “With increases in genetic testing and evaluation, future studies will be conducted with more robust cohorts of genetically characterized, less heterogeneous populations. However, there is also a need to study patients and families with unusual phenotypes where no genotype can be found.”
The investigators disclosed no conflicts of interest with the current guideline; however, they provided a list of industry relationships, including Salix Pharmaceuticals, Ferring Pharmaceuticals, and Pfizer, among others.
which is comprised of experts representing the American College of Gastroenterology, the American Gastroenterological Association, and the American Society for Gastrointestinal Endoscopy.
Gastrointestinal hamartomatous polyposis syndromes are rare, autosomal dominant disorders associated with intestinal and extraintestinal tumors. Expert consensus statements have previously offered some recommendations for managing these syndromes, but clinical data are scarce, so the present review “is intended to establish a starting point for future research,” lead author C. Richard Boland, MD, of the University of California, San Diego, and colleagues reported.
According to the investigators, “there are essentially no long-term prospective controlled studies of comparative effectiveness of management strategies for these syndromes.” As a result, their recommendations are based on “low-quality” evidence according to GRADE criteria.
Still, Dr. Boland and colleagues highlighted that “there has been tremendous progress in recent years, both in understanding the underlying genetics that underpin these disorders and in elucidating the biology of associated premalignant and malignant conditions.”
The guideline was published online in Gastroenterology .
Four syndromes reviewed
The investigators gathered these data to provide an overview of genetic and clinical features for each syndrome, as well as management strategies. Four disorders are included: juvenile polyposis syndrome; Peutz-Jeghers syndrome; hereditary mixed polyposis syndrome; and PTEN-hamartoma tumor syndrome, encompassing Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome.
Although all gastrointestinal hamartomatous polyposis syndromes are caused by germline alterations, Dr. Boland and colleagues pointed out that diagnoses are typically made based on clinical criteria, with germline results serving as confirmatory evidence.
The guideline recommends that any patient with a family history of hamartomatous polyps, or with a history of at least two hamartomatous polyps, should undergo genetic testing. The guideline also provides more nuanced genetic testing algorithms for each syndrome.
Among all the hamartomatous polyp disorders, Peutz-Jeghers syndrome is most understood, according to the investigators. It is caused by aberrations in the STK11 gene, and is characterized by polyps with “branching bands of smooth muscle covered by hyperplastic glandular mucosa” that may occur in the stomach, small intestine, and colon. Patients are also at risk of extraintestinal neoplasia.
For management of Peutz-Jeghers syndrome, the guideline advises frequent endoscopic surveillance to prevent mechanical obstruction and bleeding, as well as multidisciplinary surveillance of the breasts, pancreas, ovaries, testes, and lungs.
Juvenile polyposis syndrome is most often characterized by solitary, sporadic polyps in the colorectum (98% of patients affected), followed distantly by polyps in the stomach (14%), ileum (7%), jejunum (7%), and duodenum (7%). The condition is linked with abnormalities in BMPR1A or SMAD4 genes, with SMAD4 germline abnormalities more often leading to “massive” gastric polyps, gastrointestinal bleeding, protein-losing enteropathy, and a higher incidence of gastric cancer in adulthood. Most patients with SMAD4 mutations also have hereditary hemorrhagic telangiectasia, characterized by gastrointestinal bleeding from mucocutaneous telangiectasias, arteriovenous malformations, and epistaxis.
Management of juvenile polyposis syndrome depends on frequent colonoscopies with polypectomies beginning at 12-15 years.
“The goal of surveillance in juvenile polyposis syndrome is to mitigate symptoms related to the disorder and decrease the risk of complications from the manifestations, including cancer,” Dr. Boland and colleagues wrote.
PTEN-hamartoma tumor syndrome, which includes both Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome, is caused by abnormalities in the eponymous PTEN gene. Patients with the condition have an increased risk of colon cancer and polyposis, as well as extraintestinal cancers.
Diagnosis of PTEN-hamartoma tumor syndrome may be complex, involving “clinical examination, mammography and breast MRI, thyroid ultrasound, transvaginal ultrasound, upper gastrointestinal endoscopy, colonoscopy, and renal ultrasound,” according to the guideline.
After diagnosis, frequent colonoscopies are recommended, typically starting at age 35 years, as well as continued surveillance of other organs.
Hereditary mixed polyposis syndrome, which involves attenuated colonic polyposis, is the rarest of the four disorders, having been reported in only “a few families,” according to the guideline. The condition has been linked with “large duplications of the promoter region or entire GREM1 gene.”
Onset is typically in the late 20s, “which is when colonoscopic surveillance should begin,” the investigators wrote. More data are needed to determine appropriate surveillance intervals and if the condition is associated with increased risk of extraintestinal neoplasia.
This call for more research into gastrointestinal hamartomatous polyposis syndromes carried through to the conclusion of the guideline.
“Long-term prospective studies of mutation carriers are still needed to further clarify the risk of cancer and the role of surveillance in these syndromes,” Dr. Boland and colleagues wrote. “With increases in genetic testing and evaluation, future studies will be conducted with more robust cohorts of genetically characterized, less heterogeneous populations. However, there is also a need to study patients and families with unusual phenotypes where no genotype can be found.”
The investigators disclosed no conflicts of interest with the current guideline; however, they provided a list of industry relationships, including Salix Pharmaceuticals, Ferring Pharmaceuticals, and Pfizer, among others.
which is comprised of experts representing the American College of Gastroenterology, the American Gastroenterological Association, and the American Society for Gastrointestinal Endoscopy.
Gastrointestinal hamartomatous polyposis syndromes are rare, autosomal dominant disorders associated with intestinal and extraintestinal tumors. Expert consensus statements have previously offered some recommendations for managing these syndromes, but clinical data are scarce, so the present review “is intended to establish a starting point for future research,” lead author C. Richard Boland, MD, of the University of California, San Diego, and colleagues reported.
According to the investigators, “there are essentially no long-term prospective controlled studies of comparative effectiveness of management strategies for these syndromes.” As a result, their recommendations are based on “low-quality” evidence according to GRADE criteria.
Still, Dr. Boland and colleagues highlighted that “there has been tremendous progress in recent years, both in understanding the underlying genetics that underpin these disorders and in elucidating the biology of associated premalignant and malignant conditions.”
The guideline was published online in Gastroenterology .
Four syndromes reviewed
The investigators gathered these data to provide an overview of genetic and clinical features for each syndrome, as well as management strategies. Four disorders are included: juvenile polyposis syndrome; Peutz-Jeghers syndrome; hereditary mixed polyposis syndrome; and PTEN-hamartoma tumor syndrome, encompassing Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome.
Although all gastrointestinal hamartomatous polyposis syndromes are caused by germline alterations, Dr. Boland and colleagues pointed out that diagnoses are typically made based on clinical criteria, with germline results serving as confirmatory evidence.
The guideline recommends that any patient with a family history of hamartomatous polyps, or with a history of at least two hamartomatous polyps, should undergo genetic testing. The guideline also provides more nuanced genetic testing algorithms for each syndrome.
Among all the hamartomatous polyp disorders, Peutz-Jeghers syndrome is most understood, according to the investigators. It is caused by aberrations in the STK11 gene, and is characterized by polyps with “branching bands of smooth muscle covered by hyperplastic glandular mucosa” that may occur in the stomach, small intestine, and colon. Patients are also at risk of extraintestinal neoplasia.
For management of Peutz-Jeghers syndrome, the guideline advises frequent endoscopic surveillance to prevent mechanical obstruction and bleeding, as well as multidisciplinary surveillance of the breasts, pancreas, ovaries, testes, and lungs.
Juvenile polyposis syndrome is most often characterized by solitary, sporadic polyps in the colorectum (98% of patients affected), followed distantly by polyps in the stomach (14%), ileum (7%), jejunum (7%), and duodenum (7%). The condition is linked with abnormalities in BMPR1A or SMAD4 genes, with SMAD4 germline abnormalities more often leading to “massive” gastric polyps, gastrointestinal bleeding, protein-losing enteropathy, and a higher incidence of gastric cancer in adulthood. Most patients with SMAD4 mutations also have hereditary hemorrhagic telangiectasia, characterized by gastrointestinal bleeding from mucocutaneous telangiectasias, arteriovenous malformations, and epistaxis.
Management of juvenile polyposis syndrome depends on frequent colonoscopies with polypectomies beginning at 12-15 years.
“The goal of surveillance in juvenile polyposis syndrome is to mitigate symptoms related to the disorder and decrease the risk of complications from the manifestations, including cancer,” Dr. Boland and colleagues wrote.
PTEN-hamartoma tumor syndrome, which includes both Bannayan-Riley-Ruvalcaba syndrome and Cowden’s syndrome, is caused by abnormalities in the eponymous PTEN gene. Patients with the condition have an increased risk of colon cancer and polyposis, as well as extraintestinal cancers.
Diagnosis of PTEN-hamartoma tumor syndrome may be complex, involving “clinical examination, mammography and breast MRI, thyroid ultrasound, transvaginal ultrasound, upper gastrointestinal endoscopy, colonoscopy, and renal ultrasound,” according to the guideline.
After diagnosis, frequent colonoscopies are recommended, typically starting at age 35 years, as well as continued surveillance of other organs.
Hereditary mixed polyposis syndrome, which involves attenuated colonic polyposis, is the rarest of the four disorders, having been reported in only “a few families,” according to the guideline. The condition has been linked with “large duplications of the promoter region or entire GREM1 gene.”
Onset is typically in the late 20s, “which is when colonoscopic surveillance should begin,” the investigators wrote. More data are needed to determine appropriate surveillance intervals and if the condition is associated with increased risk of extraintestinal neoplasia.
This call for more research into gastrointestinal hamartomatous polyposis syndromes carried through to the conclusion of the guideline.
“Long-term prospective studies of mutation carriers are still needed to further clarify the risk of cancer and the role of surveillance in these syndromes,” Dr. Boland and colleagues wrote. “With increases in genetic testing and evaluation, future studies will be conducted with more robust cohorts of genetically characterized, less heterogeneous populations. However, there is also a need to study patients and families with unusual phenotypes where no genotype can be found.”
The investigators disclosed no conflicts of interest with the current guideline; however, they provided a list of industry relationships, including Salix Pharmaceuticals, Ferring Pharmaceuticals, and Pfizer, among others.
FROM GASTROENTEROLOGY
AGA Clinical Practice Update: Expert review of dietary options for IBS
The American Gastroenterological Association has published a clinical practice update on dietary interventions for patients with irritable bowel syndrome (IBS). The topics range from identification of suitable candidates for dietary interventions, to levels of evidence for specific diets, which are becoming increasingly recognized for their key role in managing patients with IBS, according to lead author William D. Chey, MD, of the University of Michigan, Ann Arbor, and colleagues.
“Most medical therapies for IBS improve global symptoms in fewer than one-half of patients, with a therapeutic gain of 7%-15% over placebo,” the researchers wrote in Gastroenterology. “Most patients with IBS associate their GI symptoms with eating food.”
According to Dr. Chey and colleagues, clinicians who are considering dietary modifications for treating IBS should first recognize the inherent challenges presented by this process and be aware that new diets won’t work for everyone.
“Specialty diets require planning and preparation, which may be impractical for some patients,” they wrote, noting that individuals with “decreased cognitive abilities and significant psychiatric disease” may be unable to follow diets or interpret their own responses to specific foods. Special diets may also be inappropriate for patients with financial constraints, and “should be avoided in patients with an eating disorder.”
Because of the challenges involved in dietary interventions, Dr. Chey and colleagues advised clinical support from a registered dietitian nutritionist or other resource.
Patients who are suitable candidates for intervention and willing to try a new diet should first provide information about their current eating habits. A food trial can then be personalized and implemented for a predetermined amount of time. If the patient does not respond, then the diet should be stopped and changed to a new diet or another intervention.
Dr. Chey and colleagues discussed three specific dietary interventions and their corresponding levels of evidence: soluble fiber; the low-fermentable oligo-, di-, and monosaccharides and polyols (FODMAP) diet; and a gluten-free diet.
“Soluble fiber is efficacious in treating global symptoms of IBS,” they wrote, citing 15 randomized controlled trials. Soluble fiber is most suitable for patients with constipation-predominant IBS, and different soluble fibers may yield different outcomes based on characteristics such as rate of fermentation and viscosity. In contrast, insoluble fiber is unlikely to help with IBS, and may worsen abdominal pain and bloating.
The low-FODMAP diet is “currently the most evidence-based diet intervention for IBS,” especially for patients with diarrhea-predominant IBS. Dr. Chey and colleagues offered a clear roadmap for employing the diet. First, patients should eat only low-FODMAP foods for 4-6 weeks. If symptoms don’t improve, the diet should be stopped. If symptoms do improve, foods containing a single FODMAP should be reintroduced one at a time, each in increasing quantities over 3 days, alongside documentation of symptom responses. Finally, the diet should be personalized based on these responses. The majority of patients (close to 80%) “can liberalize” a low-FODMAP diet based on their responses.
In contrast with the low-FODMAP diet, which has a relatively solid body of supporting evidence, efficacy data are still limited for treating IBS with a gluten-free diet. “Although observational studies found that most patients with IBS improve with a gluten-free diet, randomized controlled trials have yielded mixed results,” Dr. Chey and colleagues explained.
Their report cited a recent monograph on the topic that concluded that gluten-free eating offered no significant benefit over placebo (relative risk, 0.46; 95% confidence interval, 0.16-1.28). While some studies have documented positive results with a gluten-free diet, Dr. Chey and colleagues suggested that confounding variables such as the nocebo effect and the impact of other dietary factors have yet to be ruled out. “At present, it remains unclear whether a gluten-free diet is of benefit to patients with IBS.”
Dr. Chey and colleagues also explored IBS biomarkers. While some early data have shown that biomarkers may predict dietary responses, “there is insufficient evidence to support their routine use in clinical practice. ... Further efforts to identify and validate biomarkers that predict response to dietary interventions are needed to deliver ‘personalized nutrition.’ ”
The clinical practice update was commissioned and approved by the AGA CPU Committee and the AGA Governing Board. The researchers disclosed relationships with Biomerica, Salix, Mauna Kea Technologies, and others.
This article was updated May 19, 2022.
The American Gastroenterological Association has published a clinical practice update on dietary interventions for patients with irritable bowel syndrome (IBS). The topics range from identification of suitable candidates for dietary interventions, to levels of evidence for specific diets, which are becoming increasingly recognized for their key role in managing patients with IBS, according to lead author William D. Chey, MD, of the University of Michigan, Ann Arbor, and colleagues.
“Most medical therapies for IBS improve global symptoms in fewer than one-half of patients, with a therapeutic gain of 7%-15% over placebo,” the researchers wrote in Gastroenterology. “Most patients with IBS associate their GI symptoms with eating food.”
According to Dr. Chey and colleagues, clinicians who are considering dietary modifications for treating IBS should first recognize the inherent challenges presented by this process and be aware that new diets won’t work for everyone.
“Specialty diets require planning and preparation, which may be impractical for some patients,” they wrote, noting that individuals with “decreased cognitive abilities and significant psychiatric disease” may be unable to follow diets or interpret their own responses to specific foods. Special diets may also be inappropriate for patients with financial constraints, and “should be avoided in patients with an eating disorder.”
Because of the challenges involved in dietary interventions, Dr. Chey and colleagues advised clinical support from a registered dietitian nutritionist or other resource.
Patients who are suitable candidates for intervention and willing to try a new diet should first provide information about their current eating habits. A food trial can then be personalized and implemented for a predetermined amount of time. If the patient does not respond, then the diet should be stopped and changed to a new diet or another intervention.
Dr. Chey and colleagues discussed three specific dietary interventions and their corresponding levels of evidence: soluble fiber; the low-fermentable oligo-, di-, and monosaccharides and polyols (FODMAP) diet; and a gluten-free diet.
“Soluble fiber is efficacious in treating global symptoms of IBS,” they wrote, citing 15 randomized controlled trials. Soluble fiber is most suitable for patients with constipation-predominant IBS, and different soluble fibers may yield different outcomes based on characteristics such as rate of fermentation and viscosity. In contrast, insoluble fiber is unlikely to help with IBS, and may worsen abdominal pain and bloating.
The low-FODMAP diet is “currently the most evidence-based diet intervention for IBS,” especially for patients with diarrhea-predominant IBS. Dr. Chey and colleagues offered a clear roadmap for employing the diet. First, patients should eat only low-FODMAP foods for 4-6 weeks. If symptoms don’t improve, the diet should be stopped. If symptoms do improve, foods containing a single FODMAP should be reintroduced one at a time, each in increasing quantities over 3 days, alongside documentation of symptom responses. Finally, the diet should be personalized based on these responses. The majority of patients (close to 80%) “can liberalize” a low-FODMAP diet based on their responses.
In contrast with the low-FODMAP diet, which has a relatively solid body of supporting evidence, efficacy data are still limited for treating IBS with a gluten-free diet. “Although observational studies found that most patients with IBS improve with a gluten-free diet, randomized controlled trials have yielded mixed results,” Dr. Chey and colleagues explained.
Their report cited a recent monograph on the topic that concluded that gluten-free eating offered no significant benefit over placebo (relative risk, 0.46; 95% confidence interval, 0.16-1.28). While some studies have documented positive results with a gluten-free diet, Dr. Chey and colleagues suggested that confounding variables such as the nocebo effect and the impact of other dietary factors have yet to be ruled out. “At present, it remains unclear whether a gluten-free diet is of benefit to patients with IBS.”
Dr. Chey and colleagues also explored IBS biomarkers. While some early data have shown that biomarkers may predict dietary responses, “there is insufficient evidence to support their routine use in clinical practice. ... Further efforts to identify and validate biomarkers that predict response to dietary interventions are needed to deliver ‘personalized nutrition.’ ”
The clinical practice update was commissioned and approved by the AGA CPU Committee and the AGA Governing Board. The researchers disclosed relationships with Biomerica, Salix, Mauna Kea Technologies, and others.
This article was updated May 19, 2022.
The American Gastroenterological Association has published a clinical practice update on dietary interventions for patients with irritable bowel syndrome (IBS). The topics range from identification of suitable candidates for dietary interventions, to levels of evidence for specific diets, which are becoming increasingly recognized for their key role in managing patients with IBS, according to lead author William D. Chey, MD, of the University of Michigan, Ann Arbor, and colleagues.
“Most medical therapies for IBS improve global symptoms in fewer than one-half of patients, with a therapeutic gain of 7%-15% over placebo,” the researchers wrote in Gastroenterology. “Most patients with IBS associate their GI symptoms with eating food.”
According to Dr. Chey and colleagues, clinicians who are considering dietary modifications for treating IBS should first recognize the inherent challenges presented by this process and be aware that new diets won’t work for everyone.
“Specialty diets require planning and preparation, which may be impractical for some patients,” they wrote, noting that individuals with “decreased cognitive abilities and significant psychiatric disease” may be unable to follow diets or interpret their own responses to specific foods. Special diets may also be inappropriate for patients with financial constraints, and “should be avoided in patients with an eating disorder.”
Because of the challenges involved in dietary interventions, Dr. Chey and colleagues advised clinical support from a registered dietitian nutritionist or other resource.
Patients who are suitable candidates for intervention and willing to try a new diet should first provide information about their current eating habits. A food trial can then be personalized and implemented for a predetermined amount of time. If the patient does not respond, then the diet should be stopped and changed to a new diet or another intervention.
Dr. Chey and colleagues discussed three specific dietary interventions and their corresponding levels of evidence: soluble fiber; the low-fermentable oligo-, di-, and monosaccharides and polyols (FODMAP) diet; and a gluten-free diet.
“Soluble fiber is efficacious in treating global symptoms of IBS,” they wrote, citing 15 randomized controlled trials. Soluble fiber is most suitable for patients with constipation-predominant IBS, and different soluble fibers may yield different outcomes based on characteristics such as rate of fermentation and viscosity. In contrast, insoluble fiber is unlikely to help with IBS, and may worsen abdominal pain and bloating.
The low-FODMAP diet is “currently the most evidence-based diet intervention for IBS,” especially for patients with diarrhea-predominant IBS. Dr. Chey and colleagues offered a clear roadmap for employing the diet. First, patients should eat only low-FODMAP foods for 4-6 weeks. If symptoms don’t improve, the diet should be stopped. If symptoms do improve, foods containing a single FODMAP should be reintroduced one at a time, each in increasing quantities over 3 days, alongside documentation of symptom responses. Finally, the diet should be personalized based on these responses. The majority of patients (close to 80%) “can liberalize” a low-FODMAP diet based on their responses.
In contrast with the low-FODMAP diet, which has a relatively solid body of supporting evidence, efficacy data are still limited for treating IBS with a gluten-free diet. “Although observational studies found that most patients with IBS improve with a gluten-free diet, randomized controlled trials have yielded mixed results,” Dr. Chey and colleagues explained.
Their report cited a recent monograph on the topic that concluded that gluten-free eating offered no significant benefit over placebo (relative risk, 0.46; 95% confidence interval, 0.16-1.28). While some studies have documented positive results with a gluten-free diet, Dr. Chey and colleagues suggested that confounding variables such as the nocebo effect and the impact of other dietary factors have yet to be ruled out. “At present, it remains unclear whether a gluten-free diet is of benefit to patients with IBS.”
Dr. Chey and colleagues also explored IBS biomarkers. While some early data have shown that biomarkers may predict dietary responses, “there is insufficient evidence to support their routine use in clinical practice. ... Further efforts to identify and validate biomarkers that predict response to dietary interventions are needed to deliver ‘personalized nutrition.’ ”
The clinical practice update was commissioned and approved by the AGA CPU Committee and the AGA Governing Board. The researchers disclosed relationships with Biomerica, Salix, Mauna Kea Technologies, and others.
This article was updated May 19, 2022.
FROM GASTROENTEROLOGY
Cellular gene profiling may predict IBD treatment response
Transcriptomic profiling of phagocytes in the lamina propria of patients with inflammatory bowel disease (IBD) may guide future treatment selection, according to investigators.
Mucosal gut biopsies revealed that phagocytic gene expression correlated with inflammatory states, types of IBD, and responses to therapy, lead author Gillian E. Jacobsen a MD/PhD candidate at the University of Miami and colleagues reported.
In an article in Gastro Hep Advances, the investigators wrote that “lamina propria phagocytes along with epithelial cells represent a first line of defense and play a balancing act between tolerance toward commensal microbes and generation of immune responses toward pathogenic microorganisms. ... Inappropriate responses by lamina propria phagocytes have been linked to IBD.”
To better understand these responses, the researchers collected 111 gut mucosal biopsies from 54 patients with IBD, among whom 59% were taking biologics, 72% had inflammation in at least one biopsy site, and 41% had previously used at least one other biologic. Samples were analyzed to determine cell phenotypes, gene expression, and cytokine responses to in vitro Janus kinase (JAK) inhibitor exposure.
Ms. Jacobsen and colleagues noted that most reports that address the function of phagocytes focus on circulating dendritic cells, monocytes, or monocyte-derived macrophages, rather than on resident phagocyte populations located in the lamina propria. However, these circulating cells “do not reflect intestinal inflammation, or whole tissue biopsies.”
Phagocytes based on CD11b expression and phenotyped CD11b+-enriched cells using flow cytometry were identified. In samples with active inflammation, cells were most often granulocytes (45.5%), followed by macrophages (22.6%) and monocytes (9.4%). Uninflamed samples had a slightly lower proportion of granulocytes (33.6%), about the same proportion of macrophages (22.7%), and a higher rate of B cells (15.6% vs. 9.0%).
Ms. Jacobsen and colleagues highlighted the absolute uptick in granulocytes, including neutrophils.
“Neutrophilic infiltration is a major indicator of IBD activity and may be critically linked to ongoing inflammation,” they wrote. “These data demonstrate that CD11b+ enrichment reflects the inflammatory state of the biopsies.”
The investigators also showed that transcriptional profiles of lamina propria CD11b+ cells differed “greatly” between colon and ileum, which suggested that “the location or cellular environment plays a marked role in determining the gene expression of phagocytes.”
CD11b+ cell gene expression profiles also correlated with ulcerative colitis versus Crohn’s disease, although the researchers noted that these patterns were less pronounced than correlations with inflammatory states
“There are pathways common to inflammation regardless of the IBD type that could be used as markers of inflammation or targets for therapy.”
Comparing colon samples from patients who responded to anti–tumor necrosis factor therapy with those who were refractory to anti-TNF therapy revealed significant associations between response type and 52 differentially expressed genes.
“These genes were mostly immunoglobulin genes up-regulated in the anti–TNF-treated inflamed colon, suggesting that CD11b+ B cells may play a role in medication refractoriness.”
Evaluating inflamed colon and anti-TNF refractory ileum revealed differential expression of OSM, a known marker of TNF-resistant disease, as well as TREM1, a proinflammatory marker. In contrast, NTS genes showed high expression in uninflamed samples on anti-TNF therapy. The researchers noted that these findings “may be used to build precision medicine approaches in IBD.”
Further experiments showed that in vitro exposure of anti-TNF refractory samples to JAK inhibitors resulted in significantly reduced secretion of interleukin-8 and TNF-alpha.
“Our study provides functional data that JAK inhibition with tofacitinib (JAK1/JAK3) or ruxolitinib (JAK1/JAK2) inhibits lipopolysaccharide-induced cytokine production even in TNF-refractory samples,” the researchers wrote. “These data inform the response of patients to JAK inhibitors, including those refractory to other treatments.”
The study was supported by Pfizer, the National Institute of Diabetes and Digestive and Kidney Diseases, the Micky & Madeleine Arison Family Foundation Crohn’s & Colitis Discovery Laboratory, and Martin Kalser Chair in Gastroenterology at the University of Miami. The investigators disclosed additional relationships with Takeda, Abbvie, Eli Lilly, and others.
Inflammatory bowel diseases are complex and heterogenous disorders driven by inappropriate immune responses to luminal substances, including diet and microbes, resulting in chronic inflammation of the gastrointestinal tract. Therapies for IBD largely center around suppressing immune responses; however, given the complexity and heterogeneity of these diseases, consensus on which aspect of the immune response to suppress and which cell type to target in a given patient is unclear.
Sreeram Udayan, PhD, and Rodney D. Newberry, MD, are with the division of gastroenterology in the department of medicine at Washington University, St. Louis.
Inflammatory bowel diseases are complex and heterogenous disorders driven by inappropriate immune responses to luminal substances, including diet and microbes, resulting in chronic inflammation of the gastrointestinal tract. Therapies for IBD largely center around suppressing immune responses; however, given the complexity and heterogeneity of these diseases, consensus on which aspect of the immune response to suppress and which cell type to target in a given patient is unclear.
Sreeram Udayan, PhD, and Rodney D. Newberry, MD, are with the division of gastroenterology in the department of medicine at Washington University, St. Louis.
Inflammatory bowel diseases are complex and heterogenous disorders driven by inappropriate immune responses to luminal substances, including diet and microbes, resulting in chronic inflammation of the gastrointestinal tract. Therapies for IBD largely center around suppressing immune responses; however, given the complexity and heterogeneity of these diseases, consensus on which aspect of the immune response to suppress and which cell type to target in a given patient is unclear.
Sreeram Udayan, PhD, and Rodney D. Newberry, MD, are with the division of gastroenterology in the department of medicine at Washington University, St. Louis.
Transcriptomic profiling of phagocytes in the lamina propria of patients with inflammatory bowel disease (IBD) may guide future treatment selection, according to investigators.
Mucosal gut biopsies revealed that phagocytic gene expression correlated with inflammatory states, types of IBD, and responses to therapy, lead author Gillian E. Jacobsen a MD/PhD candidate at the University of Miami and colleagues reported.
In an article in Gastro Hep Advances, the investigators wrote that “lamina propria phagocytes along with epithelial cells represent a first line of defense and play a balancing act between tolerance toward commensal microbes and generation of immune responses toward pathogenic microorganisms. ... Inappropriate responses by lamina propria phagocytes have been linked to IBD.”
To better understand these responses, the researchers collected 111 gut mucosal biopsies from 54 patients with IBD, among whom 59% were taking biologics, 72% had inflammation in at least one biopsy site, and 41% had previously used at least one other biologic. Samples were analyzed to determine cell phenotypes, gene expression, and cytokine responses to in vitro Janus kinase (JAK) inhibitor exposure.
Ms. Jacobsen and colleagues noted that most reports that address the function of phagocytes focus on circulating dendritic cells, monocytes, or monocyte-derived macrophages, rather than on resident phagocyte populations located in the lamina propria. However, these circulating cells “do not reflect intestinal inflammation, or whole tissue biopsies.”
Phagocytes based on CD11b expression and phenotyped CD11b+-enriched cells using flow cytometry were identified. In samples with active inflammation, cells were most often granulocytes (45.5%), followed by macrophages (22.6%) and monocytes (9.4%). Uninflamed samples had a slightly lower proportion of granulocytes (33.6%), about the same proportion of macrophages (22.7%), and a higher rate of B cells (15.6% vs. 9.0%).
Ms. Jacobsen and colleagues highlighted the absolute uptick in granulocytes, including neutrophils.
“Neutrophilic infiltration is a major indicator of IBD activity and may be critically linked to ongoing inflammation,” they wrote. “These data demonstrate that CD11b+ enrichment reflects the inflammatory state of the biopsies.”
The investigators also showed that transcriptional profiles of lamina propria CD11b+ cells differed “greatly” between colon and ileum, which suggested that “the location or cellular environment plays a marked role in determining the gene expression of phagocytes.”
CD11b+ cell gene expression profiles also correlated with ulcerative colitis versus Crohn’s disease, although the researchers noted that these patterns were less pronounced than correlations with inflammatory states
“There are pathways common to inflammation regardless of the IBD type that could be used as markers of inflammation or targets for therapy.”
Comparing colon samples from patients who responded to anti–tumor necrosis factor therapy with those who were refractory to anti-TNF therapy revealed significant associations between response type and 52 differentially expressed genes.
“These genes were mostly immunoglobulin genes up-regulated in the anti–TNF-treated inflamed colon, suggesting that CD11b+ B cells may play a role in medication refractoriness.”
Evaluating inflamed colon and anti-TNF refractory ileum revealed differential expression of OSM, a known marker of TNF-resistant disease, as well as TREM1, a proinflammatory marker. In contrast, NTS genes showed high expression in uninflamed samples on anti-TNF therapy. The researchers noted that these findings “may be used to build precision medicine approaches in IBD.”
Further experiments showed that in vitro exposure of anti-TNF refractory samples to JAK inhibitors resulted in significantly reduced secretion of interleukin-8 and TNF-alpha.
“Our study provides functional data that JAK inhibition with tofacitinib (JAK1/JAK3) or ruxolitinib (JAK1/JAK2) inhibits lipopolysaccharide-induced cytokine production even in TNF-refractory samples,” the researchers wrote. “These data inform the response of patients to JAK inhibitors, including those refractory to other treatments.”
The study was supported by Pfizer, the National Institute of Diabetes and Digestive and Kidney Diseases, the Micky & Madeleine Arison Family Foundation Crohn’s & Colitis Discovery Laboratory, and Martin Kalser Chair in Gastroenterology at the University of Miami. The investigators disclosed additional relationships with Takeda, Abbvie, Eli Lilly, and others.
Transcriptomic profiling of phagocytes in the lamina propria of patients with inflammatory bowel disease (IBD) may guide future treatment selection, according to investigators.
Mucosal gut biopsies revealed that phagocytic gene expression correlated with inflammatory states, types of IBD, and responses to therapy, lead author Gillian E. Jacobsen a MD/PhD candidate at the University of Miami and colleagues reported.
In an article in Gastro Hep Advances, the investigators wrote that “lamina propria phagocytes along with epithelial cells represent a first line of defense and play a balancing act between tolerance toward commensal microbes and generation of immune responses toward pathogenic microorganisms. ... Inappropriate responses by lamina propria phagocytes have been linked to IBD.”
To better understand these responses, the researchers collected 111 gut mucosal biopsies from 54 patients with IBD, among whom 59% were taking biologics, 72% had inflammation in at least one biopsy site, and 41% had previously used at least one other biologic. Samples were analyzed to determine cell phenotypes, gene expression, and cytokine responses to in vitro Janus kinase (JAK) inhibitor exposure.
Ms. Jacobsen and colleagues noted that most reports that address the function of phagocytes focus on circulating dendritic cells, monocytes, or monocyte-derived macrophages, rather than on resident phagocyte populations located in the lamina propria. However, these circulating cells “do not reflect intestinal inflammation, or whole tissue biopsies.”
Phagocytes based on CD11b expression and phenotyped CD11b+-enriched cells using flow cytometry were identified. In samples with active inflammation, cells were most often granulocytes (45.5%), followed by macrophages (22.6%) and monocytes (9.4%). Uninflamed samples had a slightly lower proportion of granulocytes (33.6%), about the same proportion of macrophages (22.7%), and a higher rate of B cells (15.6% vs. 9.0%).
Ms. Jacobsen and colleagues highlighted the absolute uptick in granulocytes, including neutrophils.
“Neutrophilic infiltration is a major indicator of IBD activity and may be critically linked to ongoing inflammation,” they wrote. “These data demonstrate that CD11b+ enrichment reflects the inflammatory state of the biopsies.”
The investigators also showed that transcriptional profiles of lamina propria CD11b+ cells differed “greatly” between colon and ileum, which suggested that “the location or cellular environment plays a marked role in determining the gene expression of phagocytes.”
CD11b+ cell gene expression profiles also correlated with ulcerative colitis versus Crohn’s disease, although the researchers noted that these patterns were less pronounced than correlations with inflammatory states
“There are pathways common to inflammation regardless of the IBD type that could be used as markers of inflammation or targets for therapy.”
Comparing colon samples from patients who responded to anti–tumor necrosis factor therapy with those who were refractory to anti-TNF therapy revealed significant associations between response type and 52 differentially expressed genes.
“These genes were mostly immunoglobulin genes up-regulated in the anti–TNF-treated inflamed colon, suggesting that CD11b+ B cells may play a role in medication refractoriness.”
Evaluating inflamed colon and anti-TNF refractory ileum revealed differential expression of OSM, a known marker of TNF-resistant disease, as well as TREM1, a proinflammatory marker. In contrast, NTS genes showed high expression in uninflamed samples on anti-TNF therapy. The researchers noted that these findings “may be used to build precision medicine approaches in IBD.”
Further experiments showed that in vitro exposure of anti-TNF refractory samples to JAK inhibitors resulted in significantly reduced secretion of interleukin-8 and TNF-alpha.
“Our study provides functional data that JAK inhibition with tofacitinib (JAK1/JAK3) or ruxolitinib (JAK1/JAK2) inhibits lipopolysaccharide-induced cytokine production even in TNF-refractory samples,” the researchers wrote. “These data inform the response of patients to JAK inhibitors, including those refractory to other treatments.”
The study was supported by Pfizer, the National Institute of Diabetes and Digestive and Kidney Diseases, the Micky & Madeleine Arison Family Foundation Crohn’s & Colitis Discovery Laboratory, and Martin Kalser Chair in Gastroenterology at the University of Miami. The investigators disclosed additional relationships with Takeda, Abbvie, Eli Lilly, and others.
FROM GASTRO HEP ADVANCES
Deep learning system outmatches pathologists in diagnosing liver lesions
A new deep learning system can classify hepatocellular nodular lesions (HNLs) via whole-slide images, improving risk stratification of patients and diagnostic rate of hepatocellular carcinoma (HCC), according to investigators.
While the model requires further validation, it could eventually be used to optimize accuracy and efficiency of histologic diagnoses, potentially decreasing reliance on pathologists, particularly in areas with limited access to subspecialists.
In an article published in Gastroenterology, Na Cheng, MD, of Sun Yat-sen University, Guangzhou, China, and colleagues wrote that the “diagnostic process [for HNLs] is laborious, time-consuming, and subject to the experience of the pathologists, often with significant interobserver and intraobserver variability. ... Therefore, [an] automated analysis system is highly demanded in the pathology field, which could considerably ease the workload, speed up the diagnosis, and facilitate the in-time treatment.”
To this end, Dr. Cheng and colleagues developed the hepatocellular-nodular artificial intelligence model (HnAIM) that can scan whole-image slides to identify seven types of tissue: well-differentiated HCC, high-grade dysplastic nodules, low-grade dysplastic nodules, hepatocellular adenoma, focal nodular hyperplasia, and background tissue.
Developing and testing HnAIM was a multistep process that began with three subspecialist pathologists, who independently reviewed and classified liver slides from surgical resection. Unanimous agreement was achieved in 649 slides from 462 patients. These slides were then scanned to create whole-slide images, which were divided into sets for training (70%), validation (15%), and internal testing (15%). Accuracy, measured by area under the curve (AUC), was over 99.9% for the internal testing set. The accuracy of HnAIM was independently, externally validated.
First, HnAIM evaluated liver biopsy slides from 30 patients at one center. Results were compared with diagnoses made by nine pathologists classified as either senior, intermediate, or junior. While HnAIM correctly diagnosed 100% of the cases, senior pathologists correctly diagnosed 94.4% of the cases, followed in accuracy by intermediate (86.7%) and junior (73.3%) pathologists.
The researchers noted that the “rate of agreement with subspecialists was higher for HnAIM than for all 9 pathologists at distinguishing 7 liver tissues, with important diagnostic implications for fragmentary or scarce biopsy specimens.”
Next, HnAIM evaluated 234 samples from three hospitals. Accuracy was slightly lower, with an AUC of 93.5%. The researchers highlighted how HnAIM consistently differentiated precancerous lesions and well-defined HCC from benign lesions and background tissues.
A final experiment showed how HnAIM reacted to the most challenging cases. The investigators selected 12 cases without definitive diagnoses and found that, similar to the findings of three subspecialist pathologists, HnAIM did not reach a single diagnostic conclusion.
The researchers reported that “This may be due to a number of potential reasons, such as inherent uncertainty in the 2-dimensional interpretation of a 3-dimensional specimen, the limited number of tissue samples, and cognitive factors such as anchoring.”
However, HnAIM contributed to the diagnostic process by generating multiple diagnostic possibilities with weighted likelihood. After reviewing these results, the expert pathologists reached consensus in 5 out of 12 cases. Moreover, two out of three expert pathologists agreed on all 12 cases, improving agreement rate from 25% to 100%.
The researchers concluded that the model holds the promise to facilitate human HNL diagnoses and improve efficiency and quality. It can also reduce the workload of pathologists, especially where subspecialists are unavailable.
The study was supported by the National Natural Science Foundation of China, the Guangdong Basic and Applied Basic Research Foundation, the Natural Science Foundation of Guangdong Province, and others. The investigators reported no conflicts of interest.
As the prevalence of hepatocellular carcinoma (HCC) continues to rise, the early and accurate detection and diagnosis of HCC remains paramount to improving patient outcomes. In cases of typical or advanced HCC, an accurate diagnosis is made using CT or MR imaging. However, hepatocellular nodular lesions (HNLs) with atypical or inconclusive radiographic appearances are often biopsied to achieve a histopathologic diagnosis. In addition, accurate diagnosis of an HNL following liver resection or transplantation is important to long-term surveillance and management. An accurate histopathologic diagnosis relies on the availability of experienced subspecialty pathologists and remains a costly and labor-intensive process that can lead to delays in diagnosis and care.
Hannah P. Kim, MD, MSCR, is an assistant professor in the division of gastroenterology, hepatology, and nutrition in the department of medicine at Vanderbilt University Medical Center, Nashville, Tenn. She has no conflicts of interest.
As the prevalence of hepatocellular carcinoma (HCC) continues to rise, the early and accurate detection and diagnosis of HCC remains paramount to improving patient outcomes. In cases of typical or advanced HCC, an accurate diagnosis is made using CT or MR imaging. However, hepatocellular nodular lesions (HNLs) with atypical or inconclusive radiographic appearances are often biopsied to achieve a histopathologic diagnosis. In addition, accurate diagnosis of an HNL following liver resection or transplantation is important to long-term surveillance and management. An accurate histopathologic diagnosis relies on the availability of experienced subspecialty pathologists and remains a costly and labor-intensive process that can lead to delays in diagnosis and care.
Hannah P. Kim, MD, MSCR, is an assistant professor in the division of gastroenterology, hepatology, and nutrition in the department of medicine at Vanderbilt University Medical Center, Nashville, Tenn. She has no conflicts of interest.
As the prevalence of hepatocellular carcinoma (HCC) continues to rise, the early and accurate detection and diagnosis of HCC remains paramount to improving patient outcomes. In cases of typical or advanced HCC, an accurate diagnosis is made using CT or MR imaging. However, hepatocellular nodular lesions (HNLs) with atypical or inconclusive radiographic appearances are often biopsied to achieve a histopathologic diagnosis. In addition, accurate diagnosis of an HNL following liver resection or transplantation is important to long-term surveillance and management. An accurate histopathologic diagnosis relies on the availability of experienced subspecialty pathologists and remains a costly and labor-intensive process that can lead to delays in diagnosis and care.
Hannah P. Kim, MD, MSCR, is an assistant professor in the division of gastroenterology, hepatology, and nutrition in the department of medicine at Vanderbilt University Medical Center, Nashville, Tenn. She has no conflicts of interest.
A new deep learning system can classify hepatocellular nodular lesions (HNLs) via whole-slide images, improving risk stratification of patients and diagnostic rate of hepatocellular carcinoma (HCC), according to investigators.
While the model requires further validation, it could eventually be used to optimize accuracy and efficiency of histologic diagnoses, potentially decreasing reliance on pathologists, particularly in areas with limited access to subspecialists.
In an article published in Gastroenterology, Na Cheng, MD, of Sun Yat-sen University, Guangzhou, China, and colleagues wrote that the “diagnostic process [for HNLs] is laborious, time-consuming, and subject to the experience of the pathologists, often with significant interobserver and intraobserver variability. ... Therefore, [an] automated analysis system is highly demanded in the pathology field, which could considerably ease the workload, speed up the diagnosis, and facilitate the in-time treatment.”
To this end, Dr. Cheng and colleagues developed the hepatocellular-nodular artificial intelligence model (HnAIM) that can scan whole-image slides to identify seven types of tissue: well-differentiated HCC, high-grade dysplastic nodules, low-grade dysplastic nodules, hepatocellular adenoma, focal nodular hyperplasia, and background tissue.
Developing and testing HnAIM was a multistep process that began with three subspecialist pathologists, who independently reviewed and classified liver slides from surgical resection. Unanimous agreement was achieved in 649 slides from 462 patients. These slides were then scanned to create whole-slide images, which were divided into sets for training (70%), validation (15%), and internal testing (15%). Accuracy, measured by area under the curve (AUC), was over 99.9% for the internal testing set. The accuracy of HnAIM was independently, externally validated.
First, HnAIM evaluated liver biopsy slides from 30 patients at one center. Results were compared with diagnoses made by nine pathologists classified as either senior, intermediate, or junior. While HnAIM correctly diagnosed 100% of the cases, senior pathologists correctly diagnosed 94.4% of the cases, followed in accuracy by intermediate (86.7%) and junior (73.3%) pathologists.
The researchers noted that the “rate of agreement with subspecialists was higher for HnAIM than for all 9 pathologists at distinguishing 7 liver tissues, with important diagnostic implications for fragmentary or scarce biopsy specimens.”
Next, HnAIM evaluated 234 samples from three hospitals. Accuracy was slightly lower, with an AUC of 93.5%. The researchers highlighted how HnAIM consistently differentiated precancerous lesions and well-defined HCC from benign lesions and background tissues.
A final experiment showed how HnAIM reacted to the most challenging cases. The investigators selected 12 cases without definitive diagnoses and found that, similar to the findings of three subspecialist pathologists, HnAIM did not reach a single diagnostic conclusion.
The researchers reported that “This may be due to a number of potential reasons, such as inherent uncertainty in the 2-dimensional interpretation of a 3-dimensional specimen, the limited number of tissue samples, and cognitive factors such as anchoring.”
However, HnAIM contributed to the diagnostic process by generating multiple diagnostic possibilities with weighted likelihood. After reviewing these results, the expert pathologists reached consensus in 5 out of 12 cases. Moreover, two out of three expert pathologists agreed on all 12 cases, improving agreement rate from 25% to 100%.
The researchers concluded that the model holds the promise to facilitate human HNL diagnoses and improve efficiency and quality. It can also reduce the workload of pathologists, especially where subspecialists are unavailable.
The study was supported by the National Natural Science Foundation of China, the Guangdong Basic and Applied Basic Research Foundation, the Natural Science Foundation of Guangdong Province, and others. The investigators reported no conflicts of interest.
A new deep learning system can classify hepatocellular nodular lesions (HNLs) via whole-slide images, improving risk stratification of patients and diagnostic rate of hepatocellular carcinoma (HCC), according to investigators.
While the model requires further validation, it could eventually be used to optimize accuracy and efficiency of histologic diagnoses, potentially decreasing reliance on pathologists, particularly in areas with limited access to subspecialists.
In an article published in Gastroenterology, Na Cheng, MD, of Sun Yat-sen University, Guangzhou, China, and colleagues wrote that the “diagnostic process [for HNLs] is laborious, time-consuming, and subject to the experience of the pathologists, often with significant interobserver and intraobserver variability. ... Therefore, [an] automated analysis system is highly demanded in the pathology field, which could considerably ease the workload, speed up the diagnosis, and facilitate the in-time treatment.”
To this end, Dr. Cheng and colleagues developed the hepatocellular-nodular artificial intelligence model (HnAIM) that can scan whole-image slides to identify seven types of tissue: well-differentiated HCC, high-grade dysplastic nodules, low-grade dysplastic nodules, hepatocellular adenoma, focal nodular hyperplasia, and background tissue.
Developing and testing HnAIM was a multistep process that began with three subspecialist pathologists, who independently reviewed and classified liver slides from surgical resection. Unanimous agreement was achieved in 649 slides from 462 patients. These slides were then scanned to create whole-slide images, which were divided into sets for training (70%), validation (15%), and internal testing (15%). Accuracy, measured by area under the curve (AUC), was over 99.9% for the internal testing set. The accuracy of HnAIM was independently, externally validated.
First, HnAIM evaluated liver biopsy slides from 30 patients at one center. Results were compared with diagnoses made by nine pathologists classified as either senior, intermediate, or junior. While HnAIM correctly diagnosed 100% of the cases, senior pathologists correctly diagnosed 94.4% of the cases, followed in accuracy by intermediate (86.7%) and junior (73.3%) pathologists.
The researchers noted that the “rate of agreement with subspecialists was higher for HnAIM than for all 9 pathologists at distinguishing 7 liver tissues, with important diagnostic implications for fragmentary or scarce biopsy specimens.”
Next, HnAIM evaluated 234 samples from three hospitals. Accuracy was slightly lower, with an AUC of 93.5%. The researchers highlighted how HnAIM consistently differentiated precancerous lesions and well-defined HCC from benign lesions and background tissues.
A final experiment showed how HnAIM reacted to the most challenging cases. The investigators selected 12 cases without definitive diagnoses and found that, similar to the findings of three subspecialist pathologists, HnAIM did not reach a single diagnostic conclusion.
The researchers reported that “This may be due to a number of potential reasons, such as inherent uncertainty in the 2-dimensional interpretation of a 3-dimensional specimen, the limited number of tissue samples, and cognitive factors such as anchoring.”
However, HnAIM contributed to the diagnostic process by generating multiple diagnostic possibilities with weighted likelihood. After reviewing these results, the expert pathologists reached consensus in 5 out of 12 cases. Moreover, two out of three expert pathologists agreed on all 12 cases, improving agreement rate from 25% to 100%.
The researchers concluded that the model holds the promise to facilitate human HNL diagnoses and improve efficiency and quality. It can also reduce the workload of pathologists, especially where subspecialists are unavailable.
The study was supported by the National Natural Science Foundation of China, the Guangdong Basic and Applied Basic Research Foundation, the Natural Science Foundation of Guangdong Province, and others. The investigators reported no conflicts of interest.
FROM GASTROENTEROLOGY
Liquid biopsy a valuable tool for detecting, monitoring HCC
Liquid biopsy using circulating tumor (ctDNA) detection and profiling is a valuable tool for clinicians in monitoring hepatocellular carcinoma (HCC), particularly in monitoring progression, researchers wrote in a recent review.
Details of the review, led by co–first authors Xueying Lyu and Yu-Man Tsui, both of the department of pathology and State Key Laboratory of Liver Research at the University of Hong Kong, were published in Cellular and Molecular Gastroenterology and Hepatology.
Because there are few treatment options for advanced-stage liver cancer, scientists are searching for noninvasive ways to detect liver cancer before is progresses. Liver resection is the primary treatment for HCC, but the recurrence rate is high. Early detection increases the ability to identify relevant molecular-targeted drugs and helps predict patient response.
There is growing interest in noninvasive circulating cell-free DNA (cfDNA) as well as in ctDNA – both are part of promising strategies to test circulating DNA in the bloodstream. Together with other circulating biomarkers, they are called liquid biopsy.
HCC can be detected noninvasively by detecting plasma ctDNA released from dying cancer cells. Detection depends on determining whether the circulating tumor DNA has the same molecular alterations as its tumor source. cfDNA contains genomic DNA from different tumor clones or tumors from different sites within a patient to help real-time monitoring of tumor progression.
Barriers to widespread clinical use of liquid biopsy include lack of standardization of the collection process. Procedures differ across health systems on how much blood should be collected, which tubes should be used for collection and how samples should be stored and shipped. The study authors suggested that “specialized tubes can be used for blood sample collection to reduce the chance of white blood cell rupture and genomic DNA contamination from the damaged white blood cells.”
Further research is needed
The study findings indicated that some aspects of liquid biopsy with cfDNA/ctDNA still need further exploration. For example, the effects of tumor vascularization, tumor aggressiveness, metabolic activity, and cell death mechanism on the dynamics of ctDNA in the bloodstream need to be identified.
It’s not yet clear how cfDNA is released into the bloodstream. Actively released cfDNA from the tumor may convey a different message from cfDNA released passively from dying cells upon treatment. The first represents treatment-resistant cells/subclones while the second represents treatment-responsive cells/subclones. Moreover, it is difficult to detect ctDNA mutation in early stage cancers that have lower tumor burden.
The investigators wrote: “The contributions of cfDNA from apoptosis, necrosis, autophagic cell death, and active release at different time points during disease progression, treatment response, and resistance appearance are poorly understood and will affect interpretation of the clinical observation in cfDNA assays.” A lower limit of detection needs to be determined and a standard curve set so that researchers can quantify the allelic frequencies of the mutants in cfDNA and avoid false-negative detection.
They urged establishing external quality assurance to verify laboratory performance, the proficiency in the cfDNA diagnostic test, and interpretation of results to identify errors in sampling, procedures, and decision making. Legal liability and cost effectiveness of using plasma cfDNA in treatment decisions also need to be considered.
The researchers wrote that, to better understand how ctDNA/cfDNA can be used to complement precision medicine in liver cancer, large multicenter cohorts and long-term follow-up are needed to compare ctDNA-guided decision-making against standard treatment without guidance from ctDNA profiling.
The authors disclosed having no conflicts of interest.
Detection and characterization of circulating tumor DNA (ctDNA) is one of the major forms of liquid biopsy. Because ctDNA can reflect molecular features of cancer tissues, it is considered an ideal alternative to tissue biopsy. Furthermore, it can overcome the limitation of tumor tissue biopsies such as bleeding, needle tract seeding, and sampling error.
Currently, several large biomarker trials of ctDNA for early HCC detection are underway. Once its accuracy is established in phase 3-4 biomarker studies, the role of ctDNA in the context of the existing surveillance program should be further defined. As the combination of ctDNA and other orthogonal circulating biomarkers was shown to enhance the performance, future research should explore biomarker panels that include ctDNA and other promising markers to maximize performance. Predictive biomarkers for treatment response is an unmet need in HCC. Investigating the role of a specific ctDNA marker panel as a predictor of immunotherapy responsiveness would be of great interest and is under active investigation.
Ju Dong Yang, MD, is with the Karsh Division of Digestive and Liver Diseases in the department of medicine, with the Comprehensive Transplant Center, and with the Samuel Oschin Comprehensive Cancer Institute at Cedars Sinai Medical Center, Los Angeles. He disclosed providing consulting services for Exact Sciences and Exelixis and Eisai.
Detection and characterization of circulating tumor DNA (ctDNA) is one of the major forms of liquid biopsy. Because ctDNA can reflect molecular features of cancer tissues, it is considered an ideal alternative to tissue biopsy. Furthermore, it can overcome the limitation of tumor tissue biopsies such as bleeding, needle tract seeding, and sampling error.
Currently, several large biomarker trials of ctDNA for early HCC detection are underway. Once its accuracy is established in phase 3-4 biomarker studies, the role of ctDNA in the context of the existing surveillance program should be further defined. As the combination of ctDNA and other orthogonal circulating biomarkers was shown to enhance the performance, future research should explore biomarker panels that include ctDNA and other promising markers to maximize performance. Predictive biomarkers for treatment response is an unmet need in HCC. Investigating the role of a specific ctDNA marker panel as a predictor of immunotherapy responsiveness would be of great interest and is under active investigation.
Ju Dong Yang, MD, is with the Karsh Division of Digestive and Liver Diseases in the department of medicine, with the Comprehensive Transplant Center, and with the Samuel Oschin Comprehensive Cancer Institute at Cedars Sinai Medical Center, Los Angeles. He disclosed providing consulting services for Exact Sciences and Exelixis and Eisai.
Detection and characterization of circulating tumor DNA (ctDNA) is one of the major forms of liquid biopsy. Because ctDNA can reflect molecular features of cancer tissues, it is considered an ideal alternative to tissue biopsy. Furthermore, it can overcome the limitation of tumor tissue biopsies such as bleeding, needle tract seeding, and sampling error.
Currently, several large biomarker trials of ctDNA for early HCC detection are underway. Once its accuracy is established in phase 3-4 biomarker studies, the role of ctDNA in the context of the existing surveillance program should be further defined. As the combination of ctDNA and other orthogonal circulating biomarkers was shown to enhance the performance, future research should explore biomarker panels that include ctDNA and other promising markers to maximize performance. Predictive biomarkers for treatment response is an unmet need in HCC. Investigating the role of a specific ctDNA marker panel as a predictor of immunotherapy responsiveness would be of great interest and is under active investigation.
Ju Dong Yang, MD, is with the Karsh Division of Digestive and Liver Diseases in the department of medicine, with the Comprehensive Transplant Center, and with the Samuel Oschin Comprehensive Cancer Institute at Cedars Sinai Medical Center, Los Angeles. He disclosed providing consulting services for Exact Sciences and Exelixis and Eisai.
Liquid biopsy using circulating tumor (ctDNA) detection and profiling is a valuable tool for clinicians in monitoring hepatocellular carcinoma (HCC), particularly in monitoring progression, researchers wrote in a recent review.
Details of the review, led by co–first authors Xueying Lyu and Yu-Man Tsui, both of the department of pathology and State Key Laboratory of Liver Research at the University of Hong Kong, were published in Cellular and Molecular Gastroenterology and Hepatology.
Because there are few treatment options for advanced-stage liver cancer, scientists are searching for noninvasive ways to detect liver cancer before is progresses. Liver resection is the primary treatment for HCC, but the recurrence rate is high. Early detection increases the ability to identify relevant molecular-targeted drugs and helps predict patient response.
There is growing interest in noninvasive circulating cell-free DNA (cfDNA) as well as in ctDNA – both are part of promising strategies to test circulating DNA in the bloodstream. Together with other circulating biomarkers, they are called liquid biopsy.
HCC can be detected noninvasively by detecting plasma ctDNA released from dying cancer cells. Detection depends on determining whether the circulating tumor DNA has the same molecular alterations as its tumor source. cfDNA contains genomic DNA from different tumor clones or tumors from different sites within a patient to help real-time monitoring of tumor progression.
Barriers to widespread clinical use of liquid biopsy include lack of standardization of the collection process. Procedures differ across health systems on how much blood should be collected, which tubes should be used for collection and how samples should be stored and shipped. The study authors suggested that “specialized tubes can be used for blood sample collection to reduce the chance of white blood cell rupture and genomic DNA contamination from the damaged white blood cells.”
Further research is needed
The study findings indicated that some aspects of liquid biopsy with cfDNA/ctDNA still need further exploration. For example, the effects of tumor vascularization, tumor aggressiveness, metabolic activity, and cell death mechanism on the dynamics of ctDNA in the bloodstream need to be identified.
It’s not yet clear how cfDNA is released into the bloodstream. Actively released cfDNA from the tumor may convey a different message from cfDNA released passively from dying cells upon treatment. The first represents treatment-resistant cells/subclones while the second represents treatment-responsive cells/subclones. Moreover, it is difficult to detect ctDNA mutation in early stage cancers that have lower tumor burden.
The investigators wrote: “The contributions of cfDNA from apoptosis, necrosis, autophagic cell death, and active release at different time points during disease progression, treatment response, and resistance appearance are poorly understood and will affect interpretation of the clinical observation in cfDNA assays.” A lower limit of detection needs to be determined and a standard curve set so that researchers can quantify the allelic frequencies of the mutants in cfDNA and avoid false-negative detection.
They urged establishing external quality assurance to verify laboratory performance, the proficiency in the cfDNA diagnostic test, and interpretation of results to identify errors in sampling, procedures, and decision making. Legal liability and cost effectiveness of using plasma cfDNA in treatment decisions also need to be considered.
The researchers wrote that, to better understand how ctDNA/cfDNA can be used to complement precision medicine in liver cancer, large multicenter cohorts and long-term follow-up are needed to compare ctDNA-guided decision-making against standard treatment without guidance from ctDNA profiling.
The authors disclosed having no conflicts of interest.
Liquid biopsy using circulating tumor (ctDNA) detection and profiling is a valuable tool for clinicians in monitoring hepatocellular carcinoma (HCC), particularly in monitoring progression, researchers wrote in a recent review.
Details of the review, led by co–first authors Xueying Lyu and Yu-Man Tsui, both of the department of pathology and State Key Laboratory of Liver Research at the University of Hong Kong, were published in Cellular and Molecular Gastroenterology and Hepatology.
Because there are few treatment options for advanced-stage liver cancer, scientists are searching for noninvasive ways to detect liver cancer before is progresses. Liver resection is the primary treatment for HCC, but the recurrence rate is high. Early detection increases the ability to identify relevant molecular-targeted drugs and helps predict patient response.
There is growing interest in noninvasive circulating cell-free DNA (cfDNA) as well as in ctDNA – both are part of promising strategies to test circulating DNA in the bloodstream. Together with other circulating biomarkers, they are called liquid biopsy.
HCC can be detected noninvasively by detecting plasma ctDNA released from dying cancer cells. Detection depends on determining whether the circulating tumor DNA has the same molecular alterations as its tumor source. cfDNA contains genomic DNA from different tumor clones or tumors from different sites within a patient to help real-time monitoring of tumor progression.
Barriers to widespread clinical use of liquid biopsy include lack of standardization of the collection process. Procedures differ across health systems on how much blood should be collected, which tubes should be used for collection and how samples should be stored and shipped. The study authors suggested that “specialized tubes can be used for blood sample collection to reduce the chance of white blood cell rupture and genomic DNA contamination from the damaged white blood cells.”
Further research is needed
The study findings indicated that some aspects of liquid biopsy with cfDNA/ctDNA still need further exploration. For example, the effects of tumor vascularization, tumor aggressiveness, metabolic activity, and cell death mechanism on the dynamics of ctDNA in the bloodstream need to be identified.
It’s not yet clear how cfDNA is released into the bloodstream. Actively released cfDNA from the tumor may convey a different message from cfDNA released passively from dying cells upon treatment. The first represents treatment-resistant cells/subclones while the second represents treatment-responsive cells/subclones. Moreover, it is difficult to detect ctDNA mutation in early stage cancers that have lower tumor burden.
The investigators wrote: “The contributions of cfDNA from apoptosis, necrosis, autophagic cell death, and active release at different time points during disease progression, treatment response, and resistance appearance are poorly understood and will affect interpretation of the clinical observation in cfDNA assays.” A lower limit of detection needs to be determined and a standard curve set so that researchers can quantify the allelic frequencies of the mutants in cfDNA and avoid false-negative detection.
They urged establishing external quality assurance to verify laboratory performance, the proficiency in the cfDNA diagnostic test, and interpretation of results to identify errors in sampling, procedures, and decision making. Legal liability and cost effectiveness of using plasma cfDNA in treatment decisions also need to be considered.
The researchers wrote that, to better understand how ctDNA/cfDNA can be used to complement precision medicine in liver cancer, large multicenter cohorts and long-term follow-up are needed to compare ctDNA-guided decision-making against standard treatment without guidance from ctDNA profiling.
The authors disclosed having no conflicts of interest.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY