User login
Fecal bacteria plus FIT promising in diagnosis of colorectal cancer
SAN DIEGO – A panel of four fecal bacteria can reliably discriminate between colorectal cancer (CRC) patients and controls. Combining these four biomarkers with the fecal immunochemical test (FIT) further increases the sensitivity and specificity for CRC.
“This study establishes a reliable platform for a convenient translational approach of using new markers identified by a metagenomics approach. This panel, in combination with other modalities, may be useful for a noninvasive method of diagnosing CRC. Our data look most promising for the combination of FIT plus the four biomarkers,” Dr. Jessie Qiaoyi Liang said at a presentation during the annual Digestive Disease Week.
In this study, FIT had a sensitivity of 70.3% and a specificity of 98.3%. Using the four biomarkers led to a sensitivity of 83.8% and a specificity of 83.2%, while combining them with FIT achieved a sensitivity of 92.8% and a specificity of 81.5%.
The four bacteria included Fusobacterium nucleatum (Fn, a previously identified biomarker), as well as three newly described candidates: Bacteroides clarus (Bc), Clostridium hathewayi (Ch), and one undefined species (in-house label ‘m7’).
The study evaluated the utility of fecal bacterial marker candidates by metagenomic sequencing for the diagnosis of CRC using the simple, cost-effective and targeted method of quantitative PCR.
A strong correlation was observed between qPCR assays and the metagenomic approach for these candidate bacteria. The four bacteria were selected from 21 candidate bacteria. Each of the four candidates was analyzed separately and in combination to determine their utility for the diagnosis of CRC.
The four biomarkers were validated in stool samples from two cohorts of patients prior to undergoing colonoscopy: a Hong Kong cohort of 370 patients (170 found to have CRC and 200 controls) and a second cohort of Shanghai patients (total 69, 33 CRC patients and 36 controls).
All four were found to significantly discriminate between CRC and controls in both cohorts of patients
Next, the researchers looked at these four bacteria and compared them with Fn, m7, and Bc in a cohort of 230 subjects from Hong Kong – 111 with CRC and 119 controls. The four-bacteria panel was significantly better than the three-bacteria panel.
In that same cohort of patients, they compared FIT alone, the four bacteria panel alone, and the combination of the two. The four markers had higher sensitivity compared to FIT alone for stage 1 colorectal cancer, but sensitivity was similar to FIT for stages II, III, and IV.
Using both FIT and the four-bacteria panel increased the sensitivity from 70.3% for FIT to 92.8%.
“At this time, we can say this is promising research. The panel cannot be used in the clinical setting. We plan further studies. The combination of FIT plus the four biomarkers looks the most promising,” said Dr. Liang, a research assistant professor at the Chinese University of Hong Kong.
She noted that the panel is not discriminatory enough for adenoma patients. “We are still looking for other markers for adenomas,” she said.
Commenting on the state of the art for biomarkers in gastrointestinal cancer, Dr. Joseph Sung, Chinese University of Hong Kong, said that there are thousands of biomarkers under study.
“Combinations of biomarkers will turn out to be more useful. They should be used in combination with other diagnostic modules. Cancer biomarkers require large validation studies in different populations.
“We are unlikely to find a single magic bullet,” he said. “The list of biomarkers will continue to grow. All are undergoing larger-scale validation in centers in China, but we need to be cognizant that populations around the world will not have the same biomarkers. The real issue in microbiome markers is geographic variation,” he stated.
SAN DIEGO – A panel of four fecal bacteria can reliably discriminate between colorectal cancer (CRC) patients and controls. Combining these four biomarkers with the fecal immunochemical test (FIT) further increases the sensitivity and specificity for CRC.
“This study establishes a reliable platform for a convenient translational approach of using new markers identified by a metagenomics approach. This panel, in combination with other modalities, may be useful for a noninvasive method of diagnosing CRC. Our data look most promising for the combination of FIT plus the four biomarkers,” Dr. Jessie Qiaoyi Liang said at a presentation during the annual Digestive Disease Week.
In this study, FIT had a sensitivity of 70.3% and a specificity of 98.3%. Using the four biomarkers led to a sensitivity of 83.8% and a specificity of 83.2%, while combining them with FIT achieved a sensitivity of 92.8% and a specificity of 81.5%.
The four bacteria included Fusobacterium nucleatum (Fn, a previously identified biomarker), as well as three newly described candidates: Bacteroides clarus (Bc), Clostridium hathewayi (Ch), and one undefined species (in-house label ‘m7’).
The study evaluated the utility of fecal bacterial marker candidates by metagenomic sequencing for the diagnosis of CRC using the simple, cost-effective and targeted method of quantitative PCR.
A strong correlation was observed between qPCR assays and the metagenomic approach for these candidate bacteria. The four bacteria were selected from 21 candidate bacteria. Each of the four candidates was analyzed separately and in combination to determine their utility for the diagnosis of CRC.
The four biomarkers were validated in stool samples from two cohorts of patients prior to undergoing colonoscopy: a Hong Kong cohort of 370 patients (170 found to have CRC and 200 controls) and a second cohort of Shanghai patients (total 69, 33 CRC patients and 36 controls).
All four were found to significantly discriminate between CRC and controls in both cohorts of patients
Next, the researchers looked at these four bacteria and compared them with Fn, m7, and Bc in a cohort of 230 subjects from Hong Kong – 111 with CRC and 119 controls. The four-bacteria panel was significantly better than the three-bacteria panel.
In that same cohort of patients, they compared FIT alone, the four bacteria panel alone, and the combination of the two. The four markers had higher sensitivity compared to FIT alone for stage 1 colorectal cancer, but sensitivity was similar to FIT for stages II, III, and IV.
Using both FIT and the four-bacteria panel increased the sensitivity from 70.3% for FIT to 92.8%.
“At this time, we can say this is promising research. The panel cannot be used in the clinical setting. We plan further studies. The combination of FIT plus the four biomarkers looks the most promising,” said Dr. Liang, a research assistant professor at the Chinese University of Hong Kong.
She noted that the panel is not discriminatory enough for adenoma patients. “We are still looking for other markers for adenomas,” she said.
Commenting on the state of the art for biomarkers in gastrointestinal cancer, Dr. Joseph Sung, Chinese University of Hong Kong, said that there are thousands of biomarkers under study.
“Combinations of biomarkers will turn out to be more useful. They should be used in combination with other diagnostic modules. Cancer biomarkers require large validation studies in different populations.
“We are unlikely to find a single magic bullet,” he said. “The list of biomarkers will continue to grow. All are undergoing larger-scale validation in centers in China, but we need to be cognizant that populations around the world will not have the same biomarkers. The real issue in microbiome markers is geographic variation,” he stated.
SAN DIEGO – A panel of four fecal bacteria can reliably discriminate between colorectal cancer (CRC) patients and controls. Combining these four biomarkers with the fecal immunochemical test (FIT) further increases the sensitivity and specificity for CRC.
“This study establishes a reliable platform for a convenient translational approach of using new markers identified by a metagenomics approach. This panel, in combination with other modalities, may be useful for a noninvasive method of diagnosing CRC. Our data look most promising for the combination of FIT plus the four biomarkers,” Dr. Jessie Qiaoyi Liang said at a presentation during the annual Digestive Disease Week.
In this study, FIT had a sensitivity of 70.3% and a specificity of 98.3%. Using the four biomarkers led to a sensitivity of 83.8% and a specificity of 83.2%, while combining them with FIT achieved a sensitivity of 92.8% and a specificity of 81.5%.
The four bacteria included Fusobacterium nucleatum (Fn, a previously identified biomarker), as well as three newly described candidates: Bacteroides clarus (Bc), Clostridium hathewayi (Ch), and one undefined species (in-house label ‘m7’).
The study evaluated the utility of fecal bacterial marker candidates by metagenomic sequencing for the diagnosis of CRC using the simple, cost-effective and targeted method of quantitative PCR.
A strong correlation was observed between qPCR assays and the metagenomic approach for these candidate bacteria. The four bacteria were selected from 21 candidate bacteria. Each of the four candidates was analyzed separately and in combination to determine their utility for the diagnosis of CRC.
The four biomarkers were validated in stool samples from two cohorts of patients prior to undergoing colonoscopy: a Hong Kong cohort of 370 patients (170 found to have CRC and 200 controls) and a second cohort of Shanghai patients (total 69, 33 CRC patients and 36 controls).
All four were found to significantly discriminate between CRC and controls in both cohorts of patients
Next, the researchers looked at these four bacteria and compared them with Fn, m7, and Bc in a cohort of 230 subjects from Hong Kong – 111 with CRC and 119 controls. The four-bacteria panel was significantly better than the three-bacteria panel.
In that same cohort of patients, they compared FIT alone, the four bacteria panel alone, and the combination of the two. The four markers had higher sensitivity compared to FIT alone for stage 1 colorectal cancer, but sensitivity was similar to FIT for stages II, III, and IV.
Using both FIT and the four-bacteria panel increased the sensitivity from 70.3% for FIT to 92.8%.
“At this time, we can say this is promising research. The panel cannot be used in the clinical setting. We plan further studies. The combination of FIT plus the four biomarkers looks the most promising,” said Dr. Liang, a research assistant professor at the Chinese University of Hong Kong.
She noted that the panel is not discriminatory enough for adenoma patients. “We are still looking for other markers for adenomas,” she said.
Commenting on the state of the art for biomarkers in gastrointestinal cancer, Dr. Joseph Sung, Chinese University of Hong Kong, said that there are thousands of biomarkers under study.
“Combinations of biomarkers will turn out to be more useful. They should be used in combination with other diagnostic modules. Cancer biomarkers require large validation studies in different populations.
“We are unlikely to find a single magic bullet,” he said. “The list of biomarkers will continue to grow. All are undergoing larger-scale validation in centers in China, but we need to be cognizant that populations around the world will not have the same biomarkers. The real issue in microbiome markers is geographic variation,” he stated.
AT DDW® 2016
Key clinical point: Four fecal bacteria combined with FIT have a high sensitivity for colorectal cancer.
Major finding: Use of the four biomarkers led to a sensitivity of 83.8% and a specificity of 83.2%, while combining them with FIT achieved a sensitivity of 92.8% and a specificity of 81.5%.
Data source: The researchers used metagenomic sequencing to evaluate the utility of four fecal bacteria in diagnosing colorectal cancer in two cohorts of patients: 370 from Hong Kong and 69 from Shanghai.
Disclosures: The study was funded by the Hong Kong government, Group Research Funding in China, and private donations.
Tacrolimus worsens IBD post liver transplant for primary sclerosing cholangitis
SAN DIEGO – A retrospective study provides insight as to why some patients have a milder versus a more severe course of inflammatory bowel disease (IBD) after liver transplant for primary sclerosing cholangitis (PSC).
Recurrent PSC, prolonged use of steroids, and cancer development after liver transplant were associated with a milder course of IBD, but tacrolimus use was associated with increased IBD flare post transplant.
“The course of IBD is highly variable after liver transplant for PSC. PSC is associated with IBD in 60% to 90% of patients, and IBD worsens in about 30% of PSC-IBD patients after transplant. We wanted to explore the risk factors for worsening IBD in this setting,” Dr. Mohamad Mouchli of the Mayo Clinic in Rochester, Minn., explained at the annual Digestive Disease Week.
For purposes of this study, progression of IBD was defined as the need for escalation of medical therapy, compared with before liver transplant, or need for colectomy for medically refractory IBD.
The study population included patients with PSC-IBD who underwent liver transplant for noncholangiocarcinoma indications at the Mayo Clinic from 1998 to 2012. Patients were followed through February 2015.
The investigators screened 373 patients: After exclusions for cancer, no IBD at transplant, and pretransplant colectomy, 151 patients with an intact colon were left and formed the basis of further analysis.
Median age at transplant was 46 years and two-thirds of the patients were male. Transplant-related variables included the following: 23% experienced allograft failure, 36% had recurrent PSC, 25.2% had CMV infection, 19.2% were retransplanted, 22.5% developed cancer after liver transplant, and 52.3% had acute cellular rejection.
Before transplant, 69 patients had quiescent IBD with no therapy and 62 were maintained on 5-aminosalicylates. Post transplant, despite transplant-related immunosuppression, 56 patients (37.1%) required escalation of therapy, 87 patients (57.6%) had a stable course, and 8 patients (5.3%) improved.
Risk of IBD progression at 1, 5, and 10 years was 4%, 18.5%, and 25.5%, respectively. Thirty-five patients underwent colectomy after transplant: the 1-, 5-, and 10-year risks of colectomy were 2%, 9.3%, and 17.2%, respectively. Fourteen percent of patients required anti–tumor necrosis factor therapy after transplant.
On multivariate analysis, tacrolimus exposure emerged as a risk factor for progression of IBD. Tacrolimus immunosuppression was twice as likely as cyclosporine-based immunosuppression to lead to IBD progression. By contrast, recurrent PSC, use of steroids for longer than 6 months, and cancer development after liver transplant were protective of IBD progression.
No association was found between progression of IBD and transplant-related infection or mismatch, immunosuppression with mycophenolate or azathioprine, or IBD-related factors such as pretransplant IBD status or empirical initiation of 5-aminosalicylates within 4 months of liver transplant. During the question-and-answer session following his presentation, Dr. Mouchli was asked whether these results justify prophylactic colectomy. He said that could be considered in patients with active IBD before transplant, but on a case-by-case basis.
SAN DIEGO – A retrospective study provides insight as to why some patients have a milder versus a more severe course of inflammatory bowel disease (IBD) after liver transplant for primary sclerosing cholangitis (PSC).
Recurrent PSC, prolonged use of steroids, and cancer development after liver transplant were associated with a milder course of IBD, but tacrolimus use was associated with increased IBD flare post transplant.
“The course of IBD is highly variable after liver transplant for PSC. PSC is associated with IBD in 60% to 90% of patients, and IBD worsens in about 30% of PSC-IBD patients after transplant. We wanted to explore the risk factors for worsening IBD in this setting,” Dr. Mohamad Mouchli of the Mayo Clinic in Rochester, Minn., explained at the annual Digestive Disease Week.
For purposes of this study, progression of IBD was defined as the need for escalation of medical therapy, compared with before liver transplant, or need for colectomy for medically refractory IBD.
The study population included patients with PSC-IBD who underwent liver transplant for noncholangiocarcinoma indications at the Mayo Clinic from 1998 to 2012. Patients were followed through February 2015.
The investigators screened 373 patients: After exclusions for cancer, no IBD at transplant, and pretransplant colectomy, 151 patients with an intact colon were left and formed the basis of further analysis.
Median age at transplant was 46 years and two-thirds of the patients were male. Transplant-related variables included the following: 23% experienced allograft failure, 36% had recurrent PSC, 25.2% had CMV infection, 19.2% were retransplanted, 22.5% developed cancer after liver transplant, and 52.3% had acute cellular rejection.
Before transplant, 69 patients had quiescent IBD with no therapy and 62 were maintained on 5-aminosalicylates. Post transplant, despite transplant-related immunosuppression, 56 patients (37.1%) required escalation of therapy, 87 patients (57.6%) had a stable course, and 8 patients (5.3%) improved.
Risk of IBD progression at 1, 5, and 10 years was 4%, 18.5%, and 25.5%, respectively. Thirty-five patients underwent colectomy after transplant: the 1-, 5-, and 10-year risks of colectomy were 2%, 9.3%, and 17.2%, respectively. Fourteen percent of patients required anti–tumor necrosis factor therapy after transplant.
On multivariate analysis, tacrolimus exposure emerged as a risk factor for progression of IBD. Tacrolimus immunosuppression was twice as likely as cyclosporine-based immunosuppression to lead to IBD progression. By contrast, recurrent PSC, use of steroids for longer than 6 months, and cancer development after liver transplant were protective of IBD progression.
No association was found between progression of IBD and transplant-related infection or mismatch, immunosuppression with mycophenolate or azathioprine, or IBD-related factors such as pretransplant IBD status or empirical initiation of 5-aminosalicylates within 4 months of liver transplant. During the question-and-answer session following his presentation, Dr. Mouchli was asked whether these results justify prophylactic colectomy. He said that could be considered in patients with active IBD before transplant, but on a case-by-case basis.
SAN DIEGO – A retrospective study provides insight as to why some patients have a milder versus a more severe course of inflammatory bowel disease (IBD) after liver transplant for primary sclerosing cholangitis (PSC).
Recurrent PSC, prolonged use of steroids, and cancer development after liver transplant were associated with a milder course of IBD, but tacrolimus use was associated with increased IBD flare post transplant.
“The course of IBD is highly variable after liver transplant for PSC. PSC is associated with IBD in 60% to 90% of patients, and IBD worsens in about 30% of PSC-IBD patients after transplant. We wanted to explore the risk factors for worsening IBD in this setting,” Dr. Mohamad Mouchli of the Mayo Clinic in Rochester, Minn., explained at the annual Digestive Disease Week.
For purposes of this study, progression of IBD was defined as the need for escalation of medical therapy, compared with before liver transplant, or need for colectomy for medically refractory IBD.
The study population included patients with PSC-IBD who underwent liver transplant for noncholangiocarcinoma indications at the Mayo Clinic from 1998 to 2012. Patients were followed through February 2015.
The investigators screened 373 patients: After exclusions for cancer, no IBD at transplant, and pretransplant colectomy, 151 patients with an intact colon were left and formed the basis of further analysis.
Median age at transplant was 46 years and two-thirds of the patients were male. Transplant-related variables included the following: 23% experienced allograft failure, 36% had recurrent PSC, 25.2% had CMV infection, 19.2% were retransplanted, 22.5% developed cancer after liver transplant, and 52.3% had acute cellular rejection.
Before transplant, 69 patients had quiescent IBD with no therapy and 62 were maintained on 5-aminosalicylates. Post transplant, despite transplant-related immunosuppression, 56 patients (37.1%) required escalation of therapy, 87 patients (57.6%) had a stable course, and 8 patients (5.3%) improved.
Risk of IBD progression at 1, 5, and 10 years was 4%, 18.5%, and 25.5%, respectively. Thirty-five patients underwent colectomy after transplant: the 1-, 5-, and 10-year risks of colectomy were 2%, 9.3%, and 17.2%, respectively. Fourteen percent of patients required anti–tumor necrosis factor therapy after transplant.
On multivariate analysis, tacrolimus exposure emerged as a risk factor for progression of IBD. Tacrolimus immunosuppression was twice as likely as cyclosporine-based immunosuppression to lead to IBD progression. By contrast, recurrent PSC, use of steroids for longer than 6 months, and cancer development after liver transplant were protective of IBD progression.
No association was found between progression of IBD and transplant-related infection or mismatch, immunosuppression with mycophenolate or azathioprine, or IBD-related factors such as pretransplant IBD status or empirical initiation of 5-aminosalicylates within 4 months of liver transplant. During the question-and-answer session following his presentation, Dr. Mouchli was asked whether these results justify prophylactic colectomy. He said that could be considered in patients with active IBD before transplant, but on a case-by-case basis.
AT DDW® 2016
Key clinical point: Tacrolimus exposure was an independent risk factor for IBD progression after liver transplant in patients with PSC-IBD.
Major finding: Tacrolimus immunosuppression was twice as likely as cyclosporine-based immunosuppression to lead to worsening IBC post liver transplant.
Data source: Retrospective study of the natural history of IBD following liver transplant in 151 patients with PSC-IBD.
Disclosures: Dr. Mouchli had no financial disclosures to report.
Acid suppression appears to prevent cancer in Barrett’s esophagus
SAN DIEGO – In patients with Barrett’s esophagus (BE), acid suppression with proton pump inhibitors (PPIs) and, to a lesser extent, histamine2 receptor antagonists (H2RA) reduced the risk of progression to esophageal adenocarcinoma (EAC), according to the findings from a study reported at the annual Digestive Disease Week.
Both treatments were independently associated with reduced risk of progression to cancer in the nested, case-control study. Taking PPIs reduced the risk of developing EAC by 69% and taking H2RAs reduced the risk by 45%.
“There are no concrete guidelines regarding use of PPIs for patients with Barrett’s esophagus,” said presenting author Dr Mimi C. Tan, a postdoctoral research fellow at Baylor College of Medicine in Houston. “The guidelines for these agents are based on symptoms of reflux. Although this study does not tell us for sure, it looks like taking these medications [in Barrett’s esophagus] prevents cancer. The effect is stronger with PPIs, but H2RAs still have an effect.”
The incidence of BE is increasing in the United States, and BE is the only known risk factor for EAC.
“There is a knowledge gap, with a deficiency of studies of cases with longitudinal follow-up in a cohort of BE patients examining the effects of PPIs and H2RAs on progression to EAC,” Dr. Tan explained. “We conducted this study in a large cohort of BE patients and hypothesized that acid suppression with PPIs and H2RAs would decrease the risk of EAC.”
The study included a cohort of 29,536 male veterans diagnosed with BE between 2004 and 2009 identified in the national Veterans Affairs Corporate Data Warehouse. Of those, 760 had an ICD-9 diagnosis of EAC. Cases of incident BE (in patients who developed EAC) were matched with controls with BE who did not develop cancer by the time of each EAC diagnosis in the corresponding case. Cases were followed until 2011.
After exclusions, the final analysis was based on 311 cases with EAC after BE and 856 matched controls with no EAC. Use of acid suppression was based on reports of medications dispensed at a Veteran’s Affairs pharmacy.
In general, patients with EAC were significantly more overweight and obese than controls (86.8% versus 80.6%, respectively, P = .04) and were more often cigarette smokers than controls (19% versus 13%, respectively, P = .02).
Cases were less likely than controls to fill at least one prescription for a PPI: 65% of cases versus 83% of controls. Dr Tan said the number of H2RA users was much smaller; 8% of cases had at least one prescription for an H2RA, compared with 14% of controls.
For PPIs, duration of use was not associated with risk, whereas the opposite was true for H2RA users.
“This was an observational study, not a randomized trial, so we can’t say with certainty that PPI and H2RA use reduce the risk of cancer,” Dr. Tan cautioned. “But the study suggests that there may be a role for these medications. Further study is needed.”
In a separate interview, Dr Tan noted that the risk of developing EAC in people with BE is low – 0.5% per year.
During the question and answer session following her presentation, an audience member said that it is interesting and disturbing that so many BE patients are not on acid suppression.
“There are no clear cut guidelines that state that BE patients should be on a PPI,” Dr Tan noted. “Maybe that’s why primary care doctors are not prescribing them.”
In a talk after Dr Tan’s presentation, Dr Stuart Spechler of UT Southwestern Medical Center, Dallas, noted that it appears that clonal diversity at diagnosis of BE identifies patients likely to develop EAC.
“This raises a concern: If these cells are predestined to become cancer, can they achieve salvation through good acts?” Dr. Spechler asked. “Dr Tan’s study suggests that they can. The malignant potential of BE may be predetermined, and acid suppression therapy reduces the risk of progression to EAC.”
Dr. Tan had no financial disclosures to report.
SAN DIEGO – In patients with Barrett’s esophagus (BE), acid suppression with proton pump inhibitors (PPIs) and, to a lesser extent, histamine2 receptor antagonists (H2RA) reduced the risk of progression to esophageal adenocarcinoma (EAC), according to the findings from a study reported at the annual Digestive Disease Week.
Both treatments were independently associated with reduced risk of progression to cancer in the nested, case-control study. Taking PPIs reduced the risk of developing EAC by 69% and taking H2RAs reduced the risk by 45%.
“There are no concrete guidelines regarding use of PPIs for patients with Barrett’s esophagus,” said presenting author Dr Mimi C. Tan, a postdoctoral research fellow at Baylor College of Medicine in Houston. “The guidelines for these agents are based on symptoms of reflux. Although this study does not tell us for sure, it looks like taking these medications [in Barrett’s esophagus] prevents cancer. The effect is stronger with PPIs, but H2RAs still have an effect.”
The incidence of BE is increasing in the United States, and BE is the only known risk factor for EAC.
“There is a knowledge gap, with a deficiency of studies of cases with longitudinal follow-up in a cohort of BE patients examining the effects of PPIs and H2RAs on progression to EAC,” Dr. Tan explained. “We conducted this study in a large cohort of BE patients and hypothesized that acid suppression with PPIs and H2RAs would decrease the risk of EAC.”
The study included a cohort of 29,536 male veterans diagnosed with BE between 2004 and 2009 identified in the national Veterans Affairs Corporate Data Warehouse. Of those, 760 had an ICD-9 diagnosis of EAC. Cases of incident BE (in patients who developed EAC) were matched with controls with BE who did not develop cancer by the time of each EAC diagnosis in the corresponding case. Cases were followed until 2011.
After exclusions, the final analysis was based on 311 cases with EAC after BE and 856 matched controls with no EAC. Use of acid suppression was based on reports of medications dispensed at a Veteran’s Affairs pharmacy.
In general, patients with EAC were significantly more overweight and obese than controls (86.8% versus 80.6%, respectively, P = .04) and were more often cigarette smokers than controls (19% versus 13%, respectively, P = .02).
Cases were less likely than controls to fill at least one prescription for a PPI: 65% of cases versus 83% of controls. Dr Tan said the number of H2RA users was much smaller; 8% of cases had at least one prescription for an H2RA, compared with 14% of controls.
For PPIs, duration of use was not associated with risk, whereas the opposite was true for H2RA users.
“This was an observational study, not a randomized trial, so we can’t say with certainty that PPI and H2RA use reduce the risk of cancer,” Dr. Tan cautioned. “But the study suggests that there may be a role for these medications. Further study is needed.”
In a separate interview, Dr Tan noted that the risk of developing EAC in people with BE is low – 0.5% per year.
During the question and answer session following her presentation, an audience member said that it is interesting and disturbing that so many BE patients are not on acid suppression.
“There are no clear cut guidelines that state that BE patients should be on a PPI,” Dr Tan noted. “Maybe that’s why primary care doctors are not prescribing them.”
In a talk after Dr Tan’s presentation, Dr Stuart Spechler of UT Southwestern Medical Center, Dallas, noted that it appears that clonal diversity at diagnosis of BE identifies patients likely to develop EAC.
“This raises a concern: If these cells are predestined to become cancer, can they achieve salvation through good acts?” Dr. Spechler asked. “Dr Tan’s study suggests that they can. The malignant potential of BE may be predetermined, and acid suppression therapy reduces the risk of progression to EAC.”
Dr. Tan had no financial disclosures to report.
SAN DIEGO – In patients with Barrett’s esophagus (BE), acid suppression with proton pump inhibitors (PPIs) and, to a lesser extent, histamine2 receptor antagonists (H2RA) reduced the risk of progression to esophageal adenocarcinoma (EAC), according to the findings from a study reported at the annual Digestive Disease Week.
Both treatments were independently associated with reduced risk of progression to cancer in the nested, case-control study. Taking PPIs reduced the risk of developing EAC by 69% and taking H2RAs reduced the risk by 45%.
“There are no concrete guidelines regarding use of PPIs for patients with Barrett’s esophagus,” said presenting author Dr Mimi C. Tan, a postdoctoral research fellow at Baylor College of Medicine in Houston. “The guidelines for these agents are based on symptoms of reflux. Although this study does not tell us for sure, it looks like taking these medications [in Barrett’s esophagus] prevents cancer. The effect is stronger with PPIs, but H2RAs still have an effect.”
The incidence of BE is increasing in the United States, and BE is the only known risk factor for EAC.
“There is a knowledge gap, with a deficiency of studies of cases with longitudinal follow-up in a cohort of BE patients examining the effects of PPIs and H2RAs on progression to EAC,” Dr. Tan explained. “We conducted this study in a large cohort of BE patients and hypothesized that acid suppression with PPIs and H2RAs would decrease the risk of EAC.”
The study included a cohort of 29,536 male veterans diagnosed with BE between 2004 and 2009 identified in the national Veterans Affairs Corporate Data Warehouse. Of those, 760 had an ICD-9 diagnosis of EAC. Cases of incident BE (in patients who developed EAC) were matched with controls with BE who did not develop cancer by the time of each EAC diagnosis in the corresponding case. Cases were followed until 2011.
After exclusions, the final analysis was based on 311 cases with EAC after BE and 856 matched controls with no EAC. Use of acid suppression was based on reports of medications dispensed at a Veteran’s Affairs pharmacy.
In general, patients with EAC were significantly more overweight and obese than controls (86.8% versus 80.6%, respectively, P = .04) and were more often cigarette smokers than controls (19% versus 13%, respectively, P = .02).
Cases were less likely than controls to fill at least one prescription for a PPI: 65% of cases versus 83% of controls. Dr Tan said the number of H2RA users was much smaller; 8% of cases had at least one prescription for an H2RA, compared with 14% of controls.
For PPIs, duration of use was not associated with risk, whereas the opposite was true for H2RA users.
“This was an observational study, not a randomized trial, so we can’t say with certainty that PPI and H2RA use reduce the risk of cancer,” Dr. Tan cautioned. “But the study suggests that there may be a role for these medications. Further study is needed.”
In a separate interview, Dr Tan noted that the risk of developing EAC in people with BE is low – 0.5% per year.
During the question and answer session following her presentation, an audience member said that it is interesting and disturbing that so many BE patients are not on acid suppression.
“There are no clear cut guidelines that state that BE patients should be on a PPI,” Dr Tan noted. “Maybe that’s why primary care doctors are not prescribing them.”
In a talk after Dr Tan’s presentation, Dr Stuart Spechler of UT Southwestern Medical Center, Dallas, noted that it appears that clonal diversity at diagnosis of BE identifies patients likely to develop EAC.
“This raises a concern: If these cells are predestined to become cancer, can they achieve salvation through good acts?” Dr. Spechler asked. “Dr Tan’s study suggests that they can. The malignant potential of BE may be predetermined, and acid suppression therapy reduces the risk of progression to EAC.”
Dr. Tan had no financial disclosures to report.
AT DDW® 2016
Key clinical point: In patients with Barrett’s esophagus, acid suppression with proton pump inhibitors and, to a lesser extent, histamine2 receptor antagonists reduces the likelihood of progression to esophageal cancer.
Major finding: PPIs reduced the risk of developing esophageal cancer by 69%, and H2RAs reduced the risk by 45%.
Data source: A prospective, nested, case-control study of 311 incident cases with esophageal cancer and 856 matched controls.
Disclosures: Dr. Tan had no financial disclosures to report.
Bile salts may be biomarker for recurrent C. difficile infection
SAN DIEGO – Bile acid salts in the stool may be a potential biomarker for recurrent episodes of Clostridium difficile infection, a preliminary study suggests.
Although the finding needs to be validated in a prospective study, it could have therapeutic implications, the study investigators said.
“If our results are validated, we could take bile salt profiles of patients who come in with their first episode of C. difficile infection and, using this biomarker, adjust therapy accordingly,” said study lead author Dr. Jessica Allegretti, who presented the findings at the annual Digestive Disease Week. “A patient at high risk of recurrence could get fecal transplant earlier. Right now, fecal transplant is used for recurrent infection.”
C. difficile represents a major public health threat, and recurrent disease complicates 20%-30% of cases.
The disease is communicated by spores that are resistant to heat and antibiotics, and they germinate in the gastrointestinal tract. Bile acids are part of that process. Bile acids assist in the digestion of fat, and a small proportion pass into the colon where primary bile acids undergo transformation into secondary bile acids such as deoxycholate and lithocholate.
In vitro, primary acids can stimulate C. difficile, explained Dr. Allegretti of Brigham and Women’s Hospital, Cambridge, Mass. “Antibiotic therapy may ablate critical members of the microbiota. We aimed to assess bile acid profiles in patients with C. difficile infection, compared with controls, to understand their role in pathogenesis and hopefully identify a biomarker for recurrence.”
The cross-sectional study collected serum and a single stool sample from three groups of 60 patients: patients with a first episode of C. difficile (fCDI) prior to antibiotics (20 patients), patients with a recurrent episode (rCDI) on treatment with chronic vancomycin at the time of sampling (19), and healthy controls who were fecal transplant donors (21).
The researchers sequenced stool microbial components and conducted bile salt metabolomic profiling. Significant differences were revealed in microbial analysis of the stool samples: Primary bile salts (which induce germination) were significantly elevated in rCDI, compared with fCDI and controls – while secondary bile salts in the stool (which are protective) such as deoxycholate and lithocholate were significantly elevated in controls, compared with fCDI and rCDI (P = .0002 and P = .0007, respectively).
“The same trends were seen in the plasma samples, but were less dramatic than in the stool,” Dr. Allegretti noted.
The median predicted bile salt hydrolase (BSH) gene abundance in rCDI was 20% of the median value in controls (P = .001), and it also was significantly lower than fCDI (P = .001). No significant difference was seen between predicted BSH gene abundance between controls and the fCDI groups.
“An association with reduced predicted bacterial bile salt hydrolase gene abundance may be associated with a diminished capacity to metabolize bile acids,” she said.
The difference in BSH gene abundance between controls and rCDI was largely due to changes in the abundance of 10 bacterial taxa, Dr. Allegretti said.
“This study reinforces the importance of bile salts in CDI and demonstrates for the first time in humans that this shift can be appreciated as early as the first episode of CDI in patients who are antibiotic naive,” Dr Allegretti said.
She noted that rCDI samples were collected in patients on chronic antibiotic therapy, and that may explain some of the decrease in biological microdiversity seen in the study.
In search of a biomarker, “secondary bile acids clearly seem to be the winner, and for now, stool seems to make more sense than blood for samples,” she stated.
Dr. Allegretti and her colleagues are conducting a prospective validation study.
The American College of Gastroenterology funded the study.
SAN DIEGO – Bile acid salts in the stool may be a potential biomarker for recurrent episodes of Clostridium difficile infection, a preliminary study suggests.
Although the finding needs to be validated in a prospective study, it could have therapeutic implications, the study investigators said.
“If our results are validated, we could take bile salt profiles of patients who come in with their first episode of C. difficile infection and, using this biomarker, adjust therapy accordingly,” said study lead author Dr. Jessica Allegretti, who presented the findings at the annual Digestive Disease Week. “A patient at high risk of recurrence could get fecal transplant earlier. Right now, fecal transplant is used for recurrent infection.”
C. difficile represents a major public health threat, and recurrent disease complicates 20%-30% of cases.
The disease is communicated by spores that are resistant to heat and antibiotics, and they germinate in the gastrointestinal tract. Bile acids are part of that process. Bile acids assist in the digestion of fat, and a small proportion pass into the colon where primary bile acids undergo transformation into secondary bile acids such as deoxycholate and lithocholate.
In vitro, primary acids can stimulate C. difficile, explained Dr. Allegretti of Brigham and Women’s Hospital, Cambridge, Mass. “Antibiotic therapy may ablate critical members of the microbiota. We aimed to assess bile acid profiles in patients with C. difficile infection, compared with controls, to understand their role in pathogenesis and hopefully identify a biomarker for recurrence.”
The cross-sectional study collected serum and a single stool sample from three groups of 60 patients: patients with a first episode of C. difficile (fCDI) prior to antibiotics (20 patients), patients with a recurrent episode (rCDI) on treatment with chronic vancomycin at the time of sampling (19), and healthy controls who were fecal transplant donors (21).
The researchers sequenced stool microbial components and conducted bile salt metabolomic profiling. Significant differences were revealed in microbial analysis of the stool samples: Primary bile salts (which induce germination) were significantly elevated in rCDI, compared with fCDI and controls – while secondary bile salts in the stool (which are protective) such as deoxycholate and lithocholate were significantly elevated in controls, compared with fCDI and rCDI (P = .0002 and P = .0007, respectively).
“The same trends were seen in the plasma samples, but were less dramatic than in the stool,” Dr. Allegretti noted.
The median predicted bile salt hydrolase (BSH) gene abundance in rCDI was 20% of the median value in controls (P = .001), and it also was significantly lower than fCDI (P = .001). No significant difference was seen between predicted BSH gene abundance between controls and the fCDI groups.
“An association with reduced predicted bacterial bile salt hydrolase gene abundance may be associated with a diminished capacity to metabolize bile acids,” she said.
The difference in BSH gene abundance between controls and rCDI was largely due to changes in the abundance of 10 bacterial taxa, Dr. Allegretti said.
“This study reinforces the importance of bile salts in CDI and demonstrates for the first time in humans that this shift can be appreciated as early as the first episode of CDI in patients who are antibiotic naive,” Dr Allegretti said.
She noted that rCDI samples were collected in patients on chronic antibiotic therapy, and that may explain some of the decrease in biological microdiversity seen in the study.
In search of a biomarker, “secondary bile acids clearly seem to be the winner, and for now, stool seems to make more sense than blood for samples,” she stated.
Dr. Allegretti and her colleagues are conducting a prospective validation study.
The American College of Gastroenterology funded the study.
SAN DIEGO – Bile acid salts in the stool may be a potential biomarker for recurrent episodes of Clostridium difficile infection, a preliminary study suggests.
Although the finding needs to be validated in a prospective study, it could have therapeutic implications, the study investigators said.
“If our results are validated, we could take bile salt profiles of patients who come in with their first episode of C. difficile infection and, using this biomarker, adjust therapy accordingly,” said study lead author Dr. Jessica Allegretti, who presented the findings at the annual Digestive Disease Week. “A patient at high risk of recurrence could get fecal transplant earlier. Right now, fecal transplant is used for recurrent infection.”
C. difficile represents a major public health threat, and recurrent disease complicates 20%-30% of cases.
The disease is communicated by spores that are resistant to heat and antibiotics, and they germinate in the gastrointestinal tract. Bile acids are part of that process. Bile acids assist in the digestion of fat, and a small proportion pass into the colon where primary bile acids undergo transformation into secondary bile acids such as deoxycholate and lithocholate.
In vitro, primary acids can stimulate C. difficile, explained Dr. Allegretti of Brigham and Women’s Hospital, Cambridge, Mass. “Antibiotic therapy may ablate critical members of the microbiota. We aimed to assess bile acid profiles in patients with C. difficile infection, compared with controls, to understand their role in pathogenesis and hopefully identify a biomarker for recurrence.”
The cross-sectional study collected serum and a single stool sample from three groups of 60 patients: patients with a first episode of C. difficile (fCDI) prior to antibiotics (20 patients), patients with a recurrent episode (rCDI) on treatment with chronic vancomycin at the time of sampling (19), and healthy controls who were fecal transplant donors (21).
The researchers sequenced stool microbial components and conducted bile salt metabolomic profiling. Significant differences were revealed in microbial analysis of the stool samples: Primary bile salts (which induce germination) were significantly elevated in rCDI, compared with fCDI and controls – while secondary bile salts in the stool (which are protective) such as deoxycholate and lithocholate were significantly elevated in controls, compared with fCDI and rCDI (P = .0002 and P = .0007, respectively).
“The same trends were seen in the plasma samples, but were less dramatic than in the stool,” Dr. Allegretti noted.
The median predicted bile salt hydrolase (BSH) gene abundance in rCDI was 20% of the median value in controls (P = .001), and it also was significantly lower than fCDI (P = .001). No significant difference was seen between predicted BSH gene abundance between controls and the fCDI groups.
“An association with reduced predicted bacterial bile salt hydrolase gene abundance may be associated with a diminished capacity to metabolize bile acids,” she said.
The difference in BSH gene abundance between controls and rCDI was largely due to changes in the abundance of 10 bacterial taxa, Dr. Allegretti said.
“This study reinforces the importance of bile salts in CDI and demonstrates for the first time in humans that this shift can be appreciated as early as the first episode of CDI in patients who are antibiotic naive,” Dr Allegretti said.
She noted that rCDI samples were collected in patients on chronic antibiotic therapy, and that may explain some of the decrease in biological microdiversity seen in the study.
In search of a biomarker, “secondary bile acids clearly seem to be the winner, and for now, stool seems to make more sense than blood for samples,” she stated.
Dr. Allegretti and her colleagues are conducting a prospective validation study.
The American College of Gastroenterology funded the study.
AT DDW® 2016
Key clinical point: Bile salt acids may identify patients at risk of recurrent Clostridium difficile infection who require more aggressive first-line therapy.
Major finding: Secondary bile acids in stool can distinguish between first-episode patients, recurrent-episode patients, and healthy controls.
Data source: A prospective cross-sectional study of 60 participants.
Disclosures: The American College of Gastroenterology funded the study.
Mirtazapine improved functional dyspepsia, psychological distress
SAN DIEGO – The tetracyclic antidepressant mirtazapine significantly improved indicators of functional dyspepsia and psychological distress in a single-center, randomized, placebo-controlled, double-blinded trial of 116 adults.
After 3 months of treatment, the mirtazapine group had significantly less nausea, early satiety, depression, somatization, hostility, and phobic anxiety, compared with the control group (all P values < .05), Dr. Yaoyao Gong reported at the annual Digestive Disease Week. Most patients began improving after 1-2 weeks of mirtazapine therapy, she added.
“We assume that mirtazapine might improve functional dyspepsia by improving depression and anxiety, reducing visceral sensitivity, and through its prokinetic effects on gastrointestinal transit,” said Dr. Gong, who is at the gastroenterology department of Nanjing Medical University, China.
Mirtazapine is a presynaptic alpha-2 antagonist that also blocks the 5-HT2a, 5-HT2c, 5-HT3, and H-1 receptors. This atypical antidepressant reduced visceral hypersensitivity and increased gastric accommodation, gastric emptying, and colonic transit time in animal studies, and improved weight loss, early satiety, and nausea in a recent U.S. placebo-controlled pilot trial (Clin Gastroenterol Hepatol. 2016 Mar;14[3]:385-92).
As a biopsychosocial disorder, functional dyspepsia involves both physical and psychological symptoms, Dr. Gong noted. To explore how mirtazapine affects both realms, she and her associates randomized outpatients meeting Rome III functional dyspepsia criteria who also been diagnosed by a psychiatrist with anxiety, depression, or somatization disorder to receive either mirtazapine, 15 mg per day (61 patients) or placebo (55 patients). Both groups also received omeprazole, 30 mg per day, and mosapride, 5 mg three times a day. Patients were mostly in their early 40s, none was taking SSRIs or MAOIs, they had no history of abdominal surgery or upper endoscopy lesions, and they had negative 13C urea breath tests for Helicobacter pylori infection.
After 3 months of treatment, mirtazapine was associated with significantly lower 7-point Likert scales for nausea and early satiety, compared with standard care alone, said Dr. Gong. Mirtazapine did not significantly improve the other Rome III criteria for functional dyspepsia, but general overall score was significantly lower than for the control group.
In addition, the mirtazapine group had a “markedly better” average Symptom Checklist (SCL)-90 score, compared with the control group, and individual measures of depression, somatization, hostility, and phobic anxiety also were significantly lower than for controls (P < .05).
Mirtazapine did not significantly affect overall anxiety, a frequent psychological feature of functional dyspepsia. Mirtazapine was most often associated with dry mouth, although the researchers did not measure weight gain, another common adverse effect of the medication.
None of the patients stopped treatment because of adverse effects. The study group was too small to look at the separate effects of mirtazapine on epigastric pain and postprandial distress syndrome, Dr. Gong noted.
“We plan to assess these patients again after 12 months of treatment,” she said.
Dr. Gong did not report funding sources and had no disclosures.
SAN DIEGO – The tetracyclic antidepressant mirtazapine significantly improved indicators of functional dyspepsia and psychological distress in a single-center, randomized, placebo-controlled, double-blinded trial of 116 adults.
After 3 months of treatment, the mirtazapine group had significantly less nausea, early satiety, depression, somatization, hostility, and phobic anxiety, compared with the control group (all P values < .05), Dr. Yaoyao Gong reported at the annual Digestive Disease Week. Most patients began improving after 1-2 weeks of mirtazapine therapy, she added.
“We assume that mirtazapine might improve functional dyspepsia by improving depression and anxiety, reducing visceral sensitivity, and through its prokinetic effects on gastrointestinal transit,” said Dr. Gong, who is at the gastroenterology department of Nanjing Medical University, China.
Mirtazapine is a presynaptic alpha-2 antagonist that also blocks the 5-HT2a, 5-HT2c, 5-HT3, and H-1 receptors. This atypical antidepressant reduced visceral hypersensitivity and increased gastric accommodation, gastric emptying, and colonic transit time in animal studies, and improved weight loss, early satiety, and nausea in a recent U.S. placebo-controlled pilot trial (Clin Gastroenterol Hepatol. 2016 Mar;14[3]:385-92).
As a biopsychosocial disorder, functional dyspepsia involves both physical and psychological symptoms, Dr. Gong noted. To explore how mirtazapine affects both realms, she and her associates randomized outpatients meeting Rome III functional dyspepsia criteria who also been diagnosed by a psychiatrist with anxiety, depression, or somatization disorder to receive either mirtazapine, 15 mg per day (61 patients) or placebo (55 patients). Both groups also received omeprazole, 30 mg per day, and mosapride, 5 mg three times a day. Patients were mostly in their early 40s, none was taking SSRIs or MAOIs, they had no history of abdominal surgery or upper endoscopy lesions, and they had negative 13C urea breath tests for Helicobacter pylori infection.
After 3 months of treatment, mirtazapine was associated with significantly lower 7-point Likert scales for nausea and early satiety, compared with standard care alone, said Dr. Gong. Mirtazapine did not significantly improve the other Rome III criteria for functional dyspepsia, but general overall score was significantly lower than for the control group.
In addition, the mirtazapine group had a “markedly better” average Symptom Checklist (SCL)-90 score, compared with the control group, and individual measures of depression, somatization, hostility, and phobic anxiety also were significantly lower than for controls (P < .05).
Mirtazapine did not significantly affect overall anxiety, a frequent psychological feature of functional dyspepsia. Mirtazapine was most often associated with dry mouth, although the researchers did not measure weight gain, another common adverse effect of the medication.
None of the patients stopped treatment because of adverse effects. The study group was too small to look at the separate effects of mirtazapine on epigastric pain and postprandial distress syndrome, Dr. Gong noted.
“We plan to assess these patients again after 12 months of treatment,” she said.
Dr. Gong did not report funding sources and had no disclosures.
SAN DIEGO – The tetracyclic antidepressant mirtazapine significantly improved indicators of functional dyspepsia and psychological distress in a single-center, randomized, placebo-controlled, double-blinded trial of 116 adults.
After 3 months of treatment, the mirtazapine group had significantly less nausea, early satiety, depression, somatization, hostility, and phobic anxiety, compared with the control group (all P values < .05), Dr. Yaoyao Gong reported at the annual Digestive Disease Week. Most patients began improving after 1-2 weeks of mirtazapine therapy, she added.
“We assume that mirtazapine might improve functional dyspepsia by improving depression and anxiety, reducing visceral sensitivity, and through its prokinetic effects on gastrointestinal transit,” said Dr. Gong, who is at the gastroenterology department of Nanjing Medical University, China.
Mirtazapine is a presynaptic alpha-2 antagonist that also blocks the 5-HT2a, 5-HT2c, 5-HT3, and H-1 receptors. This atypical antidepressant reduced visceral hypersensitivity and increased gastric accommodation, gastric emptying, and colonic transit time in animal studies, and improved weight loss, early satiety, and nausea in a recent U.S. placebo-controlled pilot trial (Clin Gastroenterol Hepatol. 2016 Mar;14[3]:385-92).
As a biopsychosocial disorder, functional dyspepsia involves both physical and psychological symptoms, Dr. Gong noted. To explore how mirtazapine affects both realms, she and her associates randomized outpatients meeting Rome III functional dyspepsia criteria who also been diagnosed by a psychiatrist with anxiety, depression, or somatization disorder to receive either mirtazapine, 15 mg per day (61 patients) or placebo (55 patients). Both groups also received omeprazole, 30 mg per day, and mosapride, 5 mg three times a day. Patients were mostly in their early 40s, none was taking SSRIs or MAOIs, they had no history of abdominal surgery or upper endoscopy lesions, and they had negative 13C urea breath tests for Helicobacter pylori infection.
After 3 months of treatment, mirtazapine was associated with significantly lower 7-point Likert scales for nausea and early satiety, compared with standard care alone, said Dr. Gong. Mirtazapine did not significantly improve the other Rome III criteria for functional dyspepsia, but general overall score was significantly lower than for the control group.
In addition, the mirtazapine group had a “markedly better” average Symptom Checklist (SCL)-90 score, compared with the control group, and individual measures of depression, somatization, hostility, and phobic anxiety also were significantly lower than for controls (P < .05).
Mirtazapine did not significantly affect overall anxiety, a frequent psychological feature of functional dyspepsia. Mirtazapine was most often associated with dry mouth, although the researchers did not measure weight gain, another common adverse effect of the medication.
None of the patients stopped treatment because of adverse effects. The study group was too small to look at the separate effects of mirtazapine on epigastric pain and postprandial distress syndrome, Dr. Gong noted.
“We plan to assess these patients again after 12 months of treatment,” she said.
Dr. Gong did not report funding sources and had no disclosures.
AT DDW® 2016
Key clinical point: A 15-mg daily dose of mirtazapine controlled comorbid functional dyspepsia and psychological distress more effectively than placebo.
Major finding: At 3 months, the mirtazapine group had significantly less nausea, early satiety, depression, somatization, hostility, and phobic anxiety than patients who received only standard of care (P < .05 for all comparisons).
Data source: A single-center, randomized, double-blind, placebo-controlled trial of 116 adults with functional dyspepsia and depression, anxiety, or somatization disorder.
Disclosures: Dr. Gong reported no funding sources and had no disclosures.
Endocuff safely cut nearly 1 minute off colonoscopy time
SAN DIEGO – A disposable Endocuff cut colonoscopic withdrawal times by nearly a minute and slightly improved polyp detection, compared with standard colonoscopy, according to a randomized, prospective trial of 562 patients.
The Endocuff caused no known adverse effects except for superficial mucosal trauma, Dr. Paul Feuerstadt said at the annual Digestive Disease Week. The study, which is the first of its kind in the United States, suggests that the Endocuff can improve the efficiency of colonoscopies without undermining detection rates, he added.
The plastic, flexible Endocuff slides onto the tip of a standard colonoscope, and has phalanges that press on the colonic mucosa “to improve polyp and adenoma detection rates, at least in theory,” said Dr. Feuerstadt, who is at the Gastroenterology Center of Connecticut in Hamden, Conn.
Use of the device improved the polyp detection rate by 63% and adenoma detection by 86% in a previous study in Germany.
For the current study, Dr. Feuerstadt and his associates screened 1,067 consecutive patients at two endoscopy centers in Connecticut, and excluded those with colitis, inflammatory bowel disease, diarrhea, chronic splenomegaly, and a history of surgical resection or colonic stricture. The 562 remaining patients were randomized to either Endocuff-assisted or standard colonoscopies performed by eight endoscopists with historically high adenoma detection rates of nearly 44%.
Use of the Endocuff seemed to slightly improve polyp detection, though none of the primary comparisons reached statistical significance, despite sufficient study power, Dr. Feuerstadt said. The rate of polyp detection was 63% for Endocuff-assisted colonoscopy and 60% for standard colonoscopy (P = .41), while rates of adenoma detection were 42% and 45%, respectively. There was a nonsignificant trend toward higher detection of sessile serrated adenomas (11% versus 9%; P = .37).
Notably, average withdrawal times were 9.9 minutes with the Endocuff (standard deviation, 5.5 minutes), versus 11.1 minutes without it (standard deviation, 5.9 minutes; P = .02). There were no perforations or other major adverse events, no instances of the Endocuff coming off the scope, and no difference in bleeding rates between the two groups.
However, 8% of Endocuff patients had mild mucosal trauma, compared with none of the control group, Dr. Feuerstadt reported.
The two groups resembled one another demographically, clinically, and in terms of their family history of colonic polyps. However, the Endocuff group had a higher frequency of first-degree relatives younger than age 50 years with colon cancer, Dr. Feuerstadt noted.
The endoscopists had an average historical ADR of 43.6%, “very similar to the 44.7% we saw in the study,” he added. “The device yields similar adenoma detection rates overall, with shorter withdrawal times, thereby increasing colonoscopic efficiency.”
Dr. Feuerstadt did not report funding sources. He disclosed consulting fees from Medivators, which makes endoscope reprocessing and related products.
SAN DIEGO – A disposable Endocuff cut colonoscopic withdrawal times by nearly a minute and slightly improved polyp detection, compared with standard colonoscopy, according to a randomized, prospective trial of 562 patients.
The Endocuff caused no known adverse effects except for superficial mucosal trauma, Dr. Paul Feuerstadt said at the annual Digestive Disease Week. The study, which is the first of its kind in the United States, suggests that the Endocuff can improve the efficiency of colonoscopies without undermining detection rates, he added.
The plastic, flexible Endocuff slides onto the tip of a standard colonoscope, and has phalanges that press on the colonic mucosa “to improve polyp and adenoma detection rates, at least in theory,” said Dr. Feuerstadt, who is at the Gastroenterology Center of Connecticut in Hamden, Conn.
Use of the device improved the polyp detection rate by 63% and adenoma detection by 86% in a previous study in Germany.
For the current study, Dr. Feuerstadt and his associates screened 1,067 consecutive patients at two endoscopy centers in Connecticut, and excluded those with colitis, inflammatory bowel disease, diarrhea, chronic splenomegaly, and a history of surgical resection or colonic stricture. The 562 remaining patients were randomized to either Endocuff-assisted or standard colonoscopies performed by eight endoscopists with historically high adenoma detection rates of nearly 44%.
Use of the Endocuff seemed to slightly improve polyp detection, though none of the primary comparisons reached statistical significance, despite sufficient study power, Dr. Feuerstadt said. The rate of polyp detection was 63% for Endocuff-assisted colonoscopy and 60% for standard colonoscopy (P = .41), while rates of adenoma detection were 42% and 45%, respectively. There was a nonsignificant trend toward higher detection of sessile serrated adenomas (11% versus 9%; P = .37).
Notably, average withdrawal times were 9.9 minutes with the Endocuff (standard deviation, 5.5 minutes), versus 11.1 minutes without it (standard deviation, 5.9 minutes; P = .02). There were no perforations or other major adverse events, no instances of the Endocuff coming off the scope, and no difference in bleeding rates between the two groups.
However, 8% of Endocuff patients had mild mucosal trauma, compared with none of the control group, Dr. Feuerstadt reported.
The two groups resembled one another demographically, clinically, and in terms of their family history of colonic polyps. However, the Endocuff group had a higher frequency of first-degree relatives younger than age 50 years with colon cancer, Dr. Feuerstadt noted.
The endoscopists had an average historical ADR of 43.6%, “very similar to the 44.7% we saw in the study,” he added. “The device yields similar adenoma detection rates overall, with shorter withdrawal times, thereby increasing colonoscopic efficiency.”
Dr. Feuerstadt did not report funding sources. He disclosed consulting fees from Medivators, which makes endoscope reprocessing and related products.
SAN DIEGO – A disposable Endocuff cut colonoscopic withdrawal times by nearly a minute and slightly improved polyp detection, compared with standard colonoscopy, according to a randomized, prospective trial of 562 patients.
The Endocuff caused no known adverse effects except for superficial mucosal trauma, Dr. Paul Feuerstadt said at the annual Digestive Disease Week. The study, which is the first of its kind in the United States, suggests that the Endocuff can improve the efficiency of colonoscopies without undermining detection rates, he added.
The plastic, flexible Endocuff slides onto the tip of a standard colonoscope, and has phalanges that press on the colonic mucosa “to improve polyp and adenoma detection rates, at least in theory,” said Dr. Feuerstadt, who is at the Gastroenterology Center of Connecticut in Hamden, Conn.
Use of the device improved the polyp detection rate by 63% and adenoma detection by 86% in a previous study in Germany.
For the current study, Dr. Feuerstadt and his associates screened 1,067 consecutive patients at two endoscopy centers in Connecticut, and excluded those with colitis, inflammatory bowel disease, diarrhea, chronic splenomegaly, and a history of surgical resection or colonic stricture. The 562 remaining patients were randomized to either Endocuff-assisted or standard colonoscopies performed by eight endoscopists with historically high adenoma detection rates of nearly 44%.
Use of the Endocuff seemed to slightly improve polyp detection, though none of the primary comparisons reached statistical significance, despite sufficient study power, Dr. Feuerstadt said. The rate of polyp detection was 63% for Endocuff-assisted colonoscopy and 60% for standard colonoscopy (P = .41), while rates of adenoma detection were 42% and 45%, respectively. There was a nonsignificant trend toward higher detection of sessile serrated adenomas (11% versus 9%; P = .37).
Notably, average withdrawal times were 9.9 minutes with the Endocuff (standard deviation, 5.5 minutes), versus 11.1 minutes without it (standard deviation, 5.9 minutes; P = .02). There were no perforations or other major adverse events, no instances of the Endocuff coming off the scope, and no difference in bleeding rates between the two groups.
However, 8% of Endocuff patients had mild mucosal trauma, compared with none of the control group, Dr. Feuerstadt reported.
The two groups resembled one another demographically, clinically, and in terms of their family history of colonic polyps. However, the Endocuff group had a higher frequency of first-degree relatives younger than age 50 years with colon cancer, Dr. Feuerstadt noted.
The endoscopists had an average historical ADR of 43.6%, “very similar to the 44.7% we saw in the study,” he added. “The device yields similar adenoma detection rates overall, with shorter withdrawal times, thereby increasing colonoscopic efficiency.”
Dr. Feuerstadt did not report funding sources. He disclosed consulting fees from Medivators, which makes endoscope reprocessing and related products.
AT DDW® 2016
Key clinical point: Attaching a disposable Endocuff to the tip of a colonoscope enabled endoscopists to cut about 1 minute off withdrawal times and slightly increase their polyp detection rates.
Major finding: The rate of polyp detection was 63% for Endocuff-assisted colonoscopy and 60% for standard colonoscopy (P = .41). Average withdrawal times were 9.9 minutes with the Endocuff and 11.1 minutes without it (P = .02).
Data source: A prospective, randomized, controlled trial of 562 patients from two endoscopy centers.
Disclosures: Dr. Feuerstadt disclosed consulting fees from Medivators, which makes endoscope reprocessing and related products.
Obesity is diverticula risk factor in women, not men
SAN DIEGO – Obesity is a risk factor for colonic diverticulosis among women but not men, while a low-fiber diet was not found to be a risk factor in a recent study reported at the annual Digestive Disease Week.
“The classic teaching in medical school is that a low-fiber diet increases constipation, which in turn increases the risk of diverticula,” explained lead author Dr. Anne Peery, assistant professor of medicine at the University of North Carolina, Chapel Hill. “This is in textbooks and on your boards. But there is no association between low-fiber dietary intake and diverticula.
“There is, however, evidence from other studies that a high-fiber diet and increased physical activity decrease the risk of developing complications from diverticula,” Dr. Peery added. She noted that the study was designed to look at risk factors for developing diverticula, not at complications.
“The provocative findings from our study are twofold: We found that the prevalence of diverticula is higher in men and lower in women younger than age 50, and that obesity is a risk factor for diverticula in women but not in men,” Dr Peery said.
“The age-related gender differences we identified were quite surprising, and suggest that something is going on in women under the age of 50 that may be estrogen-related. This opens up an avenue of research,” she noted.
Colonic diverticula are common, and they are important because of complications such as hemorrhage, perforation, and inflammation. They also pose a substantial health burden, accounting for 2.5 million office visits and 4,500 deaths each year in the United States.
“Despite this, we know very little about risk factors for colonic diverticula,” Dr. Peery noted.
The prospective study recruited 624 patients between the ages of 30 years and 80 years undergoing a first screening colonoscopy between 2013 and 2015 at the University of North Carolina in Chapel Hill. Prior to undergoing the procedure, each participant was interviewed using validated instruments to assess diet and physical activity. Each participant had a detailed examination for colonic diverticula, with a research assistant present during the entire colonoscopy.
“The presence or absence of diverticula reported in previous studies were extracted from colonoscopy reports. Our study assessed risk factors prior to undergoing colonoscopy,” she emphasized. “This is one of the study strengths.”
Not surprisingly, the study showed that the prevalence of diverticula (or “tics”) increased with age. Younger than age 50, the prevalence was higher in men than in women, after which prevalence equalized with age.
In the study population, 124 men had diverticula and 150 did not; 136 women had diverticula and 214 did not. Women with diverticula were more likely to be older, white, and have a higher body mass index (BMI).
The investigators looked at several measures of obesity, including BMI, waist circumference, and waist-to-height ratio. Women with greater BMI were at increased risk for diverticula, a risk relationship that was not seen in men. The risk of developing six or more diverticula was more than twofold greater in obese women.
Men with a greater waist circumference (more than 102 cm) had no increased risk for diverticula, while women with a greater waist circumference (more than 88 cm) were at increased risk of any diverticula, as well as having six or more diverticula.
A similar pattern was observed for waist-to-height ratio, which some experts believe is related to obesity, according to Dr. Peery. No association was found in men. But for women, a high-risk waist-to-height ratio increased the risk of diverticula, and the risk of having six or more diverticula was almost twice as great in these women, compared with men.
The investigators then measured the association between dietary fiber and physical activity with diverticula. No associations with diverticula were found in any quartile (lowest to highest) for both physical activity and dietary fiber intake.
In an interview, Dr. Peery speculated on why women have a lower prevalence of “tics,” compared with men younger than age 50. She said there are gender-related differences in the way fat is stored and metabolized.
“Obese women have more visceral adiposity that men, and they tend to eat more carbohydrates, while obese men have higher alcohol and meat intake. These differences will be studied in greater depth as they relate to diverticula and complications,” she noted.
SAN DIEGO – Obesity is a risk factor for colonic diverticulosis among women but not men, while a low-fiber diet was not found to be a risk factor in a recent study reported at the annual Digestive Disease Week.
“The classic teaching in medical school is that a low-fiber diet increases constipation, which in turn increases the risk of diverticula,” explained lead author Dr. Anne Peery, assistant professor of medicine at the University of North Carolina, Chapel Hill. “This is in textbooks and on your boards. But there is no association between low-fiber dietary intake and diverticula.
“There is, however, evidence from other studies that a high-fiber diet and increased physical activity decrease the risk of developing complications from diverticula,” Dr. Peery added. She noted that the study was designed to look at risk factors for developing diverticula, not at complications.
“The provocative findings from our study are twofold: We found that the prevalence of diverticula is higher in men and lower in women younger than age 50, and that obesity is a risk factor for diverticula in women but not in men,” Dr Peery said.
“The age-related gender differences we identified were quite surprising, and suggest that something is going on in women under the age of 50 that may be estrogen-related. This opens up an avenue of research,” she noted.
Colonic diverticula are common, and they are important because of complications such as hemorrhage, perforation, and inflammation. They also pose a substantial health burden, accounting for 2.5 million office visits and 4,500 deaths each year in the United States.
“Despite this, we know very little about risk factors for colonic diverticula,” Dr. Peery noted.
The prospective study recruited 624 patients between the ages of 30 years and 80 years undergoing a first screening colonoscopy between 2013 and 2015 at the University of North Carolina in Chapel Hill. Prior to undergoing the procedure, each participant was interviewed using validated instruments to assess diet and physical activity. Each participant had a detailed examination for colonic diverticula, with a research assistant present during the entire colonoscopy.
“The presence or absence of diverticula reported in previous studies were extracted from colonoscopy reports. Our study assessed risk factors prior to undergoing colonoscopy,” she emphasized. “This is one of the study strengths.”
Not surprisingly, the study showed that the prevalence of diverticula (or “tics”) increased with age. Younger than age 50, the prevalence was higher in men than in women, after which prevalence equalized with age.
In the study population, 124 men had diverticula and 150 did not; 136 women had diverticula and 214 did not. Women with diverticula were more likely to be older, white, and have a higher body mass index (BMI).
The investigators looked at several measures of obesity, including BMI, waist circumference, and waist-to-height ratio. Women with greater BMI were at increased risk for diverticula, a risk relationship that was not seen in men. The risk of developing six or more diverticula was more than twofold greater in obese women.
Men with a greater waist circumference (more than 102 cm) had no increased risk for diverticula, while women with a greater waist circumference (more than 88 cm) were at increased risk of any diverticula, as well as having six or more diverticula.
A similar pattern was observed for waist-to-height ratio, which some experts believe is related to obesity, according to Dr. Peery. No association was found in men. But for women, a high-risk waist-to-height ratio increased the risk of diverticula, and the risk of having six or more diverticula was almost twice as great in these women, compared with men.
The investigators then measured the association between dietary fiber and physical activity with diverticula. No associations with diverticula were found in any quartile (lowest to highest) for both physical activity and dietary fiber intake.
In an interview, Dr. Peery speculated on why women have a lower prevalence of “tics,” compared with men younger than age 50. She said there are gender-related differences in the way fat is stored and metabolized.
“Obese women have more visceral adiposity that men, and they tend to eat more carbohydrates, while obese men have higher alcohol and meat intake. These differences will be studied in greater depth as they relate to diverticula and complications,” she noted.
SAN DIEGO – Obesity is a risk factor for colonic diverticulosis among women but not men, while a low-fiber diet was not found to be a risk factor in a recent study reported at the annual Digestive Disease Week.
“The classic teaching in medical school is that a low-fiber diet increases constipation, which in turn increases the risk of diverticula,” explained lead author Dr. Anne Peery, assistant professor of medicine at the University of North Carolina, Chapel Hill. “This is in textbooks and on your boards. But there is no association between low-fiber dietary intake and diverticula.
“There is, however, evidence from other studies that a high-fiber diet and increased physical activity decrease the risk of developing complications from diverticula,” Dr. Peery added. She noted that the study was designed to look at risk factors for developing diverticula, not at complications.
“The provocative findings from our study are twofold: We found that the prevalence of diverticula is higher in men and lower in women younger than age 50, and that obesity is a risk factor for diverticula in women but not in men,” Dr Peery said.
“The age-related gender differences we identified were quite surprising, and suggest that something is going on in women under the age of 50 that may be estrogen-related. This opens up an avenue of research,” she noted.
Colonic diverticula are common, and they are important because of complications such as hemorrhage, perforation, and inflammation. They also pose a substantial health burden, accounting for 2.5 million office visits and 4,500 deaths each year in the United States.
“Despite this, we know very little about risk factors for colonic diverticula,” Dr. Peery noted.
The prospective study recruited 624 patients between the ages of 30 years and 80 years undergoing a first screening colonoscopy between 2013 and 2015 at the University of North Carolina in Chapel Hill. Prior to undergoing the procedure, each participant was interviewed using validated instruments to assess diet and physical activity. Each participant had a detailed examination for colonic diverticula, with a research assistant present during the entire colonoscopy.
“The presence or absence of diverticula reported in previous studies were extracted from colonoscopy reports. Our study assessed risk factors prior to undergoing colonoscopy,” she emphasized. “This is one of the study strengths.”
Not surprisingly, the study showed that the prevalence of diverticula (or “tics”) increased with age. Younger than age 50, the prevalence was higher in men than in women, after which prevalence equalized with age.
In the study population, 124 men had diverticula and 150 did not; 136 women had diverticula and 214 did not. Women with diverticula were more likely to be older, white, and have a higher body mass index (BMI).
The investigators looked at several measures of obesity, including BMI, waist circumference, and waist-to-height ratio. Women with greater BMI were at increased risk for diverticula, a risk relationship that was not seen in men. The risk of developing six or more diverticula was more than twofold greater in obese women.
Men with a greater waist circumference (more than 102 cm) had no increased risk for diverticula, while women with a greater waist circumference (more than 88 cm) were at increased risk of any diverticula, as well as having six or more diverticula.
A similar pattern was observed for waist-to-height ratio, which some experts believe is related to obesity, according to Dr. Peery. No association was found in men. But for women, a high-risk waist-to-height ratio increased the risk of diverticula, and the risk of having six or more diverticula was almost twice as great in these women, compared with men.
The investigators then measured the association between dietary fiber and physical activity with diverticula. No associations with diverticula were found in any quartile (lowest to highest) for both physical activity and dietary fiber intake.
In an interview, Dr. Peery speculated on why women have a lower prevalence of “tics,” compared with men younger than age 50. She said there are gender-related differences in the way fat is stored and metabolized.
“Obese women have more visceral adiposity that men, and they tend to eat more carbohydrates, while obese men have higher alcohol and meat intake. These differences will be studied in greater depth as they relate to diverticula and complications,” she noted.
AT DDW® 2016
Key clinical point: Obesity is associated with diverticula in women, not men, and a low-fiber diet is not a risk factor.
Major finding: Younger than age 50 years, men were more likely to have diverticula, and obese women were twice as likely as men to have six or more diverticula.
Data source: A prospective, cross-sectional study.
Disclosures: The National Institutes of Health sponsored the study.