User login
Increased prevalence of chronic narcotic use in children with IBD
The prevalence of chronic narcotic use among pediatric patients with inflammatory bowel disease is abnormally high and should be curbed to prevent further adverse effects from taking hold, according to a new study published in the February issue of Clinical Gastroenterology and Hepatology.
“Although pain control is an important aspect of disease management in pediatric IBD [inflammatory bowel disease], little is known about chronic narcotic use in children,” wrote the investigators, led by Dr. Jessie P. Buckley of the University of North Carolina at Chapel Hill. They added that “although children with Crohn’s disease are at increased risk of anxiety and depression, the relationship between narcotic use and psychiatric conditions has not yet been assessed in the pediatric IBD population (Clin. Gastroenterol. Hepatol. 2015 February [doi:10.1016/j.cgh.2014.07.057].”
In a retrospective, cross-sectional study, Dr. Buckley and her associates counted all 4,911,286 patients aged 18 years or younger with continuous health plan enrollment and pharmacy benefits in the MarketScan Commercial Claims and Encounters database between January 1, 2010 and December 31, 2011. Children were defined as having IBD if they had either three or more health care visits for Crohn’s disease or ulcerative colitis during the study’s time frame, or “at least one health care contact for Crohn’s disease or ulcerative colitis and at least one pharmacy claim for any of the following medications: mesalamine, olsalazine, balsalazide, sulfasalazine, 6-mercaptopurine, azathioprine, methotrexate, enteral budesonide, or biologics (infliximab, adalimumab, certolizumab,or natalizumab).”
From this population, the authors used diagnosis codes based on data from the U.S. National Drug Code system, the European Pharmaceutical Market Research Association, and the Pharmaceutical Business Intelligence and Research Group Anatomical Classification System drug classes, along with information on dispensation of IBD medications, to select 4,344 IBD patients, each of whom was subsequently matched with 5 other patients without IBD based on age, sex, and geographic region within the United States (Northeast, North Central, South, or West), yielding 21,720 in this population. Subjects were defined as chronic users if they “had at least three narcotic drug claims during the 2-year study period.” Within the IBD population, 2,737 (63%) of subjects had Crohn’s disease and 1,607 (37%) had ulcerative colitis.
Investigators found that 5.6% (241 individuals) of the 4,344 subjects with IBD were chronic narcotic users, significantly higher than the 2.3% rate (489 subjects) of chronic narcotic use found in the non-IBD population (prevalence odds ratio, 2.59; 95% confidence interval, 2.21–3.04). Furthermore, chronic narcotic use was more prevalent in IBD patients with psychological impairment than those without: 15.7% (POR, 6.8; 95% CI, 4.3–10.6) versus 3.2% (POR, 2.3; 95% CI, 1.9–2.7), respectively. In children with IBD, older age and chronic narcotic use were also more highly associated with “increased health care utilization [and] fracture.”
“Children with IBD had more than twice the prevalence of chronic narcotic use as children without, and associations between IBD status and narcotic use indicate particularly high burden among those with concomitant anxiety or depression,” Dr. Buckley and her associates wrote, adding that “psychiatric diagnoses have also been associated with increased risk of narcotic use among adult patients with IBD. Greater attention to the complex relationships between pain and psychological impairment is warranted because children with IBD are at increased risk of anxiety and depression, compared with their peers, and other cofactors are not easily intervened on (e.g., age, region, fracture).”
The authors disclosed that financial support for this study was provided, at least in part, by GlaxoSmithKline (GSK). Dr. Buckley received funding through a research assistantship at GSK, coauthors Dr. Suzanne F. Cook and Dr. Jeffery K. Allen are employees of GSK, and coauthor Dr. Michael D. Kappelman is a consultant to GSK, among other companies.
The prevalence of chronic narcotic use among pediatric patients with inflammatory bowel disease is abnormally high and should be curbed to prevent further adverse effects from taking hold, according to a new study published in the February issue of Clinical Gastroenterology and Hepatology.
“Although pain control is an important aspect of disease management in pediatric IBD [inflammatory bowel disease], little is known about chronic narcotic use in children,” wrote the investigators, led by Dr. Jessie P. Buckley of the University of North Carolina at Chapel Hill. They added that “although children with Crohn’s disease are at increased risk of anxiety and depression, the relationship between narcotic use and psychiatric conditions has not yet been assessed in the pediatric IBD population (Clin. Gastroenterol. Hepatol. 2015 February [doi:10.1016/j.cgh.2014.07.057].”
In a retrospective, cross-sectional study, Dr. Buckley and her associates counted all 4,911,286 patients aged 18 years or younger with continuous health plan enrollment and pharmacy benefits in the MarketScan Commercial Claims and Encounters database between January 1, 2010 and December 31, 2011. Children were defined as having IBD if they had either three or more health care visits for Crohn’s disease or ulcerative colitis during the study’s time frame, or “at least one health care contact for Crohn’s disease or ulcerative colitis and at least one pharmacy claim for any of the following medications: mesalamine, olsalazine, balsalazide, sulfasalazine, 6-mercaptopurine, azathioprine, methotrexate, enteral budesonide, or biologics (infliximab, adalimumab, certolizumab,or natalizumab).”
From this population, the authors used diagnosis codes based on data from the U.S. National Drug Code system, the European Pharmaceutical Market Research Association, and the Pharmaceutical Business Intelligence and Research Group Anatomical Classification System drug classes, along with information on dispensation of IBD medications, to select 4,344 IBD patients, each of whom was subsequently matched with 5 other patients without IBD based on age, sex, and geographic region within the United States (Northeast, North Central, South, or West), yielding 21,720 in this population. Subjects were defined as chronic users if they “had at least three narcotic drug claims during the 2-year study period.” Within the IBD population, 2,737 (63%) of subjects had Crohn’s disease and 1,607 (37%) had ulcerative colitis.
Investigators found that 5.6% (241 individuals) of the 4,344 subjects with IBD were chronic narcotic users, significantly higher than the 2.3% rate (489 subjects) of chronic narcotic use found in the non-IBD population (prevalence odds ratio, 2.59; 95% confidence interval, 2.21–3.04). Furthermore, chronic narcotic use was more prevalent in IBD patients with psychological impairment than those without: 15.7% (POR, 6.8; 95% CI, 4.3–10.6) versus 3.2% (POR, 2.3; 95% CI, 1.9–2.7), respectively. In children with IBD, older age and chronic narcotic use were also more highly associated with “increased health care utilization [and] fracture.”
“Children with IBD had more than twice the prevalence of chronic narcotic use as children without, and associations between IBD status and narcotic use indicate particularly high burden among those with concomitant anxiety or depression,” Dr. Buckley and her associates wrote, adding that “psychiatric diagnoses have also been associated with increased risk of narcotic use among adult patients with IBD. Greater attention to the complex relationships between pain and psychological impairment is warranted because children with IBD are at increased risk of anxiety and depression, compared with their peers, and other cofactors are not easily intervened on (e.g., age, region, fracture).”
The authors disclosed that financial support for this study was provided, at least in part, by GlaxoSmithKline (GSK). Dr. Buckley received funding through a research assistantship at GSK, coauthors Dr. Suzanne F. Cook and Dr. Jeffery K. Allen are employees of GSK, and coauthor Dr. Michael D. Kappelman is a consultant to GSK, among other companies.
The prevalence of chronic narcotic use among pediatric patients with inflammatory bowel disease is abnormally high and should be curbed to prevent further adverse effects from taking hold, according to a new study published in the February issue of Clinical Gastroenterology and Hepatology.
“Although pain control is an important aspect of disease management in pediatric IBD [inflammatory bowel disease], little is known about chronic narcotic use in children,” wrote the investigators, led by Dr. Jessie P. Buckley of the University of North Carolina at Chapel Hill. They added that “although children with Crohn’s disease are at increased risk of anxiety and depression, the relationship between narcotic use and psychiatric conditions has not yet been assessed in the pediatric IBD population (Clin. Gastroenterol. Hepatol. 2015 February [doi:10.1016/j.cgh.2014.07.057].”
In a retrospective, cross-sectional study, Dr. Buckley and her associates counted all 4,911,286 patients aged 18 years or younger with continuous health plan enrollment and pharmacy benefits in the MarketScan Commercial Claims and Encounters database between January 1, 2010 and December 31, 2011. Children were defined as having IBD if they had either three or more health care visits for Crohn’s disease or ulcerative colitis during the study’s time frame, or “at least one health care contact for Crohn’s disease or ulcerative colitis and at least one pharmacy claim for any of the following medications: mesalamine, olsalazine, balsalazide, sulfasalazine, 6-mercaptopurine, azathioprine, methotrexate, enteral budesonide, or biologics (infliximab, adalimumab, certolizumab,or natalizumab).”
From this population, the authors used diagnosis codes based on data from the U.S. National Drug Code system, the European Pharmaceutical Market Research Association, and the Pharmaceutical Business Intelligence and Research Group Anatomical Classification System drug classes, along with information on dispensation of IBD medications, to select 4,344 IBD patients, each of whom was subsequently matched with 5 other patients without IBD based on age, sex, and geographic region within the United States (Northeast, North Central, South, or West), yielding 21,720 in this population. Subjects were defined as chronic users if they “had at least three narcotic drug claims during the 2-year study period.” Within the IBD population, 2,737 (63%) of subjects had Crohn’s disease and 1,607 (37%) had ulcerative colitis.
Investigators found that 5.6% (241 individuals) of the 4,344 subjects with IBD were chronic narcotic users, significantly higher than the 2.3% rate (489 subjects) of chronic narcotic use found in the non-IBD population (prevalence odds ratio, 2.59; 95% confidence interval, 2.21–3.04). Furthermore, chronic narcotic use was more prevalent in IBD patients with psychological impairment than those without: 15.7% (POR, 6.8; 95% CI, 4.3–10.6) versus 3.2% (POR, 2.3; 95% CI, 1.9–2.7), respectively. In children with IBD, older age and chronic narcotic use were also more highly associated with “increased health care utilization [and] fracture.”
“Children with IBD had more than twice the prevalence of chronic narcotic use as children without, and associations between IBD status and narcotic use indicate particularly high burden among those with concomitant anxiety or depression,” Dr. Buckley and her associates wrote, adding that “psychiatric diagnoses have also been associated with increased risk of narcotic use among adult patients with IBD. Greater attention to the complex relationships between pain and psychological impairment is warranted because children with IBD are at increased risk of anxiety and depression, compared with their peers, and other cofactors are not easily intervened on (e.g., age, region, fracture).”
The authors disclosed that financial support for this study was provided, at least in part, by GlaxoSmithKline (GSK). Dr. Buckley received funding through a research assistantship at GSK, coauthors Dr. Suzanne F. Cook and Dr. Jeffery K. Allen are employees of GSK, and coauthor Dr. Michael D. Kappelman is a consultant to GSK, among other companies.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: Chronic narcotic use in pediatric patients with inflammatory bowel disease has become increasingly prevalent.
Major finding: Prevalence of chronic narcotic use was 5.6% among children with IBD vs. 2.3% in the general population (OR, 2.6).
Data source: Retrospective, cross-sectional study.
Disclosures: Support for this study was provided by GlaxoSmithKline; all of the study’s coauthors are either employed by or otherwise affiliated with the company.
Transoral fundoplication can be effective against GERD symptoms
Transoral esophagogastric fundoplication can be an effective treatment for patients seeking to alleviate symptoms associated with gastroesophageal reflux disease, particularly in individuals with persistent regurgitation despite prior treatment with proton pump inhibitor therapy, according to the results of a new study published in the February issue of Gastroenterology (doi:10.1053/j.gastro.2014.10.009).
“Gastroesophageal reflux disease (GERD) remains one of the most common conditions for which Americans take daily medication, and PPI use has more than doubled in the last decade,” wrote lead authors Dr. John G. Hunter of Oregon Health & Science University in Portland, and Dr. Peter J. Kahrilas of Northwestern University in Chicago, and their associates. “Despite this, up to 40% of proton pump inhibitor (PPI)–dependent GERD patients have troublesome symptoms of GERD, despite PPI therapy.”
In the Randomized EsophyX vs Sham, Placebo-Controlled Transoral Fundoplication (RESPECT) trial, investigators screened 696 patients who were experiencing “troublesome regurgitation” despite daily PPI treatment. These subjects were evaluated via three validated GERD-specific symptom scales, and were either on or off PPI use at the time of trial commencement. Post trial, patients were blinded to therapy and were reassessed at intervals of 2, 12, and 26 weeks. All patients underwent 48-hour esophageal pH monitoring and esophagogastroduodenoscopy at 66 months after the trial ended.
Regurgitation severity was based on the Montreal definition, which was used to measure efficacy of treatments given as part of the study. The Montreal definition of reflux is described by the authors as “either mucosal damage or troublesome symptoms attributable to reflux.” Those with “least troublesome” regurgitation while on PPIs “underwent barium swallow, esophagogastroduodenoscopy, 48-hour esophageal pH monitoring (off PPIs), and high-resolution esophageal manometry analyses.”
Eighty-seven subjects with GERD and hiatal hernias of at least 2 centimeters were randomly assigned to groups that underwent transoral fundoplication (TF) followed by placebo treatment after 6 months, while 42 subjects, who made up the control group, underwent a “sham surgery” and began regimens of once- or twice-daily omeprazole medication for 6 months.
Results showed that 67% of patients who received TF treatment experienced elimination of adverse regurgitation vs. 45% of those treated with PPI (P = .023). Control of esophageal pH also improved noticeably in patients who received TF treatment versus those who did not (9.3% vs. 6.3% on average, respectively, P < .001), but not in patients who received the “sham surgery” (8.6% preop vs. 8.9% postop on average). Fewer patients who received TF treatment recorded having “no response” after 3 months compared with those in the control group (11% vs. 36%, respectively, P = .004).
“Transoral fundoplication may fill the ‘therapeutic gap’ that exists between PPI and laparoscopic fundoplication,” wrote the authors. “Considering the virtual absence of dysphagia and bloating after TF, which may be problematic with LINX [LINX Reflux Management System], it would appear that TF is an option for patients with troublesome regurgitation, as well as for patients with troublesome GERD symptoms who wish not to take PPI over a protracted period of time.”
Several coauthors disclosed ties with the study sponsor EndoGastric Solutions of Redmond, Wash., as well as individual potential conflicts of interest.
Transoral esophagogastric fundoplication can be an effective treatment for patients seeking to alleviate symptoms associated with gastroesophageal reflux disease, particularly in individuals with persistent regurgitation despite prior treatment with proton pump inhibitor therapy, according to the results of a new study published in the February issue of Gastroenterology (doi:10.1053/j.gastro.2014.10.009).
“Gastroesophageal reflux disease (GERD) remains one of the most common conditions for which Americans take daily medication, and PPI use has more than doubled in the last decade,” wrote lead authors Dr. John G. Hunter of Oregon Health & Science University in Portland, and Dr. Peter J. Kahrilas of Northwestern University in Chicago, and their associates. “Despite this, up to 40% of proton pump inhibitor (PPI)–dependent GERD patients have troublesome symptoms of GERD, despite PPI therapy.”
In the Randomized EsophyX vs Sham, Placebo-Controlled Transoral Fundoplication (RESPECT) trial, investigators screened 696 patients who were experiencing “troublesome regurgitation” despite daily PPI treatment. These subjects were evaluated via three validated GERD-specific symptom scales, and were either on or off PPI use at the time of trial commencement. Post trial, patients were blinded to therapy and were reassessed at intervals of 2, 12, and 26 weeks. All patients underwent 48-hour esophageal pH monitoring and esophagogastroduodenoscopy at 66 months after the trial ended.
Regurgitation severity was based on the Montreal definition, which was used to measure efficacy of treatments given as part of the study. The Montreal definition of reflux is described by the authors as “either mucosal damage or troublesome symptoms attributable to reflux.” Those with “least troublesome” regurgitation while on PPIs “underwent barium swallow, esophagogastroduodenoscopy, 48-hour esophageal pH monitoring (off PPIs), and high-resolution esophageal manometry analyses.”
Eighty-seven subjects with GERD and hiatal hernias of at least 2 centimeters were randomly assigned to groups that underwent transoral fundoplication (TF) followed by placebo treatment after 6 months, while 42 subjects, who made up the control group, underwent a “sham surgery” and began regimens of once- or twice-daily omeprazole medication for 6 months.
Results showed that 67% of patients who received TF treatment experienced elimination of adverse regurgitation vs. 45% of those treated with PPI (P = .023). Control of esophageal pH also improved noticeably in patients who received TF treatment versus those who did not (9.3% vs. 6.3% on average, respectively, P < .001), but not in patients who received the “sham surgery” (8.6% preop vs. 8.9% postop on average). Fewer patients who received TF treatment recorded having “no response” after 3 months compared with those in the control group (11% vs. 36%, respectively, P = .004).
“Transoral fundoplication may fill the ‘therapeutic gap’ that exists between PPI and laparoscopic fundoplication,” wrote the authors. “Considering the virtual absence of dysphagia and bloating after TF, which may be problematic with LINX [LINX Reflux Management System], it would appear that TF is an option for patients with troublesome regurgitation, as well as for patients with troublesome GERD symptoms who wish not to take PPI over a protracted period of time.”
Several coauthors disclosed ties with the study sponsor EndoGastric Solutions of Redmond, Wash., as well as individual potential conflicts of interest.
Transoral esophagogastric fundoplication can be an effective treatment for patients seeking to alleviate symptoms associated with gastroesophageal reflux disease, particularly in individuals with persistent regurgitation despite prior treatment with proton pump inhibitor therapy, according to the results of a new study published in the February issue of Gastroenterology (doi:10.1053/j.gastro.2014.10.009).
“Gastroesophageal reflux disease (GERD) remains one of the most common conditions for which Americans take daily medication, and PPI use has more than doubled in the last decade,” wrote lead authors Dr. John G. Hunter of Oregon Health & Science University in Portland, and Dr. Peter J. Kahrilas of Northwestern University in Chicago, and their associates. “Despite this, up to 40% of proton pump inhibitor (PPI)–dependent GERD patients have troublesome symptoms of GERD, despite PPI therapy.”
In the Randomized EsophyX vs Sham, Placebo-Controlled Transoral Fundoplication (RESPECT) trial, investigators screened 696 patients who were experiencing “troublesome regurgitation” despite daily PPI treatment. These subjects were evaluated via three validated GERD-specific symptom scales, and were either on or off PPI use at the time of trial commencement. Post trial, patients were blinded to therapy and were reassessed at intervals of 2, 12, and 26 weeks. All patients underwent 48-hour esophageal pH monitoring and esophagogastroduodenoscopy at 66 months after the trial ended.
Regurgitation severity was based on the Montreal definition, which was used to measure efficacy of treatments given as part of the study. The Montreal definition of reflux is described by the authors as “either mucosal damage or troublesome symptoms attributable to reflux.” Those with “least troublesome” regurgitation while on PPIs “underwent barium swallow, esophagogastroduodenoscopy, 48-hour esophageal pH monitoring (off PPIs), and high-resolution esophageal manometry analyses.”
Eighty-seven subjects with GERD and hiatal hernias of at least 2 centimeters were randomly assigned to groups that underwent transoral fundoplication (TF) followed by placebo treatment after 6 months, while 42 subjects, who made up the control group, underwent a “sham surgery” and began regimens of once- or twice-daily omeprazole medication for 6 months.
Results showed that 67% of patients who received TF treatment experienced elimination of adverse regurgitation vs. 45% of those treated with PPI (P = .023). Control of esophageal pH also improved noticeably in patients who received TF treatment versus those who did not (9.3% vs. 6.3% on average, respectively, P < .001), but not in patients who received the “sham surgery” (8.6% preop vs. 8.9% postop on average). Fewer patients who received TF treatment recorded having “no response” after 3 months compared with those in the control group (11% vs. 36%, respectively, P = .004).
“Transoral fundoplication may fill the ‘therapeutic gap’ that exists between PPI and laparoscopic fundoplication,” wrote the authors. “Considering the virtual absence of dysphagia and bloating after TF, which may be problematic with LINX [LINX Reflux Management System], it would appear that TF is an option for patients with troublesome regurgitation, as well as for patients with troublesome GERD symptoms who wish not to take PPI over a protracted period of time.”
Several coauthors disclosed ties with the study sponsor EndoGastric Solutions of Redmond, Wash., as well as individual potential conflicts of interest.
FROM GASTROENTEROLOGY
Key clinical point: Transoral esophagogastric fundoplication (TF) is an effective treatment for gastroesophageal reflux disease symptoms, particularly in patients with persistent regurgitation despite proton pump inhibitor therapy (PPI).
Major finding: Of patients who received TF, 67% experienced elimination of adverse regurgitation, compared with 45% of those treated with PPI (P = .023).
Data source: Randomized EsophyX vs Sham, Placebo-Controlled Transoral Fundoplication (RESPECT) trial.
Disclosures: Several coauthors disclosed ties with the study sponsor EndoGastric Solutions of Redmond, Wash., as well as individual potential conflicts of interest.
Sofosbuvir and ribavirin critical to preventing posttransplantation HCV recurrence
Sofosbuvir and ribavirin treatments should be administered to patients with hepatitis C virus who undergo liver transplantations in order to significantly decrease the risks of posttransplant HCV recurrence, according to two new studies published in the January issue of Gastroenterology (10.1053/j.gastro.2014.09.023 and 10.1053/j.gastro.2014.10.001).
“In clinical trials, administration of sofosbuvir with ribavirin was associated with rapid decreases of HCV RNA to undetectable levels in patients with HCV genotype 1, 2, 3, 4, and 6 infections,” wrote lead author Dr. Michael P. Curry of the Beth Israel Deaconess Medical Center in Boston, and his coauthors on the first of these two studies. “In more than 3,000 patients treated to date, sofosbuvir has been shown to be safe, viral breakthrough during treatment has been rare (and associated with nonadherence), and few drug interactions have been observed.”
In a phase II, open-label study, Dr. Curry and his coinvestigators enrolled 61 patients with HCV of any genotype, and cirrhosis with a Child-Turcotte-Pugh score no greater than 7, who were all wait-listed to receive liver transplantations. Subjects received up to 48 weeks of treatment with 400 mg of sofosbuvir, and a separate dose of ribavirin prior to liver transplantation, while 43 patients received transplantations alone. The primary outcome sought by investigators was HCV-RNA levels less than 25 IU/mL at 12 weeks after transplantation among patients that had this level prior to the operation.
The investigators found that 43 subjects had the desired HCV-RNA levels; of that population, 49% had a posttransplantation virologic response, with the most frequent side effects reported by subjects being fatigue (38%), headache (23%), and anemia (21%). Of the 43 applicable subjects, 30 (70% of the population) had a posttransplantation virologic response at 12 weeks, 10 (23%) had recurrent infection, and 3 (7%) died.
“This study provides proof of concept that virologic suppression without interferon significantly can reduce the rate of recurrent HCV after liver transplantation,” the study says, adding that the results “compare favorably with those observed in other trials of pretransplantation antiviral therapy.”
In the second study, the authors ascertained that combination therapy consisting of sofosbuvir and ribavirin for 24 weeks is effective at preventing hepatitis C virus recurrence in patients who undergo liver transplantations.
“Recurrent HCV infection is the most common cause of mortality and graft loss following transplantation, and up to 30% of patients with recurrent infection develop cirrhosis within 5 years,” wrote the study’s authors, led by Dr. Michael Charlton of the Mayo Clinic in Rochester, Minn.
Using a prospective, multicenter, open-label pilot study, investigators enrolled and treated 40 patients with a 24-week regimen of 400 mg sofosbuvir and ribavirin starting at 400 mg, which was subsequently adjusted per patient based on individual creatinine clearance and hemoglobin levels. Subjects were 78% male and 85% white, with 83% having HCV genotype 1, 40% having cirrhosis, and 88% having been previously treated with interferon. The primary outcome investigators looked for was “sustained virologic response 12 weeks after treatment (SVR12).”
Data showed that SVR12 was achieved by 28 of the 40 subjects that received treatment, or 70%. The most commonly reported adverse effects were fatigue (30%), diarrhea (28%), headache (25%), and anemia (20%). No patients exhibited detectable viral resistance during or after treatment, and although two patients terminated their treatment because of adverse events, investigators reported no deaths, graft losses, or episodes of rejection.
“In contrast,” Dr. Charlton and his coauthors noted, “interferon-based treatments have been associated with posttreatment immunological dysfunction (particularly plasma cell hepatitis) and even hepatic decompensation in LT [liver transplant] recipients.”
The authors of the first study disclosed that Dr. Curry has received grants from and been affiliated with Gilead, which was a sponsor of the study. The authors of the second study reported no relevant financial disclosures.
Sofosbuvir and ribavirin treatments should be administered to patients with hepatitis C virus who undergo liver transplantations in order to significantly decrease the risks of posttransplant HCV recurrence, according to two new studies published in the January issue of Gastroenterology (10.1053/j.gastro.2014.09.023 and 10.1053/j.gastro.2014.10.001).
“In clinical trials, administration of sofosbuvir with ribavirin was associated with rapid decreases of HCV RNA to undetectable levels in patients with HCV genotype 1, 2, 3, 4, and 6 infections,” wrote lead author Dr. Michael P. Curry of the Beth Israel Deaconess Medical Center in Boston, and his coauthors on the first of these two studies. “In more than 3,000 patients treated to date, sofosbuvir has been shown to be safe, viral breakthrough during treatment has been rare (and associated with nonadherence), and few drug interactions have been observed.”
In a phase II, open-label study, Dr. Curry and his coinvestigators enrolled 61 patients with HCV of any genotype, and cirrhosis with a Child-Turcotte-Pugh score no greater than 7, who were all wait-listed to receive liver transplantations. Subjects received up to 48 weeks of treatment with 400 mg of sofosbuvir, and a separate dose of ribavirin prior to liver transplantation, while 43 patients received transplantations alone. The primary outcome sought by investigators was HCV-RNA levels less than 25 IU/mL at 12 weeks after transplantation among patients that had this level prior to the operation.
The investigators found that 43 subjects had the desired HCV-RNA levels; of that population, 49% had a posttransplantation virologic response, with the most frequent side effects reported by subjects being fatigue (38%), headache (23%), and anemia (21%). Of the 43 applicable subjects, 30 (70% of the population) had a posttransplantation virologic response at 12 weeks, 10 (23%) had recurrent infection, and 3 (7%) died.
“This study provides proof of concept that virologic suppression without interferon significantly can reduce the rate of recurrent HCV after liver transplantation,” the study says, adding that the results “compare favorably with those observed in other trials of pretransplantation antiviral therapy.”
In the second study, the authors ascertained that combination therapy consisting of sofosbuvir and ribavirin for 24 weeks is effective at preventing hepatitis C virus recurrence in patients who undergo liver transplantations.
“Recurrent HCV infection is the most common cause of mortality and graft loss following transplantation, and up to 30% of patients with recurrent infection develop cirrhosis within 5 years,” wrote the study’s authors, led by Dr. Michael Charlton of the Mayo Clinic in Rochester, Minn.
Using a prospective, multicenter, open-label pilot study, investigators enrolled and treated 40 patients with a 24-week regimen of 400 mg sofosbuvir and ribavirin starting at 400 mg, which was subsequently adjusted per patient based on individual creatinine clearance and hemoglobin levels. Subjects were 78% male and 85% white, with 83% having HCV genotype 1, 40% having cirrhosis, and 88% having been previously treated with interferon. The primary outcome investigators looked for was “sustained virologic response 12 weeks after treatment (SVR12).”
Data showed that SVR12 was achieved by 28 of the 40 subjects that received treatment, or 70%. The most commonly reported adverse effects were fatigue (30%), diarrhea (28%), headache (25%), and anemia (20%). No patients exhibited detectable viral resistance during or after treatment, and although two patients terminated their treatment because of adverse events, investigators reported no deaths, graft losses, or episodes of rejection.
“In contrast,” Dr. Charlton and his coauthors noted, “interferon-based treatments have been associated with posttreatment immunological dysfunction (particularly plasma cell hepatitis) and even hepatic decompensation in LT [liver transplant] recipients.”
The authors of the first study disclosed that Dr. Curry has received grants from and been affiliated with Gilead, which was a sponsor of the study. The authors of the second study reported no relevant financial disclosures.
Sofosbuvir and ribavirin treatments should be administered to patients with hepatitis C virus who undergo liver transplantations in order to significantly decrease the risks of posttransplant HCV recurrence, according to two new studies published in the January issue of Gastroenterology (10.1053/j.gastro.2014.09.023 and 10.1053/j.gastro.2014.10.001).
“In clinical trials, administration of sofosbuvir with ribavirin was associated with rapid decreases of HCV RNA to undetectable levels in patients with HCV genotype 1, 2, 3, 4, and 6 infections,” wrote lead author Dr. Michael P. Curry of the Beth Israel Deaconess Medical Center in Boston, and his coauthors on the first of these two studies. “In more than 3,000 patients treated to date, sofosbuvir has been shown to be safe, viral breakthrough during treatment has been rare (and associated with nonadherence), and few drug interactions have been observed.”
In a phase II, open-label study, Dr. Curry and his coinvestigators enrolled 61 patients with HCV of any genotype, and cirrhosis with a Child-Turcotte-Pugh score no greater than 7, who were all wait-listed to receive liver transplantations. Subjects received up to 48 weeks of treatment with 400 mg of sofosbuvir, and a separate dose of ribavirin prior to liver transplantation, while 43 patients received transplantations alone. The primary outcome sought by investigators was HCV-RNA levels less than 25 IU/mL at 12 weeks after transplantation among patients that had this level prior to the operation.
The investigators found that 43 subjects had the desired HCV-RNA levels; of that population, 49% had a posttransplantation virologic response, with the most frequent side effects reported by subjects being fatigue (38%), headache (23%), and anemia (21%). Of the 43 applicable subjects, 30 (70% of the population) had a posttransplantation virologic response at 12 weeks, 10 (23%) had recurrent infection, and 3 (7%) died.
“This study provides proof of concept that virologic suppression without interferon significantly can reduce the rate of recurrent HCV after liver transplantation,” the study says, adding that the results “compare favorably with those observed in other trials of pretransplantation antiviral therapy.”
In the second study, the authors ascertained that combination therapy consisting of sofosbuvir and ribavirin for 24 weeks is effective at preventing hepatitis C virus recurrence in patients who undergo liver transplantations.
“Recurrent HCV infection is the most common cause of mortality and graft loss following transplantation, and up to 30% of patients with recurrent infection develop cirrhosis within 5 years,” wrote the study’s authors, led by Dr. Michael Charlton of the Mayo Clinic in Rochester, Minn.
Using a prospective, multicenter, open-label pilot study, investigators enrolled and treated 40 patients with a 24-week regimen of 400 mg sofosbuvir and ribavirin starting at 400 mg, which was subsequently adjusted per patient based on individual creatinine clearance and hemoglobin levels. Subjects were 78% male and 85% white, with 83% having HCV genotype 1, 40% having cirrhosis, and 88% having been previously treated with interferon. The primary outcome investigators looked for was “sustained virologic response 12 weeks after treatment (SVR12).”
Data showed that SVR12 was achieved by 28 of the 40 subjects that received treatment, or 70%. The most commonly reported adverse effects were fatigue (30%), diarrhea (28%), headache (25%), and anemia (20%). No patients exhibited detectable viral resistance during or after treatment, and although two patients terminated their treatment because of adverse events, investigators reported no deaths, graft losses, or episodes of rejection.
“In contrast,” Dr. Charlton and his coauthors noted, “interferon-based treatments have been associated with posttreatment immunological dysfunction (particularly plasma cell hepatitis) and even hepatic decompensation in LT [liver transplant] recipients.”
The authors of the first study disclosed that Dr. Curry has received grants from and been affiliated with Gilead, which was a sponsor of the study. The authors of the second study reported no relevant financial disclosures.
FROM GASTROENTEROLOGY
Visible light spectrography should be used in diagnosing chronic gastrointestinal ischemia
Using visible light spectrography in the diagnosis of patients with suspected chronic gastrointestinal ischemia can lead to more accurate diagnoses, more-effective treatment regimens, and, ultimately, longer-lasting positive results, according to a new study published in the January issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2014.07.012).
“Medical history and physical examination were poor predictors for the presence of CGI [chronic gastrointestinal ischemia] [but] addition of radiologic evaluation [and] functional testing by means of tonometry substantially improved the accuracy of diagnosis,” said study leader Dr. Aria Sana of Utrecht University in the Netherlands.
The authors added that “VLS [Visible light spectrography] has recently been introduced as a new minimally invasive technique to detect mucosal hypoxia by means of measurement of mucosal capillary hemoglobin oxygen saturation during endoscopy in patients clinically suspected of CGI.”
In a prospective study, Dr. Sana and her associates gathered data on 212 patients referred to their medical center between November 2008 through January 2011 for suspected CGI. Subjects underwent visualization of gastrointestinal arteries and assessments of mucosal perfusion via VLS; those found to have occlusive CGI were followed-up after a median 13 months’ time to assess their response to treatment.
Of the 212 subjects initially screened, 107 (50%) were found to have occlusive CGI. Of that population, 96 (90%) were offered treatment, of which 89 (93%) were available to provide follow-up data after the median time of 13 months.
Investigators found that 62 subjects (70% of the 89 who reported after the follow-up period) had sustained responses to treatment that they were prescribed as a result of VLS and visualization-based diagnoses. Furthermore, patients who displayed weight loss, abdominal bruit, and low corpus mucosal saturation were found most likely to respond to treatment, particularly the latter – corpus saturation level of less than 56% was “one of the strongest predictors of a positive treatment response” investigators found.
“The presence of [at least] two predictors or the absence of any predictor was of discriminative value with [greater than] 85% vs. [less than] 50% response rate, respectively, suggesting patients with a predicted response rate of < 50% should primarily be considered for conservative management,” Dr. Sana and her coinvestigators noted.
The authors disclosed no conflicts of any kind.
In this study from the Netherlands, the authors used visible light spectroscopy (VLS) to diagnose chronic gastrointestinal ischemia (CGI) and to predict its response to surgical or endoscopic treatment; a challenging task indeed. Diagnosis of CGI is difficult because the classic symptom complex of meal-precipitated abdominal pain leading to weight loss is nonspecific; asymptomatic splanchnic vascular obstruction is not uncommon in the general population with autopsy series showing significant stenosis of the celiac, superior mesenteric, and inferior mesenteric arteries of 50%, 30%, and 30%, respectively; radiologic imaging tests conventionally used to diagnose CGI evaluate only vascular anatomy and not physiologic parameters of ischemia, i.e., intracellular acidosis (which can be evaluated by balloon tonometry) nor mucosal hemoglobin oxygen saturation, which is determined by VLS.
The authors evaluated 212 patients referred for suspected CGI and in all performed radiologic imaging of the splanchnic vessels and assessment of mucosal perfusion by VLS. Obstructive CGI (OCGI)was diagnosed by a multidisciplinary team and required clinical agreement, significant stenosis of at least one splanchnic artery and mucosal ischemia as determined by VLS. OCGI was classified as single- or multivessel and patients were offered surgical or endovascular revascularization. Response to treatment was then evaluated in patients with OCGI. A total of 107 patients were diagnosed with OCGI and data on response to treatment was available in 89 with a median follow-up of 13 months. Sustained symptomatic response was seen in 62 (70%) and most strongly predicted by the presence of weight loss, an abdominal bruit, and a gastric corpus mucosal saturation level of < 56%. This is an important study because it stresses the value of a team approach to diagnosis and the use of a combination of techniques to evaluate vascular obstruction and its functional significance.
There were several shortcomings of the study, in my opinion, although these do not lessen the importance of the study’s message. Most important, the study was not blinded. Spontaneous resolution of symptoms was noted in 19 patients without OCGI and in an additional 9 patients treated with a variety of medical and surgical means. Celiac artery compression release relieved symptoms in 17 patients, although the ischemic nature of this entity is arguable. Repeat VLS was not performed in all patients. Finally, clinical follow-up was relatively short and long-term follow-up was mainly by questionnaire, leading to another source of bias. To perform fiber-optic catheter-based VLS oximetry during esophagogastroduodenoscopy in a consistent fashion is technically challenging but probably worthwhile to learn. The authors continue to shine light on this relatively dark and poorly illuminated subject matter; let it shine, let it shine, let it shine.
Dr. Lawrence J. Brandt, MACG, AGAF, FASGE, is a professor of medicine and surgery at the Albert Einstein College of Medicine, N.Y., and emeritus chief, division of gastroenterology, Montefiore Medical Center, Bronx, N.Y.
In this study from the Netherlands, the authors used visible light spectroscopy (VLS) to diagnose chronic gastrointestinal ischemia (CGI) and to predict its response to surgical or endoscopic treatment; a challenging task indeed. Diagnosis of CGI is difficult because the classic symptom complex of meal-precipitated abdominal pain leading to weight loss is nonspecific; asymptomatic splanchnic vascular obstruction is not uncommon in the general population with autopsy series showing significant stenosis of the celiac, superior mesenteric, and inferior mesenteric arteries of 50%, 30%, and 30%, respectively; radiologic imaging tests conventionally used to diagnose CGI evaluate only vascular anatomy and not physiologic parameters of ischemia, i.e., intracellular acidosis (which can be evaluated by balloon tonometry) nor mucosal hemoglobin oxygen saturation, which is determined by VLS.
The authors evaluated 212 patients referred for suspected CGI and in all performed radiologic imaging of the splanchnic vessels and assessment of mucosal perfusion by VLS. Obstructive CGI (OCGI)was diagnosed by a multidisciplinary team and required clinical agreement, significant stenosis of at least one splanchnic artery and mucosal ischemia as determined by VLS. OCGI was classified as single- or multivessel and patients were offered surgical or endovascular revascularization. Response to treatment was then evaluated in patients with OCGI. A total of 107 patients were diagnosed with OCGI and data on response to treatment was available in 89 with a median follow-up of 13 months. Sustained symptomatic response was seen in 62 (70%) and most strongly predicted by the presence of weight loss, an abdominal bruit, and a gastric corpus mucosal saturation level of < 56%. This is an important study because it stresses the value of a team approach to diagnosis and the use of a combination of techniques to evaluate vascular obstruction and its functional significance.
There were several shortcomings of the study, in my opinion, although these do not lessen the importance of the study’s message. Most important, the study was not blinded. Spontaneous resolution of symptoms was noted in 19 patients without OCGI and in an additional 9 patients treated with a variety of medical and surgical means. Celiac artery compression release relieved symptoms in 17 patients, although the ischemic nature of this entity is arguable. Repeat VLS was not performed in all patients. Finally, clinical follow-up was relatively short and long-term follow-up was mainly by questionnaire, leading to another source of bias. To perform fiber-optic catheter-based VLS oximetry during esophagogastroduodenoscopy in a consistent fashion is technically challenging but probably worthwhile to learn. The authors continue to shine light on this relatively dark and poorly illuminated subject matter; let it shine, let it shine, let it shine.
Dr. Lawrence J. Brandt, MACG, AGAF, FASGE, is a professor of medicine and surgery at the Albert Einstein College of Medicine, N.Y., and emeritus chief, division of gastroenterology, Montefiore Medical Center, Bronx, N.Y.
In this study from the Netherlands, the authors used visible light spectroscopy (VLS) to diagnose chronic gastrointestinal ischemia (CGI) and to predict its response to surgical or endoscopic treatment; a challenging task indeed. Diagnosis of CGI is difficult because the classic symptom complex of meal-precipitated abdominal pain leading to weight loss is nonspecific; asymptomatic splanchnic vascular obstruction is not uncommon in the general population with autopsy series showing significant stenosis of the celiac, superior mesenteric, and inferior mesenteric arteries of 50%, 30%, and 30%, respectively; radiologic imaging tests conventionally used to diagnose CGI evaluate only vascular anatomy and not physiologic parameters of ischemia, i.e., intracellular acidosis (which can be evaluated by balloon tonometry) nor mucosal hemoglobin oxygen saturation, which is determined by VLS.
The authors evaluated 212 patients referred for suspected CGI and in all performed radiologic imaging of the splanchnic vessels and assessment of mucosal perfusion by VLS. Obstructive CGI (OCGI)was diagnosed by a multidisciplinary team and required clinical agreement, significant stenosis of at least one splanchnic artery and mucosal ischemia as determined by VLS. OCGI was classified as single- or multivessel and patients were offered surgical or endovascular revascularization. Response to treatment was then evaluated in patients with OCGI. A total of 107 patients were diagnosed with OCGI and data on response to treatment was available in 89 with a median follow-up of 13 months. Sustained symptomatic response was seen in 62 (70%) and most strongly predicted by the presence of weight loss, an abdominal bruit, and a gastric corpus mucosal saturation level of < 56%. This is an important study because it stresses the value of a team approach to diagnosis and the use of a combination of techniques to evaluate vascular obstruction and its functional significance.
There were several shortcomings of the study, in my opinion, although these do not lessen the importance of the study’s message. Most important, the study was not blinded. Spontaneous resolution of symptoms was noted in 19 patients without OCGI and in an additional 9 patients treated with a variety of medical and surgical means. Celiac artery compression release relieved symptoms in 17 patients, although the ischemic nature of this entity is arguable. Repeat VLS was not performed in all patients. Finally, clinical follow-up was relatively short and long-term follow-up was mainly by questionnaire, leading to another source of bias. To perform fiber-optic catheter-based VLS oximetry during esophagogastroduodenoscopy in a consistent fashion is technically challenging but probably worthwhile to learn. The authors continue to shine light on this relatively dark and poorly illuminated subject matter; let it shine, let it shine, let it shine.
Dr. Lawrence J. Brandt, MACG, AGAF, FASGE, is a professor of medicine and surgery at the Albert Einstein College of Medicine, N.Y., and emeritus chief, division of gastroenterology, Montefiore Medical Center, Bronx, N.Y.
Using visible light spectrography in the diagnosis of patients with suspected chronic gastrointestinal ischemia can lead to more accurate diagnoses, more-effective treatment regimens, and, ultimately, longer-lasting positive results, according to a new study published in the January issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2014.07.012).
“Medical history and physical examination were poor predictors for the presence of CGI [chronic gastrointestinal ischemia] [but] addition of radiologic evaluation [and] functional testing by means of tonometry substantially improved the accuracy of diagnosis,” said study leader Dr. Aria Sana of Utrecht University in the Netherlands.
The authors added that “VLS [Visible light spectrography] has recently been introduced as a new minimally invasive technique to detect mucosal hypoxia by means of measurement of mucosal capillary hemoglobin oxygen saturation during endoscopy in patients clinically suspected of CGI.”
In a prospective study, Dr. Sana and her associates gathered data on 212 patients referred to their medical center between November 2008 through January 2011 for suspected CGI. Subjects underwent visualization of gastrointestinal arteries and assessments of mucosal perfusion via VLS; those found to have occlusive CGI were followed-up after a median 13 months’ time to assess their response to treatment.
Of the 212 subjects initially screened, 107 (50%) were found to have occlusive CGI. Of that population, 96 (90%) were offered treatment, of which 89 (93%) were available to provide follow-up data after the median time of 13 months.
Investigators found that 62 subjects (70% of the 89 who reported after the follow-up period) had sustained responses to treatment that they were prescribed as a result of VLS and visualization-based diagnoses. Furthermore, patients who displayed weight loss, abdominal bruit, and low corpus mucosal saturation were found most likely to respond to treatment, particularly the latter – corpus saturation level of less than 56% was “one of the strongest predictors of a positive treatment response” investigators found.
“The presence of [at least] two predictors or the absence of any predictor was of discriminative value with [greater than] 85% vs. [less than] 50% response rate, respectively, suggesting patients with a predicted response rate of < 50% should primarily be considered for conservative management,” Dr. Sana and her coinvestigators noted.
The authors disclosed no conflicts of any kind.
Using visible light spectrography in the diagnosis of patients with suspected chronic gastrointestinal ischemia can lead to more accurate diagnoses, more-effective treatment regimens, and, ultimately, longer-lasting positive results, according to a new study published in the January issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2014.07.012).
“Medical history and physical examination were poor predictors for the presence of CGI [chronic gastrointestinal ischemia] [but] addition of radiologic evaluation [and] functional testing by means of tonometry substantially improved the accuracy of diagnosis,” said study leader Dr. Aria Sana of Utrecht University in the Netherlands.
The authors added that “VLS [Visible light spectrography] has recently been introduced as a new minimally invasive technique to detect mucosal hypoxia by means of measurement of mucosal capillary hemoglobin oxygen saturation during endoscopy in patients clinically suspected of CGI.”
In a prospective study, Dr. Sana and her associates gathered data on 212 patients referred to their medical center between November 2008 through January 2011 for suspected CGI. Subjects underwent visualization of gastrointestinal arteries and assessments of mucosal perfusion via VLS; those found to have occlusive CGI were followed-up after a median 13 months’ time to assess their response to treatment.
Of the 212 subjects initially screened, 107 (50%) were found to have occlusive CGI. Of that population, 96 (90%) were offered treatment, of which 89 (93%) were available to provide follow-up data after the median time of 13 months.
Investigators found that 62 subjects (70% of the 89 who reported after the follow-up period) had sustained responses to treatment that they were prescribed as a result of VLS and visualization-based diagnoses. Furthermore, patients who displayed weight loss, abdominal bruit, and low corpus mucosal saturation were found most likely to respond to treatment, particularly the latter – corpus saturation level of less than 56% was “one of the strongest predictors of a positive treatment response” investigators found.
“The presence of [at least] two predictors or the absence of any predictor was of discriminative value with [greater than] 85% vs. [less than] 50% response rate, respectively, suggesting patients with a predicted response rate of < 50% should primarily be considered for conservative management,” Dr. Sana and her coinvestigators noted.
The authors disclosed no conflicts of any kind.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: Diagnosis of chronic gastrointestinal ischemia (CGI) via visible light spectrography (VLS) can lead to more effective treatment methods that yield longer-term results.
Major finding: Of patients diagnosed with CGI via VLS, 70% reported sustained responses to treatment after 13 months.
Data source: A prospective study.
Disclosures: The authors reported no conflicts.
Identification of subtypes, mutations in CRC and colon cancer greatly increases survival rates
Identifying subtypes and tumor mutations in patients with colorectal and stage III colon cancer can significantly improve survival rates, according to two new studies published in the January issue of Gastroenterology (doi:10.1053/j.gastro.2014.09.038 and doi:10.1053/j.gastro.2014.09.041).
In the first study, researchers found that etiologically defined subtypes of colorectal cancer are characterized by marked differences in survival rates and confirmed the clinical importance of studying the molecular heterogeneity of the disease.
“Increasing evidence indicates that colorectal cancer (CRC) is a biologically heterogeneous disease that can develop via a number of distinct pathways involving different combinations of genetic and epigenetic changes,” wrote Amanda Phipps, Ph.D, of the University of Washington in Seattle, and her coinvestigators, adding that “the biologic distinctions between CRC subtypes resulting from different etiologic pathways may plausibly translate to differences in survival.”
Between 1998 and 2007, 2,706 participants were enrolled for this study through the population-based Seattle Colon Cancer Family Registry and followed for survival through 2012. Of those, 2,050 had tumor samples collected from them, each of which was classified into one of five subtypes based on tumor markers: type 1 (microsatellite instability [MSI] high, CpG island methylator phenotype [CIMP] positive, positive for BRAF mutation, negative for KRAS mutation); type 2 (microsatellite stable [MSS] or MSI-low, CIMPpositive, positive for BRAF mutation, negative for KRAS mutation); type 3 (MSS or MSI-low, non-CIMP, negative for BRAF mutation, positive for KRAS mutation); type 4 (MSS or MSI-low, non-CIMP, negative for mutations in BRAF and KRAS); and type 5 (MSI-high, non-CIMP, negative for mutations in BRAF and KRAS).
To analyze data, Cox regression models were used to estimate hazard ratios, 95% confidence intervals, and associations for each subtype with specific diseases and overall mortality, all of which were adjusted for age, sex, body mass, diagnosis year, and smoking history.
Results indicated that type 4 tumors were the most predominant, but subjects with type 2 tumors had the highest disease-specific mortality (hazard ratio = 2.20) and subjects with type 3 tumors also had unusually high disease-specific mortality (HR = 1.32). Type 5 tumors were associated with the lowest disease-specific mortality (HR = 0.30). Associations with overall mortality were similar to those with disease-specific mortality.
“These findings contribute to a small but growing literature supporting the significance of CRC-subtype classifications defined by combinations of these tumor markers,” the authors noted.
The second study’s results indicate that patients suffering from stage III colon cancer who have MMR-proficient tumors and mutations in BRAF (V600E) or KRAS had shorter survival times than patients whose tumors lacked these specific mutations.
“Defining [CRC] tumor subtypes based upon pathway-driven alterations has the potential to improve prognostication and guide targeted therapy,” wrote Dr. Frank Sinicrope of the Mayo Clinic in Rochester, Minn., and his associates.
Investigators collected 2,720 tumor samples from stage III cancer patients and detected mutations in BRAF (V600E) and KRAS using a polymerase chain reaction–based assay, and tumors deficient or proficient in DNA mismatch repair (MMR) via detection of MLH1, MSH2, and MSH6 proteins and methylation of the MLH1 promoter. Investigators used a separate sample of tumors, taken from 783 stage III cancer patients, to validate their findings, and employed the Cox proportional hazards model to evaluate associations between tumors and 5-year disease-free survival rates.
Tumors were divided into five subtypes, three of which were MMR proficient – those with BRAFV600E (6.9% of samples), mutations in KRAS (35%), or tumors lacking either BRAFV600E or mutations in KRAS (49%) – and two of which were MMR deficient: the sporadic type (6.8%) with BRAFV600E or hypermethylation of MLH1, and the familial type (2.6%), which lacked BRAFV600E or hypermethylation of MLH1.
The data accumulated by Dr. Sinicrope and his coinvestigators showed that more MMR-proficient tumors with BRAFV600E than MMR-proficient tumors without BRAFV600E or mutations in KRAS were proximal, high grade, N2, and prevalent in female subjects: 76% vs. 33%, 44% vs. 19%, 59% vs. 41%, and 59% vs. 42%, respectively (overall P < .0001).
Furthermore, a “significantly lower proportion” of patients having MMR-proficient tumors with BRAFV600E (HR 1.43; adjusted P = .0065) or mutant KRAS (HR 1.48; adjusted P < .0001) survived disease free for 5 years, compared with those whose MMR-proficient tumors lacked mutations in either gene.
“We found that a biomarker-based classifier can identify prognostically distinct subtypes within stage III colon cancer patients that was externally validated,” the authors wrote, adding that “taken together, our biomarker classifier provides important prognostic information in stage III colon cancers with implications for patient management.”
The investigators for both studies reported no relevant financial disclosures.
It is now recognized that colon cancer is quite heterogeneous on a genetic level, and the clinical features associated with each of these genetic subtypes are equally heterogeneous. The two current studies addressed the question of whether long-term prognosis differs among these genetic subtypes. Colon tumors were first divided into five distinct categories, based upon a panel of multiple molecular markers. One study analyzed more than 2,700 colon cancers of all stages, whereas the other examined more than 2,700 stage III tumors from a North Central Cancer Treatment Group adjuvant chemotherapy trial.
Similar patterns emerged. Tumors with the least favorable prognosis were the so-called “serrated” tumors that are DNA mismatch-repair (MMR) proficient and are positive for a BRAF mutation. Tumors with deficient MMR (MSI-H), whether sporadic or associated with Lynch syndrome, consistently exhibited the most favorable prognosis and highest rates of long-term survival.
These studies provide strong evidence that links survival with specific tumor genotypes, regardless of stage or treatment, and establish the significance of molecular genotyping for prognostic purposes. There are other important reasons to perform tumor genotyping, including the identification of unrecognized Lynch syndrome. However, the therapeutic implications of tumor genotyping remain less clear, as meaningful targeted therapies for each of the specific subgroups are still lacking. In particular, effective targeting of the BRAF oncogene in serrated tumors remains an important priority. More refined molecular classifications are likely to emerge in the future, and the opportunities to offer more precise and personalized approaches to management should increase in parallel.
Dr. Daniel C. Chung is associate professor, department of medicine, Harvard Medical School, Boston, and director, Hi-Risk Gastrointestinal Cancer Clinic, Massachusetts General Hospital, also in Boston. He had no conflicts of interest.
It is now recognized that colon cancer is quite heterogeneous on a genetic level, and the clinical features associated with each of these genetic subtypes are equally heterogeneous. The two current studies addressed the question of whether long-term prognosis differs among these genetic subtypes. Colon tumors were first divided into five distinct categories, based upon a panel of multiple molecular markers. One study analyzed more than 2,700 colon cancers of all stages, whereas the other examined more than 2,700 stage III tumors from a North Central Cancer Treatment Group adjuvant chemotherapy trial.
Similar patterns emerged. Tumors with the least favorable prognosis were the so-called “serrated” tumors that are DNA mismatch-repair (MMR) proficient and are positive for a BRAF mutation. Tumors with deficient MMR (MSI-H), whether sporadic or associated with Lynch syndrome, consistently exhibited the most favorable prognosis and highest rates of long-term survival.
These studies provide strong evidence that links survival with specific tumor genotypes, regardless of stage or treatment, and establish the significance of molecular genotyping for prognostic purposes. There are other important reasons to perform tumor genotyping, including the identification of unrecognized Lynch syndrome. However, the therapeutic implications of tumor genotyping remain less clear, as meaningful targeted therapies for each of the specific subgroups are still lacking. In particular, effective targeting of the BRAF oncogene in serrated tumors remains an important priority. More refined molecular classifications are likely to emerge in the future, and the opportunities to offer more precise and personalized approaches to management should increase in parallel.
Dr. Daniel C. Chung is associate professor, department of medicine, Harvard Medical School, Boston, and director, Hi-Risk Gastrointestinal Cancer Clinic, Massachusetts General Hospital, also in Boston. He had no conflicts of interest.
It is now recognized that colon cancer is quite heterogeneous on a genetic level, and the clinical features associated with each of these genetic subtypes are equally heterogeneous. The two current studies addressed the question of whether long-term prognosis differs among these genetic subtypes. Colon tumors were first divided into five distinct categories, based upon a panel of multiple molecular markers. One study analyzed more than 2,700 colon cancers of all stages, whereas the other examined more than 2,700 stage III tumors from a North Central Cancer Treatment Group adjuvant chemotherapy trial.
Similar patterns emerged. Tumors with the least favorable prognosis were the so-called “serrated” tumors that are DNA mismatch-repair (MMR) proficient and are positive for a BRAF mutation. Tumors with deficient MMR (MSI-H), whether sporadic or associated with Lynch syndrome, consistently exhibited the most favorable prognosis and highest rates of long-term survival.
These studies provide strong evidence that links survival with specific tumor genotypes, regardless of stage or treatment, and establish the significance of molecular genotyping for prognostic purposes. There are other important reasons to perform tumor genotyping, including the identification of unrecognized Lynch syndrome. However, the therapeutic implications of tumor genotyping remain less clear, as meaningful targeted therapies for each of the specific subgroups are still lacking. In particular, effective targeting of the BRAF oncogene in serrated tumors remains an important priority. More refined molecular classifications are likely to emerge in the future, and the opportunities to offer more precise and personalized approaches to management should increase in parallel.
Dr. Daniel C. Chung is associate professor, department of medicine, Harvard Medical School, Boston, and director, Hi-Risk Gastrointestinal Cancer Clinic, Massachusetts General Hospital, also in Boston. He had no conflicts of interest.
Identifying subtypes and tumor mutations in patients with colorectal and stage III colon cancer can significantly improve survival rates, according to two new studies published in the January issue of Gastroenterology (doi:10.1053/j.gastro.2014.09.038 and doi:10.1053/j.gastro.2014.09.041).
In the first study, researchers found that etiologically defined subtypes of colorectal cancer are characterized by marked differences in survival rates and confirmed the clinical importance of studying the molecular heterogeneity of the disease.
“Increasing evidence indicates that colorectal cancer (CRC) is a biologically heterogeneous disease that can develop via a number of distinct pathways involving different combinations of genetic and epigenetic changes,” wrote Amanda Phipps, Ph.D, of the University of Washington in Seattle, and her coinvestigators, adding that “the biologic distinctions between CRC subtypes resulting from different etiologic pathways may plausibly translate to differences in survival.”
Between 1998 and 2007, 2,706 participants were enrolled for this study through the population-based Seattle Colon Cancer Family Registry and followed for survival through 2012. Of those, 2,050 had tumor samples collected from them, each of which was classified into one of five subtypes based on tumor markers: type 1 (microsatellite instability [MSI] high, CpG island methylator phenotype [CIMP] positive, positive for BRAF mutation, negative for KRAS mutation); type 2 (microsatellite stable [MSS] or MSI-low, CIMPpositive, positive for BRAF mutation, negative for KRAS mutation); type 3 (MSS or MSI-low, non-CIMP, negative for BRAF mutation, positive for KRAS mutation); type 4 (MSS or MSI-low, non-CIMP, negative for mutations in BRAF and KRAS); and type 5 (MSI-high, non-CIMP, negative for mutations in BRAF and KRAS).
To analyze data, Cox regression models were used to estimate hazard ratios, 95% confidence intervals, and associations for each subtype with specific diseases and overall mortality, all of which were adjusted for age, sex, body mass, diagnosis year, and smoking history.
Results indicated that type 4 tumors were the most predominant, but subjects with type 2 tumors had the highest disease-specific mortality (hazard ratio = 2.20) and subjects with type 3 tumors also had unusually high disease-specific mortality (HR = 1.32). Type 5 tumors were associated with the lowest disease-specific mortality (HR = 0.30). Associations with overall mortality were similar to those with disease-specific mortality.
“These findings contribute to a small but growing literature supporting the significance of CRC-subtype classifications defined by combinations of these tumor markers,” the authors noted.
The second study’s results indicate that patients suffering from stage III colon cancer who have MMR-proficient tumors and mutations in BRAF (V600E) or KRAS had shorter survival times than patients whose tumors lacked these specific mutations.
“Defining [CRC] tumor subtypes based upon pathway-driven alterations has the potential to improve prognostication and guide targeted therapy,” wrote Dr. Frank Sinicrope of the Mayo Clinic in Rochester, Minn., and his associates.
Investigators collected 2,720 tumor samples from stage III cancer patients and detected mutations in BRAF (V600E) and KRAS using a polymerase chain reaction–based assay, and tumors deficient or proficient in DNA mismatch repair (MMR) via detection of MLH1, MSH2, and MSH6 proteins and methylation of the MLH1 promoter. Investigators used a separate sample of tumors, taken from 783 stage III cancer patients, to validate their findings, and employed the Cox proportional hazards model to evaluate associations between tumors and 5-year disease-free survival rates.
Tumors were divided into five subtypes, three of which were MMR proficient – those with BRAFV600E (6.9% of samples), mutations in KRAS (35%), or tumors lacking either BRAFV600E or mutations in KRAS (49%) – and two of which were MMR deficient: the sporadic type (6.8%) with BRAFV600E or hypermethylation of MLH1, and the familial type (2.6%), which lacked BRAFV600E or hypermethylation of MLH1.
The data accumulated by Dr. Sinicrope and his coinvestigators showed that more MMR-proficient tumors with BRAFV600E than MMR-proficient tumors without BRAFV600E or mutations in KRAS were proximal, high grade, N2, and prevalent in female subjects: 76% vs. 33%, 44% vs. 19%, 59% vs. 41%, and 59% vs. 42%, respectively (overall P < .0001).
Furthermore, a “significantly lower proportion” of patients having MMR-proficient tumors with BRAFV600E (HR 1.43; adjusted P = .0065) or mutant KRAS (HR 1.48; adjusted P < .0001) survived disease free for 5 years, compared with those whose MMR-proficient tumors lacked mutations in either gene.
“We found that a biomarker-based classifier can identify prognostically distinct subtypes within stage III colon cancer patients that was externally validated,” the authors wrote, adding that “taken together, our biomarker classifier provides important prognostic information in stage III colon cancers with implications for patient management.”
The investigators for both studies reported no relevant financial disclosures.
Identifying subtypes and tumor mutations in patients with colorectal and stage III colon cancer can significantly improve survival rates, according to two new studies published in the January issue of Gastroenterology (doi:10.1053/j.gastro.2014.09.038 and doi:10.1053/j.gastro.2014.09.041).
In the first study, researchers found that etiologically defined subtypes of colorectal cancer are characterized by marked differences in survival rates and confirmed the clinical importance of studying the molecular heterogeneity of the disease.
“Increasing evidence indicates that colorectal cancer (CRC) is a biologically heterogeneous disease that can develop via a number of distinct pathways involving different combinations of genetic and epigenetic changes,” wrote Amanda Phipps, Ph.D, of the University of Washington in Seattle, and her coinvestigators, adding that “the biologic distinctions between CRC subtypes resulting from different etiologic pathways may plausibly translate to differences in survival.”
Between 1998 and 2007, 2,706 participants were enrolled for this study through the population-based Seattle Colon Cancer Family Registry and followed for survival through 2012. Of those, 2,050 had tumor samples collected from them, each of which was classified into one of five subtypes based on tumor markers: type 1 (microsatellite instability [MSI] high, CpG island methylator phenotype [CIMP] positive, positive for BRAF mutation, negative for KRAS mutation); type 2 (microsatellite stable [MSS] or MSI-low, CIMPpositive, positive for BRAF mutation, negative for KRAS mutation); type 3 (MSS or MSI-low, non-CIMP, negative for BRAF mutation, positive for KRAS mutation); type 4 (MSS or MSI-low, non-CIMP, negative for mutations in BRAF and KRAS); and type 5 (MSI-high, non-CIMP, negative for mutations in BRAF and KRAS).
To analyze data, Cox regression models were used to estimate hazard ratios, 95% confidence intervals, and associations for each subtype with specific diseases and overall mortality, all of which were adjusted for age, sex, body mass, diagnosis year, and smoking history.
Results indicated that type 4 tumors were the most predominant, but subjects with type 2 tumors had the highest disease-specific mortality (hazard ratio = 2.20) and subjects with type 3 tumors also had unusually high disease-specific mortality (HR = 1.32). Type 5 tumors were associated with the lowest disease-specific mortality (HR = 0.30). Associations with overall mortality were similar to those with disease-specific mortality.
“These findings contribute to a small but growing literature supporting the significance of CRC-subtype classifications defined by combinations of these tumor markers,” the authors noted.
The second study’s results indicate that patients suffering from stage III colon cancer who have MMR-proficient tumors and mutations in BRAF (V600E) or KRAS had shorter survival times than patients whose tumors lacked these specific mutations.
“Defining [CRC] tumor subtypes based upon pathway-driven alterations has the potential to improve prognostication and guide targeted therapy,” wrote Dr. Frank Sinicrope of the Mayo Clinic in Rochester, Minn., and his associates.
Investigators collected 2,720 tumor samples from stage III cancer patients and detected mutations in BRAF (V600E) and KRAS using a polymerase chain reaction–based assay, and tumors deficient or proficient in DNA mismatch repair (MMR) via detection of MLH1, MSH2, and MSH6 proteins and methylation of the MLH1 promoter. Investigators used a separate sample of tumors, taken from 783 stage III cancer patients, to validate their findings, and employed the Cox proportional hazards model to evaluate associations between tumors and 5-year disease-free survival rates.
Tumors were divided into five subtypes, three of which were MMR proficient – those with BRAFV600E (6.9% of samples), mutations in KRAS (35%), or tumors lacking either BRAFV600E or mutations in KRAS (49%) – and two of which were MMR deficient: the sporadic type (6.8%) with BRAFV600E or hypermethylation of MLH1, and the familial type (2.6%), which lacked BRAFV600E or hypermethylation of MLH1.
The data accumulated by Dr. Sinicrope and his coinvestigators showed that more MMR-proficient tumors with BRAFV600E than MMR-proficient tumors without BRAFV600E or mutations in KRAS were proximal, high grade, N2, and prevalent in female subjects: 76% vs. 33%, 44% vs. 19%, 59% vs. 41%, and 59% vs. 42%, respectively (overall P < .0001).
Furthermore, a “significantly lower proportion” of patients having MMR-proficient tumors with BRAFV600E (HR 1.43; adjusted P = .0065) or mutant KRAS (HR 1.48; adjusted P < .0001) survived disease free for 5 years, compared with those whose MMR-proficient tumors lacked mutations in either gene.
“We found that a biomarker-based classifier can identify prognostically distinct subtypes within stage III colon cancer patients that was externally validated,” the authors wrote, adding that “taken together, our biomarker classifier provides important prognostic information in stage III colon cancers with implications for patient management.”
The investigators for both studies reported no relevant financial disclosures.
FROM GASTROENTEROLOGY
Cathelicidins might help prevent, treat colonic fibrosis
Peptides known as cathelicidins directly inhibited collagen synthesis in human colonic fibroblasts and in mice with colitis, authors of a controlled, prospective study reported. The findings appeared Nov. 11 in Cellular and Molecular Gastroenterology and Hepatology.
“Our results strongly suggest that cathelicidin administration may be a novel approach to prevent or treat inflammatory bowel disease and IBD-related colonic fibrosis,” said Dr. Jun Hwan Yoo at the University of California, Los Angeles, and his associates.
Cathelicidins are endogenous antimicrobial peptides that exhibit “potent” anti-inflammatory effects against acute colitis, and inhibit colonic fibrosis in mice with chronic or infectious colitis, the investigators said. In past studies, cathelicidin-deficient mice were more susceptible to infections, had poorer wound healing, and developed worse colitis, compared with mice that were not cathelicidin deficient, they added. Cathelicidins also inhibit collagen synthesis in human dermal fibroblasts, they said (Cellular and Molecular Gastroenterology and Hepatology 2014 Nov. 11 10.1016/j.jcmgh.2014.08.001 [doi:10.1016/j.jcmgh.2014.08.001]).
To further explore the role of cathelicidins in intestinal fibrosis, Dr. Yoo and associates created two murine models of intestinal inflammation by infecting mice with Salmonella or by administering trinitrobenzene sulfonic acid enemas. Then they administered either intracolonic mCRAMP peptide at a dose of 5 mg/kg every 3 days, or intravenous injections of a lentivirus that overexpressed the cathelicidin gene. The researchers also exposed human intestinal fibroblasts and human colonic CCD-18Co fibroblasts to transforming growth factor beta1 (TGF-beta1) or to insulinlike growth factor 1, which induced collagen protein and mRNA expression that mimicked intestinal fibrosis. Then they exposed these cells to 3-5 mcm of the human cathelicidin LL-37.
The groups of mice with colitis had significantly higher colonic expression of collagen mRNA and significantly more colon tissue damage than did the normal controls, said the researchers. Mice with colitis that received mCRAMP or lentivirus-overexpressing cathelicidin gene had significantly lower collagen mRNA levels and less cecal and colonic collagen deposition, compared with mice that received only the vehicle control, they added (all P values less than .05). Intracolonic mCRAMP also restored body weight (P = .0178) in mice with colitis, compared with untreated controls, they added. Furthermore, LL-37 inhibited collagen synthesis in human intestinal and colonic fibroblasts (P = .0001), they said.
The research was supported by the UCLA CURE Center, the Crohn’s and Colitis Foundation of America, the National Institutes of Health, the Blinder Research Foundation for Crohn’s Disease, the Eli and Edythe Broad Chair, and the U.S. Public Health Service. The authors declared no conflicts of interest.
Fibrosis is a major complication of Crohn’s disease that can lead to strictures and intestinal obstruction. While biologic therapies have revolutionized medical treatment of Crohn’s disease and may reduce the incidence of recurrent stricturing disease, as many as 20% of Crohn’s disease patients treated with these agents still develop strictures.
One attractive therapeutic target could be TGF-beta, which induces fibroblasts to synthesize collagen and is upregulated in Crohn’s disease strictures. Unfortunately, anti-TGF-beta1 antibody therapy failed to work in other fibrotic diseases. In addition, toxicity might be expected in Crohn’s disease, as TGF-beta has pleiotropic functions in the gut, some of which are critical to homeostasis.
Recent work in extraintestinal organs has elucidated the involvement of cathelicidins, antimicrobial cationic peptides, in fibrosis. For example, LL-37, the cleaved form of human cathelicidin, can reduce TGF-beta–induced collagen synthesis by human keloid fibroblasts. However, the ability of cathelicidins to limit intestinal fibrosis has not been explored.
The study by Yoo et al. is important because it demonstrates that cathelicidins may be a powerful therapy for intestinal fibrosis. Importantly, cathelicidin therapy does not appear to affect TGF-beta signaling, as LL-37 therapy did not affect TGF-beta1 expression in vivo, but did inhibit TGF-beta–induced collagen production by primary Crohn’s disease intestinal fibroblasts in vitro.
It is also important to recognize that cathelicidins are already being developed as therapy to limit fibrosis in extra-intestinal diseases. Thus, much in the same way Crohn’s disease patients benefited from the development of anti-TNF-beta biologics for rheumatoid arthritis, it may be possible to take advantage of ongoing studies to reduce the cost and time involved in delivering a new therapeutic agent to patients.
Dr. Stefania Vetrano is a research associate at the IBD Center, Humanitas Clinical and Research Center, Rozzano, Milan. She has no conflicts of interest.
Fibrosis is a major complication of Crohn’s disease that can lead to strictures and intestinal obstruction. While biologic therapies have revolutionized medical treatment of Crohn’s disease and may reduce the incidence of recurrent stricturing disease, as many as 20% of Crohn’s disease patients treated with these agents still develop strictures.
One attractive therapeutic target could be TGF-beta, which induces fibroblasts to synthesize collagen and is upregulated in Crohn’s disease strictures. Unfortunately, anti-TGF-beta1 antibody therapy failed to work in other fibrotic diseases. In addition, toxicity might be expected in Crohn’s disease, as TGF-beta has pleiotropic functions in the gut, some of which are critical to homeostasis.
Recent work in extraintestinal organs has elucidated the involvement of cathelicidins, antimicrobial cationic peptides, in fibrosis. For example, LL-37, the cleaved form of human cathelicidin, can reduce TGF-beta–induced collagen synthesis by human keloid fibroblasts. However, the ability of cathelicidins to limit intestinal fibrosis has not been explored.
The study by Yoo et al. is important because it demonstrates that cathelicidins may be a powerful therapy for intestinal fibrosis. Importantly, cathelicidin therapy does not appear to affect TGF-beta signaling, as LL-37 therapy did not affect TGF-beta1 expression in vivo, but did inhibit TGF-beta–induced collagen production by primary Crohn’s disease intestinal fibroblasts in vitro.
It is also important to recognize that cathelicidins are already being developed as therapy to limit fibrosis in extra-intestinal diseases. Thus, much in the same way Crohn’s disease patients benefited from the development of anti-TNF-beta biologics for rheumatoid arthritis, it may be possible to take advantage of ongoing studies to reduce the cost and time involved in delivering a new therapeutic agent to patients.
Dr. Stefania Vetrano is a research associate at the IBD Center, Humanitas Clinical and Research Center, Rozzano, Milan. She has no conflicts of interest.
Fibrosis is a major complication of Crohn’s disease that can lead to strictures and intestinal obstruction. While biologic therapies have revolutionized medical treatment of Crohn’s disease and may reduce the incidence of recurrent stricturing disease, as many as 20% of Crohn’s disease patients treated with these agents still develop strictures.
One attractive therapeutic target could be TGF-beta, which induces fibroblasts to synthesize collagen and is upregulated in Crohn’s disease strictures. Unfortunately, anti-TGF-beta1 antibody therapy failed to work in other fibrotic diseases. In addition, toxicity might be expected in Crohn’s disease, as TGF-beta has pleiotropic functions in the gut, some of which are critical to homeostasis.
Recent work in extraintestinal organs has elucidated the involvement of cathelicidins, antimicrobial cationic peptides, in fibrosis. For example, LL-37, the cleaved form of human cathelicidin, can reduce TGF-beta–induced collagen synthesis by human keloid fibroblasts. However, the ability of cathelicidins to limit intestinal fibrosis has not been explored.
The study by Yoo et al. is important because it demonstrates that cathelicidins may be a powerful therapy for intestinal fibrosis. Importantly, cathelicidin therapy does not appear to affect TGF-beta signaling, as LL-37 therapy did not affect TGF-beta1 expression in vivo, but did inhibit TGF-beta–induced collagen production by primary Crohn’s disease intestinal fibroblasts in vitro.
It is also important to recognize that cathelicidins are already being developed as therapy to limit fibrosis in extra-intestinal diseases. Thus, much in the same way Crohn’s disease patients benefited from the development of anti-TNF-beta biologics for rheumatoid arthritis, it may be possible to take advantage of ongoing studies to reduce the cost and time involved in delivering a new therapeutic agent to patients.
Dr. Stefania Vetrano is a research associate at the IBD Center, Humanitas Clinical and Research Center, Rozzano, Milan. She has no conflicts of interest.
Peptides known as cathelicidins directly inhibited collagen synthesis in human colonic fibroblasts and in mice with colitis, authors of a controlled, prospective study reported. The findings appeared Nov. 11 in Cellular and Molecular Gastroenterology and Hepatology.
“Our results strongly suggest that cathelicidin administration may be a novel approach to prevent or treat inflammatory bowel disease and IBD-related colonic fibrosis,” said Dr. Jun Hwan Yoo at the University of California, Los Angeles, and his associates.
Cathelicidins are endogenous antimicrobial peptides that exhibit “potent” anti-inflammatory effects against acute colitis, and inhibit colonic fibrosis in mice with chronic or infectious colitis, the investigators said. In past studies, cathelicidin-deficient mice were more susceptible to infections, had poorer wound healing, and developed worse colitis, compared with mice that were not cathelicidin deficient, they added. Cathelicidins also inhibit collagen synthesis in human dermal fibroblasts, they said (Cellular and Molecular Gastroenterology and Hepatology 2014 Nov. 11 10.1016/j.jcmgh.2014.08.001 [doi:10.1016/j.jcmgh.2014.08.001]).
To further explore the role of cathelicidins in intestinal fibrosis, Dr. Yoo and associates created two murine models of intestinal inflammation by infecting mice with Salmonella or by administering trinitrobenzene sulfonic acid enemas. Then they administered either intracolonic mCRAMP peptide at a dose of 5 mg/kg every 3 days, or intravenous injections of a lentivirus that overexpressed the cathelicidin gene. The researchers also exposed human intestinal fibroblasts and human colonic CCD-18Co fibroblasts to transforming growth factor beta1 (TGF-beta1) or to insulinlike growth factor 1, which induced collagen protein and mRNA expression that mimicked intestinal fibrosis. Then they exposed these cells to 3-5 mcm of the human cathelicidin LL-37.
The groups of mice with colitis had significantly higher colonic expression of collagen mRNA and significantly more colon tissue damage than did the normal controls, said the researchers. Mice with colitis that received mCRAMP or lentivirus-overexpressing cathelicidin gene had significantly lower collagen mRNA levels and less cecal and colonic collagen deposition, compared with mice that received only the vehicle control, they added (all P values less than .05). Intracolonic mCRAMP also restored body weight (P = .0178) in mice with colitis, compared with untreated controls, they added. Furthermore, LL-37 inhibited collagen synthesis in human intestinal and colonic fibroblasts (P = .0001), they said.
The research was supported by the UCLA CURE Center, the Crohn’s and Colitis Foundation of America, the National Institutes of Health, the Blinder Research Foundation for Crohn’s Disease, the Eli and Edythe Broad Chair, and the U.S. Public Health Service. The authors declared no conflicts of interest.
Peptides known as cathelicidins directly inhibited collagen synthesis in human colonic fibroblasts and in mice with colitis, authors of a controlled, prospective study reported. The findings appeared Nov. 11 in Cellular and Molecular Gastroenterology and Hepatology.
“Our results strongly suggest that cathelicidin administration may be a novel approach to prevent or treat inflammatory bowel disease and IBD-related colonic fibrosis,” said Dr. Jun Hwan Yoo at the University of California, Los Angeles, and his associates.
Cathelicidins are endogenous antimicrobial peptides that exhibit “potent” anti-inflammatory effects against acute colitis, and inhibit colonic fibrosis in mice with chronic or infectious colitis, the investigators said. In past studies, cathelicidin-deficient mice were more susceptible to infections, had poorer wound healing, and developed worse colitis, compared with mice that were not cathelicidin deficient, they added. Cathelicidins also inhibit collagen synthesis in human dermal fibroblasts, they said (Cellular and Molecular Gastroenterology and Hepatology 2014 Nov. 11 10.1016/j.jcmgh.2014.08.001 [doi:10.1016/j.jcmgh.2014.08.001]).
To further explore the role of cathelicidins in intestinal fibrosis, Dr. Yoo and associates created two murine models of intestinal inflammation by infecting mice with Salmonella or by administering trinitrobenzene sulfonic acid enemas. Then they administered either intracolonic mCRAMP peptide at a dose of 5 mg/kg every 3 days, or intravenous injections of a lentivirus that overexpressed the cathelicidin gene. The researchers also exposed human intestinal fibroblasts and human colonic CCD-18Co fibroblasts to transforming growth factor beta1 (TGF-beta1) or to insulinlike growth factor 1, which induced collagen protein and mRNA expression that mimicked intestinal fibrosis. Then they exposed these cells to 3-5 mcm of the human cathelicidin LL-37.
The groups of mice with colitis had significantly higher colonic expression of collagen mRNA and significantly more colon tissue damage than did the normal controls, said the researchers. Mice with colitis that received mCRAMP or lentivirus-overexpressing cathelicidin gene had significantly lower collagen mRNA levels and less cecal and colonic collagen deposition, compared with mice that received only the vehicle control, they added (all P values less than .05). Intracolonic mCRAMP also restored body weight (P = .0178) in mice with colitis, compared with untreated controls, they added. Furthermore, LL-37 inhibited collagen synthesis in human intestinal and colonic fibroblasts (P = .0001), they said.
The research was supported by the UCLA CURE Center, the Crohn’s and Colitis Foundation of America, the National Institutes of Health, the Blinder Research Foundation for Crohn’s Disease, the Eli and Edythe Broad Chair, and the U.S. Public Health Service. The authors declared no conflicts of interest.
Key clinical point: Cathelicidins might help prevent or reverse intestinal fibrosis in patients with inflammatory bowel disease.
Major finding: Cathelicidins inhibited colonic fibrosis in mice with colitis and in human intestinal fibroblasts (P = .0001).
Data source: Controlled prospective study of the effects of cathelicidins in laboratory mice with colitis and in human intestinal fibroblasts.
Disclosures: The research was supported by the UCLA CURE Center, the Crohn’s and Colitis Foundation of America, the National Institutes of Health, the Blinder Research Foundation for Crohn’s Disease, the Eli and Edythe Broad Chair, and the U.S. Public Health Service. The authors declared no conflicts of interest.
Aramchol shows promise for NAFLD; resveratrol disappoints
Administration of a novel fatty acid–bile acid conjugate safely and significantly reduced liver fat content in patients with nonalcoholic fatty liver disease in a phase II trial.
Administration of resveratrol for 8 weeks, however, failed to improve any features of the disease in a separate placebo-controlled study.
In 60 patients aged 18-75 years in a randomized, double-blind, placebo-controlled study of the fatty acid–bile acid conjugate aramchol (Trima Israel Pharmaceutical Products Ltd. Maabarot), 3 months of treatment was associated with a decrease in liver fat content by a mean of 12.57% as measured by magnetic resonance spectroscopy, compared with an increase in liver fat by a mean of 6.39% in patients who received placebo, Dr. Rifaat Safadi of Hadassah University Medical Center, Jerusalem and colleagues reported in the DEcember issue of Clinical Gastroenterology and Hepatology.
The difference between the treatment and placebo groups was significant, even after age, sex, and body mass index were adjusted for.
Liver fat content also decreased in patients who received treatment with a 100-mg dose of aramchol, but the difference compared with placebo was not statistically significant, indicating a dose-response relationship in treated patients, the investigators said (Clin Gastroenterol Hepatol 2014[doi:10.1016/j.cgh.2014.04.038]).
A trend was seen for improvements over time in endothelial function and levels of alanine aminotransferase and adiponectin.
Patients in the study were from 10 centers in Israel, and all had biopsy-confirmed NAFLD (54 patients) or nonalcoholic steatohepatitis (NASH, 6 patients). They were randomized to receive 100 or 300 mg of aramchol or placebo once daily for 3 months.
No serious or drug-related adverse events occurred in the treated patients, the investigators said.
The findings suggest that aramchol is a candidate for the treatment of fatty liver–related diseases, which are “an increasingly relevant public health issue because of their association with the worldwide epidemics of diabetes and obesity,” and for which treatments are lacking, they said, concluding that longer trials in patients with NASH and metabolic complications are warranted to evaluate metabolic and histologic benefits of treatment.
Findings from the resveratrol study were less encouraging.
Eight weeks of treatment at 3,000 mg daily in overweight or obese men with NAFLD who were recruited from clinics in Brisbane from 2011 through 2012 was no better than placebo for improving insulin resistance, hepatic steatosis, or abdominal fat distribution, Dr. Veronique S. Chachay of the University of Queensland, Brisbane, Australia and her colleagues also reported in the December issue of Clinical Gastroenterology and Hepatology.
No changes were observed in plasma lipids or antioxidant activity, and levels of alanine and aspartate aminotransferases increased significantly (from 45 to 63 U/L and from 35 to 45 U/L, respectively) until week 6 in the 10 treated patients, compared with the 10 who received placebo (40 to 48 U/L and 36 to 38 U/L, respectively). Resveratrol did not significantly alter transcription of NQ01, PTP1B, IL6, or H01, the investigators said (Clin Gastroenterol Hepatol 2014[doi:10.106/j.cgh.2014.02.024]).
NAFLD is associated with abdominal obesity, insulin resistance, and inflammation, and while weight loss via calorie restriction is known to reduce features of the disease, there is no pharmacologic therapy available to achieve this end.
The investigators sought to determine if resveratrol – a polyphenol that has been shown to prevent high-energy diet-induced steatosis and insulin resistance in animals – might be beneficial in patients with NAFLD.
“The present study demonstrates that the preventive role of resveratrol observed in diet-induced preclinical models of NAFLD does not translate into a therapeutic role in clinically established NAFLD,” they said, adding that clinical dose-finding studies are “paramount to elucidate the dose-response relationship” and that “the purported calorie-restriction mimicking of resveratrol may require investigation in combination with dietary prescription, standard care, and lifestyle modifications to target adequately the complexity of dysregulation in obesity-related chronic disease.”
The aramchol study was supported by Galmed Medical Research, Ltd. Dr. Safadi reported having no disclosures. The resveratrol study was supported by the Princess Alexandra Research Foundation, the Lions Medical Research Foundation, and the National Health and Medical Research Council of Australia. The authors reported having no disclosures.
Administration of a novel fatty acid–bile acid conjugate safely and significantly reduced liver fat content in patients with nonalcoholic fatty liver disease in a phase II trial.
Administration of resveratrol for 8 weeks, however, failed to improve any features of the disease in a separate placebo-controlled study.
In 60 patients aged 18-75 years in a randomized, double-blind, placebo-controlled study of the fatty acid–bile acid conjugate aramchol (Trima Israel Pharmaceutical Products Ltd. Maabarot), 3 months of treatment was associated with a decrease in liver fat content by a mean of 12.57% as measured by magnetic resonance spectroscopy, compared with an increase in liver fat by a mean of 6.39% in patients who received placebo, Dr. Rifaat Safadi of Hadassah University Medical Center, Jerusalem and colleagues reported in the DEcember issue of Clinical Gastroenterology and Hepatology.
The difference between the treatment and placebo groups was significant, even after age, sex, and body mass index were adjusted for.
Liver fat content also decreased in patients who received treatment with a 100-mg dose of aramchol, but the difference compared with placebo was not statistically significant, indicating a dose-response relationship in treated patients, the investigators said (Clin Gastroenterol Hepatol 2014[doi:10.1016/j.cgh.2014.04.038]).
A trend was seen for improvements over time in endothelial function and levels of alanine aminotransferase and adiponectin.
Patients in the study were from 10 centers in Israel, and all had biopsy-confirmed NAFLD (54 patients) or nonalcoholic steatohepatitis (NASH, 6 patients). They were randomized to receive 100 or 300 mg of aramchol or placebo once daily for 3 months.
No serious or drug-related adverse events occurred in the treated patients, the investigators said.
The findings suggest that aramchol is a candidate for the treatment of fatty liver–related diseases, which are “an increasingly relevant public health issue because of their association with the worldwide epidemics of diabetes and obesity,” and for which treatments are lacking, they said, concluding that longer trials in patients with NASH and metabolic complications are warranted to evaluate metabolic and histologic benefits of treatment.
Findings from the resveratrol study were less encouraging.
Eight weeks of treatment at 3,000 mg daily in overweight or obese men with NAFLD who were recruited from clinics in Brisbane from 2011 through 2012 was no better than placebo for improving insulin resistance, hepatic steatosis, or abdominal fat distribution, Dr. Veronique S. Chachay of the University of Queensland, Brisbane, Australia and her colleagues also reported in the December issue of Clinical Gastroenterology and Hepatology.
No changes were observed in plasma lipids or antioxidant activity, and levels of alanine and aspartate aminotransferases increased significantly (from 45 to 63 U/L and from 35 to 45 U/L, respectively) until week 6 in the 10 treated patients, compared with the 10 who received placebo (40 to 48 U/L and 36 to 38 U/L, respectively). Resveratrol did not significantly alter transcription of NQ01, PTP1B, IL6, or H01, the investigators said (Clin Gastroenterol Hepatol 2014[doi:10.106/j.cgh.2014.02.024]).
NAFLD is associated with abdominal obesity, insulin resistance, and inflammation, and while weight loss via calorie restriction is known to reduce features of the disease, there is no pharmacologic therapy available to achieve this end.
The investigators sought to determine if resveratrol – a polyphenol that has been shown to prevent high-energy diet-induced steatosis and insulin resistance in animals – might be beneficial in patients with NAFLD.
“The present study demonstrates that the preventive role of resveratrol observed in diet-induced preclinical models of NAFLD does not translate into a therapeutic role in clinically established NAFLD,” they said, adding that clinical dose-finding studies are “paramount to elucidate the dose-response relationship” and that “the purported calorie-restriction mimicking of resveratrol may require investigation in combination with dietary prescription, standard care, and lifestyle modifications to target adequately the complexity of dysregulation in obesity-related chronic disease.”
The aramchol study was supported by Galmed Medical Research, Ltd. Dr. Safadi reported having no disclosures. The resveratrol study was supported by the Princess Alexandra Research Foundation, the Lions Medical Research Foundation, and the National Health and Medical Research Council of Australia. The authors reported having no disclosures.
Administration of a novel fatty acid–bile acid conjugate safely and significantly reduced liver fat content in patients with nonalcoholic fatty liver disease in a phase II trial.
Administration of resveratrol for 8 weeks, however, failed to improve any features of the disease in a separate placebo-controlled study.
In 60 patients aged 18-75 years in a randomized, double-blind, placebo-controlled study of the fatty acid–bile acid conjugate aramchol (Trima Israel Pharmaceutical Products Ltd. Maabarot), 3 months of treatment was associated with a decrease in liver fat content by a mean of 12.57% as measured by magnetic resonance spectroscopy, compared with an increase in liver fat by a mean of 6.39% in patients who received placebo, Dr. Rifaat Safadi of Hadassah University Medical Center, Jerusalem and colleagues reported in the DEcember issue of Clinical Gastroenterology and Hepatology.
The difference between the treatment and placebo groups was significant, even after age, sex, and body mass index were adjusted for.
Liver fat content also decreased in patients who received treatment with a 100-mg dose of aramchol, but the difference compared with placebo was not statistically significant, indicating a dose-response relationship in treated patients, the investigators said (Clin Gastroenterol Hepatol 2014[doi:10.1016/j.cgh.2014.04.038]).
A trend was seen for improvements over time in endothelial function and levels of alanine aminotransferase and adiponectin.
Patients in the study were from 10 centers in Israel, and all had biopsy-confirmed NAFLD (54 patients) or nonalcoholic steatohepatitis (NASH, 6 patients). They were randomized to receive 100 or 300 mg of aramchol or placebo once daily for 3 months.
No serious or drug-related adverse events occurred in the treated patients, the investigators said.
The findings suggest that aramchol is a candidate for the treatment of fatty liver–related diseases, which are “an increasingly relevant public health issue because of their association with the worldwide epidemics of diabetes and obesity,” and for which treatments are lacking, they said, concluding that longer trials in patients with NASH and metabolic complications are warranted to evaluate metabolic and histologic benefits of treatment.
Findings from the resveratrol study were less encouraging.
Eight weeks of treatment at 3,000 mg daily in overweight or obese men with NAFLD who were recruited from clinics in Brisbane from 2011 through 2012 was no better than placebo for improving insulin resistance, hepatic steatosis, or abdominal fat distribution, Dr. Veronique S. Chachay of the University of Queensland, Brisbane, Australia and her colleagues also reported in the December issue of Clinical Gastroenterology and Hepatology.
No changes were observed in plasma lipids or antioxidant activity, and levels of alanine and aspartate aminotransferases increased significantly (from 45 to 63 U/L and from 35 to 45 U/L, respectively) until week 6 in the 10 treated patients, compared with the 10 who received placebo (40 to 48 U/L and 36 to 38 U/L, respectively). Resveratrol did not significantly alter transcription of NQ01, PTP1B, IL6, or H01, the investigators said (Clin Gastroenterol Hepatol 2014[doi:10.106/j.cgh.2014.02.024]).
NAFLD is associated with abdominal obesity, insulin resistance, and inflammation, and while weight loss via calorie restriction is known to reduce features of the disease, there is no pharmacologic therapy available to achieve this end.
The investigators sought to determine if resveratrol – a polyphenol that has been shown to prevent high-energy diet-induced steatosis and insulin resistance in animals – might be beneficial in patients with NAFLD.
“The present study demonstrates that the preventive role of resveratrol observed in diet-induced preclinical models of NAFLD does not translate into a therapeutic role in clinically established NAFLD,” they said, adding that clinical dose-finding studies are “paramount to elucidate the dose-response relationship” and that “the purported calorie-restriction mimicking of resveratrol may require investigation in combination with dietary prescription, standard care, and lifestyle modifications to target adequately the complexity of dysregulation in obesity-related chronic disease.”
The aramchol study was supported by Galmed Medical Research, Ltd. Dr. Safadi reported having no disclosures. The resveratrol study was supported by the Princess Alexandra Research Foundation, the Lions Medical Research Foundation, and the National Health and Medical Research Council of Australia. The authors reported having no disclosures.
Key clinical point: Aramchol is a candidate treatment for fatty liver–related diseases; resveratrol requires dose-finding and other studies.
Major finding: The change in liver fat content was a mean of -12.57% with aramchol vs. +6.39% with placebo. Resveratol had no therapeutic benefit in NAFLD.
Data source: A phase II study of 60 patients, and a randomized, controlled study of 20 patients.
Disclosures: The aramchol study was supported by Galmed Medical Research, Ltd. Dr. Safadi reported having no disclosures. The resveratrol study was supported by the Princess Alexandra Research Foundation, the Lions Medical Research Foundation, and the National Health and Medical Research Council of Australia. The authors reported having no disclosures.
Infliximab serum concentrations, efficacy linked in ulcerative colitis
Higher serum concentrations of infliximab are associated with clinical response, mucosal healing, and clinical remission in adults with moderate to severe ulcerative colitis, according to post hoc analyses of data from the Active Ulcerative Colitis Trials, ACT-1 and ACT-2.
In the 728 patients from the two randomized, controlled phase III pivotal trials, median serum concentrations of infliximab at week 8 were higher among patients with clinical response or mucosal healing during induction than in those who did not achieve these endpoints. For example, the median concentration among those who received 5-mg/kg doses of infliximab was 35.0 mcg/mL in responders, compared with 25.8 mcg/mL in nonresponders, Omoniyi J. Adedokun of Janssen Research and Development, Spring House, Pa., and colleagues reported in the December issue of Gastroenterology [doi:10.1053/j.gastro.2014.08.035].
“Similar results were observed for clinical response and mucosal healing during maintenance at week 30 and week 54,” the investigators wrote, noting that in the 5-mg/kg group, the median trough serum infliximab concentration was several-fold higher in responders than in nonresponders (3.9 vs. 1.2 mcg/mL at week 30 and 5.0 vs. 0.7 mcg/mL at week 54).
Concentrations did not differ significantly at 8 weeks in remitters and nonremitters in the 5-mg/kg group, but they did in patients who received 10 mg/kg and in patients in both dose groups at weeks 30 and 54, the investigators reported.
When assessed by infliximab concentration quartiles, treatment efficacy – defined by clinical response, mucosal healing, and/or clinical remission – generally improved with increasing concentrations in patients in both the 5- and 10-mg/kg groups; those with concentrations in the lowest quartile consistently were less likely to show clinical response, clinical remission, or mucosal healing, and had rates of success approaching those observed in the placebo groups.
Optimal infliximab concentration target thresholds associated with clinical improvement in ulcerative colitis patients in these analyses were 41 mcg/mL at week 8 (sensitivity, specificity, and positive predictive values of 63%, 62%, and 80%, respectively) and 3.7 mcg/mL at week 30 for maintenance of clinical response (sensitivity, specificity, and positive predictive values of 65%, 71%, and 82%, respectively). The data at week 54 suggested a range for serum infliximab concentrations of similar sensitivity, specificity, and positive predictive value, but those data represented only a subset of patients assessed, the investigators said.
Patients who achieved an efficacy outcome, but who failed to maintain that outcome, had lower serum infliximab concentrations earlier in the course of therapy than did those who maintained the outcome, the investigators said.
“In general, the lower the infliximab concentration at a given time point, the more likely the patients were to fail to maintain remission,” they wrote.
The findings demonstrate a strong association between serum infliximab concentration and efficacy outcomes in patients with ulcerative colitis, and highlight the possibility of infliximab dose optimization – particularly in patients who are likely to lose efficacy while receiving a standard dose of infliximab, the investigators said.
Target threshold concentrations identified by this analysis could help clinicians understand why an individual patient fails to achieve the expected efficacy, but a prospective study designed to confirm the importance of optimizing infliximab concentrations is needed before it can be determined whether these results can be exploited to achieve better outcomes for patients with ulcerative colitis, Mr. Adedokun and coinvestigators said.
The ACT trials were funded by Janssen Research and Development, which also employs Mr. Adedokun.
Higher serum concentrations of infliximab are associated with clinical response, mucosal healing, and clinical remission in adults with moderate to severe ulcerative colitis, according to post hoc analyses of data from the Active Ulcerative Colitis Trials, ACT-1 and ACT-2.
In the 728 patients from the two randomized, controlled phase III pivotal trials, median serum concentrations of infliximab at week 8 were higher among patients with clinical response or mucosal healing during induction than in those who did not achieve these endpoints. For example, the median concentration among those who received 5-mg/kg doses of infliximab was 35.0 mcg/mL in responders, compared with 25.8 mcg/mL in nonresponders, Omoniyi J. Adedokun of Janssen Research and Development, Spring House, Pa., and colleagues reported in the December issue of Gastroenterology [doi:10.1053/j.gastro.2014.08.035].
“Similar results were observed for clinical response and mucosal healing during maintenance at week 30 and week 54,” the investigators wrote, noting that in the 5-mg/kg group, the median trough serum infliximab concentration was several-fold higher in responders than in nonresponders (3.9 vs. 1.2 mcg/mL at week 30 and 5.0 vs. 0.7 mcg/mL at week 54).
Concentrations did not differ significantly at 8 weeks in remitters and nonremitters in the 5-mg/kg group, but they did in patients who received 10 mg/kg and in patients in both dose groups at weeks 30 and 54, the investigators reported.
When assessed by infliximab concentration quartiles, treatment efficacy – defined by clinical response, mucosal healing, and/or clinical remission – generally improved with increasing concentrations in patients in both the 5- and 10-mg/kg groups; those with concentrations in the lowest quartile consistently were less likely to show clinical response, clinical remission, or mucosal healing, and had rates of success approaching those observed in the placebo groups.
Optimal infliximab concentration target thresholds associated with clinical improvement in ulcerative colitis patients in these analyses were 41 mcg/mL at week 8 (sensitivity, specificity, and positive predictive values of 63%, 62%, and 80%, respectively) and 3.7 mcg/mL at week 30 for maintenance of clinical response (sensitivity, specificity, and positive predictive values of 65%, 71%, and 82%, respectively). The data at week 54 suggested a range for serum infliximab concentrations of similar sensitivity, specificity, and positive predictive value, but those data represented only a subset of patients assessed, the investigators said.
Patients who achieved an efficacy outcome, but who failed to maintain that outcome, had lower serum infliximab concentrations earlier in the course of therapy than did those who maintained the outcome, the investigators said.
“In general, the lower the infliximab concentration at a given time point, the more likely the patients were to fail to maintain remission,” they wrote.
The findings demonstrate a strong association between serum infliximab concentration and efficacy outcomes in patients with ulcerative colitis, and highlight the possibility of infliximab dose optimization – particularly in patients who are likely to lose efficacy while receiving a standard dose of infliximab, the investigators said.
Target threshold concentrations identified by this analysis could help clinicians understand why an individual patient fails to achieve the expected efficacy, but a prospective study designed to confirm the importance of optimizing infliximab concentrations is needed before it can be determined whether these results can be exploited to achieve better outcomes for patients with ulcerative colitis, Mr. Adedokun and coinvestigators said.
The ACT trials were funded by Janssen Research and Development, which also employs Mr. Adedokun.
Higher serum concentrations of infliximab are associated with clinical response, mucosal healing, and clinical remission in adults with moderate to severe ulcerative colitis, according to post hoc analyses of data from the Active Ulcerative Colitis Trials, ACT-1 and ACT-2.
In the 728 patients from the two randomized, controlled phase III pivotal trials, median serum concentrations of infliximab at week 8 were higher among patients with clinical response or mucosal healing during induction than in those who did not achieve these endpoints. For example, the median concentration among those who received 5-mg/kg doses of infliximab was 35.0 mcg/mL in responders, compared with 25.8 mcg/mL in nonresponders, Omoniyi J. Adedokun of Janssen Research and Development, Spring House, Pa., and colleagues reported in the December issue of Gastroenterology [doi:10.1053/j.gastro.2014.08.035].
“Similar results were observed for clinical response and mucosal healing during maintenance at week 30 and week 54,” the investigators wrote, noting that in the 5-mg/kg group, the median trough serum infliximab concentration was several-fold higher in responders than in nonresponders (3.9 vs. 1.2 mcg/mL at week 30 and 5.0 vs. 0.7 mcg/mL at week 54).
Concentrations did not differ significantly at 8 weeks in remitters and nonremitters in the 5-mg/kg group, but they did in patients who received 10 mg/kg and in patients in both dose groups at weeks 30 and 54, the investigators reported.
When assessed by infliximab concentration quartiles, treatment efficacy – defined by clinical response, mucosal healing, and/or clinical remission – generally improved with increasing concentrations in patients in both the 5- and 10-mg/kg groups; those with concentrations in the lowest quartile consistently were less likely to show clinical response, clinical remission, or mucosal healing, and had rates of success approaching those observed in the placebo groups.
Optimal infliximab concentration target thresholds associated with clinical improvement in ulcerative colitis patients in these analyses were 41 mcg/mL at week 8 (sensitivity, specificity, and positive predictive values of 63%, 62%, and 80%, respectively) and 3.7 mcg/mL at week 30 for maintenance of clinical response (sensitivity, specificity, and positive predictive values of 65%, 71%, and 82%, respectively). The data at week 54 suggested a range for serum infliximab concentrations of similar sensitivity, specificity, and positive predictive value, but those data represented only a subset of patients assessed, the investigators said.
Patients who achieved an efficacy outcome, but who failed to maintain that outcome, had lower serum infliximab concentrations earlier in the course of therapy than did those who maintained the outcome, the investigators said.
“In general, the lower the infliximab concentration at a given time point, the more likely the patients were to fail to maintain remission,” they wrote.
The findings demonstrate a strong association between serum infliximab concentration and efficacy outcomes in patients with ulcerative colitis, and highlight the possibility of infliximab dose optimization – particularly in patients who are likely to lose efficacy while receiving a standard dose of infliximab, the investigators said.
Target threshold concentrations identified by this analysis could help clinicians understand why an individual patient fails to achieve the expected efficacy, but a prospective study designed to confirm the importance of optimizing infliximab concentrations is needed before it can be determined whether these results can be exploited to achieve better outcomes for patients with ulcerative colitis, Mr. Adedokun and coinvestigators said.
The ACT trials were funded by Janssen Research and Development, which also employs Mr. Adedokun.
Key clinical point: Serum concentrations of infliximab are associated with response, and could allow for dose optimization.
Major finding: Median concentrations of infliximab at 8 weeks in those receiving 5 mg/kg were 35.0 mcg/mL in responders vs. 25.8 mcg/mL in nonresponders.
Data source: Post hoc analyses of the ACT trials involving 728 patients.
Disclosures: The ACT trials were funded by Janssen Research and Development, which also employs Mr. Adedokun.
Large study shows no link between celiac disease and fertility problems
With the exception of those diagnosed between the age of 25 and 29 years, women with celiac disease are no more likely than are women without celiac disease to have fertility problems, according to findings from a large population-based study in the United Kingdom.
Of more than 2.4 million women with prospective primary care records available during their childbearing years (ages 15-49 years) between 1990 and 2013, 6,506 were diagnosed with celiac disease. The women with celiac disease had a similar rate of recorded fertility problems as did those without celiac disease (4.4% vs. 4.1%), Nafeesa N. Dhalwani of the University of Nottingham and City Hospital Nottingham, U.K., and colleagues reported in the December issue of Gastroenterology (doi:10.1053/j.gastro.2014.08.025).
Source: American Gastroenterological Association
Further, the rates of infertility in those with celiac disease were similar to those without celiac disease both before and after diagnosis except in those aged 25-29 years at the time of diagnosis; the rates in those women were 41% higher, compared with those without celiac disease who were the same age (incidence rate ratio, 1.41), the investigators said.
“However, the absolute excess risk [for those diagnosed at age 25-29 years] was only 0.5% (5.2/1,000 person-years), they said.
Women included in the analysis were identified from the Health Improvement Network database. Rates of new clinically recorded fertility problems among those with and without diagnosed celiac disease were stratified by timing of celiac disease diagnosis after adjustment for sociodemographics, comorbidities, and calendar year.
The findings contrast with those from a number of smaller studies that demonstrated an association between infertility and celiac disease, but those studies included small numbers of women, including many who were receiving infertility specialist services, the investigators said, explaining that the women may not have been representative of the general population, and that other small studies found no link between celiac disease and fertility problems.
Celiac disease affects about 1% of the population in North America and Western Europe, and between 60% and 70% of those who are diagnosed are women. Several mechanisms through which celiac disease might affect a woman’s fertility have been described in the literature, but no conclusive evidence exists to support them, they noted.
Despite this lack of evidence and the inconsistent findings from small studies, a number of reviews include infertility as a key nongastrointestinal manifestation of celiac disease. The current findings suggest that most women with celiac disease – whether diagnosed or undiagnosed – do not have a substantially greater likelihood of having clinically recorded fertility problems than do those without celiac disease.
“Therefore, screening when women initially present with fertility problems may not identify a significant number of women with celiac disease, beyond the general population prevalence. This may not always apply to subgroups of women with severe celiac disease. However, in terms of the clinical burden of fertility problems at a population level, these findings should be reassuring for women with celiac disease and all stakeholders involved in their care,” the investigators concluded.
This study was supported by CORE/Coeliac UK, and by a University of Nottingham/Nottingham University Hospitals National Health Service Trust Senior Clinical Research Fellowship. The authors reported having no disclosures.
With the exception of those diagnosed between the age of 25 and 29 years, women with celiac disease are no more likely than are women without celiac disease to have fertility problems, according to findings from a large population-based study in the United Kingdom.
Of more than 2.4 million women with prospective primary care records available during their childbearing years (ages 15-49 years) between 1990 and 2013, 6,506 were diagnosed with celiac disease. The women with celiac disease had a similar rate of recorded fertility problems as did those without celiac disease (4.4% vs. 4.1%), Nafeesa N. Dhalwani of the University of Nottingham and City Hospital Nottingham, U.K., and colleagues reported in the December issue of Gastroenterology (doi:10.1053/j.gastro.2014.08.025).
Source: American Gastroenterological Association
Further, the rates of infertility in those with celiac disease were similar to those without celiac disease both before and after diagnosis except in those aged 25-29 years at the time of diagnosis; the rates in those women were 41% higher, compared with those without celiac disease who were the same age (incidence rate ratio, 1.41), the investigators said.
“However, the absolute excess risk [for those diagnosed at age 25-29 years] was only 0.5% (5.2/1,000 person-years), they said.
Women included in the analysis were identified from the Health Improvement Network database. Rates of new clinically recorded fertility problems among those with and without diagnosed celiac disease were stratified by timing of celiac disease diagnosis after adjustment for sociodemographics, comorbidities, and calendar year.
The findings contrast with those from a number of smaller studies that demonstrated an association between infertility and celiac disease, but those studies included small numbers of women, including many who were receiving infertility specialist services, the investigators said, explaining that the women may not have been representative of the general population, and that other small studies found no link between celiac disease and fertility problems.
Celiac disease affects about 1% of the population in North America and Western Europe, and between 60% and 70% of those who are diagnosed are women. Several mechanisms through which celiac disease might affect a woman’s fertility have been described in the literature, but no conclusive evidence exists to support them, they noted.
Despite this lack of evidence and the inconsistent findings from small studies, a number of reviews include infertility as a key nongastrointestinal manifestation of celiac disease. The current findings suggest that most women with celiac disease – whether diagnosed or undiagnosed – do not have a substantially greater likelihood of having clinically recorded fertility problems than do those without celiac disease.
“Therefore, screening when women initially present with fertility problems may not identify a significant number of women with celiac disease, beyond the general population prevalence. This may not always apply to subgroups of women with severe celiac disease. However, in terms of the clinical burden of fertility problems at a population level, these findings should be reassuring for women with celiac disease and all stakeholders involved in their care,” the investigators concluded.
This study was supported by CORE/Coeliac UK, and by a University of Nottingham/Nottingham University Hospitals National Health Service Trust Senior Clinical Research Fellowship. The authors reported having no disclosures.
With the exception of those diagnosed between the age of 25 and 29 years, women with celiac disease are no more likely than are women without celiac disease to have fertility problems, according to findings from a large population-based study in the United Kingdom.
Of more than 2.4 million women with prospective primary care records available during their childbearing years (ages 15-49 years) between 1990 and 2013, 6,506 were diagnosed with celiac disease. The women with celiac disease had a similar rate of recorded fertility problems as did those without celiac disease (4.4% vs. 4.1%), Nafeesa N. Dhalwani of the University of Nottingham and City Hospital Nottingham, U.K., and colleagues reported in the December issue of Gastroenterology (doi:10.1053/j.gastro.2014.08.025).
Source: American Gastroenterological Association
Further, the rates of infertility in those with celiac disease were similar to those without celiac disease both before and after diagnosis except in those aged 25-29 years at the time of diagnosis; the rates in those women were 41% higher, compared with those without celiac disease who were the same age (incidence rate ratio, 1.41), the investigators said.
“However, the absolute excess risk [for those diagnosed at age 25-29 years] was only 0.5% (5.2/1,000 person-years), they said.
Women included in the analysis were identified from the Health Improvement Network database. Rates of new clinically recorded fertility problems among those with and without diagnosed celiac disease were stratified by timing of celiac disease diagnosis after adjustment for sociodemographics, comorbidities, and calendar year.
The findings contrast with those from a number of smaller studies that demonstrated an association between infertility and celiac disease, but those studies included small numbers of women, including many who were receiving infertility specialist services, the investigators said, explaining that the women may not have been representative of the general population, and that other small studies found no link between celiac disease and fertility problems.
Celiac disease affects about 1% of the population in North America and Western Europe, and between 60% and 70% of those who are diagnosed are women. Several mechanisms through which celiac disease might affect a woman’s fertility have been described in the literature, but no conclusive evidence exists to support them, they noted.
Despite this lack of evidence and the inconsistent findings from small studies, a number of reviews include infertility as a key nongastrointestinal manifestation of celiac disease. The current findings suggest that most women with celiac disease – whether diagnosed or undiagnosed – do not have a substantially greater likelihood of having clinically recorded fertility problems than do those without celiac disease.
“Therefore, screening when women initially present with fertility problems may not identify a significant number of women with celiac disease, beyond the general population prevalence. This may not always apply to subgroups of women with severe celiac disease. However, in terms of the clinical burden of fertility problems at a population level, these findings should be reassuring for women with celiac disease and all stakeholders involved in their care,” the investigators concluded.
This study was supported by CORE/Coeliac UK, and by a University of Nottingham/Nottingham University Hospitals National Health Service Trust Senior Clinical Research Fellowship. The authors reported having no disclosures.
Key clinical point: Women with celiac disease are not at increased risk of fertility problems.
Major finding: Women with celiac disease had a similar rate of recorded fertility problems as did those without celiac disease (4.4% vs. 4.1%).
Data source: A population-based study of more than 2.4 million women.
Disclosures: This study was supported by CORE/Coeliac UK, and by a University of Nottingham/Nottingham University Hospitals National Health Service Trust Senior Clinical Research Fellowship. The authors reported having no disclosures.
Fecal immunochemical testing, colonoscopy found similar for detecting advanced cancers
Fecal immunochemical testing with a low hemoglobin threshold for colonoscopy resembled one-time, primary colonoscopy for detecting advanced neoplasias in the first-degree relatives of colorectal cancer patients, investigators reported in the November issue of Gastroenterology.
Annual fecal immunochemical testing (FIT), followed by colonoscopy if hemoglobin levels met or exceeded 10 mcg per gram of feces, detected all cases of colorectal cancer (CRC) and 61% of advanced adenomas in the study population, said Dr. Enrique Quintero and Dr. Maria Carrillo at the Universidad de la Laguna in Spain and their associates.
But one-time colonoscopy was better than FIT for detecting all neoplasms as a whole in first-degree relatives of patients with CRC, the researchers reported. Based on the findings, initial screening with FIT should be considered when access to colonoscopy is limited, especially if patients are more likely to accept FIT than colonoscopy, the investigators said (Gastroenterology [doi: 10.1053/j.gastro.2014.08.004]).
Courtesy: American Gastroenterological Association
The trial included 1,918 first-degree relatives of patients with CRC. In all, 782 relatives were randomized to one-time colonoscopy, while 784 were assigned to annual FIT for 3 years, the researchers reported. Advanced neoplasia was detected in 3.9% of the FIT group and in 5.8% of the primary colonoscopy group, the investigators said (odds ratio, 1.56; 95% confidence interval, 0.95-2.56; P = .08). Rates of detection of advanced neoplasia also were similar between the FIT and primary colonoscopy groups when participants were stratified by age, sex, age of family member with CRC, type of familial relationship, and number of relatives with CRC, the researchers reported. However, primary colonoscopy identified significantly more nonadvanced adenomas (19.8%) than did FIT (5.4%), they added (OR, 4.71; 95% CI, 3.22-6.89; P less than .001).
Participants with negative FIT results were invited to undergo colonoscopy at the end of the study, the researchers said. Follow-up colonoscopies in these relatives showed that FIT had missed 39% of advanced adenomas but no cases of CRC, they reported. To detect one case of advanced CRC, only 4 relatives in the FIT group needed to undergo colonoscopy, compared with 18 members in the primary colonoscopy group, they added. “A potential benefit of FIT over primary colonoscopy in familial CRC screening is that it may save a substantial number of unnecessary colonoscopies, thus preventing harm and lowering costs,” the investigators concluded.
Ethical concerns prevented the researchers from assessing the efficacy of FIT for more than 3 years, they said. In addition, participants knew they could opt out of their assigned screening method before providing informed consent, which could have biased rates of detection of advanced CRC, the researchers noted. However, these rates did not significantly differ between diagnostic groups, they said. The study did not look at sessile serrated or traditional serrated polyps, because the study was designed when these polyps were still considered hyperplastic and nonmalignant, the investigators noted.
Future studies should evaluate the acceptance of FIT-based screening and its effects on mortality in familial CRC, the researchers concluded.
Their study was supported by grants from Fundación Canaria para la Investigación Sanitaria, Caja de Canarias, and Departmento de Medicina Interna de la Universidad de La Laguna. They reported having no conflicts of interest.
Colonoscopy is the preferred screening method for first-degree relatives (FDRs) of colorectal cancer patients. But evidence supporting the use of colonoscopy in this high-risk population remains indirect, with no randomized trials showing a reduction in CRC incidence or mortality. Recently, screening with fecal immunochemical testing in average-risk populations has gained widespread adoption, mainly because of its low cost, ease of use, and moderate sensitivity and high specificity for CRC. However, despite the fact that FIT is an accepted screening strategy in the average-risk population, little is known regarding FIT’s ability to detect advanced neoplasia in FDRs of CRC patients.
Dr. Quintero and his colleagues are to be congratulated for performing a randomized trial comparing the efficacy of repeated annual FIT versus a one-time colonoscopy in detecting advanced neoplasia in FDRs of CRC patients. The results of their study clearly show that annual FIT is equally effective in detecting advanced neoplasia, compared with a one-time colonoscopy after 3 years.
However, despite the study’s statistical significance in demonstrating equivalence between the two screening modalities, there was still a marked absolute difference in detecting advanced neoplasia between the two tests. Furthermore, the usefulness of FIT screening as an alternative to colonoscopy in this high-risk population will depend on patient uptake. The current study was unable to address this issue because participants knew they could opt out of their assigned strategy and still participate in the study, which was seen in the high crossover rate from the FIT group to the colonoscopy group.
These issues aside, the Quintero study provides important information about alternative screening modalities for the detection of advanced neoplasia in FDRs of CRC patients, and paves the way for future clinical studies.
Dr. Jeffrey Lee is assistant clinical professor of medicine, division of gastroenterology, University of California, San Francisco. He has no conflicts of interest.
Colonoscopy is the preferred screening method for first-degree relatives (FDRs) of colorectal cancer patients. But evidence supporting the use of colonoscopy in this high-risk population remains indirect, with no randomized trials showing a reduction in CRC incidence or mortality. Recently, screening with fecal immunochemical testing in average-risk populations has gained widespread adoption, mainly because of its low cost, ease of use, and moderate sensitivity and high specificity for CRC. However, despite the fact that FIT is an accepted screening strategy in the average-risk population, little is known regarding FIT’s ability to detect advanced neoplasia in FDRs of CRC patients.
Dr. Quintero and his colleagues are to be congratulated for performing a randomized trial comparing the efficacy of repeated annual FIT versus a one-time colonoscopy in detecting advanced neoplasia in FDRs of CRC patients. The results of their study clearly show that annual FIT is equally effective in detecting advanced neoplasia, compared with a one-time colonoscopy after 3 years.
However, despite the study’s statistical significance in demonstrating equivalence between the two screening modalities, there was still a marked absolute difference in detecting advanced neoplasia between the two tests. Furthermore, the usefulness of FIT screening as an alternative to colonoscopy in this high-risk population will depend on patient uptake. The current study was unable to address this issue because participants knew they could opt out of their assigned strategy and still participate in the study, which was seen in the high crossover rate from the FIT group to the colonoscopy group.
These issues aside, the Quintero study provides important information about alternative screening modalities for the detection of advanced neoplasia in FDRs of CRC patients, and paves the way for future clinical studies.
Dr. Jeffrey Lee is assistant clinical professor of medicine, division of gastroenterology, University of California, San Francisco. He has no conflicts of interest.
Colonoscopy is the preferred screening method for first-degree relatives (FDRs) of colorectal cancer patients. But evidence supporting the use of colonoscopy in this high-risk population remains indirect, with no randomized trials showing a reduction in CRC incidence or mortality. Recently, screening with fecal immunochemical testing in average-risk populations has gained widespread adoption, mainly because of its low cost, ease of use, and moderate sensitivity and high specificity for CRC. However, despite the fact that FIT is an accepted screening strategy in the average-risk population, little is known regarding FIT’s ability to detect advanced neoplasia in FDRs of CRC patients.
Dr. Quintero and his colleagues are to be congratulated for performing a randomized trial comparing the efficacy of repeated annual FIT versus a one-time colonoscopy in detecting advanced neoplasia in FDRs of CRC patients. The results of their study clearly show that annual FIT is equally effective in detecting advanced neoplasia, compared with a one-time colonoscopy after 3 years.
However, despite the study’s statistical significance in demonstrating equivalence between the two screening modalities, there was still a marked absolute difference in detecting advanced neoplasia between the two tests. Furthermore, the usefulness of FIT screening as an alternative to colonoscopy in this high-risk population will depend on patient uptake. The current study was unable to address this issue because participants knew they could opt out of their assigned strategy and still participate in the study, which was seen in the high crossover rate from the FIT group to the colonoscopy group.
These issues aside, the Quintero study provides important information about alternative screening modalities for the detection of advanced neoplasia in FDRs of CRC patients, and paves the way for future clinical studies.
Dr. Jeffrey Lee is assistant clinical professor of medicine, division of gastroenterology, University of California, San Francisco. He has no conflicts of interest.
Fecal immunochemical testing with a low hemoglobin threshold for colonoscopy resembled one-time, primary colonoscopy for detecting advanced neoplasias in the first-degree relatives of colorectal cancer patients, investigators reported in the November issue of Gastroenterology.
Annual fecal immunochemical testing (FIT), followed by colonoscopy if hemoglobin levels met or exceeded 10 mcg per gram of feces, detected all cases of colorectal cancer (CRC) and 61% of advanced adenomas in the study population, said Dr. Enrique Quintero and Dr. Maria Carrillo at the Universidad de la Laguna in Spain and their associates.
But one-time colonoscopy was better than FIT for detecting all neoplasms as a whole in first-degree relatives of patients with CRC, the researchers reported. Based on the findings, initial screening with FIT should be considered when access to colonoscopy is limited, especially if patients are more likely to accept FIT than colonoscopy, the investigators said (Gastroenterology [doi: 10.1053/j.gastro.2014.08.004]).
Courtesy: American Gastroenterological Association
The trial included 1,918 first-degree relatives of patients with CRC. In all, 782 relatives were randomized to one-time colonoscopy, while 784 were assigned to annual FIT for 3 years, the researchers reported. Advanced neoplasia was detected in 3.9% of the FIT group and in 5.8% of the primary colonoscopy group, the investigators said (odds ratio, 1.56; 95% confidence interval, 0.95-2.56; P = .08). Rates of detection of advanced neoplasia also were similar between the FIT and primary colonoscopy groups when participants were stratified by age, sex, age of family member with CRC, type of familial relationship, and number of relatives with CRC, the researchers reported. However, primary colonoscopy identified significantly more nonadvanced adenomas (19.8%) than did FIT (5.4%), they added (OR, 4.71; 95% CI, 3.22-6.89; P less than .001).
Participants with negative FIT results were invited to undergo colonoscopy at the end of the study, the researchers said. Follow-up colonoscopies in these relatives showed that FIT had missed 39% of advanced adenomas but no cases of CRC, they reported. To detect one case of advanced CRC, only 4 relatives in the FIT group needed to undergo colonoscopy, compared with 18 members in the primary colonoscopy group, they added. “A potential benefit of FIT over primary colonoscopy in familial CRC screening is that it may save a substantial number of unnecessary colonoscopies, thus preventing harm and lowering costs,” the investigators concluded.
Ethical concerns prevented the researchers from assessing the efficacy of FIT for more than 3 years, they said. In addition, participants knew they could opt out of their assigned screening method before providing informed consent, which could have biased rates of detection of advanced CRC, the researchers noted. However, these rates did not significantly differ between diagnostic groups, they said. The study did not look at sessile serrated or traditional serrated polyps, because the study was designed when these polyps were still considered hyperplastic and nonmalignant, the investigators noted.
Future studies should evaluate the acceptance of FIT-based screening and its effects on mortality in familial CRC, the researchers concluded.
Their study was supported by grants from Fundación Canaria para la Investigación Sanitaria, Caja de Canarias, and Departmento de Medicina Interna de la Universidad de La Laguna. They reported having no conflicts of interest.
Fecal immunochemical testing with a low hemoglobin threshold for colonoscopy resembled one-time, primary colonoscopy for detecting advanced neoplasias in the first-degree relatives of colorectal cancer patients, investigators reported in the November issue of Gastroenterology.
Annual fecal immunochemical testing (FIT), followed by colonoscopy if hemoglobin levels met or exceeded 10 mcg per gram of feces, detected all cases of colorectal cancer (CRC) and 61% of advanced adenomas in the study population, said Dr. Enrique Quintero and Dr. Maria Carrillo at the Universidad de la Laguna in Spain and their associates.
But one-time colonoscopy was better than FIT for detecting all neoplasms as a whole in first-degree relatives of patients with CRC, the researchers reported. Based on the findings, initial screening with FIT should be considered when access to colonoscopy is limited, especially if patients are more likely to accept FIT than colonoscopy, the investigators said (Gastroenterology [doi: 10.1053/j.gastro.2014.08.004]).
Courtesy: American Gastroenterological Association
The trial included 1,918 first-degree relatives of patients with CRC. In all, 782 relatives were randomized to one-time colonoscopy, while 784 were assigned to annual FIT for 3 years, the researchers reported. Advanced neoplasia was detected in 3.9% of the FIT group and in 5.8% of the primary colonoscopy group, the investigators said (odds ratio, 1.56; 95% confidence interval, 0.95-2.56; P = .08). Rates of detection of advanced neoplasia also were similar between the FIT and primary colonoscopy groups when participants were stratified by age, sex, age of family member with CRC, type of familial relationship, and number of relatives with CRC, the researchers reported. However, primary colonoscopy identified significantly more nonadvanced adenomas (19.8%) than did FIT (5.4%), they added (OR, 4.71; 95% CI, 3.22-6.89; P less than .001).
Participants with negative FIT results were invited to undergo colonoscopy at the end of the study, the researchers said. Follow-up colonoscopies in these relatives showed that FIT had missed 39% of advanced adenomas but no cases of CRC, they reported. To detect one case of advanced CRC, only 4 relatives in the FIT group needed to undergo colonoscopy, compared with 18 members in the primary colonoscopy group, they added. “A potential benefit of FIT over primary colonoscopy in familial CRC screening is that it may save a substantial number of unnecessary colonoscopies, thus preventing harm and lowering costs,” the investigators concluded.
Ethical concerns prevented the researchers from assessing the efficacy of FIT for more than 3 years, they said. In addition, participants knew they could opt out of their assigned screening method before providing informed consent, which could have biased rates of detection of advanced CRC, the researchers noted. However, these rates did not significantly differ between diagnostic groups, they said. The study did not look at sessile serrated or traditional serrated polyps, because the study was designed when these polyps were still considered hyperplastic and nonmalignant, the investigators noted.
Future studies should evaluate the acceptance of FIT-based screening and its effects on mortality in familial CRC, the researchers concluded.
Their study was supported by grants from Fundación Canaria para la Investigación Sanitaria, Caja de Canarias, and Departmento de Medicina Interna de la Universidad de La Laguna. They reported having no conflicts of interest.
Key clinical point: Screening with fecal immunochemical testing was comparable to one-time colonoscopy for detecting advanced neoplasias in relatives of colorectal cancer patients.
Major finding: In all, 3.9% of the FIT group and 5.8% of the primary colonoscopy group had advanced neoplasia (odds ratio, 1.56; 95% confidence interval, 0.95-2.56; P = .08).
Data source: Randomized controlled study of 1,918 first-degree relatives of patients with colorectal cancer.
Disclosures: The research was supported by grants from Fundación Canaria para la Investigación Sanitaria, Caja de Canarias, and Departmento de Medicina Interna de la Universidad de La Laguna. The investigators reported having no conflicts of interest.