User login
AGA Guideline: Therapeutic drug monitoring in IBD
Physicians should perform reactive therapeutic drug monitoring to guide changes in anti–tumor necrosis factor (TNF) therapy in patients with active inflammatory bowel disease and should consider target trough concentrations of at least 5 mcg/mL for infliximab, at least 7.5 mcg/mL for adalimumab, and at least 20 mcg/mL for certolizumab pegol, according to a guideline from the AGA Institute, published in the September 2017 issue of Gastroenterology (Gastroenterology. doi: 10.1053/j.gastro.2017.07.032).
Therapeutic drug monitoring can help guide whether to ramp up a dose (if the trough level is below the threshold) or switch therapy (if the trough level is above the threshold) when patients are not responding adequately to maintenance treatment. A nonresponder with optimal trough concentrations might need to switch drug classes, the guideline noted. A patient with low trough levels and no antidrug antibodies is probably experiencing rapid drug clearance in the setting of high inflammation. A patient with low or undetectable trough levels and high antidrug antibody titers has developed neutralizing antidrug antibodies. However, trough concentrations can vary for many other reasons, ranging from disease severity and inflammation to body mass index and sex. Therefore, target levels also vary and can be challenging to set.
The AGA makes no recommendation about routine, proactive TDM in patients with quiescent IBD who are on anti-TNF agents. While proactive TDM can shed light on endoscopic response and drug clearance, it might also trigger a premature switch of therapies; this is particularly likely because physicians have sparse data on either target trough levels for asymptomatic patients or the clinical significance of “low-titer” antidrug antibodies. The optimal frequency of proactive TDM also remains unclear.
Pending better data, the AGA recommended checking infliximab or adalimumab trough levels as close to the next dose as possible – that is, within 24 hours. Drug trough levels are consistent across commercial assays, but antidrug antibody titers are not, and there are no uniform thresholds for clinically relevant antidrug antibody titers. “Therefore, it may be beneficial to utilize the same assay when checking for trough concentration and antidrug antibodies,” the guideline stated.
For patients on a thiopurine, routine testing of thiopurine methyltransferase (TPMT) enzyme or genotype is recommended to guide dosing. In three pooled studies comprising 1,145 patients, only two patients were homozygous; further, rates of hematologic adverse events, clinical remission, and treatment discontinuation did not differ based on TPMT testing itself. However, using TPMT testing to guide dosing was associated with an 89% decrease in the risk of hematologic adverse events among patients who had a homozygous genotype or had low or absent TPMT enzymatic activity. “While this risk may be mitigated by routine laboratory CBC checking, adherence to regular monitoring in clinical practice is suboptimal,” the guideline stated. “It is important to continue to perform routine lab monitoring [of] CBC and liver enzymes after starting a thiopurine, regardless of the TPMT testing results.”
The AGA also conditionally supported reactive monitoring of thiopurine metabolites to guide treatment changes if patients develop breakthrough symptoms or treatment-related adverse effects. For active IBD symptoms in spite of thiopurine monotherapy, a target 6-thioguanine (6-TGN) cutoff between 230 and 450 pmol per 8 x 108 RBC is recommended. Again, supporting evidence is of “very low quality” – in a retrospective, observational study, patients who received treatment according to a TDM algorithm were five times more likely to respond to a change in therapy (relative risk, 5.2). The guideline recommended against monitoring thiopurine metabolites in quiescent IBD. Studies did not support this practice, compared with standard dosing, although no study of thiopurine metabolites included patients on thiopurine/anti-TNF combination therapy, the guideline’s authors noted.
The guideline includes clinical-decision support tools on when to perform TDM and how to interpret results when patients are taking an anti-TNF agent or a thiopurine. The guideline does not cover vedolizumab or ustekinumab because data are sparse. Other knowledge gaps include when best to measure trough concentrations; whether empiric dose escalation or TDM is preferred if response to induction is suboptimal; how target trough concentrations vary based on disease phenotype, disease state, or treatment goals; which levels and durations of antidrug antibody titers are clinically significant; and whether to suppress antidrug antibodies before changing therapy. Future studies should compare routine proactive and reactive TDM, investigate how often to perform proactive TDM, and characterize TDM of newly approved biologic agents, the guideline concluded.
The authors of the guideline document disclosed no conflicts related to the guideline topic.
Physicians should perform reactive therapeutic drug monitoring to guide changes in anti–tumor necrosis factor (TNF) therapy in patients with active inflammatory bowel disease and should consider target trough concentrations of at least 5 mcg/mL for infliximab, at least 7.5 mcg/mL for adalimumab, and at least 20 mcg/mL for certolizumab pegol, according to a guideline from the AGA Institute, published in the September 2017 issue of Gastroenterology (Gastroenterology. doi: 10.1053/j.gastro.2017.07.032).
Therapeutic drug monitoring can help guide whether to ramp up a dose (if the trough level is below the threshold) or switch therapy (if the trough level is above the threshold) when patients are not responding adequately to maintenance treatment. A nonresponder with optimal trough concentrations might need to switch drug classes, the guideline noted. A patient with low trough levels and no antidrug antibodies is probably experiencing rapid drug clearance in the setting of high inflammation. A patient with low or undetectable trough levels and high antidrug antibody titers has developed neutralizing antidrug antibodies. However, trough concentrations can vary for many other reasons, ranging from disease severity and inflammation to body mass index and sex. Therefore, target levels also vary and can be challenging to set.
The AGA makes no recommendation about routine, proactive TDM in patients with quiescent IBD who are on anti-TNF agents. While proactive TDM can shed light on endoscopic response and drug clearance, it might also trigger a premature switch of therapies; this is particularly likely because physicians have sparse data on either target trough levels for asymptomatic patients or the clinical significance of “low-titer” antidrug antibodies. The optimal frequency of proactive TDM also remains unclear.
Pending better data, the AGA recommended checking infliximab or adalimumab trough levels as close to the next dose as possible – that is, within 24 hours. Drug trough levels are consistent across commercial assays, but antidrug antibody titers are not, and there are no uniform thresholds for clinically relevant antidrug antibody titers. “Therefore, it may be beneficial to utilize the same assay when checking for trough concentration and antidrug antibodies,” the guideline stated.
For patients on a thiopurine, routine testing of thiopurine methyltransferase (TPMT) enzyme or genotype is recommended to guide dosing. In three pooled studies comprising 1,145 patients, only two patients were homozygous; further, rates of hematologic adverse events, clinical remission, and treatment discontinuation did not differ based on TPMT testing itself. However, using TPMT testing to guide dosing was associated with an 89% decrease in the risk of hematologic adverse events among patients who had a homozygous genotype or had low or absent TPMT enzymatic activity. “While this risk may be mitigated by routine laboratory CBC checking, adherence to regular monitoring in clinical practice is suboptimal,” the guideline stated. “It is important to continue to perform routine lab monitoring [of] CBC and liver enzymes after starting a thiopurine, regardless of the TPMT testing results.”
The AGA also conditionally supported reactive monitoring of thiopurine metabolites to guide treatment changes if patients develop breakthrough symptoms or treatment-related adverse effects. For active IBD symptoms in spite of thiopurine monotherapy, a target 6-thioguanine (6-TGN) cutoff between 230 and 450 pmol per 8 x 108 RBC is recommended. Again, supporting evidence is of “very low quality” – in a retrospective, observational study, patients who received treatment according to a TDM algorithm were five times more likely to respond to a change in therapy (relative risk, 5.2). The guideline recommended against monitoring thiopurine metabolites in quiescent IBD. Studies did not support this practice, compared with standard dosing, although no study of thiopurine metabolites included patients on thiopurine/anti-TNF combination therapy, the guideline’s authors noted.
The guideline includes clinical-decision support tools on when to perform TDM and how to interpret results when patients are taking an anti-TNF agent or a thiopurine. The guideline does not cover vedolizumab or ustekinumab because data are sparse. Other knowledge gaps include when best to measure trough concentrations; whether empiric dose escalation or TDM is preferred if response to induction is suboptimal; how target trough concentrations vary based on disease phenotype, disease state, or treatment goals; which levels and durations of antidrug antibody titers are clinically significant; and whether to suppress antidrug antibodies before changing therapy. Future studies should compare routine proactive and reactive TDM, investigate how often to perform proactive TDM, and characterize TDM of newly approved biologic agents, the guideline concluded.
The authors of the guideline document disclosed no conflicts related to the guideline topic.
Physicians should perform reactive therapeutic drug monitoring to guide changes in anti–tumor necrosis factor (TNF) therapy in patients with active inflammatory bowel disease and should consider target trough concentrations of at least 5 mcg/mL for infliximab, at least 7.5 mcg/mL for adalimumab, and at least 20 mcg/mL for certolizumab pegol, according to a guideline from the AGA Institute, published in the September 2017 issue of Gastroenterology (Gastroenterology. doi: 10.1053/j.gastro.2017.07.032).
Therapeutic drug monitoring can help guide whether to ramp up a dose (if the trough level is below the threshold) or switch therapy (if the trough level is above the threshold) when patients are not responding adequately to maintenance treatment. A nonresponder with optimal trough concentrations might need to switch drug classes, the guideline noted. A patient with low trough levels and no antidrug antibodies is probably experiencing rapid drug clearance in the setting of high inflammation. A patient with low or undetectable trough levels and high antidrug antibody titers has developed neutralizing antidrug antibodies. However, trough concentrations can vary for many other reasons, ranging from disease severity and inflammation to body mass index and sex. Therefore, target levels also vary and can be challenging to set.
The AGA makes no recommendation about routine, proactive TDM in patients with quiescent IBD who are on anti-TNF agents. While proactive TDM can shed light on endoscopic response and drug clearance, it might also trigger a premature switch of therapies; this is particularly likely because physicians have sparse data on either target trough levels for asymptomatic patients or the clinical significance of “low-titer” antidrug antibodies. The optimal frequency of proactive TDM also remains unclear.
Pending better data, the AGA recommended checking infliximab or adalimumab trough levels as close to the next dose as possible – that is, within 24 hours. Drug trough levels are consistent across commercial assays, but antidrug antibody titers are not, and there are no uniform thresholds for clinically relevant antidrug antibody titers. “Therefore, it may be beneficial to utilize the same assay when checking for trough concentration and antidrug antibodies,” the guideline stated.
For patients on a thiopurine, routine testing of thiopurine methyltransferase (TPMT) enzyme or genotype is recommended to guide dosing. In three pooled studies comprising 1,145 patients, only two patients were homozygous; further, rates of hematologic adverse events, clinical remission, and treatment discontinuation did not differ based on TPMT testing itself. However, using TPMT testing to guide dosing was associated with an 89% decrease in the risk of hematologic adverse events among patients who had a homozygous genotype or had low or absent TPMT enzymatic activity. “While this risk may be mitigated by routine laboratory CBC checking, adherence to regular monitoring in clinical practice is suboptimal,” the guideline stated. “It is important to continue to perform routine lab monitoring [of] CBC and liver enzymes after starting a thiopurine, regardless of the TPMT testing results.”
The AGA also conditionally supported reactive monitoring of thiopurine metabolites to guide treatment changes if patients develop breakthrough symptoms or treatment-related adverse effects. For active IBD symptoms in spite of thiopurine monotherapy, a target 6-thioguanine (6-TGN) cutoff between 230 and 450 pmol per 8 x 108 RBC is recommended. Again, supporting evidence is of “very low quality” – in a retrospective, observational study, patients who received treatment according to a TDM algorithm were five times more likely to respond to a change in therapy (relative risk, 5.2). The guideline recommended against monitoring thiopurine metabolites in quiescent IBD. Studies did not support this practice, compared with standard dosing, although no study of thiopurine metabolites included patients on thiopurine/anti-TNF combination therapy, the guideline’s authors noted.
The guideline includes clinical-decision support tools on when to perform TDM and how to interpret results when patients are taking an anti-TNF agent or a thiopurine. The guideline does not cover vedolizumab or ustekinumab because data are sparse. Other knowledge gaps include when best to measure trough concentrations; whether empiric dose escalation or TDM is preferred if response to induction is suboptimal; how target trough concentrations vary based on disease phenotype, disease state, or treatment goals; which levels and durations of antidrug antibody titers are clinically significant; and whether to suppress antidrug antibodies before changing therapy. Future studies should compare routine proactive and reactive TDM, investigate how often to perform proactive TDM, and characterize TDM of newly approved biologic agents, the guideline concluded.
The authors of the guideline document disclosed no conflicts related to the guideline topic.
FROM GASTROENTEROLOGY
AGA Clinical Practice Update: Opioids in gastroenterology
Physicians should consistently rule out opioid therapy as the cause of gastrointestinal symptoms, states a new clinical practice update published in the September 2017 issue of Clinical Gastroenterology and Hepatology (Clin Gastroenterol Hepatol. doi: 10.1016/j.cgh.2017.05.014).
About 4% of Americans receive long-term opioid therapy, primarily for musculoskeletal, postsurgical, or vascular pain, as well as nonsurgical abdominal pain, writes Michael Camilleri, MD, AGAF, of Mayo Clinic in Rochester, Minn., and his associates. Because opioid receptors thickly populate the gastrointestinal tract, exogenous opioids can trigger a variety of gastrointestinal symptoms. Examples include achalasia, gastroparesis, nausea, postsurgical ileus, constipation, and narcotic bowel syndrome.
In the stomach, opioid use can cause gastroparesis, early satiety, and postprandial nausea and emesis, especially in the postoperative setting. Even novel opioid agents that are less likely to cause constipation can retard gastric emptying. For example, tapentadol, a mu-opioid agonist and norepinephrine reuptake inhibitor, delays emptying to the same extent as oxycodone. Tramadol also appears to slow overall orocecal transit. Although gastroparesis itself can cause nausea and emesis, opioids also directly stimulate the chemoreceptor trigger zone in the area postrema in the floor of the fourth ventricle. Options for preventive therapy include using a prokinetic, such as metoclopramide, prochlorperazine, or a 5-hydroxytryptamine3 antagonist, especially if patients are receiving opioids for postoperative pain control.
Exogenous opioids also can cause ileus, especially after abdominal surgery. These patients are already at risk of ileus because of surgical stress from bowel handling, secretion of inflammatory mediators and endogenous opioids, and fluctuating hormone and electrolyte levels. Postoperative analgesia with mu-opioids adds to the risk of ileus by increasing fluid absorption and inhibiting colonic motility.
Both postsurgical and nonsurgical opioid use also can trigger opioid-induced constipation (OIC), in which patients have less than three spontaneous bowel movements a week, harder stools, increased straining, and a feeling of incomplete evacuation. Patients may also report nausea, emesis, and gastroesophageal reflux. Even low-dose and short-term opioid therapy can lead to OIC. Symptoms and treatment response can be assessed with the bowel function index, in which patients rate ease of defecation, completeness of bowel evacuation, and severity of constipation over the past week on a scale of 0-100. Scores of 0-29 suggest no OIC. Patients who score above 30 despite over-the-counter laxatives are candidates for stepped-up treatments, including prolonged-release naloxone and oxycodone, the intestinal secretagogue lubiprostone, or peripherally acting mu-opioid receptor antagonists (PAMORAs), such as methylnaltrexone (12 mg subcutaneously) and naloxegol (12.5 mg or 25 mg per day orally). Additionally, tapentadol controls pain at lower doses than oxycodone and is less likely to cause constipation.
Narcotic bowel syndrome typically presents as moderate to severe daily abdominal pain lasting more than 3 months in patients on long-term opioids equating to a dosage of more than 100 mg morphine daily. Typically, patients report generalized, persistent, colicky abdominal pain that does not respond to dose escalation and worsens with dose tapering. Work-up is negative for differentials such as kidney stones or bowel obstruction. One epidemiological study estimated that 4% of patients on long-term opiates develop narcotic bowel syndrome, but the true prevalence may be higher according to the experts who authored this update. Mechanisms remain unclear but may include neuroplastic changes that favor the facilitation of pain signals rather than their inhibition, inflammation of spinal glial cells through activation of toll-like receptors, abnormal function of the N-methyl-D aspartate receptor at the level of the spinal cord, and central nociceptive abnormalities related to certain psychological traits or a history of trauma.
Treating narcotic bowel syndrome requires detoxification with appropriate nonopioid therapies for pain, anxiety, and withdrawal symptoms, including the use of clonidine. “This is best handled through specialists or centers with expertise in opiate dependence,” the experts stated. Patients who are able to stay off narcotics report improvements in pain, but the recidivism rate is about 50%.
The practice update also covers opioid therapy for gastrointestinal disorders. The PAMORA alvimopan shortens time to first postoperative stool without counteracting opioid analgesia during recovery. Alvimopan also has been found to hasten recovery of gastrointestinal function in patients with postoperative ileus after bowel resection. There is no evidence for using mu-opioid agonists for pain associated with irritable bowel syndrome (IBS), but the synthetic peripheral mu-opioid receptor agonist loperamide can improve stool consistency and urgency. A typical dose is 2 mg after each loose bowel movement or 2-4 mg before eating in cases of postprandial diarrhea. The mixed mu- and kappa-opioid receptor agonist and delta-opioid receptor antagonist eluxadoline also can potentially improve stool consistency and urgency, global IBS symptoms, IBS symptom severity score, and quality of life. However, the FDA warns against using eluxadoline in patients who do not have a gallbladder because of the risk of severe outcomes – including death – related to sphincter of Oddi spasm and pancreatitis. Eluxadoline has been linked to at least two such fatalities in cholecystectomized patients. In each case, symptoms began after a single dose.
Dr. Camilleri is funded by the National Institutes of Health. He disclosed ties to AstraZeneca and Shionogi. The two coauthors disclosed ties to Forest Research Labs, Ironwood Pharmaceuticals, Prometheus, and Salix.
Physicians should consistently rule out opioid therapy as the cause of gastrointestinal symptoms, states a new clinical practice update published in the September 2017 issue of Clinical Gastroenterology and Hepatology (Clin Gastroenterol Hepatol. doi: 10.1016/j.cgh.2017.05.014).
About 4% of Americans receive long-term opioid therapy, primarily for musculoskeletal, postsurgical, or vascular pain, as well as nonsurgical abdominal pain, writes Michael Camilleri, MD, AGAF, of Mayo Clinic in Rochester, Minn., and his associates. Because opioid receptors thickly populate the gastrointestinal tract, exogenous opioids can trigger a variety of gastrointestinal symptoms. Examples include achalasia, gastroparesis, nausea, postsurgical ileus, constipation, and narcotic bowel syndrome.
In the stomach, opioid use can cause gastroparesis, early satiety, and postprandial nausea and emesis, especially in the postoperative setting. Even novel opioid agents that are less likely to cause constipation can retard gastric emptying. For example, tapentadol, a mu-opioid agonist and norepinephrine reuptake inhibitor, delays emptying to the same extent as oxycodone. Tramadol also appears to slow overall orocecal transit. Although gastroparesis itself can cause nausea and emesis, opioids also directly stimulate the chemoreceptor trigger zone in the area postrema in the floor of the fourth ventricle. Options for preventive therapy include using a prokinetic, such as metoclopramide, prochlorperazine, or a 5-hydroxytryptamine3 antagonist, especially if patients are receiving opioids for postoperative pain control.
Exogenous opioids also can cause ileus, especially after abdominal surgery. These patients are already at risk of ileus because of surgical stress from bowel handling, secretion of inflammatory mediators and endogenous opioids, and fluctuating hormone and electrolyte levels. Postoperative analgesia with mu-opioids adds to the risk of ileus by increasing fluid absorption and inhibiting colonic motility.
Both postsurgical and nonsurgical opioid use also can trigger opioid-induced constipation (OIC), in which patients have less than three spontaneous bowel movements a week, harder stools, increased straining, and a feeling of incomplete evacuation. Patients may also report nausea, emesis, and gastroesophageal reflux. Even low-dose and short-term opioid therapy can lead to OIC. Symptoms and treatment response can be assessed with the bowel function index, in which patients rate ease of defecation, completeness of bowel evacuation, and severity of constipation over the past week on a scale of 0-100. Scores of 0-29 suggest no OIC. Patients who score above 30 despite over-the-counter laxatives are candidates for stepped-up treatments, including prolonged-release naloxone and oxycodone, the intestinal secretagogue lubiprostone, or peripherally acting mu-opioid receptor antagonists (PAMORAs), such as methylnaltrexone (12 mg subcutaneously) and naloxegol (12.5 mg or 25 mg per day orally). Additionally, tapentadol controls pain at lower doses than oxycodone and is less likely to cause constipation.
Narcotic bowel syndrome typically presents as moderate to severe daily abdominal pain lasting more than 3 months in patients on long-term opioids equating to a dosage of more than 100 mg morphine daily. Typically, patients report generalized, persistent, colicky abdominal pain that does not respond to dose escalation and worsens with dose tapering. Work-up is negative for differentials such as kidney stones or bowel obstruction. One epidemiological study estimated that 4% of patients on long-term opiates develop narcotic bowel syndrome, but the true prevalence may be higher according to the experts who authored this update. Mechanisms remain unclear but may include neuroplastic changes that favor the facilitation of pain signals rather than their inhibition, inflammation of spinal glial cells through activation of toll-like receptors, abnormal function of the N-methyl-D aspartate receptor at the level of the spinal cord, and central nociceptive abnormalities related to certain psychological traits or a history of trauma.
Treating narcotic bowel syndrome requires detoxification with appropriate nonopioid therapies for pain, anxiety, and withdrawal symptoms, including the use of clonidine. “This is best handled through specialists or centers with expertise in opiate dependence,” the experts stated. Patients who are able to stay off narcotics report improvements in pain, but the recidivism rate is about 50%.
The practice update also covers opioid therapy for gastrointestinal disorders. The PAMORA alvimopan shortens time to first postoperative stool without counteracting opioid analgesia during recovery. Alvimopan also has been found to hasten recovery of gastrointestinal function in patients with postoperative ileus after bowel resection. There is no evidence for using mu-opioid agonists for pain associated with irritable bowel syndrome (IBS), but the synthetic peripheral mu-opioid receptor agonist loperamide can improve stool consistency and urgency. A typical dose is 2 mg after each loose bowel movement or 2-4 mg before eating in cases of postprandial diarrhea. The mixed mu- and kappa-opioid receptor agonist and delta-opioid receptor antagonist eluxadoline also can potentially improve stool consistency and urgency, global IBS symptoms, IBS symptom severity score, and quality of life. However, the FDA warns against using eluxadoline in patients who do not have a gallbladder because of the risk of severe outcomes – including death – related to sphincter of Oddi spasm and pancreatitis. Eluxadoline has been linked to at least two such fatalities in cholecystectomized patients. In each case, symptoms began after a single dose.
Dr. Camilleri is funded by the National Institutes of Health. He disclosed ties to AstraZeneca and Shionogi. The two coauthors disclosed ties to Forest Research Labs, Ironwood Pharmaceuticals, Prometheus, and Salix.
Physicians should consistently rule out opioid therapy as the cause of gastrointestinal symptoms, states a new clinical practice update published in the September 2017 issue of Clinical Gastroenterology and Hepatology (Clin Gastroenterol Hepatol. doi: 10.1016/j.cgh.2017.05.014).
About 4% of Americans receive long-term opioid therapy, primarily for musculoskeletal, postsurgical, or vascular pain, as well as nonsurgical abdominal pain, writes Michael Camilleri, MD, AGAF, of Mayo Clinic in Rochester, Minn., and his associates. Because opioid receptors thickly populate the gastrointestinal tract, exogenous opioids can trigger a variety of gastrointestinal symptoms. Examples include achalasia, gastroparesis, nausea, postsurgical ileus, constipation, and narcotic bowel syndrome.
In the stomach, opioid use can cause gastroparesis, early satiety, and postprandial nausea and emesis, especially in the postoperative setting. Even novel opioid agents that are less likely to cause constipation can retard gastric emptying. For example, tapentadol, a mu-opioid agonist and norepinephrine reuptake inhibitor, delays emptying to the same extent as oxycodone. Tramadol also appears to slow overall orocecal transit. Although gastroparesis itself can cause nausea and emesis, opioids also directly stimulate the chemoreceptor trigger zone in the area postrema in the floor of the fourth ventricle. Options for preventive therapy include using a prokinetic, such as metoclopramide, prochlorperazine, or a 5-hydroxytryptamine3 antagonist, especially if patients are receiving opioids for postoperative pain control.
Exogenous opioids also can cause ileus, especially after abdominal surgery. These patients are already at risk of ileus because of surgical stress from bowel handling, secretion of inflammatory mediators and endogenous opioids, and fluctuating hormone and electrolyte levels. Postoperative analgesia with mu-opioids adds to the risk of ileus by increasing fluid absorption and inhibiting colonic motility.
Both postsurgical and nonsurgical opioid use also can trigger opioid-induced constipation (OIC), in which patients have less than three spontaneous bowel movements a week, harder stools, increased straining, and a feeling of incomplete evacuation. Patients may also report nausea, emesis, and gastroesophageal reflux. Even low-dose and short-term opioid therapy can lead to OIC. Symptoms and treatment response can be assessed with the bowel function index, in which patients rate ease of defecation, completeness of bowel evacuation, and severity of constipation over the past week on a scale of 0-100. Scores of 0-29 suggest no OIC. Patients who score above 30 despite over-the-counter laxatives are candidates for stepped-up treatments, including prolonged-release naloxone and oxycodone, the intestinal secretagogue lubiprostone, or peripherally acting mu-opioid receptor antagonists (PAMORAs), such as methylnaltrexone (12 mg subcutaneously) and naloxegol (12.5 mg or 25 mg per day orally). Additionally, tapentadol controls pain at lower doses than oxycodone and is less likely to cause constipation.
Narcotic bowel syndrome typically presents as moderate to severe daily abdominal pain lasting more than 3 months in patients on long-term opioids equating to a dosage of more than 100 mg morphine daily. Typically, patients report generalized, persistent, colicky abdominal pain that does not respond to dose escalation and worsens with dose tapering. Work-up is negative for differentials such as kidney stones or bowel obstruction. One epidemiological study estimated that 4% of patients on long-term opiates develop narcotic bowel syndrome, but the true prevalence may be higher according to the experts who authored this update. Mechanisms remain unclear but may include neuroplastic changes that favor the facilitation of pain signals rather than their inhibition, inflammation of spinal glial cells through activation of toll-like receptors, abnormal function of the N-methyl-D aspartate receptor at the level of the spinal cord, and central nociceptive abnormalities related to certain psychological traits or a history of trauma.
Treating narcotic bowel syndrome requires detoxification with appropriate nonopioid therapies for pain, anxiety, and withdrawal symptoms, including the use of clonidine. “This is best handled through specialists or centers with expertise in opiate dependence,” the experts stated. Patients who are able to stay off narcotics report improvements in pain, but the recidivism rate is about 50%.
The practice update also covers opioid therapy for gastrointestinal disorders. The PAMORA alvimopan shortens time to first postoperative stool without counteracting opioid analgesia during recovery. Alvimopan also has been found to hasten recovery of gastrointestinal function in patients with postoperative ileus after bowel resection. There is no evidence for using mu-opioid agonists for pain associated with irritable bowel syndrome (IBS), but the synthetic peripheral mu-opioid receptor agonist loperamide can improve stool consistency and urgency. A typical dose is 2 mg after each loose bowel movement or 2-4 mg before eating in cases of postprandial diarrhea. The mixed mu- and kappa-opioid receptor agonist and delta-opioid receptor antagonist eluxadoline also can potentially improve stool consistency and urgency, global IBS symptoms, IBS symptom severity score, and quality of life. However, the FDA warns against using eluxadoline in patients who do not have a gallbladder because of the risk of severe outcomes – including death – related to sphincter of Oddi spasm and pancreatitis. Eluxadoline has been linked to at least two such fatalities in cholecystectomized patients. In each case, symptoms began after a single dose.
Dr. Camilleri is funded by the National Institutes of Health. He disclosed ties to AstraZeneca and Shionogi. The two coauthors disclosed ties to Forest Research Labs, Ironwood Pharmaceuticals, Prometheus, and Salix.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Colonic microbiota encroachment linked to diabetes
Bacterial infiltration into the colonic mucosa was associated with type 2 diabetes mellitus in humans, confirming prior findings in mice, investigators said.
Unlike in mice, however, microbiota encroachment did not correlate with human adiposity per se, reported Benoit Chassaing, PhD, of Georgia State University, Atlanta, and his associates. Their mouse models all have involved low-grade inflammation, which might impair insulin/leptin signaling and thereby promote both adiposity and dysglycemia, they said. In contrast, “we presume that humans can become obese for other reasons not involving the microbiota,” they added. The findings were published in the September issue of Cellular and Molecular Gastroenterology and Hepatology (2017;2[4]:205-21. doi: 10.1016/j.jcmgh.2017.04.001).
For the study, the investigators analyzed colonic mucosal biopsies from 42 middle-aged diabetic adults who underwent screening colonoscopies at a single Veteran’s Affairs hospital. All but one of the patients were men, 86% were overweight, 45% were obese, and 33% (14 patients) had diabetes. The researchers measured the shortest distance between bacteria and the epithelium using confocal microscopy and fluorescent in situ hybridization.
Nonobese, nondiabetic patients had residual bacteria “almost exclusively” in outer regions of the mucus layer, while obese diabetic patients had bacteria in the dense inner mucus near the epithelium, said the investigators. Unlike in mice, bacterial-epithelial distances did not correlate with adiposity per se among individuals without diabetes (P = .4). Conversely, patients with diabetes had bacterial-epithelial distances that were about one-third of those in euglycemic individuals (P less than .0001), even when they were not obese (P less than .001).
“We conclude that microbiota encroachment is a feature of insulin resistance–associated dysglycemia in humans,” Dr. Chassaing and his associates wrote. Microbiota encroachment did not correlate with ethnicity, use of antibiotics or diabetes treatments, or low-density lipoprotein levels, but it did correlate with a rise in CD19+ cells, probably mucosal B cells, they said. Defining connections among microbiota encroachment, B-cell responses, and metabolic disease might clarify the pathophysiology and treatment of metabolic syndrome, they concluded.
The investigators also induced hyperglycemia in wild-type mice by giving them water with 10% sucrose and intraperitoneal streptozotocin injections. Ten days after the last injection, they measured fasting blood glucose, fecal glucose, and colonic bacterial-epithelial distances. Even though fecal glucose rose as expected, they found no evidence of microbiota encroachment. They concluded that short-term (2-week) hyperglycemia was not enough to cause encroachment. Thus, microbiota encroachment is a characteristic of type 2 diabetes, not of adiposity per se, correlates with disease severity, and might stem from chronic inflammatory processes that drive insulin resistance, they concluded.
Funders included the National Institutes of Health, VA-MERIT, and the Crohn’s and Colitis Foundation of America. The investigators had no relevant conflicts of interest.
Dr. Chassaing and his colleagues examined the possible importance of the bacteria-free layer adjacent to the colonic epithelium in metabolic syndrome. A shrinking of this layer, termed “bacterial encroachment,” has been associated with human inflammatory bowel disease as well as mouse models of both colitis and metabolic syndrome, but the current study represents its first clear demonstration in human diabetes. In a cohort of 42 patients, the authors found that the epithelial-bacterial distance was inversely correlated with body mass index, fasting glucose, and hemoglobin A1c levels.
Interestingly, the primary predictor of encroachment in these patients was dysglycemia, not body mass index. This could not have been tested in standard mouse models where, because of the nature of the experimental insult, obesity and dysglycemia are essentially linked. Comparing obese human patients with and without dysglycemia, on the other hand, showed that encroachment is only clearly correlated with failed glucose regulation. This, however, is not the end of the story: In coordinated experiments with a short-term murine dysglycemia model, high glucose levels were not sufficient to elicit encroachment, suggesting a more complex metabolic circuit as the driver.
Mark R. Frey, PhD, is associate professor of pediatrics and biochemistry and molecular medicine at the Saban Research Institute, Children’s Hospital Los Angeles, University of Southern California.
Dr. Chassaing and his colleagues examined the possible importance of the bacteria-free layer adjacent to the colonic epithelium in metabolic syndrome. A shrinking of this layer, termed “bacterial encroachment,” has been associated with human inflammatory bowel disease as well as mouse models of both colitis and metabolic syndrome, but the current study represents its first clear demonstration in human diabetes. In a cohort of 42 patients, the authors found that the epithelial-bacterial distance was inversely correlated with body mass index, fasting glucose, and hemoglobin A1c levels.
Interestingly, the primary predictor of encroachment in these patients was dysglycemia, not body mass index. This could not have been tested in standard mouse models where, because of the nature of the experimental insult, obesity and dysglycemia are essentially linked. Comparing obese human patients with and without dysglycemia, on the other hand, showed that encroachment is only clearly correlated with failed glucose regulation. This, however, is not the end of the story: In coordinated experiments with a short-term murine dysglycemia model, high glucose levels were not sufficient to elicit encroachment, suggesting a more complex metabolic circuit as the driver.
Mark R. Frey, PhD, is associate professor of pediatrics and biochemistry and molecular medicine at the Saban Research Institute, Children’s Hospital Los Angeles, University of Southern California.
Dr. Chassaing and his colleagues examined the possible importance of the bacteria-free layer adjacent to the colonic epithelium in metabolic syndrome. A shrinking of this layer, termed “bacterial encroachment,” has been associated with human inflammatory bowel disease as well as mouse models of both colitis and metabolic syndrome, but the current study represents its first clear demonstration in human diabetes. In a cohort of 42 patients, the authors found that the epithelial-bacterial distance was inversely correlated with body mass index, fasting glucose, and hemoglobin A1c levels.
Interestingly, the primary predictor of encroachment in these patients was dysglycemia, not body mass index. This could not have been tested in standard mouse models where, because of the nature of the experimental insult, obesity and dysglycemia are essentially linked. Comparing obese human patients with and without dysglycemia, on the other hand, showed that encroachment is only clearly correlated with failed glucose regulation. This, however, is not the end of the story: In coordinated experiments with a short-term murine dysglycemia model, high glucose levels were not sufficient to elicit encroachment, suggesting a more complex metabolic circuit as the driver.
Mark R. Frey, PhD, is associate professor of pediatrics and biochemistry and molecular medicine at the Saban Research Institute, Children’s Hospital Los Angeles, University of Southern California.
Bacterial infiltration into the colonic mucosa was associated with type 2 diabetes mellitus in humans, confirming prior findings in mice, investigators said.
Unlike in mice, however, microbiota encroachment did not correlate with human adiposity per se, reported Benoit Chassaing, PhD, of Georgia State University, Atlanta, and his associates. Their mouse models all have involved low-grade inflammation, which might impair insulin/leptin signaling and thereby promote both adiposity and dysglycemia, they said. In contrast, “we presume that humans can become obese for other reasons not involving the microbiota,” they added. The findings were published in the September issue of Cellular and Molecular Gastroenterology and Hepatology (2017;2[4]:205-21. doi: 10.1016/j.jcmgh.2017.04.001).
For the study, the investigators analyzed colonic mucosal biopsies from 42 middle-aged diabetic adults who underwent screening colonoscopies at a single Veteran’s Affairs hospital. All but one of the patients were men, 86% were overweight, 45% were obese, and 33% (14 patients) had diabetes. The researchers measured the shortest distance between bacteria and the epithelium using confocal microscopy and fluorescent in situ hybridization.
Nonobese, nondiabetic patients had residual bacteria “almost exclusively” in outer regions of the mucus layer, while obese diabetic patients had bacteria in the dense inner mucus near the epithelium, said the investigators. Unlike in mice, bacterial-epithelial distances did not correlate with adiposity per se among individuals without diabetes (P = .4). Conversely, patients with diabetes had bacterial-epithelial distances that were about one-third of those in euglycemic individuals (P less than .0001), even when they were not obese (P less than .001).
“We conclude that microbiota encroachment is a feature of insulin resistance–associated dysglycemia in humans,” Dr. Chassaing and his associates wrote. Microbiota encroachment did not correlate with ethnicity, use of antibiotics or diabetes treatments, or low-density lipoprotein levels, but it did correlate with a rise in CD19+ cells, probably mucosal B cells, they said. Defining connections among microbiota encroachment, B-cell responses, and metabolic disease might clarify the pathophysiology and treatment of metabolic syndrome, they concluded.
The investigators also induced hyperglycemia in wild-type mice by giving them water with 10% sucrose and intraperitoneal streptozotocin injections. Ten days after the last injection, they measured fasting blood glucose, fecal glucose, and colonic bacterial-epithelial distances. Even though fecal glucose rose as expected, they found no evidence of microbiota encroachment. They concluded that short-term (2-week) hyperglycemia was not enough to cause encroachment. Thus, microbiota encroachment is a characteristic of type 2 diabetes, not of adiposity per se, correlates with disease severity, and might stem from chronic inflammatory processes that drive insulin resistance, they concluded.
Funders included the National Institutes of Health, VA-MERIT, and the Crohn’s and Colitis Foundation of America. The investigators had no relevant conflicts of interest.
Bacterial infiltration into the colonic mucosa was associated with type 2 diabetes mellitus in humans, confirming prior findings in mice, investigators said.
Unlike in mice, however, microbiota encroachment did not correlate with human adiposity per se, reported Benoit Chassaing, PhD, of Georgia State University, Atlanta, and his associates. Their mouse models all have involved low-grade inflammation, which might impair insulin/leptin signaling and thereby promote both adiposity and dysglycemia, they said. In contrast, “we presume that humans can become obese for other reasons not involving the microbiota,” they added. The findings were published in the September issue of Cellular and Molecular Gastroenterology and Hepatology (2017;2[4]:205-21. doi: 10.1016/j.jcmgh.2017.04.001).
For the study, the investigators analyzed colonic mucosal biopsies from 42 middle-aged diabetic adults who underwent screening colonoscopies at a single Veteran’s Affairs hospital. All but one of the patients were men, 86% were overweight, 45% were obese, and 33% (14 patients) had diabetes. The researchers measured the shortest distance between bacteria and the epithelium using confocal microscopy and fluorescent in situ hybridization.
Nonobese, nondiabetic patients had residual bacteria “almost exclusively” in outer regions of the mucus layer, while obese diabetic patients had bacteria in the dense inner mucus near the epithelium, said the investigators. Unlike in mice, bacterial-epithelial distances did not correlate with adiposity per se among individuals without diabetes (P = .4). Conversely, patients with diabetes had bacterial-epithelial distances that were about one-third of those in euglycemic individuals (P less than .0001), even when they were not obese (P less than .001).
“We conclude that microbiota encroachment is a feature of insulin resistance–associated dysglycemia in humans,” Dr. Chassaing and his associates wrote. Microbiota encroachment did not correlate with ethnicity, use of antibiotics or diabetes treatments, or low-density lipoprotein levels, but it did correlate with a rise in CD19+ cells, probably mucosal B cells, they said. Defining connections among microbiota encroachment, B-cell responses, and metabolic disease might clarify the pathophysiology and treatment of metabolic syndrome, they concluded.
The investigators also induced hyperglycemia in wild-type mice by giving them water with 10% sucrose and intraperitoneal streptozotocin injections. Ten days after the last injection, they measured fasting blood glucose, fecal glucose, and colonic bacterial-epithelial distances. Even though fecal glucose rose as expected, they found no evidence of microbiota encroachment. They concluded that short-term (2-week) hyperglycemia was not enough to cause encroachment. Thus, microbiota encroachment is a characteristic of type 2 diabetes, not of adiposity per se, correlates with disease severity, and might stem from chronic inflammatory processes that drive insulin resistance, they concluded.
Funders included the National Institutes of Health, VA-MERIT, and the Crohn’s and Colitis Foundation of America. The investigators had no relevant conflicts of interest.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: Microbiota encroachment into colonic mucosa characterizes type 2 diabetes in humans.
Major finding: Regardless of whether they were obese or normal weight, patients with diabetes had bacterial-epithelial colonic distances that were one-third of those in euglycemic individuals (P less than .001).
Data source: A study of 42 Veterans Affairs patients with and without type 2 diabetes mellitus.
Disclosures: Funders included the National Institutes of Health, VA-MERIT, and the Crohn’s and Colitis Foundation of America. The investigators had no relevant conflicts of interest.
VIDEO: Study highlights risks of postponing cholecystectomy
Almost half of patients who underwent endoscopic retrograde cholangiopancreatography (ERCP) did not undergo cholecystectomy (CCY) within the next 60 days according to the results of a large, retrospective cohort study reported in the September issue of Gastroenterology (doi: 10.1053/j.gastro.2017.05.048).
“Although early and delayed CCY equally reduce the risk of subsequent recurrent biliary events, patients are at 10-fold higher risk of a recurrent biliary event while waiting for a delayed CCY, compared with patients who underwent early CCY,” wrote Robert J. Huang, MD, and his associates of Stanford (Calif.) University Medical Center. Delayed CCY is cost effective, but that benefit must be weighed against the risk of loss to follow-up, especially if patients have “little or no health insurance,” they said.
Source: American Gastroenterological Association
Gallstone disease affects up to 15% of adults in developed societies, including about 20-25 million Americans. Yearly costs of treatment tally at more than $6.2 billion and have risen by more than 20% in 3 decades, according to multiple studies. Approximately 20% of patients with gallstone disease have choledocholithiasis, mainly because gallstones can pass from the gallbladder into the common bile duct. After undergoing ERCP, such patients are typically referred for CCY, but there are no “societal guidelines” on timing the referral, the researchers said. Practice patterns remain “largely institution based and may be subject to the vagaries of surgeon availability and other institutional resource constraints.” One prior study linked a median 7-week wait time for CCY with a 20% rate of recurrent biliary events. To evaluate large-scale practice patterns, the researchers studied 4,516 patients who had undergone ERCP for choledocholithiasis in California (during 2009-2011), New York (during 2011-2013), and Florida (during 2012-2014) and calculated timing and rates of subsequent CCY, recurrent biliary events, and deaths. Patients were followed for up to 365 days after ERCP.
Of the 4,516 patients studied, 1,859 (41.2%) patients underwent CCY during their index hospital admission (early CCY). Of the 2,657 (58.8%) patients who were discharged without CCY, only 491 (18%) had a planned CCY within 60 days (delayed CCY), 350 (71.3%) of which were done in an outpatient setting. Of the patients in the study, 2,168 (48.0%) did not have a CCY (no CCY) during their index visit or within 60 days. Over 365 days of follow-up, 10% of patients who did not have a CCY had recurrent biliary events, compared with 1.3% of patients who underwent early or delayed CCY. The risk of recurrent biliary events for patients who underwent early or delayed CCY was about 88% lower than if they had had no CCY within 60 days of ERCP (P less than .001 for each comparison). Performing CCY during index admission cut the risk of recurrent biliary events occurring within 60 days by 92%, compared with delayed or no CCY (P less than .001).
In all, 15 (0.7%) patients who did not undergo CCY died after subsequent hospitalization for a recurrent biliary event, compared with 1 patient who underwent early CCY (0.1%; P less than .001). There were no deaths associated with recurrent biliary events in the delayed-CCY group. Rates of all-cause mortality over 365 days were 3.1% in the no-CCY group, 0.6% in the early-CCY group, and 0% in the delayed-CCY group. Thus, cumulative death rates were about seven times higher among patients who did not undergo CCY compared with those who did (P less than .001).
Patients who did not undergo CCY tended to be older than delayed- and early-CCY patients (mean ages 66 years, 58 years, and 52 years, respectively). No-CCY patients also tended to have more comorbidities. Nonetheless, having an early CCY retained a “robust” protective effect against recurrent biliary events after accounting for age, sex, comorbidities, stent placement, facility volume, and state of residence. Even after researchers adjusted for those factors, the protective effect of early CCY dropped by less than 5% (from 92% to about 87%), the investigators said.
They also noted that the overall cohort averaged 60 years of age and that 64% were female, which is consistent with the epidemiology of biliary stone disease. Just over half were non-Hispanic whites. Medicare was the single largest primary payer (46%), followed by private insurance (28%) and Medicaid (16%).
“A strategy of delayed CCY performed on an outpatient basis was least costly,” the researchers said. “Performance of early CCY was inversely associated with low facility volume. Hispanic race, Asian race, Medicaid insurance, and no insurance associated inversely with performance of delayed CCY.”
Funders included a seed grant from the Stanford division of gastroenterology and hepatology and the National Institutes of Health. The investigators had no conflicts of interest.
Almost half of patients who underwent endoscopic retrograde cholangiopancreatography (ERCP) did not undergo cholecystectomy (CCY) within the next 60 days according to the results of a large, retrospective cohort study reported in the September issue of Gastroenterology (doi: 10.1053/j.gastro.2017.05.048).
“Although early and delayed CCY equally reduce the risk of subsequent recurrent biliary events, patients are at 10-fold higher risk of a recurrent biliary event while waiting for a delayed CCY, compared with patients who underwent early CCY,” wrote Robert J. Huang, MD, and his associates of Stanford (Calif.) University Medical Center. Delayed CCY is cost effective, but that benefit must be weighed against the risk of loss to follow-up, especially if patients have “little or no health insurance,” they said.
Source: American Gastroenterological Association
Gallstone disease affects up to 15% of adults in developed societies, including about 20-25 million Americans. Yearly costs of treatment tally at more than $6.2 billion and have risen by more than 20% in 3 decades, according to multiple studies. Approximately 20% of patients with gallstone disease have choledocholithiasis, mainly because gallstones can pass from the gallbladder into the common bile duct. After undergoing ERCP, such patients are typically referred for CCY, but there are no “societal guidelines” on timing the referral, the researchers said. Practice patterns remain “largely institution based and may be subject to the vagaries of surgeon availability and other institutional resource constraints.” One prior study linked a median 7-week wait time for CCY with a 20% rate of recurrent biliary events. To evaluate large-scale practice patterns, the researchers studied 4,516 patients who had undergone ERCP for choledocholithiasis in California (during 2009-2011), New York (during 2011-2013), and Florida (during 2012-2014) and calculated timing and rates of subsequent CCY, recurrent biliary events, and deaths. Patients were followed for up to 365 days after ERCP.
Of the 4,516 patients studied, 1,859 (41.2%) patients underwent CCY during their index hospital admission (early CCY). Of the 2,657 (58.8%) patients who were discharged without CCY, only 491 (18%) had a planned CCY within 60 days (delayed CCY), 350 (71.3%) of which were done in an outpatient setting. Of the patients in the study, 2,168 (48.0%) did not have a CCY (no CCY) during their index visit or within 60 days. Over 365 days of follow-up, 10% of patients who did not have a CCY had recurrent biliary events, compared with 1.3% of patients who underwent early or delayed CCY. The risk of recurrent biliary events for patients who underwent early or delayed CCY was about 88% lower than if they had had no CCY within 60 days of ERCP (P less than .001 for each comparison). Performing CCY during index admission cut the risk of recurrent biliary events occurring within 60 days by 92%, compared with delayed or no CCY (P less than .001).
In all, 15 (0.7%) patients who did not undergo CCY died after subsequent hospitalization for a recurrent biliary event, compared with 1 patient who underwent early CCY (0.1%; P less than .001). There were no deaths associated with recurrent biliary events in the delayed-CCY group. Rates of all-cause mortality over 365 days were 3.1% in the no-CCY group, 0.6% in the early-CCY group, and 0% in the delayed-CCY group. Thus, cumulative death rates were about seven times higher among patients who did not undergo CCY compared with those who did (P less than .001).
Patients who did not undergo CCY tended to be older than delayed- and early-CCY patients (mean ages 66 years, 58 years, and 52 years, respectively). No-CCY patients also tended to have more comorbidities. Nonetheless, having an early CCY retained a “robust” protective effect against recurrent biliary events after accounting for age, sex, comorbidities, stent placement, facility volume, and state of residence. Even after researchers adjusted for those factors, the protective effect of early CCY dropped by less than 5% (from 92% to about 87%), the investigators said.
They also noted that the overall cohort averaged 60 years of age and that 64% were female, which is consistent with the epidemiology of biliary stone disease. Just over half were non-Hispanic whites. Medicare was the single largest primary payer (46%), followed by private insurance (28%) and Medicaid (16%).
“A strategy of delayed CCY performed on an outpatient basis was least costly,” the researchers said. “Performance of early CCY was inversely associated with low facility volume. Hispanic race, Asian race, Medicaid insurance, and no insurance associated inversely with performance of delayed CCY.”
Funders included a seed grant from the Stanford division of gastroenterology and hepatology and the National Institutes of Health. The investigators had no conflicts of interest.
Almost half of patients who underwent endoscopic retrograde cholangiopancreatography (ERCP) did not undergo cholecystectomy (CCY) within the next 60 days according to the results of a large, retrospective cohort study reported in the September issue of Gastroenterology (doi: 10.1053/j.gastro.2017.05.048).
“Although early and delayed CCY equally reduce the risk of subsequent recurrent biliary events, patients are at 10-fold higher risk of a recurrent biliary event while waiting for a delayed CCY, compared with patients who underwent early CCY,” wrote Robert J. Huang, MD, and his associates of Stanford (Calif.) University Medical Center. Delayed CCY is cost effective, but that benefit must be weighed against the risk of loss to follow-up, especially if patients have “little or no health insurance,” they said.
Source: American Gastroenterological Association
Gallstone disease affects up to 15% of adults in developed societies, including about 20-25 million Americans. Yearly costs of treatment tally at more than $6.2 billion and have risen by more than 20% in 3 decades, according to multiple studies. Approximately 20% of patients with gallstone disease have choledocholithiasis, mainly because gallstones can pass from the gallbladder into the common bile duct. After undergoing ERCP, such patients are typically referred for CCY, but there are no “societal guidelines” on timing the referral, the researchers said. Practice patterns remain “largely institution based and may be subject to the vagaries of surgeon availability and other institutional resource constraints.” One prior study linked a median 7-week wait time for CCY with a 20% rate of recurrent biliary events. To evaluate large-scale practice patterns, the researchers studied 4,516 patients who had undergone ERCP for choledocholithiasis in California (during 2009-2011), New York (during 2011-2013), and Florida (during 2012-2014) and calculated timing and rates of subsequent CCY, recurrent biliary events, and deaths. Patients were followed for up to 365 days after ERCP.
Of the 4,516 patients studied, 1,859 (41.2%) patients underwent CCY during their index hospital admission (early CCY). Of the 2,657 (58.8%) patients who were discharged without CCY, only 491 (18%) had a planned CCY within 60 days (delayed CCY), 350 (71.3%) of which were done in an outpatient setting. Of the patients in the study, 2,168 (48.0%) did not have a CCY (no CCY) during their index visit or within 60 days. Over 365 days of follow-up, 10% of patients who did not have a CCY had recurrent biliary events, compared with 1.3% of patients who underwent early or delayed CCY. The risk of recurrent biliary events for patients who underwent early or delayed CCY was about 88% lower than if they had had no CCY within 60 days of ERCP (P less than .001 for each comparison). Performing CCY during index admission cut the risk of recurrent biliary events occurring within 60 days by 92%, compared with delayed or no CCY (P less than .001).
In all, 15 (0.7%) patients who did not undergo CCY died after subsequent hospitalization for a recurrent biliary event, compared with 1 patient who underwent early CCY (0.1%; P less than .001). There were no deaths associated with recurrent biliary events in the delayed-CCY group. Rates of all-cause mortality over 365 days were 3.1% in the no-CCY group, 0.6% in the early-CCY group, and 0% in the delayed-CCY group. Thus, cumulative death rates were about seven times higher among patients who did not undergo CCY compared with those who did (P less than .001).
Patients who did not undergo CCY tended to be older than delayed- and early-CCY patients (mean ages 66 years, 58 years, and 52 years, respectively). No-CCY patients also tended to have more comorbidities. Nonetheless, having an early CCY retained a “robust” protective effect against recurrent biliary events after accounting for age, sex, comorbidities, stent placement, facility volume, and state of residence. Even after researchers adjusted for those factors, the protective effect of early CCY dropped by less than 5% (from 92% to about 87%), the investigators said.
They also noted that the overall cohort averaged 60 years of age and that 64% were female, which is consistent with the epidemiology of biliary stone disease. Just over half were non-Hispanic whites. Medicare was the single largest primary payer (46%), followed by private insurance (28%) and Medicaid (16%).
“A strategy of delayed CCY performed on an outpatient basis was least costly,” the researchers said. “Performance of early CCY was inversely associated with low facility volume. Hispanic race, Asian race, Medicaid insurance, and no insurance associated inversely with performance of delayed CCY.”
Funders included a seed grant from the Stanford division of gastroenterology and hepatology and the National Institutes of Health. The investigators had no conflicts of interest.
FROM GASTROENTEROLOGY
Key clinical point: Almost half of patients who underwent endoscopic retrograde cholangiopancreatography (ERCP) did not undergo cholecystectomy within 60 days.
Major finding: A total of 48% had no cholecystectomy within 60 days. Performing cholecystectomy during index admission cut the risk of recurrent biliary events within 60 days by 92%, compared with delayed or no cholecystectomy (P less than .001).
Data source: A multistate, retrospective study of 4,516 patients hospitalized with choledocholithiasis.
Disclosures: Funders included a Stanford division of gastroenterology and hepatology divisional seed grant and the National Institutes of Health. The investigators had no conflicts of interest.
Minimally invasive screening for Barrett’s esophagus offers cost-effective alternative
The high costs of endoscopy make screening patients with gastroesophageal reflux disease (GERD) for Barrett’s esophagus a costly endeavor. But using a minimally invasive test followed by endoscopy only if results are positive could cut costs by up to 41%, according to investigators.
The report is in the September issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.02.017).
The findings mirror those from a prior study (Gastroenterology. 2013 Jan;144[1]:62-73.e60) of the new cytosponge device, which tests surface esophageal tissue for trefoil factor 3, a biomarker for Barrett’s esophagus, said Curtis R. Heberle, of Massachusetts General Hospital in Boston, and his associates. In addition, two separate models found the cytosponge strategy cost effective compared with no screening (incremental cost-effectiveness ratios [ICERs], about $26,000-$33,000). However, using the cytosponge instead of screening all GERD patients with endoscopy would reduce quality-adjusted life-years (QALYs) by about 1.8-5.5 years for every 1,000 patients.
Rates of esophageal adenocarcinoma have climbed more than sixfold in the United States in 4 decades, and 5-year survival rates remain below 20%. Nonetheless, the high cost of endoscopy and 10%-20% prevalence of GERD makes screening all patients for Barrett’s esophagus infeasible. To evaluate the cytosponge strategy, the researchers fit data from the multicenter BEST2 study (PLoS Med. 2015 Jan; 12[1]: e1001780) into two validated models calibrated to high-quality Surveillance, Epidemiology and End Results (SEER) data on esophageal cancer. Both models compared no screening with a one-time screen by either endoscopy alone or cytosponge with follow-up endoscopy in the event of a positive test. The models assumed patients were male, were 60 years old, and had GERD but not esophageal adenocarcinoma.
Without screening, there were about 14-16 cancer cases and about 15,077 quality-adjusted life years (QALYs) for every 1,000 patients. The cytosponge strategy was associated with about 8-13 cancer cases and about 15,105 QALYs. Endoscopic screening produced the most benefit overall – only about 7-12 cancer cases, with more than 15,100 QALYs. “However, greater benefits were accompanied by higher total costs,” the researchers said. For every 1,000 patients, no screening cost about $704,000 to $762,000, the cytosponge strategy cost about $1.5 to $1.6 million, and population-wide endoscopy cost about $2.1 to $2.2 million. Thus, the cytosponge method would lower the cost of screening by 37%-41% compared with endoscopically screening all men with GERD. The cytosponge was also cost effective in a model of 60-year-old women with GERD.
Using only endoscopic screening was not cost effective in either model, exceeding a $100,000 threshold of willingness to pay by anywhere from $107,000 to $330,000. The cytosponge is not yet available commercially, but the investigators assumed it cost $182 based on information from the manufacturer (Medtronic) and Medicare payments for similar devices. Although the findings withstood variations in indirect costs and age at initial screening, they were “somewhat sensitive” to variations in costs of the cytosponge and its presumed sensitivity and specificity in clinical settings. However, endoscopic screening only became cost effective when the cytosponge test cost at least $225.
The models assumed perfect adherence to screening, which probably exaggerated the effectiveness of the cytosponge and endoscopic screening, the investigators said. They noted that cytosponge screening can be performed without sedation during a short outpatient visit.
The National Institutes of Health provided funding. The investigators had no relevant disclosures.
The high costs of endoscopy make screening patients with gastroesophageal reflux disease (GERD) for Barrett’s esophagus a costly endeavor. But using a minimally invasive test followed by endoscopy only if results are positive could cut costs by up to 41%, according to investigators.
The report is in the September issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.02.017).
The findings mirror those from a prior study (Gastroenterology. 2013 Jan;144[1]:62-73.e60) of the new cytosponge device, which tests surface esophageal tissue for trefoil factor 3, a biomarker for Barrett’s esophagus, said Curtis R. Heberle, of Massachusetts General Hospital in Boston, and his associates. In addition, two separate models found the cytosponge strategy cost effective compared with no screening (incremental cost-effectiveness ratios [ICERs], about $26,000-$33,000). However, using the cytosponge instead of screening all GERD patients with endoscopy would reduce quality-adjusted life-years (QALYs) by about 1.8-5.5 years for every 1,000 patients.
Rates of esophageal adenocarcinoma have climbed more than sixfold in the United States in 4 decades, and 5-year survival rates remain below 20%. Nonetheless, the high cost of endoscopy and 10%-20% prevalence of GERD makes screening all patients for Barrett’s esophagus infeasible. To evaluate the cytosponge strategy, the researchers fit data from the multicenter BEST2 study (PLoS Med. 2015 Jan; 12[1]: e1001780) into two validated models calibrated to high-quality Surveillance, Epidemiology and End Results (SEER) data on esophageal cancer. Both models compared no screening with a one-time screen by either endoscopy alone or cytosponge with follow-up endoscopy in the event of a positive test. The models assumed patients were male, were 60 years old, and had GERD but not esophageal adenocarcinoma.
Without screening, there were about 14-16 cancer cases and about 15,077 quality-adjusted life years (QALYs) for every 1,000 patients. The cytosponge strategy was associated with about 8-13 cancer cases and about 15,105 QALYs. Endoscopic screening produced the most benefit overall – only about 7-12 cancer cases, with more than 15,100 QALYs. “However, greater benefits were accompanied by higher total costs,” the researchers said. For every 1,000 patients, no screening cost about $704,000 to $762,000, the cytosponge strategy cost about $1.5 to $1.6 million, and population-wide endoscopy cost about $2.1 to $2.2 million. Thus, the cytosponge method would lower the cost of screening by 37%-41% compared with endoscopically screening all men with GERD. The cytosponge was also cost effective in a model of 60-year-old women with GERD.
Using only endoscopic screening was not cost effective in either model, exceeding a $100,000 threshold of willingness to pay by anywhere from $107,000 to $330,000. The cytosponge is not yet available commercially, but the investigators assumed it cost $182 based on information from the manufacturer (Medtronic) and Medicare payments for similar devices. Although the findings withstood variations in indirect costs and age at initial screening, they were “somewhat sensitive” to variations in costs of the cytosponge and its presumed sensitivity and specificity in clinical settings. However, endoscopic screening only became cost effective when the cytosponge test cost at least $225.
The models assumed perfect adherence to screening, which probably exaggerated the effectiveness of the cytosponge and endoscopic screening, the investigators said. They noted that cytosponge screening can be performed without sedation during a short outpatient visit.
The National Institutes of Health provided funding. The investigators had no relevant disclosures.
The high costs of endoscopy make screening patients with gastroesophageal reflux disease (GERD) for Barrett’s esophagus a costly endeavor. But using a minimally invasive test followed by endoscopy only if results are positive could cut costs by up to 41%, according to investigators.
The report is in the September issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.02.017).
The findings mirror those from a prior study (Gastroenterology. 2013 Jan;144[1]:62-73.e60) of the new cytosponge device, which tests surface esophageal tissue for trefoil factor 3, a biomarker for Barrett’s esophagus, said Curtis R. Heberle, of Massachusetts General Hospital in Boston, and his associates. In addition, two separate models found the cytosponge strategy cost effective compared with no screening (incremental cost-effectiveness ratios [ICERs], about $26,000-$33,000). However, using the cytosponge instead of screening all GERD patients with endoscopy would reduce quality-adjusted life-years (QALYs) by about 1.8-5.5 years for every 1,000 patients.
Rates of esophageal adenocarcinoma have climbed more than sixfold in the United States in 4 decades, and 5-year survival rates remain below 20%. Nonetheless, the high cost of endoscopy and 10%-20% prevalence of GERD makes screening all patients for Barrett’s esophagus infeasible. To evaluate the cytosponge strategy, the researchers fit data from the multicenter BEST2 study (PLoS Med. 2015 Jan; 12[1]: e1001780) into two validated models calibrated to high-quality Surveillance, Epidemiology and End Results (SEER) data on esophageal cancer. Both models compared no screening with a one-time screen by either endoscopy alone or cytosponge with follow-up endoscopy in the event of a positive test. The models assumed patients were male, were 60 years old, and had GERD but not esophageal adenocarcinoma.
Without screening, there were about 14-16 cancer cases and about 15,077 quality-adjusted life years (QALYs) for every 1,000 patients. The cytosponge strategy was associated with about 8-13 cancer cases and about 15,105 QALYs. Endoscopic screening produced the most benefit overall – only about 7-12 cancer cases, with more than 15,100 QALYs. “However, greater benefits were accompanied by higher total costs,” the researchers said. For every 1,000 patients, no screening cost about $704,000 to $762,000, the cytosponge strategy cost about $1.5 to $1.6 million, and population-wide endoscopy cost about $2.1 to $2.2 million. Thus, the cytosponge method would lower the cost of screening by 37%-41% compared with endoscopically screening all men with GERD. The cytosponge was also cost effective in a model of 60-year-old women with GERD.
Using only endoscopic screening was not cost effective in either model, exceeding a $100,000 threshold of willingness to pay by anywhere from $107,000 to $330,000. The cytosponge is not yet available commercially, but the investigators assumed it cost $182 based on information from the manufacturer (Medtronic) and Medicare payments for similar devices. Although the findings withstood variations in indirect costs and age at initial screening, they were “somewhat sensitive” to variations in costs of the cytosponge and its presumed sensitivity and specificity in clinical settings. However, endoscopic screening only became cost effective when the cytosponge test cost at least $225.
The models assumed perfect adherence to screening, which probably exaggerated the effectiveness of the cytosponge and endoscopic screening, the investigators said. They noted that cytosponge screening can be performed without sedation during a short outpatient visit.
The National Institutes of Health provided funding. The investigators had no relevant disclosures.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: Using a minimally invasive screen for Barrett’s esophagus and following up with endoscopy if results are positive is a cost-effective alternative to endoscopy alone in patients with gastroesophageal reflux disease.
Major finding: The two-step screening strategy cut screening costs by 37%-41% but was associated with 1.8-5.5 fewer quality-adjusted life years for every 1,000 patients with GERD.
Data source: Two validated models based on Surveillance, Epidemiology, and End Results data, and data from the multicenter BEST2 trial.
Disclosures: The National Institutes of Health provided funding. The investigators had no relevant disclosures.
VIDEO: Large distal nongranular colorectal polyps were most likely to contain occult invasive cancers
Large sessile or flat colorectal polyps or laterally spreading lesions were most likely to contain covert malignancies when their location was rectosigmoid, their Paris classification was 0-Is or 0-IIa+Is, and they were nongranular, according to the results of a multicenter prospective cohort study of 2,106 consecutive patients reported in the September issue of Gastroenterology (doi: 10.1053/j.gastro.2017.05.047).
“Distal nongranular lesions have a high risk of occult SMIC [submucosal invasive cancer], whereas proximal, granular 0-IIa lesions, after a careful assessment for features associated with SMIC, have a very low risk,” wrote Nicholas G. Burgess, MD, of Westmead Hospital, Sydney, with his associates. “These findings can be used to inform decisions [about] which patients should undergo endoscopic submucosal dissection, endoscopic mucosal resection, or surgery.”
Source: American Gastroenterological Association
Many studies of colonic lesions have examined predictors of SMIC. Nonetheless, clinicians need more information on factors that improve clinical decision making, especially as colonic endoscopic submucosal dissection becomes more accessible, the researchers said. Large colonic lesions can contain submucosal invasive SMICs that are not visible on endoscopy, and characterizing predictors of this occurrence could help patients and clinicians decide between endoscopic submucosal dissection and endoscopic mucosal resection. To do so, the researchers analyzed histologic specimens from 2,277 colonic lesions above 20 mm (average size, 37 mm) that lacked overt endoscopic high-risk features. The study ran from 2008 through 2016, study participants averaged 68 years of age, and 53% were male. A total of 171 lesions (8%) had evidence of SMIC on pathologic review, and 138 lesions had covert SMIC. Predictors of overt and occult SMIC included Kudo pit pattern V, a depressed component (0-IIc), rectosigmoid location, 0-Is or 0-IIa+Is Paris classification, nongranular surface morphology, and larger size. After excluding lesions with obvious SMIC features – including serrated lesions and those with depressed components (Kudo pit pattern of V and Paris 0-IIc) – the strongest predictors of occult SMIC included Paris classification, surface morphology, size, and location.
“Proximal 0-IIa G or 0-Is granular lesions had the lowest risk of SMIC (0.7% and 2.3%), whereas distal 0-Is nongranular lesions had the highest risk (21.4%),” the investigators added. Lesion location, size, and combined Paris classification and surface topography showed the best fit in a multivariable model. Notably, rectosigmoid lesions had nearly twice the odds of containing covert SMIC, compared with proximal lesions (odds ratio, 1.9; 95% confidence interval, 1.2-3.0; P = .01). Other significant predictors of covert SMIC in the multivariable model included combined Paris classification, surface morphology (OR, 4.0; 95% CI, 1.2-12.7; P = .02), and increasing size (OR, 1.2 per 10-mm increase; 95% CI, 1.04-1.3; P = .01). Increased size showed an even greater effect in lesions exceeding 50 mm.
Clinicians can use these factors to help evaluate risk of invasive cancer in lesions without overt SMIC, the researchers said. “One lesion type that differs from the pattern is 0-IIa nongranular lesions,” they noted. “Once lesions with overt evidence of SMIC are excluded, these lesions have a low risk (4.2%) of harboring underlying cancer.” Although 42% of lesions with covert SMIC were SM1 (potentially curable by endoscopic resection), no predictor of covert SMIC also predicted SMI status.
Funders included Cancer Institute of New South Wales and Gallipoli Medical Research Foundation. The investigators had no conflicts of interest.
In recent years, substantial efforts have been made to improve both colonoscopy preparation and endoscopic image quality to achieve improved polyp detection. In addition, while large, complex colon polyps (typically greater than 20 mm in size) previously were often referred for surgical resection, improved polyp resection techniques and equipment have led to the ability to remove many such lesions in a piecemeal fashion or en bloc via endoscopic mucosal resection (EMR) and endoscopic submucosal dissection (ESD).
The authors are to be congratulated for their meticulous and sustained efforts in acquiring and analyzing this data. These results provide endoscopists with some important, practical, and entirely visual criteria to assess upon identification of large colon polyps that can aid in determining which type of endoscopy therapy, if any, to embark upon. Avoiding EMR when there is a reasonably high probability of invasive disease will allow for choosing a more appropriate technique such as ESD (which is becoming increasingly available in the West) or surgery. In addition, patients can avoid the unnecessary EMR-related risks of bleeding and perforation when this technique is likely to result in an inadequate resection. Future work should assess whether this information can be widely adopted and utilized to achieve similar predictive accuracy in nonexpert settings.
V. Raman Muthusamy, MD, is director, interventional and general endoscopy, clinical professor of medicine, digestive diseases/gastroenterology, University of California, Los Angeles School of Medicine. He is a consultant for Medtronic and Boston Scientific.
In recent years, substantial efforts have been made to improve both colonoscopy preparation and endoscopic image quality to achieve improved polyp detection. In addition, while large, complex colon polyps (typically greater than 20 mm in size) previously were often referred for surgical resection, improved polyp resection techniques and equipment have led to the ability to remove many such lesions in a piecemeal fashion or en bloc via endoscopic mucosal resection (EMR) and endoscopic submucosal dissection (ESD).
The authors are to be congratulated for their meticulous and sustained efforts in acquiring and analyzing this data. These results provide endoscopists with some important, practical, and entirely visual criteria to assess upon identification of large colon polyps that can aid in determining which type of endoscopy therapy, if any, to embark upon. Avoiding EMR when there is a reasonably high probability of invasive disease will allow for choosing a more appropriate technique such as ESD (which is becoming increasingly available in the West) or surgery. In addition, patients can avoid the unnecessary EMR-related risks of bleeding and perforation when this technique is likely to result in an inadequate resection. Future work should assess whether this information can be widely adopted and utilized to achieve similar predictive accuracy in nonexpert settings.
V. Raman Muthusamy, MD, is director, interventional and general endoscopy, clinical professor of medicine, digestive diseases/gastroenterology, University of California, Los Angeles School of Medicine. He is a consultant for Medtronic and Boston Scientific.
In recent years, substantial efforts have been made to improve both colonoscopy preparation and endoscopic image quality to achieve improved polyp detection. In addition, while large, complex colon polyps (typically greater than 20 mm in size) previously were often referred for surgical resection, improved polyp resection techniques and equipment have led to the ability to remove many such lesions in a piecemeal fashion or en bloc via endoscopic mucosal resection (EMR) and endoscopic submucosal dissection (ESD).
The authors are to be congratulated for their meticulous and sustained efforts in acquiring and analyzing this data. These results provide endoscopists with some important, practical, and entirely visual criteria to assess upon identification of large colon polyps that can aid in determining which type of endoscopy therapy, if any, to embark upon. Avoiding EMR when there is a reasonably high probability of invasive disease will allow for choosing a more appropriate technique such as ESD (which is becoming increasingly available in the West) or surgery. In addition, patients can avoid the unnecessary EMR-related risks of bleeding and perforation when this technique is likely to result in an inadequate resection. Future work should assess whether this information can be widely adopted and utilized to achieve similar predictive accuracy in nonexpert settings.
V. Raman Muthusamy, MD, is director, interventional and general endoscopy, clinical professor of medicine, digestive diseases/gastroenterology, University of California, Los Angeles School of Medicine. He is a consultant for Medtronic and Boston Scientific.
Large sessile or flat colorectal polyps or laterally spreading lesions were most likely to contain covert malignancies when their location was rectosigmoid, their Paris classification was 0-Is or 0-IIa+Is, and they were nongranular, according to the results of a multicenter prospective cohort study of 2,106 consecutive patients reported in the September issue of Gastroenterology (doi: 10.1053/j.gastro.2017.05.047).
“Distal nongranular lesions have a high risk of occult SMIC [submucosal invasive cancer], whereas proximal, granular 0-IIa lesions, after a careful assessment for features associated with SMIC, have a very low risk,” wrote Nicholas G. Burgess, MD, of Westmead Hospital, Sydney, with his associates. “These findings can be used to inform decisions [about] which patients should undergo endoscopic submucosal dissection, endoscopic mucosal resection, or surgery.”
Source: American Gastroenterological Association
Many studies of colonic lesions have examined predictors of SMIC. Nonetheless, clinicians need more information on factors that improve clinical decision making, especially as colonic endoscopic submucosal dissection becomes more accessible, the researchers said. Large colonic lesions can contain submucosal invasive SMICs that are not visible on endoscopy, and characterizing predictors of this occurrence could help patients and clinicians decide between endoscopic submucosal dissection and endoscopic mucosal resection. To do so, the researchers analyzed histologic specimens from 2,277 colonic lesions above 20 mm (average size, 37 mm) that lacked overt endoscopic high-risk features. The study ran from 2008 through 2016, study participants averaged 68 years of age, and 53% were male. A total of 171 lesions (8%) had evidence of SMIC on pathologic review, and 138 lesions had covert SMIC. Predictors of overt and occult SMIC included Kudo pit pattern V, a depressed component (0-IIc), rectosigmoid location, 0-Is or 0-IIa+Is Paris classification, nongranular surface morphology, and larger size. After excluding lesions with obvious SMIC features – including serrated lesions and those with depressed components (Kudo pit pattern of V and Paris 0-IIc) – the strongest predictors of occult SMIC included Paris classification, surface morphology, size, and location.
“Proximal 0-IIa G or 0-Is granular lesions had the lowest risk of SMIC (0.7% and 2.3%), whereas distal 0-Is nongranular lesions had the highest risk (21.4%),” the investigators added. Lesion location, size, and combined Paris classification and surface topography showed the best fit in a multivariable model. Notably, rectosigmoid lesions had nearly twice the odds of containing covert SMIC, compared with proximal lesions (odds ratio, 1.9; 95% confidence interval, 1.2-3.0; P = .01). Other significant predictors of covert SMIC in the multivariable model included combined Paris classification, surface morphology (OR, 4.0; 95% CI, 1.2-12.7; P = .02), and increasing size (OR, 1.2 per 10-mm increase; 95% CI, 1.04-1.3; P = .01). Increased size showed an even greater effect in lesions exceeding 50 mm.
Clinicians can use these factors to help evaluate risk of invasive cancer in lesions without overt SMIC, the researchers said. “One lesion type that differs from the pattern is 0-IIa nongranular lesions,” they noted. “Once lesions with overt evidence of SMIC are excluded, these lesions have a low risk (4.2%) of harboring underlying cancer.” Although 42% of lesions with covert SMIC were SM1 (potentially curable by endoscopic resection), no predictor of covert SMIC also predicted SMI status.
Funders included Cancer Institute of New South Wales and Gallipoli Medical Research Foundation. The investigators had no conflicts of interest.
Large sessile or flat colorectal polyps or laterally spreading lesions were most likely to contain covert malignancies when their location was rectosigmoid, their Paris classification was 0-Is or 0-IIa+Is, and they were nongranular, according to the results of a multicenter prospective cohort study of 2,106 consecutive patients reported in the September issue of Gastroenterology (doi: 10.1053/j.gastro.2017.05.047).
“Distal nongranular lesions have a high risk of occult SMIC [submucosal invasive cancer], whereas proximal, granular 0-IIa lesions, after a careful assessment for features associated with SMIC, have a very low risk,” wrote Nicholas G. Burgess, MD, of Westmead Hospital, Sydney, with his associates. “These findings can be used to inform decisions [about] which patients should undergo endoscopic submucosal dissection, endoscopic mucosal resection, or surgery.”
Source: American Gastroenterological Association
Many studies of colonic lesions have examined predictors of SMIC. Nonetheless, clinicians need more information on factors that improve clinical decision making, especially as colonic endoscopic submucosal dissection becomes more accessible, the researchers said. Large colonic lesions can contain submucosal invasive SMICs that are not visible on endoscopy, and characterizing predictors of this occurrence could help patients and clinicians decide between endoscopic submucosal dissection and endoscopic mucosal resection. To do so, the researchers analyzed histologic specimens from 2,277 colonic lesions above 20 mm (average size, 37 mm) that lacked overt endoscopic high-risk features. The study ran from 2008 through 2016, study participants averaged 68 years of age, and 53% were male. A total of 171 lesions (8%) had evidence of SMIC on pathologic review, and 138 lesions had covert SMIC. Predictors of overt and occult SMIC included Kudo pit pattern V, a depressed component (0-IIc), rectosigmoid location, 0-Is or 0-IIa+Is Paris classification, nongranular surface morphology, and larger size. After excluding lesions with obvious SMIC features – including serrated lesions and those with depressed components (Kudo pit pattern of V and Paris 0-IIc) – the strongest predictors of occult SMIC included Paris classification, surface morphology, size, and location.
“Proximal 0-IIa G or 0-Is granular lesions had the lowest risk of SMIC (0.7% and 2.3%), whereas distal 0-Is nongranular lesions had the highest risk (21.4%),” the investigators added. Lesion location, size, and combined Paris classification and surface topography showed the best fit in a multivariable model. Notably, rectosigmoid lesions had nearly twice the odds of containing covert SMIC, compared with proximal lesions (odds ratio, 1.9; 95% confidence interval, 1.2-3.0; P = .01). Other significant predictors of covert SMIC in the multivariable model included combined Paris classification, surface morphology (OR, 4.0; 95% CI, 1.2-12.7; P = .02), and increasing size (OR, 1.2 per 10-mm increase; 95% CI, 1.04-1.3; P = .01). Increased size showed an even greater effect in lesions exceeding 50 mm.
Clinicians can use these factors to help evaluate risk of invasive cancer in lesions without overt SMIC, the researchers said. “One lesion type that differs from the pattern is 0-IIa nongranular lesions,” they noted. “Once lesions with overt evidence of SMIC are excluded, these lesions have a low risk (4.2%) of harboring underlying cancer.” Although 42% of lesions with covert SMIC were SM1 (potentially curable by endoscopic resection), no predictor of covert SMIC also predicted SMI status.
Funders included Cancer Institute of New South Wales and Gallipoli Medical Research Foundation. The investigators had no conflicts of interest.
FROM GASTROENTEROLOGY
Key clinical point: Large sessile or flat colorectal polyps or laterally spreading lesions had the highest risk of occult malignancy when they were distal 0-Is or 0–IIa+Is nongranular lesions. Proximally located 0-Is or 0-IIa granular lesions had the lowest risk.
Major finding: Only 0.7% of proximal 0-IIa granular lesions and 2.3% of 0-Is granular lesions contained occult submucosal invasive malignancies, compared with 21% of distal 0-Is nongranular lesions.
Data source: A multicenter prospective cohort study of 2,277 large colonic lesions from 2,106 consecutive patients.
Disclosures: Funders included Cancer Institute of New South Wales and Gallipoli Medical Research Foundation. The investigators had no conflicts of interest.
VIDEO: High myristic acid intake linked to relapse in ulcerative colitis
High intake of myristic acid approximately tripled the odds of relapse in patients with ulcerative colitis (UC), compared with low intake, according to the results of a 12-month multicenter, prospective, observational study reported in the September 2017 issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2016.12.036).
Relapsers consumed an average of 2.2 g of this saturated fatty acid daily from sources such as palm and coconut oils, as well as dairy fats, reported Edward L. Barnes, MD, MPH, and his associates at Brigham and Women’s Hospital, Boston, on behalf of the DREAM (Diet’s Role in Exacerbations of Mesalamine Maintenance) investigators. Nonrelapsers averaged 1.4 g/day. “Our broader goal is to determine how alterations in diet can improve the care of people with IBD [inflammatory bowel disease],” the researchers wrote. “These findings can help design interventional dietary studies to determine if supplementation or avoidance of certain compounds might reduce the risk of a flare for patients with ulcerative colitis in remission.”
Dietary factors are thought to underlie relapse in ulcerative colitis, but specific culprits are poorly defined, the investigators said. Therefore, the DREAM study prospectively tracked dietary intake and flares among a homogeneous group of 412 patients with UC from 25 academic and community gastroenterology practices in the United States. Between 2007 and 2014, patients were interviewed by telephone every 3 months for 1 year or until they reported a flare, defined as a Simple Clinical Colitis Activity Index score of at least 5 or a change in disease activity that entailed a change in medication.
Source: American Gastroenterological Association
A total of 34 patients were lost to follow-up, and 45 (11% of those remaining) flared within a year of study enrollment. “When analyzed in tertiles, increasing intake of multiple fatty acids was associated with increasing odds of relapse,” the researchers wrote. Predictors of flare in the univariate analysis included high intake of myristic acid, oleic acid, eicosenoic acid, palmitelaidic acid, total translinoleic acid, saturated fat, monounsaturated fat, and omega-3 fatty acids. These predictors also included moderate or high intake of alpha-linolenic acid. Only high intake of myristic acid maintained a significant dose-response relationship in the multivariable analysis (odds ratio, 3.0; 95% confidence interval, 1.2-7.7; P = .02 for high vs. low intake). Moderate intake of alpha-linolenic acid predicted flare (OR, 5.5; CI, 95%, 1.6-19.3; P = .001) in the multivariable analysis, but high intake did not (OR, 1.3; CI, 95%, 0.3-7.0; P = .4). “Other foods previously implicated in flares of UC, such as processed meat, alcohol, and foods high in sulfur, were not associated with an increased risk of flare,” the researchers wrote.
Study participants were generally in their mid- to late 40s, white, and not current smokers. More than half were male. Most had proctitis or left-sided colitis, not pancolitis. Relapsers averaged 2.4 flares in the 18 months before enrollment (standard deviation, 1.9), compared with 1.8 flares for nonrelapsers (SD, 2.4; P = .003).
This observational study not only was subject to unmeasured confounding, but also excluded many types of patients. Among those excluded were anyone with a history of allergy to salicylates, aminosalicylates, or mesalamine tablets. Also excluded were those who had recent exposure to NSAIDs, oral or parenteral antibiotics, antidiarrheals, antispasmodics, immunosuppressives, biologics, or corticosteroids (except budesonide). Requiring monotherapy with an aminosalicylate might limit the generalizability of the findings, the investigators noted. Patients also were on variable doses of aminosalicylates, and higher doses might have helped inhibit flares.
Actavis and the National Institutes of Health provided funding. The investigators reported having no relevant financial conflicts.
Patients with inflammatory bowel disease commonly ask their physicians if dietary modifications can be made to control their disease. Despite the interest from patients, we have limited data to provide informed recommendations.
These results provide additional information to better guide our discussions with patients regarding diet and disease activity. However, the overall body of information remains sparse, and we should reinforce that dietary manipulation is an adjunct measure, at best, to our current medical therapies.
Rajesh Rasik Shah, MD, is an assistant professor of internal medicine and gastroenterology at Baylor College of Medicine, Houston. He has no conflicts of interest.
Patients with inflammatory bowel disease commonly ask their physicians if dietary modifications can be made to control their disease. Despite the interest from patients, we have limited data to provide informed recommendations.
These results provide additional information to better guide our discussions with patients regarding diet and disease activity. However, the overall body of information remains sparse, and we should reinforce that dietary manipulation is an adjunct measure, at best, to our current medical therapies.
Rajesh Rasik Shah, MD, is an assistant professor of internal medicine and gastroenterology at Baylor College of Medicine, Houston. He has no conflicts of interest.
Patients with inflammatory bowel disease commonly ask their physicians if dietary modifications can be made to control their disease. Despite the interest from patients, we have limited data to provide informed recommendations.
These results provide additional information to better guide our discussions with patients regarding diet and disease activity. However, the overall body of information remains sparse, and we should reinforce that dietary manipulation is an adjunct measure, at best, to our current medical therapies.
Rajesh Rasik Shah, MD, is an assistant professor of internal medicine and gastroenterology at Baylor College of Medicine, Houston. He has no conflicts of interest.
High intake of myristic acid approximately tripled the odds of relapse in patients with ulcerative colitis (UC), compared with low intake, according to the results of a 12-month multicenter, prospective, observational study reported in the September 2017 issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2016.12.036).
Relapsers consumed an average of 2.2 g of this saturated fatty acid daily from sources such as palm and coconut oils, as well as dairy fats, reported Edward L. Barnes, MD, MPH, and his associates at Brigham and Women’s Hospital, Boston, on behalf of the DREAM (Diet’s Role in Exacerbations of Mesalamine Maintenance) investigators. Nonrelapsers averaged 1.4 g/day. “Our broader goal is to determine how alterations in diet can improve the care of people with IBD [inflammatory bowel disease],” the researchers wrote. “These findings can help design interventional dietary studies to determine if supplementation or avoidance of certain compounds might reduce the risk of a flare for patients with ulcerative colitis in remission.”
Dietary factors are thought to underlie relapse in ulcerative colitis, but specific culprits are poorly defined, the investigators said. Therefore, the DREAM study prospectively tracked dietary intake and flares among a homogeneous group of 412 patients with UC from 25 academic and community gastroenterology practices in the United States. Between 2007 and 2014, patients were interviewed by telephone every 3 months for 1 year or until they reported a flare, defined as a Simple Clinical Colitis Activity Index score of at least 5 or a change in disease activity that entailed a change in medication.
Source: American Gastroenterological Association
A total of 34 patients were lost to follow-up, and 45 (11% of those remaining) flared within a year of study enrollment. “When analyzed in tertiles, increasing intake of multiple fatty acids was associated with increasing odds of relapse,” the researchers wrote. Predictors of flare in the univariate analysis included high intake of myristic acid, oleic acid, eicosenoic acid, palmitelaidic acid, total translinoleic acid, saturated fat, monounsaturated fat, and omega-3 fatty acids. These predictors also included moderate or high intake of alpha-linolenic acid. Only high intake of myristic acid maintained a significant dose-response relationship in the multivariable analysis (odds ratio, 3.0; 95% confidence interval, 1.2-7.7; P = .02 for high vs. low intake). Moderate intake of alpha-linolenic acid predicted flare (OR, 5.5; CI, 95%, 1.6-19.3; P = .001) in the multivariable analysis, but high intake did not (OR, 1.3; CI, 95%, 0.3-7.0; P = .4). “Other foods previously implicated in flares of UC, such as processed meat, alcohol, and foods high in sulfur, were not associated with an increased risk of flare,” the researchers wrote.
Study participants were generally in their mid- to late 40s, white, and not current smokers. More than half were male. Most had proctitis or left-sided colitis, not pancolitis. Relapsers averaged 2.4 flares in the 18 months before enrollment (standard deviation, 1.9), compared with 1.8 flares for nonrelapsers (SD, 2.4; P = .003).
This observational study not only was subject to unmeasured confounding, but also excluded many types of patients. Among those excluded were anyone with a history of allergy to salicylates, aminosalicylates, or mesalamine tablets. Also excluded were those who had recent exposure to NSAIDs, oral or parenteral antibiotics, antidiarrheals, antispasmodics, immunosuppressives, biologics, or corticosteroids (except budesonide). Requiring monotherapy with an aminosalicylate might limit the generalizability of the findings, the investigators noted. Patients also were on variable doses of aminosalicylates, and higher doses might have helped inhibit flares.
Actavis and the National Institutes of Health provided funding. The investigators reported having no relevant financial conflicts.
High intake of myristic acid approximately tripled the odds of relapse in patients with ulcerative colitis (UC), compared with low intake, according to the results of a 12-month multicenter, prospective, observational study reported in the September 2017 issue of Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2016.12.036).
Relapsers consumed an average of 2.2 g of this saturated fatty acid daily from sources such as palm and coconut oils, as well as dairy fats, reported Edward L. Barnes, MD, MPH, and his associates at Brigham and Women’s Hospital, Boston, on behalf of the DREAM (Diet’s Role in Exacerbations of Mesalamine Maintenance) investigators. Nonrelapsers averaged 1.4 g/day. “Our broader goal is to determine how alterations in diet can improve the care of people with IBD [inflammatory bowel disease],” the researchers wrote. “These findings can help design interventional dietary studies to determine if supplementation or avoidance of certain compounds might reduce the risk of a flare for patients with ulcerative colitis in remission.”
Dietary factors are thought to underlie relapse in ulcerative colitis, but specific culprits are poorly defined, the investigators said. Therefore, the DREAM study prospectively tracked dietary intake and flares among a homogeneous group of 412 patients with UC from 25 academic and community gastroenterology practices in the United States. Between 2007 and 2014, patients were interviewed by telephone every 3 months for 1 year or until they reported a flare, defined as a Simple Clinical Colitis Activity Index score of at least 5 or a change in disease activity that entailed a change in medication.
Source: American Gastroenterological Association
A total of 34 patients were lost to follow-up, and 45 (11% of those remaining) flared within a year of study enrollment. “When analyzed in tertiles, increasing intake of multiple fatty acids was associated with increasing odds of relapse,” the researchers wrote. Predictors of flare in the univariate analysis included high intake of myristic acid, oleic acid, eicosenoic acid, palmitelaidic acid, total translinoleic acid, saturated fat, monounsaturated fat, and omega-3 fatty acids. These predictors also included moderate or high intake of alpha-linolenic acid. Only high intake of myristic acid maintained a significant dose-response relationship in the multivariable analysis (odds ratio, 3.0; 95% confidence interval, 1.2-7.7; P = .02 for high vs. low intake). Moderate intake of alpha-linolenic acid predicted flare (OR, 5.5; CI, 95%, 1.6-19.3; P = .001) in the multivariable analysis, but high intake did not (OR, 1.3; CI, 95%, 0.3-7.0; P = .4). “Other foods previously implicated in flares of UC, such as processed meat, alcohol, and foods high in sulfur, were not associated with an increased risk of flare,” the researchers wrote.
Study participants were generally in their mid- to late 40s, white, and not current smokers. More than half were male. Most had proctitis or left-sided colitis, not pancolitis. Relapsers averaged 2.4 flares in the 18 months before enrollment (standard deviation, 1.9), compared with 1.8 flares for nonrelapsers (SD, 2.4; P = .003).
This observational study not only was subject to unmeasured confounding, but also excluded many types of patients. Among those excluded were anyone with a history of allergy to salicylates, aminosalicylates, or mesalamine tablets. Also excluded were those who had recent exposure to NSAIDs, oral or parenteral antibiotics, antidiarrheals, antispasmodics, immunosuppressives, biologics, or corticosteroids (except budesonide). Requiring monotherapy with an aminosalicylate might limit the generalizability of the findings, the investigators noted. Patients also were on variable doses of aminosalicylates, and higher doses might have helped inhibit flares.
Actavis and the National Institutes of Health provided funding. The investigators reported having no relevant financial conflicts.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: High intake of myristic acid tripled the odds of relapse in patients with ulcerative colitis.
Major finding: Protein, processed meat, alcohol, and sulfur intake were not linked to UC relapse.
Data source: A multicenter prospective study of 412 patients with UC.
Disclosures: Actavis and the National Institutes of Health provided funding. The investigators reported having no relevant financial conflicts.
Liver cancer risk lower after sustained response to DAAs
Individuals with hepatitis C infection who achieved a sustained virologic response (SVR) to treatment with direct-acting antivirals had a significantly lower risk of hepatocellular carcinoma (HCC), a new study suggests.
A retrospective cohort study of 22,500 U.S. veterans with hepatitis C who had been treated with direct-acting antivirals (DAAs) found those with an SVR had a 72% lower risk of HCC, compared with those who did not achieve that response (hazard ratio, 0.28; 95% confidence interval, 0.22-0.36; P less than .0001), even after adjusting for demographics as well as clinical and health utilization factors.
“These data show that successful eradication of HCV [hepatitis C virus] confers a benefit in DAA-treated patients,” wrote Fasiha Kanwal, MD, from the Michael E. DeBakey Veterans Affairs Medical Center in Houston and her coauthors. “Although a few recent studies have raised concerns that DAA might accelerate the risk of HCC in some patients early in the course of treatment, we did not find any factors that differentiated patients with HCC that developed during DAA treatment.”
The results highlighted the importance of early treatment with antivirals, beginning well before the patients showed signs of progressing to advanced fibrosis or cirrhosis, the investigators noted.
“Delaying treatment until patients progress to cirrhosis might be associated with substantial downstream costs incurred as part of lifelong HCC surveillance and/or management of HCC,” they wrote.
Sustained virologic response to DAAs also was associated with a longer time to diagnosis, and patients who didn’t achieve it showed higher rates of cancer much earlier. The most common antivirals used were sofosbuvir (75.2%; 51.1% in combination with ledipasvir), the combination of paritaprevir/ritonavir (23.3%), daclatasvir-based treatments (0.8%), and simeprevir (0.7%).
While the patients achieved SVR that showed similarly beneficial effects on HCC risk in patients with or without cirrhosis, the authors also noted that patients with cirrhosis had a nearly fivefold greater risk of developing cancer than did those without (HR, 4.73; 95% CI, 3.34-6.68). Similarly, patients with a fibrosis score (FIB-4) greater than 3.25 had a sixfold higher risk of HCC, compared with those with a value of 1.45 or lower.
Researchers commented that, at this level of risk, surveillance for HCC in these patients may be cost effective.
“Based on these data, HCC surveillance or risk modification may be needed for all patients who have progressed to cirrhosis or advanced fibrosis (as indicated by high FIB-4) at the time of SVR,” they wrote.
Alcohol use was also associated with a significantly higher annual incidence of HCC (HR, 1.56; 95% CI, 1.11-2.18).
Among the study cohort, 39% had cirrhosis, 29.7% had advanced fibrosis, and nearly one-quarter had previously been treated for hepatitis C infection. More than 40% also had diabetes, 61.4% reported alcohol use, and 54.2% had a history of drug use.
“DAAs offer a chance of cure for all patients with HCV, including patients with advanced cirrhosis, older patients, and those with alcohol use – all characteristics independently associated with risk of HCC in HCV,” the authors explained. “These data show the treated population has changed significantly in the DAA era to include many patients with other HCC risk factors; these differences likely explain why the newer cohorts of DAA-treated patients face higher absolute HCC risk than expected, based on historic data.”
The study was partly supported by the Department of Veteran Affairs’ Center for Innovations in Quality, Effectiveness, and Safety at the Michael E. DeBakey VA Medical Center. No conflicts of interest were declared.
The availability of direct-acting antivirals (DAAs) has revolutionized treatment of hepatitis C. Sustained virologic response (SVR) can be routinely achieved in more than 95% of patients – except in those with decompensated cirrhosis – with a 12-week course of these oral drugs, which have minimal adverse effects. Thus, guidelines recommend that all patients with hepatitis C should be treated with DAAs.1 It was a shock to the medical community when the recent Cochrane review concluded there was insufficient evidence to confirm or reject an effect of DAA therapy on HCV-related morbidity or all-cause mortality.2 The authors cautioned that the lack of valid evidence for DAAs’ effectiveness and the possibility of potential harm should be considered before treating people with hepatitis C with DAAs. Their conclusion was in part based on their rejection of SVR as a valid surrogate for clinical outcome. Previous studies of interferon-based therapies showed that SVR was associated with improvement in liver histology, decreased risk of hepatocellular carcinoma (HCC), and mortality.
Treatment of hepatitis C with DAAs represents one out of a handful of cases in which we can claim that a cure for a chronic disease is possible; however, treatment must be initiated early before advanced fibrosis or cirrhosis to prevent a persistent, though greatly reduced, risk of HCC. Physicians managing patients with hepatitis C should make treatment decisions based on evidence from the entire literature – which supports claims of the DAA treatment’s benefits and refutes allegations of its harmfulness – and should not be swayed by the misguided conclusions of the Cochrane review.
References
1. AASLD-IDSA. Recommendations for testing, managing, and treating hepatitis C. www.hcvguidelines.org. Accessed on July 2, 2017.
2. Jakobsen J.C., Nielsen E.E., Feinberg J., et al. Direct-acting antivirals for chronic hepatitis C. Cochrane Database Syst Rev. 2017 Jun 6;6:CD012143.
3. Curry M.P., O’Leary J.G., Bzowej N., et al. Sofosbuvir and velpatasvir for HCV in patients with decompensated cirrhosis. N Engl J Med. 2015;373(27):2618-28.
4. Kanwal F., Kramer J., Asch S.M., et al. Risk of hepatocellular cancer in HCV patients treated with direct acting antiviral agents. Gastroenterology. 2017 Jun 19. pii: S0016-5085(17)35797.
Anna S. Lok, MD, AGAF, FAASLD, is the Alice Lohrman Andrews Research Professor in Hepatology in the department of internal medicine at the University of Michigan Health System in Ann Arbor. She has received research grants from Bristol-Myers Squibb and Gilead through the University of Michigan.
The availability of direct-acting antivirals (DAAs) has revolutionized treatment of hepatitis C. Sustained virologic response (SVR) can be routinely achieved in more than 95% of patients – except in those with decompensated cirrhosis – with a 12-week course of these oral drugs, which have minimal adverse effects. Thus, guidelines recommend that all patients with hepatitis C should be treated with DAAs.1 It was a shock to the medical community when the recent Cochrane review concluded there was insufficient evidence to confirm or reject an effect of DAA therapy on HCV-related morbidity or all-cause mortality.2 The authors cautioned that the lack of valid evidence for DAAs’ effectiveness and the possibility of potential harm should be considered before treating people with hepatitis C with DAAs. Their conclusion was in part based on their rejection of SVR as a valid surrogate for clinical outcome. Previous studies of interferon-based therapies showed that SVR was associated with improvement in liver histology, decreased risk of hepatocellular carcinoma (HCC), and mortality.
Treatment of hepatitis C with DAAs represents one out of a handful of cases in which we can claim that a cure for a chronic disease is possible; however, treatment must be initiated early before advanced fibrosis or cirrhosis to prevent a persistent, though greatly reduced, risk of HCC. Physicians managing patients with hepatitis C should make treatment decisions based on evidence from the entire literature – which supports claims of the DAA treatment’s benefits and refutes allegations of its harmfulness – and should not be swayed by the misguided conclusions of the Cochrane review.
References
1. AASLD-IDSA. Recommendations for testing, managing, and treating hepatitis C. www.hcvguidelines.org. Accessed on July 2, 2017.
2. Jakobsen J.C., Nielsen E.E., Feinberg J., et al. Direct-acting antivirals for chronic hepatitis C. Cochrane Database Syst Rev. 2017 Jun 6;6:CD012143.
3. Curry M.P., O’Leary J.G., Bzowej N., et al. Sofosbuvir and velpatasvir for HCV in patients with decompensated cirrhosis. N Engl J Med. 2015;373(27):2618-28.
4. Kanwal F., Kramer J., Asch S.M., et al. Risk of hepatocellular cancer in HCV patients treated with direct acting antiviral agents. Gastroenterology. 2017 Jun 19. pii: S0016-5085(17)35797.
Anna S. Lok, MD, AGAF, FAASLD, is the Alice Lohrman Andrews Research Professor in Hepatology in the department of internal medicine at the University of Michigan Health System in Ann Arbor. She has received research grants from Bristol-Myers Squibb and Gilead through the University of Michigan.
The availability of direct-acting antivirals (DAAs) has revolutionized treatment of hepatitis C. Sustained virologic response (SVR) can be routinely achieved in more than 95% of patients – except in those with decompensated cirrhosis – with a 12-week course of these oral drugs, which have minimal adverse effects. Thus, guidelines recommend that all patients with hepatitis C should be treated with DAAs.1 It was a shock to the medical community when the recent Cochrane review concluded there was insufficient evidence to confirm or reject an effect of DAA therapy on HCV-related morbidity or all-cause mortality.2 The authors cautioned that the lack of valid evidence for DAAs’ effectiveness and the possibility of potential harm should be considered before treating people with hepatitis C with DAAs. Their conclusion was in part based on their rejection of SVR as a valid surrogate for clinical outcome. Previous studies of interferon-based therapies showed that SVR was associated with improvement in liver histology, decreased risk of hepatocellular carcinoma (HCC), and mortality.
Treatment of hepatitis C with DAAs represents one out of a handful of cases in which we can claim that a cure for a chronic disease is possible; however, treatment must be initiated early before advanced fibrosis or cirrhosis to prevent a persistent, though greatly reduced, risk of HCC. Physicians managing patients with hepatitis C should make treatment decisions based on evidence from the entire literature – which supports claims of the DAA treatment’s benefits and refutes allegations of its harmfulness – and should not be swayed by the misguided conclusions of the Cochrane review.
References
1. AASLD-IDSA. Recommendations for testing, managing, and treating hepatitis C. www.hcvguidelines.org. Accessed on July 2, 2017.
2. Jakobsen J.C., Nielsen E.E., Feinberg J., et al. Direct-acting antivirals for chronic hepatitis C. Cochrane Database Syst Rev. 2017 Jun 6;6:CD012143.
3. Curry M.P., O’Leary J.G., Bzowej N., et al. Sofosbuvir and velpatasvir for HCV in patients with decompensated cirrhosis. N Engl J Med. 2015;373(27):2618-28.
4. Kanwal F., Kramer J., Asch S.M., et al. Risk of hepatocellular cancer in HCV patients treated with direct acting antiviral agents. Gastroenterology. 2017 Jun 19. pii: S0016-5085(17)35797.
Anna S. Lok, MD, AGAF, FAASLD, is the Alice Lohrman Andrews Research Professor in Hepatology in the department of internal medicine at the University of Michigan Health System in Ann Arbor. She has received research grants from Bristol-Myers Squibb and Gilead through the University of Michigan.
Individuals with hepatitis C infection who achieved a sustained virologic response (SVR) to treatment with direct-acting antivirals had a significantly lower risk of hepatocellular carcinoma (HCC), a new study suggests.
A retrospective cohort study of 22,500 U.S. veterans with hepatitis C who had been treated with direct-acting antivirals (DAAs) found those with an SVR had a 72% lower risk of HCC, compared with those who did not achieve that response (hazard ratio, 0.28; 95% confidence interval, 0.22-0.36; P less than .0001), even after adjusting for demographics as well as clinical and health utilization factors.
“These data show that successful eradication of HCV [hepatitis C virus] confers a benefit in DAA-treated patients,” wrote Fasiha Kanwal, MD, from the Michael E. DeBakey Veterans Affairs Medical Center in Houston and her coauthors. “Although a few recent studies have raised concerns that DAA might accelerate the risk of HCC in some patients early in the course of treatment, we did not find any factors that differentiated patients with HCC that developed during DAA treatment.”
The results highlighted the importance of early treatment with antivirals, beginning well before the patients showed signs of progressing to advanced fibrosis or cirrhosis, the investigators noted.
“Delaying treatment until patients progress to cirrhosis might be associated with substantial downstream costs incurred as part of lifelong HCC surveillance and/or management of HCC,” they wrote.
Sustained virologic response to DAAs also was associated with a longer time to diagnosis, and patients who didn’t achieve it showed higher rates of cancer much earlier. The most common antivirals used were sofosbuvir (75.2%; 51.1% in combination with ledipasvir), the combination of paritaprevir/ritonavir (23.3%), daclatasvir-based treatments (0.8%), and simeprevir (0.7%).
While the patients achieved SVR that showed similarly beneficial effects on HCC risk in patients with or without cirrhosis, the authors also noted that patients with cirrhosis had a nearly fivefold greater risk of developing cancer than did those without (HR, 4.73; 95% CI, 3.34-6.68). Similarly, patients with a fibrosis score (FIB-4) greater than 3.25 had a sixfold higher risk of HCC, compared with those with a value of 1.45 or lower.
Researchers commented that, at this level of risk, surveillance for HCC in these patients may be cost effective.
“Based on these data, HCC surveillance or risk modification may be needed for all patients who have progressed to cirrhosis or advanced fibrosis (as indicated by high FIB-4) at the time of SVR,” they wrote.
Alcohol use was also associated with a significantly higher annual incidence of HCC (HR, 1.56; 95% CI, 1.11-2.18).
Among the study cohort, 39% had cirrhosis, 29.7% had advanced fibrosis, and nearly one-quarter had previously been treated for hepatitis C infection. More than 40% also had diabetes, 61.4% reported alcohol use, and 54.2% had a history of drug use.
“DAAs offer a chance of cure for all patients with HCV, including patients with advanced cirrhosis, older patients, and those with alcohol use – all characteristics independently associated with risk of HCC in HCV,” the authors explained. “These data show the treated population has changed significantly in the DAA era to include many patients with other HCC risk factors; these differences likely explain why the newer cohorts of DAA-treated patients face higher absolute HCC risk than expected, based on historic data.”
The study was partly supported by the Department of Veteran Affairs’ Center for Innovations in Quality, Effectiveness, and Safety at the Michael E. DeBakey VA Medical Center. No conflicts of interest were declared.
Individuals with hepatitis C infection who achieved a sustained virologic response (SVR) to treatment with direct-acting antivirals had a significantly lower risk of hepatocellular carcinoma (HCC), a new study suggests.
A retrospective cohort study of 22,500 U.S. veterans with hepatitis C who had been treated with direct-acting antivirals (DAAs) found those with an SVR had a 72% lower risk of HCC, compared with those who did not achieve that response (hazard ratio, 0.28; 95% confidence interval, 0.22-0.36; P less than .0001), even after adjusting for demographics as well as clinical and health utilization factors.
“These data show that successful eradication of HCV [hepatitis C virus] confers a benefit in DAA-treated patients,” wrote Fasiha Kanwal, MD, from the Michael E. DeBakey Veterans Affairs Medical Center in Houston and her coauthors. “Although a few recent studies have raised concerns that DAA might accelerate the risk of HCC in some patients early in the course of treatment, we did not find any factors that differentiated patients with HCC that developed during DAA treatment.”
The results highlighted the importance of early treatment with antivirals, beginning well before the patients showed signs of progressing to advanced fibrosis or cirrhosis, the investigators noted.
“Delaying treatment until patients progress to cirrhosis might be associated with substantial downstream costs incurred as part of lifelong HCC surveillance and/or management of HCC,” they wrote.
Sustained virologic response to DAAs also was associated with a longer time to diagnosis, and patients who didn’t achieve it showed higher rates of cancer much earlier. The most common antivirals used were sofosbuvir (75.2%; 51.1% in combination with ledipasvir), the combination of paritaprevir/ritonavir (23.3%), daclatasvir-based treatments (0.8%), and simeprevir (0.7%).
While the patients achieved SVR that showed similarly beneficial effects on HCC risk in patients with or without cirrhosis, the authors also noted that patients with cirrhosis had a nearly fivefold greater risk of developing cancer than did those without (HR, 4.73; 95% CI, 3.34-6.68). Similarly, patients with a fibrosis score (FIB-4) greater than 3.25 had a sixfold higher risk of HCC, compared with those with a value of 1.45 or lower.
Researchers commented that, at this level of risk, surveillance for HCC in these patients may be cost effective.
“Based on these data, HCC surveillance or risk modification may be needed for all patients who have progressed to cirrhosis or advanced fibrosis (as indicated by high FIB-4) at the time of SVR,” they wrote.
Alcohol use was also associated with a significantly higher annual incidence of HCC (HR, 1.56; 95% CI, 1.11-2.18).
Among the study cohort, 39% had cirrhosis, 29.7% had advanced fibrosis, and nearly one-quarter had previously been treated for hepatitis C infection. More than 40% also had diabetes, 61.4% reported alcohol use, and 54.2% had a history of drug use.
“DAAs offer a chance of cure for all patients with HCV, including patients with advanced cirrhosis, older patients, and those with alcohol use – all characteristics independently associated with risk of HCC in HCV,” the authors explained. “These data show the treated population has changed significantly in the DAA era to include many patients with other HCC risk factors; these differences likely explain why the newer cohorts of DAA-treated patients face higher absolute HCC risk than expected, based on historic data.”
The study was partly supported by the Department of Veteran Affairs’ Center for Innovations in Quality, Effectiveness, and Safety at the Michael E. DeBakey VA Medical Center. No conflicts of interest were declared.
FROM GASTROENTEROLOGY
Key clinical point:
Major finding: Individuals who achieved an SVR to antiviral treatment for hepatitis C infection had a 72% lower risk of hepatocellular carcinoma than those who do not show a sustained response.
Data source: Retrospective cohort study in 22,500 U.S. veterans with hepatitis C.
Disclosures: The study was partly supported by the Department of Veterans Affairs’ Center for Innovations in Quality, Effectiveness, and Safety at the Michael E. DeBakey VA Medical Center. No conflicts of interest were declared.
VIDEO: Autoimmune hepatitis with cirrhosis tied to hepatocellular carcinoma
The presence of cirrhosis in patients with autoimmune hepatitis markedly increased their risk of hepatocellular carcinoma, according to a systematic review and meta-analysis of 25 cohort studies and 6,528 patients.
Estimated rates of hepatocellular carcinoma (HCC) were 10.1 (6.9-14.7) cases per 1,000 person-years in these patients versus 1.1 (0.6-2.2) cases per 1,000 person-years in patients without cirrhosis at diagnosis, Aylin Tansel, MD, of Baylor College of Medicine in Houston, and associates reported in Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.02.006). Thus, surveillance for HCC “might be cost effective in this population,” they wrote. “However, patients with AIH [autoimmune hepatitis] without cirrhosis at index diagnosis, particularly those identified from general populations, are at an extremely low risk of HCC.”
Source: American Gastroenterological Association
Autoimmune hepatitis may be asymptomatic at presentation or may cause severe acute hepatitis or even fulminant liver failure. Even with immunosuppressive therapy, patients progress to cirrhosis at reported annual rates of 0.1%-8%. HCC is the fastest-growing cause of cancer mortality, and the American Association for the Study of Liver Diseases (AASLD) recommends enhanced surveillance for this disease in patients whose annual estimated risk is at least 1.5%. Although the European Association for the Study of Liver Diseases recommends screening for HCC in patients with autoimmune hepatitis and cirrhosis, AASLD makes no such recommendation, the reviewers noted. To study the risk of HCC in patients with autoimmune hepatitis, they searched PubMed, Embase, and reference lists for relevant cohort studies published through June 2016. This work yielded 20 papers and five abstracts with a pooled median follow-up period of 8 years.
The overall pooled incidence of HCC was 3.1 (95% confidence interval, 2.2-4.2) cases per 1,000 person-years, or 1.007% per year, the reviewers wrote. However, the 95% confidence interval for the annual incidence rate nearly encompassed the 1.5% cutoff recommended by AASLD, they said. Furthermore, 5 of 16 studies that investigated the risk of HCC in patients with concurrent cirrhosis reported incidence rates above 1.5%. Among 93 patients who developed HCC in the meta-analysis, only 1 did not have cirrhosis by the time autoimmune hepatitis was diagnosed.
The meta-analysis also linked HCC to older age and Asian ethnicity among patients with autoimmune hepatitis, as has been reported before. Male sex only slightly increased the risk of HCC, but the studies included only about 1,130 men, the reviewers noted. Although the severity of autoimmune hepatitis varied among studies, higher rates of relapse predicted HCC in two cohorts. Additionally, one study linked alcohol abuse to a sixfold higher risk of HCC among patients with autoimmune hepatitis. “These data support careful monitoring of patients with autoimmune hepatitis, particularly older men, patients with multiple autoimmune hepatitis relapses, and those with ongoing alcohol abuse,” the investigators wrote.
They found no evidence of publication bias, but each individual study included at most 15 cases of HCC, so pooled incidence rates were probably imprecise, especially for subgroups, they said. Studies also inconsistently reported HCC risk factors, often lacked comparison groups, and usually did not report the effects of surveillance for HCC.
Dr. Tansel reported receiving support from the National Institutes of Health. The reviewers had no conflicts of interest.
Serial imaging surveillance facilitates detection of hepatocellular carcinoma (HCC) at a stage amenable to potentially curative resection or liver transplantation. The AASLD, EASL, and APASL recommend surveillance for cirrhotic patients; however, the AASLD stipulates that the incidence of HCC exceed the threshold of cost-effectiveness of 1.5% per year. Whether HCC surveillance in cirrhotic patients with autoimmune hepatitis (AIH) is cost effective remains controversial. The systematic review and meta-analysis by Tansel et al. of 25 rigorously selected cohort studies of AIH addresses this question by calculating incidence rates of HCC per 1,000 person-years using 95% confidence intervals derived from event rates in relation to the duration of follow-up.
John M. Vierling, MD, FACP, FAASLD, is professor of medicine and surgery, chief of hepatology, Baylor College of Medicine, Houston. He has received grant support from Taiwan J and Novartis and is on the scientific advisory board for Novartis.
Serial imaging surveillance facilitates detection of hepatocellular carcinoma (HCC) at a stage amenable to potentially curative resection or liver transplantation. The AASLD, EASL, and APASL recommend surveillance for cirrhotic patients; however, the AASLD stipulates that the incidence of HCC exceed the threshold of cost-effectiveness of 1.5% per year. Whether HCC surveillance in cirrhotic patients with autoimmune hepatitis (AIH) is cost effective remains controversial. The systematic review and meta-analysis by Tansel et al. of 25 rigorously selected cohort studies of AIH addresses this question by calculating incidence rates of HCC per 1,000 person-years using 95% confidence intervals derived from event rates in relation to the duration of follow-up.
John M. Vierling, MD, FACP, FAASLD, is professor of medicine and surgery, chief of hepatology, Baylor College of Medicine, Houston. He has received grant support from Taiwan J and Novartis and is on the scientific advisory board for Novartis.
Serial imaging surveillance facilitates detection of hepatocellular carcinoma (HCC) at a stage amenable to potentially curative resection or liver transplantation. The AASLD, EASL, and APASL recommend surveillance for cirrhotic patients; however, the AASLD stipulates that the incidence of HCC exceed the threshold of cost-effectiveness of 1.5% per year. Whether HCC surveillance in cirrhotic patients with autoimmune hepatitis (AIH) is cost effective remains controversial. The systematic review and meta-analysis by Tansel et al. of 25 rigorously selected cohort studies of AIH addresses this question by calculating incidence rates of HCC per 1,000 person-years using 95% confidence intervals derived from event rates in relation to the duration of follow-up.
John M. Vierling, MD, FACP, FAASLD, is professor of medicine and surgery, chief of hepatology, Baylor College of Medicine, Houston. He has received grant support from Taiwan J and Novartis and is on the scientific advisory board for Novartis.
The presence of cirrhosis in patients with autoimmune hepatitis markedly increased their risk of hepatocellular carcinoma, according to a systematic review and meta-analysis of 25 cohort studies and 6,528 patients.
Estimated rates of hepatocellular carcinoma (HCC) were 10.1 (6.9-14.7) cases per 1,000 person-years in these patients versus 1.1 (0.6-2.2) cases per 1,000 person-years in patients without cirrhosis at diagnosis, Aylin Tansel, MD, of Baylor College of Medicine in Houston, and associates reported in Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.02.006). Thus, surveillance for HCC “might be cost effective in this population,” they wrote. “However, patients with AIH [autoimmune hepatitis] without cirrhosis at index diagnosis, particularly those identified from general populations, are at an extremely low risk of HCC.”
Source: American Gastroenterological Association
Autoimmune hepatitis may be asymptomatic at presentation or may cause severe acute hepatitis or even fulminant liver failure. Even with immunosuppressive therapy, patients progress to cirrhosis at reported annual rates of 0.1%-8%. HCC is the fastest-growing cause of cancer mortality, and the American Association for the Study of Liver Diseases (AASLD) recommends enhanced surveillance for this disease in patients whose annual estimated risk is at least 1.5%. Although the European Association for the Study of Liver Diseases recommends screening for HCC in patients with autoimmune hepatitis and cirrhosis, AASLD makes no such recommendation, the reviewers noted. To study the risk of HCC in patients with autoimmune hepatitis, they searched PubMed, Embase, and reference lists for relevant cohort studies published through June 2016. This work yielded 20 papers and five abstracts with a pooled median follow-up period of 8 years.
The overall pooled incidence of HCC was 3.1 (95% confidence interval, 2.2-4.2) cases per 1,000 person-years, or 1.007% per year, the reviewers wrote. However, the 95% confidence interval for the annual incidence rate nearly encompassed the 1.5% cutoff recommended by AASLD, they said. Furthermore, 5 of 16 studies that investigated the risk of HCC in patients with concurrent cirrhosis reported incidence rates above 1.5%. Among 93 patients who developed HCC in the meta-analysis, only 1 did not have cirrhosis by the time autoimmune hepatitis was diagnosed.
The meta-analysis also linked HCC to older age and Asian ethnicity among patients with autoimmune hepatitis, as has been reported before. Male sex only slightly increased the risk of HCC, but the studies included only about 1,130 men, the reviewers noted. Although the severity of autoimmune hepatitis varied among studies, higher rates of relapse predicted HCC in two cohorts. Additionally, one study linked alcohol abuse to a sixfold higher risk of HCC among patients with autoimmune hepatitis. “These data support careful monitoring of patients with autoimmune hepatitis, particularly older men, patients with multiple autoimmune hepatitis relapses, and those with ongoing alcohol abuse,” the investigators wrote.
They found no evidence of publication bias, but each individual study included at most 15 cases of HCC, so pooled incidence rates were probably imprecise, especially for subgroups, they said. Studies also inconsistently reported HCC risk factors, often lacked comparison groups, and usually did not report the effects of surveillance for HCC.
Dr. Tansel reported receiving support from the National Institutes of Health. The reviewers had no conflicts of interest.
The presence of cirrhosis in patients with autoimmune hepatitis markedly increased their risk of hepatocellular carcinoma, according to a systematic review and meta-analysis of 25 cohort studies and 6,528 patients.
Estimated rates of hepatocellular carcinoma (HCC) were 10.1 (6.9-14.7) cases per 1,000 person-years in these patients versus 1.1 (0.6-2.2) cases per 1,000 person-years in patients without cirrhosis at diagnosis, Aylin Tansel, MD, of Baylor College of Medicine in Houston, and associates reported in Clinical Gastroenterology and Hepatology (doi: 10.1016/j.cgh.2017.02.006). Thus, surveillance for HCC “might be cost effective in this population,” they wrote. “However, patients with AIH [autoimmune hepatitis] without cirrhosis at index diagnosis, particularly those identified from general populations, are at an extremely low risk of HCC.”
Source: American Gastroenterological Association
Autoimmune hepatitis may be asymptomatic at presentation or may cause severe acute hepatitis or even fulminant liver failure. Even with immunosuppressive therapy, patients progress to cirrhosis at reported annual rates of 0.1%-8%. HCC is the fastest-growing cause of cancer mortality, and the American Association for the Study of Liver Diseases (AASLD) recommends enhanced surveillance for this disease in patients whose annual estimated risk is at least 1.5%. Although the European Association for the Study of Liver Diseases recommends screening for HCC in patients with autoimmune hepatitis and cirrhosis, AASLD makes no such recommendation, the reviewers noted. To study the risk of HCC in patients with autoimmune hepatitis, they searched PubMed, Embase, and reference lists for relevant cohort studies published through June 2016. This work yielded 20 papers and five abstracts with a pooled median follow-up period of 8 years.
The overall pooled incidence of HCC was 3.1 (95% confidence interval, 2.2-4.2) cases per 1,000 person-years, or 1.007% per year, the reviewers wrote. However, the 95% confidence interval for the annual incidence rate nearly encompassed the 1.5% cutoff recommended by AASLD, they said. Furthermore, 5 of 16 studies that investigated the risk of HCC in patients with concurrent cirrhosis reported incidence rates above 1.5%. Among 93 patients who developed HCC in the meta-analysis, only 1 did not have cirrhosis by the time autoimmune hepatitis was diagnosed.
The meta-analysis also linked HCC to older age and Asian ethnicity among patients with autoimmune hepatitis, as has been reported before. Male sex only slightly increased the risk of HCC, but the studies included only about 1,130 men, the reviewers noted. Although the severity of autoimmune hepatitis varied among studies, higher rates of relapse predicted HCC in two cohorts. Additionally, one study linked alcohol abuse to a sixfold higher risk of HCC among patients with autoimmune hepatitis. “These data support careful monitoring of patients with autoimmune hepatitis, particularly older men, patients with multiple autoimmune hepatitis relapses, and those with ongoing alcohol abuse,” the investigators wrote.
They found no evidence of publication bias, but each individual study included at most 15 cases of HCC, so pooled incidence rates were probably imprecise, especially for subgroups, they said. Studies also inconsistently reported HCC risk factors, often lacked comparison groups, and usually did not report the effects of surveillance for HCC.
Dr. Tansel reported receiving support from the National Institutes of Health. The reviewers had no conflicts of interest.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Key clinical point: Patients with autoimmune hepatitis and cirrhosis are at increased risk of hepatocellular carcinoma.
Major finding: For every 1,000 person-years, there were 3.1 (95% CI, 2.2-4.2) cases of hepatocellular carcinoma overall, 10.1 (6.9-14.7) cases in patients who also had cirrhosis at diagnosis, and 1.1 (0.6-2.2) cases in patients who did not have cirrhosis at diagnosis.
Data source: A systematic review and meta-analysis of 25 studies of 6,528 patients with autoimmune hepatitis.
Disclosures: Dr. Tansel reported receiving support from the National Institutes of Health. The investigators disclosed no conflicts.
VIDEO: Meta-analysis favors anticoagulation for patients with cirrhosis and portal vein thrombosis
Patients with cirrhosis and portal vein thrombosis (PVT) who received anticoagulation therapy had nearly fivefold greater odds of recanalization compared with untreated patients, and were no more likely to experience major or minor bleeding, in a pooled analysis of eight studies published in the August issue of Gastroenterology (doi: 10.1053/j.gastro.2017.04.042).
Rates of any recanalization were 71% in treated patients and 42% in untreated patients (P less than .0001), wrote Lorenzo Loffredo, MD, of Sapienza University, Rome, and his coinvestigators. Rates of complete recanalization were 53% and 33%, respectively (P = .002), rates of spontaneous variceal bleeding were 2% and 12% (P = .04), and bleeding affected 11% of patients in each group. Together, the findings “show that anticoagulants are efficacious and safe for treatment of portal vein thrombosis in cirrhotic patients,” although larger, interventional clinical trials are needed to pinpoint the clinical role of anticoagulation in cirrhotic patients with PVT, the reviewers reported.
Source: American Gastroenterological Association
Bleeding from portal hypertension is a major complication in cirrhosis, but PVT affects about 20% of patients and predicts poor outcomes, they noted. Anticoagulation in this setting can be difficult because patients often have concurrent coagulopathies that are hard to assess with standard techniques, such as PT-INR (international normalized ratio). Although some studies support anticoagulating these patients, data are limited. Therefore, the reviewers searched PubMed, the ISI Web of Science, SCOPUS, and the Cochrane database through Feb. 14, 2017, for trials comparing anticoagulation with no treatment in patients with cirrhosis and PVT.
This search yielded eight trials of 353 patients who received low-molecular-weight heparin, warfarin, or no treatment for about 6 months, with a typical follow-up period of 2 years. The reviewers found no evidence of publication bias or significant heterogeneity among the trials. Six studies evaluated complete recanalization, another set of six studies tracked progression of PVT, a third set of six studies evaluated major or minor bleeding events, and four studies evaluated spontaneous variceal bleeding. Compared with no treatment, anticoagulation was tied to a significantly greater likelihood of complete recanalization (pooled odds ratio, 3.4; 95% confidence interval, 1.5-7.4; P = .002), a significantly lower chance of PVT progressing (9% vs. 33%; pooled odds ratio, 0.14; 95% CI, 0.06-0.31; P less than .0001), no difference in bleeding rates (11% in each pooled group), and a significantly lower risk of spontaneous variceal bleeding (OR, 0.23; 95% CI, 0.06-0.94; P = .04).
“Metaregression analysis showed that duration of anticoagulation did not influence outcomes,” the reviewers wrote. “Low-molecular-weight heparin, but not warfarin, was significantly associated with a complete PVT resolution as compared to untreated patients, while both low-molecular-weight heparin and warfarin were effective in reducing PVT progression.” That finding merits careful interpretation, however, because most studies on warfarin were retrospective and lacked data on the quality of anticoagulation, they added.
“It is a challenge to treat patients with cirrhosis using anticoagulants because of the perception that the coexistent coagulopathy could promote bleeding,” the researchers wrote. Nonetheless, their analysis suggests that anticoagulation has significant benefits and does not increase bleeding risk, regardless of the severity of liver failure, they concluded.
The reviewers reported having no funding sources or conflicts of interest.
Patients with cirrhosis and portal vein thrombosis (PVT) who received anticoagulation therapy had nearly fivefold greater odds of recanalization compared with untreated patients, and were no more likely to experience major or minor bleeding, in a pooled analysis of eight studies published in the August issue of Gastroenterology (doi: 10.1053/j.gastro.2017.04.042).
Rates of any recanalization were 71% in treated patients and 42% in untreated patients (P less than .0001), wrote Lorenzo Loffredo, MD, of Sapienza University, Rome, and his coinvestigators. Rates of complete recanalization were 53% and 33%, respectively (P = .002), rates of spontaneous variceal bleeding were 2% and 12% (P = .04), and bleeding affected 11% of patients in each group. Together, the findings “show that anticoagulants are efficacious and safe for treatment of portal vein thrombosis in cirrhotic patients,” although larger, interventional clinical trials are needed to pinpoint the clinical role of anticoagulation in cirrhotic patients with PVT, the reviewers reported.
Source: American Gastroenterological Association
Bleeding from portal hypertension is a major complication in cirrhosis, but PVT affects about 20% of patients and predicts poor outcomes, they noted. Anticoagulation in this setting can be difficult because patients often have concurrent coagulopathies that are hard to assess with standard techniques, such as PT-INR (international normalized ratio). Although some studies support anticoagulating these patients, data are limited. Therefore, the reviewers searched PubMed, the ISI Web of Science, SCOPUS, and the Cochrane database through Feb. 14, 2017, for trials comparing anticoagulation with no treatment in patients with cirrhosis and PVT.
This search yielded eight trials of 353 patients who received low-molecular-weight heparin, warfarin, or no treatment for about 6 months, with a typical follow-up period of 2 years. The reviewers found no evidence of publication bias or significant heterogeneity among the trials. Six studies evaluated complete recanalization, another set of six studies tracked progression of PVT, a third set of six studies evaluated major or minor bleeding events, and four studies evaluated spontaneous variceal bleeding. Compared with no treatment, anticoagulation was tied to a significantly greater likelihood of complete recanalization (pooled odds ratio, 3.4; 95% confidence interval, 1.5-7.4; P = .002), a significantly lower chance of PVT progressing (9% vs. 33%; pooled odds ratio, 0.14; 95% CI, 0.06-0.31; P less than .0001), no difference in bleeding rates (11% in each pooled group), and a significantly lower risk of spontaneous variceal bleeding (OR, 0.23; 95% CI, 0.06-0.94; P = .04).
“Metaregression analysis showed that duration of anticoagulation did not influence outcomes,” the reviewers wrote. “Low-molecular-weight heparin, but not warfarin, was significantly associated with a complete PVT resolution as compared to untreated patients, while both low-molecular-weight heparin and warfarin were effective in reducing PVT progression.” That finding merits careful interpretation, however, because most studies on warfarin were retrospective and lacked data on the quality of anticoagulation, they added.
“It is a challenge to treat patients with cirrhosis using anticoagulants because of the perception that the coexistent coagulopathy could promote bleeding,” the researchers wrote. Nonetheless, their analysis suggests that anticoagulation has significant benefits and does not increase bleeding risk, regardless of the severity of liver failure, they concluded.
The reviewers reported having no funding sources or conflicts of interest.
Patients with cirrhosis and portal vein thrombosis (PVT) who received anticoagulation therapy had nearly fivefold greater odds of recanalization compared with untreated patients, and were no more likely to experience major or minor bleeding, in a pooled analysis of eight studies published in the August issue of Gastroenterology (doi: 10.1053/j.gastro.2017.04.042).
Rates of any recanalization were 71% in treated patients and 42% in untreated patients (P less than .0001), wrote Lorenzo Loffredo, MD, of Sapienza University, Rome, and his coinvestigators. Rates of complete recanalization were 53% and 33%, respectively (P = .002), rates of spontaneous variceal bleeding were 2% and 12% (P = .04), and bleeding affected 11% of patients in each group. Together, the findings “show that anticoagulants are efficacious and safe for treatment of portal vein thrombosis in cirrhotic patients,” although larger, interventional clinical trials are needed to pinpoint the clinical role of anticoagulation in cirrhotic patients with PVT, the reviewers reported.
Source: American Gastroenterological Association
Bleeding from portal hypertension is a major complication in cirrhosis, but PVT affects about 20% of patients and predicts poor outcomes, they noted. Anticoagulation in this setting can be difficult because patients often have concurrent coagulopathies that are hard to assess with standard techniques, such as PT-INR (international normalized ratio). Although some studies support anticoagulating these patients, data are limited. Therefore, the reviewers searched PubMed, the ISI Web of Science, SCOPUS, and the Cochrane database through Feb. 14, 2017, for trials comparing anticoagulation with no treatment in patients with cirrhosis and PVT.
This search yielded eight trials of 353 patients who received low-molecular-weight heparin, warfarin, or no treatment for about 6 months, with a typical follow-up period of 2 years. The reviewers found no evidence of publication bias or significant heterogeneity among the trials. Six studies evaluated complete recanalization, another set of six studies tracked progression of PVT, a third set of six studies evaluated major or minor bleeding events, and four studies evaluated spontaneous variceal bleeding. Compared with no treatment, anticoagulation was tied to a significantly greater likelihood of complete recanalization (pooled odds ratio, 3.4; 95% confidence interval, 1.5-7.4; P = .002), a significantly lower chance of PVT progressing (9% vs. 33%; pooled odds ratio, 0.14; 95% CI, 0.06-0.31; P less than .0001), no difference in bleeding rates (11% in each pooled group), and a significantly lower risk of spontaneous variceal bleeding (OR, 0.23; 95% CI, 0.06-0.94; P = .04).
“Metaregression analysis showed that duration of anticoagulation did not influence outcomes,” the reviewers wrote. “Low-molecular-weight heparin, but not warfarin, was significantly associated with a complete PVT resolution as compared to untreated patients, while both low-molecular-weight heparin and warfarin were effective in reducing PVT progression.” That finding merits careful interpretation, however, because most studies on warfarin were retrospective and lacked data on the quality of anticoagulation, they added.
“It is a challenge to treat patients with cirrhosis using anticoagulants because of the perception that the coexistent coagulopathy could promote bleeding,” the researchers wrote. Nonetheless, their analysis suggests that anticoagulation has significant benefits and does not increase bleeding risk, regardless of the severity of liver failure, they concluded.
The reviewers reported having no funding sources or conflicts of interest.
FROM GASTROENTEROLOGY
Key clinical point: Anticoagulation produced favorable outcomes with no increase in bleeding risk in patients with cirrhosis and portal vein thrombosis.
Major finding: Rates of any recanalization were 71% in treated patients and 42% in untreated patients (P less than .0001); rates of complete recanalization were 53% and 33%, respectively (P = .002), rates of spontaneous variceal bleeding were 2% and 12% (P = .04), and bleeding affected 11% of patients in each group.
Data source: A systematic review and meta-analysis of eight studies of 353 patients with cirrhosis and portal vein thrombosis.
Disclosures: The reviewers reported having no funding sources or conflicts of interest.