Allowed Publications
Slot System
Featured Buckets
Featured Buckets Admin

Pantoprazole not needed for most patients on anticoagulant/antiplatelet therapies

Article Type
Changed
Thu, 08/22/2019 - 10:59

For most patients taking antiplatelet and/or anticoagulant therapies, the proton pump inhibitor (PPI) pantoprazole is unnecessary, based on findings from the prospective COMPASS trial, which involved more than 17,000 participants.

Pantoprazole may reduce the risk of bleeding from gastroduodenal lesions, but it is unlikely to prevent upper-gastrointestinal events, reported lead author Paul Moayyedi, MB ChB, PhD, of McMaster University in Hamilton, Canada, and colleagues.

The investigators wrote in Gastroenterology, “Guidelines suggest that patients receiving the combination of antiplatelet and anticoagulant therapy should receive PPIs to reduce the risk of upper-GI bleeding. However … there are no randomized data to support the use of PPI therapy in patients taking oral anticoagulants, and a paucity of data relating to aspirin.”

To fill this knowledge gap, the investigators recruited 17,598 participants from 33 countries who had stable peripheral artery disease and cardiovascular disease. Participants were randomized to one of three groups: 100-mg aspirin once daily, 5-mg rivaroxaban twice daily, or a combination of 2.5-mg rivaroxaban twice daily with 100-mg aspirin once daily. This part of the trial was discontinued before completion because of early cardiovascular advantages associated with combination therapy over aspirin alone, and related findings were reported previously. While combination therapy did reduce cardiovascular risks, it had less favorable effects on gut health, highlighted by an associated increase in major GI bleeding events. Despite early cessation of the cardiovascular portion of the trial, the pantoprazole regimen was continued, offering a look at the effect of long-term PPI use on gut health.

At baseline, about two-thirds of participants (64%) were not taking a PPI, requiring randomization to either 40-mg pantoprazole once daily or matching placebo. The primary efficacy outcome was time to first upper-GI clinical event, defined as a composite of the following: upper-GI obstruction, perforation, at least five gastroduodenal erosions with at least 3 days of GI pain, symptomatic gastroduodenal ulcer involving at least 3 days of GI pain, overt upper-GI bleeding of unknown origin, occult bleeding (drop in hemoglobin of at least 2 g/dL), overt bleeding with a gastroduodenal lesion (active bleeding during endoscopy), or a symptomatic gastroduodenal ulcer involving at least 3 days of GI pain. In addition to this measure, the investigators evaluated a post-hoc endpoint with a looser definition of peptic ulcer events, most notably eliminating the requirement that a lesion be actively bleeding during endoscopy.

Most patients in the trial (78%) were male, and 23% were current smokers. Smaller proportions of the population were taking a nonsteroidal anti-inflammatory drug (5%) and/or had a history of peptic ulcer disease (2.6%). The median follow-up was 3.01 years, ranging from 2.49 to 3.59 years. Permanent discontinuations occurred at approximately equal rates in the pantoprazole (21%) and placebo (22%) group, after a median of 11 months (338 days). In both groups, more than 96% of participants who continued treatment took their medications as prescribed at least 80% of the time.

Analysis showed that upper-GI events occurred marginally less often in the pantoprazole group than the placebo group, but without statistical significance (1.2% vs. 1.3%; P = .35). Of the outcomes measured, only overt bleeding of gastroduodenal origin detected by radiography or endoscopy was statistically less common in the pantoprazole group than the placebo group, with a 48% reduced rate (0.2% vs. 0.4%; P = .03). No statistical efficacy differences or statistical interactions were detected between population subgroups.

“The data suggest that routine use of PPI therapy is not warranted for patients receiving low-dose rivaroxaban with or without aspirin for the prevention of atherothrombotic events in patients with stable coronary artery disease or symptomatic peripheral artery disease, as there was no overall impact on clinical upper-GI events or upper-GI bleeding,” the investigators wrote. “This is in contrast to previous systematic reviews of randomized trials reporting that PPIs were associated with a 50%-70% reduction in bleeding and symptomatic peptic ulcers related to nonsteroidal anti-inflammatory drugs, including in the critical care setting.”

Post-hoc analysis, which allowed for a broader definition of upper-GI events related to gastroduodenal ulcers, revealed a slightly greater reduction in risk of bleeding lesions in patients taking pantoprazole, compared with placebo (hazard ratio, 0.45), and additional risk reductions for peptic ulcers (HR, 0.46) and erosions (HR, 0.33). Ultimately, pantoprazole reduced the combined rate of post-hoc events by 56%.

The investigators noted that these ulcer- and erosion-reducing effects of pantoprazole align with previous reports. “It is therefore possible that PPIs might be beneficial for patients at particularly high risk for peptic ulcer disease who are also taking aspirin and/or anticoagulants,” the investigators concluded.

The COMPASS trial was funded by Bayer AG. The investigators disclosed additional relationships with Allergan, Takeda, Janssen, and others.

SOURCE: Moayyedi P et al. Gastro. 2019 May 2. doi: 10.1053/j.gastro.2019.04.041.

Publications
Topics
Sections

For most patients taking antiplatelet and/or anticoagulant therapies, the proton pump inhibitor (PPI) pantoprazole is unnecessary, based on findings from the prospective COMPASS trial, which involved more than 17,000 participants.

Pantoprazole may reduce the risk of bleeding from gastroduodenal lesions, but it is unlikely to prevent upper-gastrointestinal events, reported lead author Paul Moayyedi, MB ChB, PhD, of McMaster University in Hamilton, Canada, and colleagues.

The investigators wrote in Gastroenterology, “Guidelines suggest that patients receiving the combination of antiplatelet and anticoagulant therapy should receive PPIs to reduce the risk of upper-GI bleeding. However … there are no randomized data to support the use of PPI therapy in patients taking oral anticoagulants, and a paucity of data relating to aspirin.”

To fill this knowledge gap, the investigators recruited 17,598 participants from 33 countries who had stable peripheral artery disease and cardiovascular disease. Participants were randomized to one of three groups: 100-mg aspirin once daily, 5-mg rivaroxaban twice daily, or a combination of 2.5-mg rivaroxaban twice daily with 100-mg aspirin once daily. This part of the trial was discontinued before completion because of early cardiovascular advantages associated with combination therapy over aspirin alone, and related findings were reported previously. While combination therapy did reduce cardiovascular risks, it had less favorable effects on gut health, highlighted by an associated increase in major GI bleeding events. Despite early cessation of the cardiovascular portion of the trial, the pantoprazole regimen was continued, offering a look at the effect of long-term PPI use on gut health.

At baseline, about two-thirds of participants (64%) were not taking a PPI, requiring randomization to either 40-mg pantoprazole once daily or matching placebo. The primary efficacy outcome was time to first upper-GI clinical event, defined as a composite of the following: upper-GI obstruction, perforation, at least five gastroduodenal erosions with at least 3 days of GI pain, symptomatic gastroduodenal ulcer involving at least 3 days of GI pain, overt upper-GI bleeding of unknown origin, occult bleeding (drop in hemoglobin of at least 2 g/dL), overt bleeding with a gastroduodenal lesion (active bleeding during endoscopy), or a symptomatic gastroduodenal ulcer involving at least 3 days of GI pain. In addition to this measure, the investigators evaluated a post-hoc endpoint with a looser definition of peptic ulcer events, most notably eliminating the requirement that a lesion be actively bleeding during endoscopy.

Most patients in the trial (78%) were male, and 23% were current smokers. Smaller proportions of the population were taking a nonsteroidal anti-inflammatory drug (5%) and/or had a history of peptic ulcer disease (2.6%). The median follow-up was 3.01 years, ranging from 2.49 to 3.59 years. Permanent discontinuations occurred at approximately equal rates in the pantoprazole (21%) and placebo (22%) group, after a median of 11 months (338 days). In both groups, more than 96% of participants who continued treatment took their medications as prescribed at least 80% of the time.

Analysis showed that upper-GI events occurred marginally less often in the pantoprazole group than the placebo group, but without statistical significance (1.2% vs. 1.3%; P = .35). Of the outcomes measured, only overt bleeding of gastroduodenal origin detected by radiography or endoscopy was statistically less common in the pantoprazole group than the placebo group, with a 48% reduced rate (0.2% vs. 0.4%; P = .03). No statistical efficacy differences or statistical interactions were detected between population subgroups.

“The data suggest that routine use of PPI therapy is not warranted for patients receiving low-dose rivaroxaban with or without aspirin for the prevention of atherothrombotic events in patients with stable coronary artery disease or symptomatic peripheral artery disease, as there was no overall impact on clinical upper-GI events or upper-GI bleeding,” the investigators wrote. “This is in contrast to previous systematic reviews of randomized trials reporting that PPIs were associated with a 50%-70% reduction in bleeding and symptomatic peptic ulcers related to nonsteroidal anti-inflammatory drugs, including in the critical care setting.”

Post-hoc analysis, which allowed for a broader definition of upper-GI events related to gastroduodenal ulcers, revealed a slightly greater reduction in risk of bleeding lesions in patients taking pantoprazole, compared with placebo (hazard ratio, 0.45), and additional risk reductions for peptic ulcers (HR, 0.46) and erosions (HR, 0.33). Ultimately, pantoprazole reduced the combined rate of post-hoc events by 56%.

The investigators noted that these ulcer- and erosion-reducing effects of pantoprazole align with previous reports. “It is therefore possible that PPIs might be beneficial for patients at particularly high risk for peptic ulcer disease who are also taking aspirin and/or anticoagulants,” the investigators concluded.

The COMPASS trial was funded by Bayer AG. The investigators disclosed additional relationships with Allergan, Takeda, Janssen, and others.

SOURCE: Moayyedi P et al. Gastro. 2019 May 2. doi: 10.1053/j.gastro.2019.04.041.

For most patients taking antiplatelet and/or anticoagulant therapies, the proton pump inhibitor (PPI) pantoprazole is unnecessary, based on findings from the prospective COMPASS trial, which involved more than 17,000 participants.

Pantoprazole may reduce the risk of bleeding from gastroduodenal lesions, but it is unlikely to prevent upper-gastrointestinal events, reported lead author Paul Moayyedi, MB ChB, PhD, of McMaster University in Hamilton, Canada, and colleagues.

The investigators wrote in Gastroenterology, “Guidelines suggest that patients receiving the combination of antiplatelet and anticoagulant therapy should receive PPIs to reduce the risk of upper-GI bleeding. However … there are no randomized data to support the use of PPI therapy in patients taking oral anticoagulants, and a paucity of data relating to aspirin.”

To fill this knowledge gap, the investigators recruited 17,598 participants from 33 countries who had stable peripheral artery disease and cardiovascular disease. Participants were randomized to one of three groups: 100-mg aspirin once daily, 5-mg rivaroxaban twice daily, or a combination of 2.5-mg rivaroxaban twice daily with 100-mg aspirin once daily. This part of the trial was discontinued before completion because of early cardiovascular advantages associated with combination therapy over aspirin alone, and related findings were reported previously. While combination therapy did reduce cardiovascular risks, it had less favorable effects on gut health, highlighted by an associated increase in major GI bleeding events. Despite early cessation of the cardiovascular portion of the trial, the pantoprazole regimen was continued, offering a look at the effect of long-term PPI use on gut health.

At baseline, about two-thirds of participants (64%) were not taking a PPI, requiring randomization to either 40-mg pantoprazole once daily or matching placebo. The primary efficacy outcome was time to first upper-GI clinical event, defined as a composite of the following: upper-GI obstruction, perforation, at least five gastroduodenal erosions with at least 3 days of GI pain, symptomatic gastroduodenal ulcer involving at least 3 days of GI pain, overt upper-GI bleeding of unknown origin, occult bleeding (drop in hemoglobin of at least 2 g/dL), overt bleeding with a gastroduodenal lesion (active bleeding during endoscopy), or a symptomatic gastroduodenal ulcer involving at least 3 days of GI pain. In addition to this measure, the investigators evaluated a post-hoc endpoint with a looser definition of peptic ulcer events, most notably eliminating the requirement that a lesion be actively bleeding during endoscopy.

Most patients in the trial (78%) were male, and 23% were current smokers. Smaller proportions of the population were taking a nonsteroidal anti-inflammatory drug (5%) and/or had a history of peptic ulcer disease (2.6%). The median follow-up was 3.01 years, ranging from 2.49 to 3.59 years. Permanent discontinuations occurred at approximately equal rates in the pantoprazole (21%) and placebo (22%) group, after a median of 11 months (338 days). In both groups, more than 96% of participants who continued treatment took their medications as prescribed at least 80% of the time.

Analysis showed that upper-GI events occurred marginally less often in the pantoprazole group than the placebo group, but without statistical significance (1.2% vs. 1.3%; P = .35). Of the outcomes measured, only overt bleeding of gastroduodenal origin detected by radiography or endoscopy was statistically less common in the pantoprazole group than the placebo group, with a 48% reduced rate (0.2% vs. 0.4%; P = .03). No statistical efficacy differences or statistical interactions were detected between population subgroups.

“The data suggest that routine use of PPI therapy is not warranted for patients receiving low-dose rivaroxaban with or without aspirin for the prevention of atherothrombotic events in patients with stable coronary artery disease or symptomatic peripheral artery disease, as there was no overall impact on clinical upper-GI events or upper-GI bleeding,” the investigators wrote. “This is in contrast to previous systematic reviews of randomized trials reporting that PPIs were associated with a 50%-70% reduction in bleeding and symptomatic peptic ulcers related to nonsteroidal anti-inflammatory drugs, including in the critical care setting.”

Post-hoc analysis, which allowed for a broader definition of upper-GI events related to gastroduodenal ulcers, revealed a slightly greater reduction in risk of bleeding lesions in patients taking pantoprazole, compared with placebo (hazard ratio, 0.45), and additional risk reductions for peptic ulcers (HR, 0.46) and erosions (HR, 0.33). Ultimately, pantoprazole reduced the combined rate of post-hoc events by 56%.

The investigators noted that these ulcer- and erosion-reducing effects of pantoprazole align with previous reports. “It is therefore possible that PPIs might be beneficial for patients at particularly high risk for peptic ulcer disease who are also taking aspirin and/or anticoagulants,” the investigators concluded.

The COMPASS trial was funded by Bayer AG. The investigators disclosed additional relationships with Allergan, Takeda, Janssen, and others.

SOURCE: Moayyedi P et al. Gastro. 2019 May 2. doi: 10.1053/j.gastro.2019.04.041.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Clopidogrel matches aspirin for reducing risk of colorectal cancer

Does clopidogrel reduce colorectal cancer risk?
Article Type
Changed
Wed, 05/26/2021 - 13:46

Clopidogrel appears to reduce the risk of colorectal cancer (CRC) as much as low-dose aspirin, based on a case-control study involving more than 15,000 cases.

Source: American Gastroenterological Association

Risk of CRC was reduced by 20%-30% when clopidogrel was given alone or in combination with aspirin, reported lead author Antonio Rodríguez-Miguel of Príncipe de Asturias University Hospital in Madrid and colleagues. This finding adds support to the hypothesis that low-dose aspirin is chemoprotective primarily because of its antiplatelet properties, they noted.

“The mechanism of action of low-dose aspirin to explain its protective effect is subject to debate,” the investigators wrote in Clinical Gastroenterology and Hepatology. “Although aspirin is a nonsteroidal anti-inflammatory drug (NSAID) and these drugs are known to prevent CRC through the inhibition of cyclooxygenase (COX)-2 in epithelial and stromal cells in the large bowel, at low doses (75-300 mg/d) aspirin has only transient effects on this isozyme, while permanently inactivating platelet COX-1 and suppressing thromboxane A2 production. The apparent lack of dose-dependence of the chemoprotective effect of aspirin, as well as the potential role of locally activated platelets in upregulating COX-2 expression in adjacent nucleated cells of the intestinal mucosa, have led [to] the postulation that low-dose aspirin could exert its chemoprotective effect via its antiplatelet action.”

Although previous studies have explored the chemoprotective potential of other antiplatelet agents, such as clopidogrel, the resultant body of evidence remains small. In 2017, for example, Avi Leader, MD, and colleagues reported that the chemoprotective effect of dual-antiplatelet therapy (DAPT) with clopidogrel and aspirin was superior to aspirin monotherapy, based on an additional 8% risk reduction. The present study aimed to build on such findings with evaluation of a Mediterranean cohort, which could reduce confounding lifestyle factors, owing to a lower rate of cardiovascular morbidity than other populations.

The nested, case-control study involved 15,491 cases of CRC and 60,000 controls who were randomly selected and frequency matched by sex, age, and year of indexing. Data were drawn from Base de datos para la Investigación Farmacoepidemiológica en Atención Primaria (BIFAP), a Spanish medical record database with more than 7 million patients. Records of patients involved in the present study were screened for prescription of three antiplatelet agents: low-dose aspirin, clopidogrel, and triflusal. Additional categorization identified current users, recent users, past users, and nonusers. The effects of clopidogrel and aspirin were evaluated separately, as monotherapies, and together, as DAPT.

Demographically, the mean age of the entire study population was 68.6 years, with a slight male predominance (59%). Median follow-up was similar between cases and controls, at approximately 3 years, ranging from about 1.5 to 6 years. Cases showed higher rates of gout, alcohol abuse, acute digestive diseases, and peripheral artery disease, whereas controls were more likely to have histories involving stroke, acute myocardial infarction, chronic digestive diseases, and constipation.

Controls were more likely to be current aspirin users than patients diagnosed with CRC (12.8% vs. 12.2%), giving an associated adjusted odds ratio (AOR) of 0.83. Risk reduction became statistically apparent after 180 days of aspirin usage, with an AOR of 0.79, and more prominent in the 1- to 3-year range, with an AOR of 0.73. This chemoprotective effect faded rapidly with discontinuation.

Current clopidogrel usage led to a comparable level of risk reduction, with an AOR of 0.80. It wasn’t until a year of continuous clopidogrel monotherapy that risk reduction became statistically significant, with an AOR of 0.65, which dropped to 0.57 between years 1 and 3.

Turning to a matched comparison of aspirin or clopidogrel monotherapy versus DAPT, the investigators found similar rates of chemoprotection. Current aspirin usage of any duration offered an adjusted risk reduction of 17%, compared with 25% for clopidogrel, and 29% for DAPT. Beyond 1 year of continuous and current usage, the superiority of DAPT was called into question, as clopidogrel monotherapy offered the greatest risk reduction, at 37%, compared with 22% for aspirin, and 22% for DAPT. Risk analyses involving triflusal lacked statistical significance.

“The results of the present study are compatible with a chemoprotective effect of clopidogrel against CRC, equivalent in magnitude to the one observed for low-dose aspirin,” the investigators wrote. “This finding indirectly supports the hypothesis that the chemoprotective effect of low-dose aspirin is mediated mostly through the permanent inactivation of platelet COX-1.”

The investigators pointed out that the chemoprotective effects of antiplatelet therapy begin to appear early in treatment, independently from lifestyle factors, but risk reduction depends on current usage. Although short-term usage of either aspirin or clopidogrel was associated with an increased risk of CRC, the investigators suggested that this was more likely a perceived risk rather than an actual one. “In our view, this observation could be explained in part by a detection bias, owing to an increased risk of GI bleeding induced by antiplatelet agents that could lead to a greater number of colonoscopies, and, as a result, an early cancer diagnosis,” they wrote.

The study was funded by the Fundación Instituto Teófilo Hernando. Dr. García-Rodríguez disclosed a relationship with CEIFE, which has received funding from Bayer and AstraZeneca.

SOURCE: Rodríguez-Miguel et al. Clin Gastrenterol Hepatol. 2018 Dec 20. doi: 10.1016/j.cgh.2018.12.012.

Body

The role of aspirin in reducing the risk of colorectal cancer is well established, although the mechanisms of actions are not entirely clear. One possible mechanism is through inhibition of the cyclooxygenase-1 (COX-1) pathway. The authors investigated the role of aspirin but also clopidogrel, another antiplatelet drug that works through inhibition of the COX-1 pathway in reducing the risk of CRC in a case-control study from Spain. CRC cases were randomly matched with cancer-free controls, and the use of aspirin and clopidogrel as a risk factor for CRC was studied. Not surprisingly, aspirin use was associated with reduced risk of CRC by 17%, However, what’s new is that the use of clopidogrel was associated with reduced risk of CRC by 20% also but use of dual therapy (aspirin plus clopidogrel) did not confer additional benefit. The results did not differ by patient age or sex. The caveat is that history of CRC screening or colonoscopy was not known for cases or controls, and many other confounders, such as diet, exercise, and other lifestyle and medication history that may account for the differences could not be easily teased apart. If confirmed by others, these data suggest an additional beneficial effect of antiplatelet agent clopidogrel in reducing risk of CRC, if taken for more than 1 year. The study opens the door to exploring mechanisms by which antiplatelet agents may reduce risk of CRC, and the potential role of other antiplatelet agents in reducing risk of CRC.

Aasma Shaukat, MD, MPH, GI section chief Minneapolis VAMC and professor of medicine, University of Minnesota, Minneapolis. She has no conflicts of interest. 
 

Publications
Topics
Sections
Body

The role of aspirin in reducing the risk of colorectal cancer is well established, although the mechanisms of actions are not entirely clear. One possible mechanism is through inhibition of the cyclooxygenase-1 (COX-1) pathway. The authors investigated the role of aspirin but also clopidogrel, another antiplatelet drug that works through inhibition of the COX-1 pathway in reducing the risk of CRC in a case-control study from Spain. CRC cases were randomly matched with cancer-free controls, and the use of aspirin and clopidogrel as a risk factor for CRC was studied. Not surprisingly, aspirin use was associated with reduced risk of CRC by 17%, However, what’s new is that the use of clopidogrel was associated with reduced risk of CRC by 20% also but use of dual therapy (aspirin plus clopidogrel) did not confer additional benefit. The results did not differ by patient age or sex. The caveat is that history of CRC screening or colonoscopy was not known for cases or controls, and many other confounders, such as diet, exercise, and other lifestyle and medication history that may account for the differences could not be easily teased apart. If confirmed by others, these data suggest an additional beneficial effect of antiplatelet agent clopidogrel in reducing risk of CRC, if taken for more than 1 year. The study opens the door to exploring mechanisms by which antiplatelet agents may reduce risk of CRC, and the potential role of other antiplatelet agents in reducing risk of CRC.

Aasma Shaukat, MD, MPH, GI section chief Minneapolis VAMC and professor of medicine, University of Minnesota, Minneapolis. She has no conflicts of interest. 
 

Body

The role of aspirin in reducing the risk of colorectal cancer is well established, although the mechanisms of actions are not entirely clear. One possible mechanism is through inhibition of the cyclooxygenase-1 (COX-1) pathway. The authors investigated the role of aspirin but also clopidogrel, another antiplatelet drug that works through inhibition of the COX-1 pathway in reducing the risk of CRC in a case-control study from Spain. CRC cases were randomly matched with cancer-free controls, and the use of aspirin and clopidogrel as a risk factor for CRC was studied. Not surprisingly, aspirin use was associated with reduced risk of CRC by 17%, However, what’s new is that the use of clopidogrel was associated with reduced risk of CRC by 20% also but use of dual therapy (aspirin plus clopidogrel) did not confer additional benefit. The results did not differ by patient age or sex. The caveat is that history of CRC screening or colonoscopy was not known for cases or controls, and many other confounders, such as diet, exercise, and other lifestyle and medication history that may account for the differences could not be easily teased apart. If confirmed by others, these data suggest an additional beneficial effect of antiplatelet agent clopidogrel in reducing risk of CRC, if taken for more than 1 year. The study opens the door to exploring mechanisms by which antiplatelet agents may reduce risk of CRC, and the potential role of other antiplatelet agents in reducing risk of CRC.

Aasma Shaukat, MD, MPH, GI section chief Minneapolis VAMC and professor of medicine, University of Minnesota, Minneapolis. She has no conflicts of interest. 
 

Title
Does clopidogrel reduce colorectal cancer risk?
Does clopidogrel reduce colorectal cancer risk?

Clopidogrel appears to reduce the risk of colorectal cancer (CRC) as much as low-dose aspirin, based on a case-control study involving more than 15,000 cases.

Source: American Gastroenterological Association

Risk of CRC was reduced by 20%-30% when clopidogrel was given alone or in combination with aspirin, reported lead author Antonio Rodríguez-Miguel of Príncipe de Asturias University Hospital in Madrid and colleagues. This finding adds support to the hypothesis that low-dose aspirin is chemoprotective primarily because of its antiplatelet properties, they noted.

“The mechanism of action of low-dose aspirin to explain its protective effect is subject to debate,” the investigators wrote in Clinical Gastroenterology and Hepatology. “Although aspirin is a nonsteroidal anti-inflammatory drug (NSAID) and these drugs are known to prevent CRC through the inhibition of cyclooxygenase (COX)-2 in epithelial and stromal cells in the large bowel, at low doses (75-300 mg/d) aspirin has only transient effects on this isozyme, while permanently inactivating platelet COX-1 and suppressing thromboxane A2 production. The apparent lack of dose-dependence of the chemoprotective effect of aspirin, as well as the potential role of locally activated platelets in upregulating COX-2 expression in adjacent nucleated cells of the intestinal mucosa, have led [to] the postulation that low-dose aspirin could exert its chemoprotective effect via its antiplatelet action.”

Although previous studies have explored the chemoprotective potential of other antiplatelet agents, such as clopidogrel, the resultant body of evidence remains small. In 2017, for example, Avi Leader, MD, and colleagues reported that the chemoprotective effect of dual-antiplatelet therapy (DAPT) with clopidogrel and aspirin was superior to aspirin monotherapy, based on an additional 8% risk reduction. The present study aimed to build on such findings with evaluation of a Mediterranean cohort, which could reduce confounding lifestyle factors, owing to a lower rate of cardiovascular morbidity than other populations.

The nested, case-control study involved 15,491 cases of CRC and 60,000 controls who were randomly selected and frequency matched by sex, age, and year of indexing. Data were drawn from Base de datos para la Investigación Farmacoepidemiológica en Atención Primaria (BIFAP), a Spanish medical record database with more than 7 million patients. Records of patients involved in the present study were screened for prescription of three antiplatelet agents: low-dose aspirin, clopidogrel, and triflusal. Additional categorization identified current users, recent users, past users, and nonusers. The effects of clopidogrel and aspirin were evaluated separately, as monotherapies, and together, as DAPT.

Demographically, the mean age of the entire study population was 68.6 years, with a slight male predominance (59%). Median follow-up was similar between cases and controls, at approximately 3 years, ranging from about 1.5 to 6 years. Cases showed higher rates of gout, alcohol abuse, acute digestive diseases, and peripheral artery disease, whereas controls were more likely to have histories involving stroke, acute myocardial infarction, chronic digestive diseases, and constipation.

Controls were more likely to be current aspirin users than patients diagnosed with CRC (12.8% vs. 12.2%), giving an associated adjusted odds ratio (AOR) of 0.83. Risk reduction became statistically apparent after 180 days of aspirin usage, with an AOR of 0.79, and more prominent in the 1- to 3-year range, with an AOR of 0.73. This chemoprotective effect faded rapidly with discontinuation.

Current clopidogrel usage led to a comparable level of risk reduction, with an AOR of 0.80. It wasn’t until a year of continuous clopidogrel monotherapy that risk reduction became statistically significant, with an AOR of 0.65, which dropped to 0.57 between years 1 and 3.

Turning to a matched comparison of aspirin or clopidogrel monotherapy versus DAPT, the investigators found similar rates of chemoprotection. Current aspirin usage of any duration offered an adjusted risk reduction of 17%, compared with 25% for clopidogrel, and 29% for DAPT. Beyond 1 year of continuous and current usage, the superiority of DAPT was called into question, as clopidogrel monotherapy offered the greatest risk reduction, at 37%, compared with 22% for aspirin, and 22% for DAPT. Risk analyses involving triflusal lacked statistical significance.

“The results of the present study are compatible with a chemoprotective effect of clopidogrel against CRC, equivalent in magnitude to the one observed for low-dose aspirin,” the investigators wrote. “This finding indirectly supports the hypothesis that the chemoprotective effect of low-dose aspirin is mediated mostly through the permanent inactivation of platelet COX-1.”

The investigators pointed out that the chemoprotective effects of antiplatelet therapy begin to appear early in treatment, independently from lifestyle factors, but risk reduction depends on current usage. Although short-term usage of either aspirin or clopidogrel was associated with an increased risk of CRC, the investigators suggested that this was more likely a perceived risk rather than an actual one. “In our view, this observation could be explained in part by a detection bias, owing to an increased risk of GI bleeding induced by antiplatelet agents that could lead to a greater number of colonoscopies, and, as a result, an early cancer diagnosis,” they wrote.

The study was funded by the Fundación Instituto Teófilo Hernando. Dr. García-Rodríguez disclosed a relationship with CEIFE, which has received funding from Bayer and AstraZeneca.

SOURCE: Rodríguez-Miguel et al. Clin Gastrenterol Hepatol. 2018 Dec 20. doi: 10.1016/j.cgh.2018.12.012.

Clopidogrel appears to reduce the risk of colorectal cancer (CRC) as much as low-dose aspirin, based on a case-control study involving more than 15,000 cases.

Source: American Gastroenterological Association

Risk of CRC was reduced by 20%-30% when clopidogrel was given alone or in combination with aspirin, reported lead author Antonio Rodríguez-Miguel of Príncipe de Asturias University Hospital in Madrid and colleagues. This finding adds support to the hypothesis that low-dose aspirin is chemoprotective primarily because of its antiplatelet properties, they noted.

“The mechanism of action of low-dose aspirin to explain its protective effect is subject to debate,” the investigators wrote in Clinical Gastroenterology and Hepatology. “Although aspirin is a nonsteroidal anti-inflammatory drug (NSAID) and these drugs are known to prevent CRC through the inhibition of cyclooxygenase (COX)-2 in epithelial and stromal cells in the large bowel, at low doses (75-300 mg/d) aspirin has only transient effects on this isozyme, while permanently inactivating platelet COX-1 and suppressing thromboxane A2 production. The apparent lack of dose-dependence of the chemoprotective effect of aspirin, as well as the potential role of locally activated platelets in upregulating COX-2 expression in adjacent nucleated cells of the intestinal mucosa, have led [to] the postulation that low-dose aspirin could exert its chemoprotective effect via its antiplatelet action.”

Although previous studies have explored the chemoprotective potential of other antiplatelet agents, such as clopidogrel, the resultant body of evidence remains small. In 2017, for example, Avi Leader, MD, and colleagues reported that the chemoprotective effect of dual-antiplatelet therapy (DAPT) with clopidogrel and aspirin was superior to aspirin monotherapy, based on an additional 8% risk reduction. The present study aimed to build on such findings with evaluation of a Mediterranean cohort, which could reduce confounding lifestyle factors, owing to a lower rate of cardiovascular morbidity than other populations.

The nested, case-control study involved 15,491 cases of CRC and 60,000 controls who were randomly selected and frequency matched by sex, age, and year of indexing. Data were drawn from Base de datos para la Investigación Farmacoepidemiológica en Atención Primaria (BIFAP), a Spanish medical record database with more than 7 million patients. Records of patients involved in the present study were screened for prescription of three antiplatelet agents: low-dose aspirin, clopidogrel, and triflusal. Additional categorization identified current users, recent users, past users, and nonusers. The effects of clopidogrel and aspirin were evaluated separately, as monotherapies, and together, as DAPT.

Demographically, the mean age of the entire study population was 68.6 years, with a slight male predominance (59%). Median follow-up was similar between cases and controls, at approximately 3 years, ranging from about 1.5 to 6 years. Cases showed higher rates of gout, alcohol abuse, acute digestive diseases, and peripheral artery disease, whereas controls were more likely to have histories involving stroke, acute myocardial infarction, chronic digestive diseases, and constipation.

Controls were more likely to be current aspirin users than patients diagnosed with CRC (12.8% vs. 12.2%), giving an associated adjusted odds ratio (AOR) of 0.83. Risk reduction became statistically apparent after 180 days of aspirin usage, with an AOR of 0.79, and more prominent in the 1- to 3-year range, with an AOR of 0.73. This chemoprotective effect faded rapidly with discontinuation.

Current clopidogrel usage led to a comparable level of risk reduction, with an AOR of 0.80. It wasn’t until a year of continuous clopidogrel monotherapy that risk reduction became statistically significant, with an AOR of 0.65, which dropped to 0.57 between years 1 and 3.

Turning to a matched comparison of aspirin or clopidogrel monotherapy versus DAPT, the investigators found similar rates of chemoprotection. Current aspirin usage of any duration offered an adjusted risk reduction of 17%, compared with 25% for clopidogrel, and 29% for DAPT. Beyond 1 year of continuous and current usage, the superiority of DAPT was called into question, as clopidogrel monotherapy offered the greatest risk reduction, at 37%, compared with 22% for aspirin, and 22% for DAPT. Risk analyses involving triflusal lacked statistical significance.

“The results of the present study are compatible with a chemoprotective effect of clopidogrel against CRC, equivalent in magnitude to the one observed for low-dose aspirin,” the investigators wrote. “This finding indirectly supports the hypothesis that the chemoprotective effect of low-dose aspirin is mediated mostly through the permanent inactivation of platelet COX-1.”

The investigators pointed out that the chemoprotective effects of antiplatelet therapy begin to appear early in treatment, independently from lifestyle factors, but risk reduction depends on current usage. Although short-term usage of either aspirin or clopidogrel was associated with an increased risk of CRC, the investigators suggested that this was more likely a perceived risk rather than an actual one. “In our view, this observation could be explained in part by a detection bias, owing to an increased risk of GI bleeding induced by antiplatelet agents that could lead to a greater number of colonoscopies, and, as a result, an early cancer diagnosis,” they wrote.

The study was funded by the Fundación Instituto Teófilo Hernando. Dr. García-Rodríguez disclosed a relationship with CEIFE, which has received funding from Bayer and AstraZeneca.

SOURCE: Rodríguez-Miguel et al. Clin Gastrenterol Hepatol. 2018 Dec 20. doi: 10.1016/j.cgh.2018.12.012.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Clopidogrel usage appears to reduce the risk of colorectal cancer as much as low-dose aspirin.

Major finding: Current clopidogrel usage was associated with a 20% reduced risk of colorectal cancer (adjusted odds ratio, 0.8).

Study details: A nested case-control study involving 15,491 cases of colorectal cancer and 60,000 controls.

Disclosures: The study was funded by the Fundación Instituto Teófilo Hernando. Dr. García-Rodríguez disclosed a relationship with CEIFE, which has received funding from Bayer and AstraZeneca.

Source: Rodríguez-Miguel A et al. Clin Gastrenterol Hepatol. 2018 Dec 20. doi: 10.1016/j.cgh.2018.12.012.

Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Inflammation diminishes quality of life in NAFLD, not fibrosis

Article Type
Changed
Thu, 08/22/2019 - 11:04

 

A variety of demographic and disease-related factors contribute to poorer quality of life in patients with nonalcoholic fatty liver disease (NAFLD), based on questionnaires involving 304 European patients.

In contrast with previous research, lobular inflammation, but not hepatic fibrosis, was associated with worse quality of life, reported to lead author Yvonne Huber, MD, of Johannes Gutenberg University in Mainz, Germany, and colleagues. Women and those with advanced disease or comorbidities had the lowest health-related quality of life (HRQL) scores. The investigators suggested that these findings could be used for treatment planning at a population and patient level.

“With the emergence of medical therapy for [nonalcoholic steatohepatitis (NASH)], it will be of importance to identify patients with the highest unmet need for treatment,” the investigators wrote in Clinical Gastroenterology and Hepatology, emphasizing that therapies targeting inflammation could provide the greatest relief.

To determine which patients with NAFLD were most affected by their condition, the investigators used the Chronic Liver Disease Questionnaire (CLDQ), which assesses physical, mental, social, and emotional function, with lower scores indicating poorer health-related quality of life. “[The CLDQ] more specifically addresses symptoms of patients with chronic liver disease, including extrahepatic manifestations, compared with traditional HRQL measures such as the [Short Form–36 (SF-36)] Health Survey Questionnaire,” the investigators explained. Recent research has used the CLDQ to reveal a variety of findings, the investigators noted, such as a 2016 study by Alt and colleagues outlining the most common symptoms in noninfectious chronic liver disease (abdominal discomfort, fatigue, and anxiety), and two studies by Younossi and colleagues describing quality of life improvements after curing hepatitis C virus, and negative impacts of viremia and hepatic inflammation in patients with hepatitis B.

The current study involved 304 patients with histologically confirmed NAFLD who were prospectively entered into the European NAFLD registry via centers in Germany (n = 133), the United Kingdom (n = 154), and Spain (n = 17). Patient data included demographic factors, laboratory findings, and histologic features. Within 6 months of liver biopsy, patients completed the CLDQ.

The mean patient age was 52.3 years, with slightly more men than women (53.3% vs. 46.7%). Most patients (75%) were obese, leading to a median body mass index of 33.3 kg/m2. More than two-thirds of patients (69.1%) had NASH, while approximately half of the population (51.4%) had moderate steatosis, no or low-grade fibrosis (F0-2, 58.2%), and no or low-grade lobular inflammation (grade 0 or 1, 54.7%). The three countries had significantly different population profiles; for example, the United Kingdom had an approximately 10% higher prevalence of type 2 diabetes and obesity compared with the entire cohort, but a decreased arterial hypertension rate of a similar magnitude. The United Kingdom also had a significantly lower mean CLDQ score than that of the study population as a whole (4.73 vs. 4.99).

Analysis of the entire cohort revealed that a variety of demographic and disease-related factors negatively impacted health-related quality of life. Women had a significantly lower mean CLDQ score than that of men (5.31 vs. 4.62; P less than .001), more often reporting abdominal symptoms, fatigue, systemic symptoms, reduced activity, diminished emotional functioning, and worry. CLDQ overall score was negatively influenced by obesity (4.83 vs. 5.46), type 2 diabetes (4.74 vs. 5.25), and hyperlipidemia (4.84 vs. 5.24), but not hypertension. Laboratory findings that negatively correlated with CLDQ included aspartate transaminase (AST) and HbA1c, whereas ferritin was positively correlated.

Generally, patients with NASH reported worse quality of life than that of those with just NAFLD (4.85 vs. 5.31). Factors contributing most to this disparity were fatigue, systemic symptoms, activity, and worry. On a histologic level, hepatic steatosis, ballooning, and lobular inflammation predicted poorer quality of life; although advanced fibrosis and compensated cirrhosis were associated with a trend toward reduced quality of life, this pattern lacked statistical significance. Multivariate analysis, which accounted for age, sex, body mass index, country, and type 2 diabetes, revealed independent associations between reduced quality of life and type 2 diabetes, sex, age, body mass index, and hepatic inflammation, but not fibrosis.

“The striking finding of the current analysis in this well-characterized European cohort was that, in contrast to the published data on predictors of overall and liver-specific mortality, lobular inflammation correlated independently with HRQL,” the investigators wrote. “These results differ from the NASH [Clinical Research Network] cohort, which found lower HRQL using the generic [SF-36 Health Survey Questionnaire] in NASH compared with a healthy U.S. population and a significant effect in cirrhosis only.” The investigators suggested that mechanistic differences in disease progression could explain this discordance.

Although hepatic fibrosis has been tied with quality of life by some studies, the investigators pointed out that patients with chronic hepatitis B or C have reported improved quality of life after viral elimination or suppression, which reduce inflammation, but not fibrosis. “On the basis of the current analysis, it can be expected that improvement of steatohepatitis, and in particular lobular inflammation, will have measurable influence on HRQL even independently of fibrosis improvement,” the investigators concluded.

The study was funded by H2020. The investigators reported no conflicts of interest.

SOURCE: Huber Y et al. CGH. 2018 Dec 20. doi: 10.1016/j.cgh.2018.12.016.

Publications
Topics
Sections

 

A variety of demographic and disease-related factors contribute to poorer quality of life in patients with nonalcoholic fatty liver disease (NAFLD), based on questionnaires involving 304 European patients.

In contrast with previous research, lobular inflammation, but not hepatic fibrosis, was associated with worse quality of life, reported to lead author Yvonne Huber, MD, of Johannes Gutenberg University in Mainz, Germany, and colleagues. Women and those with advanced disease or comorbidities had the lowest health-related quality of life (HRQL) scores. The investigators suggested that these findings could be used for treatment planning at a population and patient level.

“With the emergence of medical therapy for [nonalcoholic steatohepatitis (NASH)], it will be of importance to identify patients with the highest unmet need for treatment,” the investigators wrote in Clinical Gastroenterology and Hepatology, emphasizing that therapies targeting inflammation could provide the greatest relief.

To determine which patients with NAFLD were most affected by their condition, the investigators used the Chronic Liver Disease Questionnaire (CLDQ), which assesses physical, mental, social, and emotional function, with lower scores indicating poorer health-related quality of life. “[The CLDQ] more specifically addresses symptoms of patients with chronic liver disease, including extrahepatic manifestations, compared with traditional HRQL measures such as the [Short Form–36 (SF-36)] Health Survey Questionnaire,” the investigators explained. Recent research has used the CLDQ to reveal a variety of findings, the investigators noted, such as a 2016 study by Alt and colleagues outlining the most common symptoms in noninfectious chronic liver disease (abdominal discomfort, fatigue, and anxiety), and two studies by Younossi and colleagues describing quality of life improvements after curing hepatitis C virus, and negative impacts of viremia and hepatic inflammation in patients with hepatitis B.

The current study involved 304 patients with histologically confirmed NAFLD who were prospectively entered into the European NAFLD registry via centers in Germany (n = 133), the United Kingdom (n = 154), and Spain (n = 17). Patient data included demographic factors, laboratory findings, and histologic features. Within 6 months of liver biopsy, patients completed the CLDQ.

The mean patient age was 52.3 years, with slightly more men than women (53.3% vs. 46.7%). Most patients (75%) were obese, leading to a median body mass index of 33.3 kg/m2. More than two-thirds of patients (69.1%) had NASH, while approximately half of the population (51.4%) had moderate steatosis, no or low-grade fibrosis (F0-2, 58.2%), and no or low-grade lobular inflammation (grade 0 or 1, 54.7%). The three countries had significantly different population profiles; for example, the United Kingdom had an approximately 10% higher prevalence of type 2 diabetes and obesity compared with the entire cohort, but a decreased arterial hypertension rate of a similar magnitude. The United Kingdom also had a significantly lower mean CLDQ score than that of the study population as a whole (4.73 vs. 4.99).

Analysis of the entire cohort revealed that a variety of demographic and disease-related factors negatively impacted health-related quality of life. Women had a significantly lower mean CLDQ score than that of men (5.31 vs. 4.62; P less than .001), more often reporting abdominal symptoms, fatigue, systemic symptoms, reduced activity, diminished emotional functioning, and worry. CLDQ overall score was negatively influenced by obesity (4.83 vs. 5.46), type 2 diabetes (4.74 vs. 5.25), and hyperlipidemia (4.84 vs. 5.24), but not hypertension. Laboratory findings that negatively correlated with CLDQ included aspartate transaminase (AST) and HbA1c, whereas ferritin was positively correlated.

Generally, patients with NASH reported worse quality of life than that of those with just NAFLD (4.85 vs. 5.31). Factors contributing most to this disparity were fatigue, systemic symptoms, activity, and worry. On a histologic level, hepatic steatosis, ballooning, and lobular inflammation predicted poorer quality of life; although advanced fibrosis and compensated cirrhosis were associated with a trend toward reduced quality of life, this pattern lacked statistical significance. Multivariate analysis, which accounted for age, sex, body mass index, country, and type 2 diabetes, revealed independent associations between reduced quality of life and type 2 diabetes, sex, age, body mass index, and hepatic inflammation, but not fibrosis.

“The striking finding of the current analysis in this well-characterized European cohort was that, in contrast to the published data on predictors of overall and liver-specific mortality, lobular inflammation correlated independently with HRQL,” the investigators wrote. “These results differ from the NASH [Clinical Research Network] cohort, which found lower HRQL using the generic [SF-36 Health Survey Questionnaire] in NASH compared with a healthy U.S. population and a significant effect in cirrhosis only.” The investigators suggested that mechanistic differences in disease progression could explain this discordance.

Although hepatic fibrosis has been tied with quality of life by some studies, the investigators pointed out that patients with chronic hepatitis B or C have reported improved quality of life after viral elimination or suppression, which reduce inflammation, but not fibrosis. “On the basis of the current analysis, it can be expected that improvement of steatohepatitis, and in particular lobular inflammation, will have measurable influence on HRQL even independently of fibrosis improvement,” the investigators concluded.

The study was funded by H2020. The investigators reported no conflicts of interest.

SOURCE: Huber Y et al. CGH. 2018 Dec 20. doi: 10.1016/j.cgh.2018.12.016.

 

A variety of demographic and disease-related factors contribute to poorer quality of life in patients with nonalcoholic fatty liver disease (NAFLD), based on questionnaires involving 304 European patients.

In contrast with previous research, lobular inflammation, but not hepatic fibrosis, was associated with worse quality of life, reported to lead author Yvonne Huber, MD, of Johannes Gutenberg University in Mainz, Germany, and colleagues. Women and those with advanced disease or comorbidities had the lowest health-related quality of life (HRQL) scores. The investigators suggested that these findings could be used for treatment planning at a population and patient level.

“With the emergence of medical therapy for [nonalcoholic steatohepatitis (NASH)], it will be of importance to identify patients with the highest unmet need for treatment,” the investigators wrote in Clinical Gastroenterology and Hepatology, emphasizing that therapies targeting inflammation could provide the greatest relief.

To determine which patients with NAFLD were most affected by their condition, the investigators used the Chronic Liver Disease Questionnaire (CLDQ), which assesses physical, mental, social, and emotional function, with lower scores indicating poorer health-related quality of life. “[The CLDQ] more specifically addresses symptoms of patients with chronic liver disease, including extrahepatic manifestations, compared with traditional HRQL measures such as the [Short Form–36 (SF-36)] Health Survey Questionnaire,” the investigators explained. Recent research has used the CLDQ to reveal a variety of findings, the investigators noted, such as a 2016 study by Alt and colleagues outlining the most common symptoms in noninfectious chronic liver disease (abdominal discomfort, fatigue, and anxiety), and two studies by Younossi and colleagues describing quality of life improvements after curing hepatitis C virus, and negative impacts of viremia and hepatic inflammation in patients with hepatitis B.

The current study involved 304 patients with histologically confirmed NAFLD who were prospectively entered into the European NAFLD registry via centers in Germany (n = 133), the United Kingdom (n = 154), and Spain (n = 17). Patient data included demographic factors, laboratory findings, and histologic features. Within 6 months of liver biopsy, patients completed the CLDQ.

The mean patient age was 52.3 years, with slightly more men than women (53.3% vs. 46.7%). Most patients (75%) were obese, leading to a median body mass index of 33.3 kg/m2. More than two-thirds of patients (69.1%) had NASH, while approximately half of the population (51.4%) had moderate steatosis, no or low-grade fibrosis (F0-2, 58.2%), and no or low-grade lobular inflammation (grade 0 or 1, 54.7%). The three countries had significantly different population profiles; for example, the United Kingdom had an approximately 10% higher prevalence of type 2 diabetes and obesity compared with the entire cohort, but a decreased arterial hypertension rate of a similar magnitude. The United Kingdom also had a significantly lower mean CLDQ score than that of the study population as a whole (4.73 vs. 4.99).

Analysis of the entire cohort revealed that a variety of demographic and disease-related factors negatively impacted health-related quality of life. Women had a significantly lower mean CLDQ score than that of men (5.31 vs. 4.62; P less than .001), more often reporting abdominal symptoms, fatigue, systemic symptoms, reduced activity, diminished emotional functioning, and worry. CLDQ overall score was negatively influenced by obesity (4.83 vs. 5.46), type 2 diabetes (4.74 vs. 5.25), and hyperlipidemia (4.84 vs. 5.24), but not hypertension. Laboratory findings that negatively correlated with CLDQ included aspartate transaminase (AST) and HbA1c, whereas ferritin was positively correlated.

Generally, patients with NASH reported worse quality of life than that of those with just NAFLD (4.85 vs. 5.31). Factors contributing most to this disparity were fatigue, systemic symptoms, activity, and worry. On a histologic level, hepatic steatosis, ballooning, and lobular inflammation predicted poorer quality of life; although advanced fibrosis and compensated cirrhosis were associated with a trend toward reduced quality of life, this pattern lacked statistical significance. Multivariate analysis, which accounted for age, sex, body mass index, country, and type 2 diabetes, revealed independent associations between reduced quality of life and type 2 diabetes, sex, age, body mass index, and hepatic inflammation, but not fibrosis.

“The striking finding of the current analysis in this well-characterized European cohort was that, in contrast to the published data on predictors of overall and liver-specific mortality, lobular inflammation correlated independently with HRQL,” the investigators wrote. “These results differ from the NASH [Clinical Research Network] cohort, which found lower HRQL using the generic [SF-36 Health Survey Questionnaire] in NASH compared with a healthy U.S. population and a significant effect in cirrhosis only.” The investigators suggested that mechanistic differences in disease progression could explain this discordance.

Although hepatic fibrosis has been tied with quality of life by some studies, the investigators pointed out that patients with chronic hepatitis B or C have reported improved quality of life after viral elimination or suppression, which reduce inflammation, but not fibrosis. “On the basis of the current analysis, it can be expected that improvement of steatohepatitis, and in particular lobular inflammation, will have measurable influence on HRQL even independently of fibrosis improvement,” the investigators concluded.

The study was funded by H2020. The investigators reported no conflicts of interest.

SOURCE: Huber Y et al. CGH. 2018 Dec 20. doi: 10.1016/j.cgh.2018.12.016.

Publications
Publications
Topics
Article Type
Click for Credit Status
Active
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
CME ID
205473
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

HCC surveillance after anti-HCV therapy cost effective only for patients with cirrhosis

Article Type
Changed
Tue, 08/20/2019 - 09:16

For patients with hepatitis C virus (HCV)–related cirrhosis (F4), but not those with advanced fibrosis (F3), hepatocellular carcinoma (HCC) surveillance after a sustained virologic response (SVR) is cost effective, according to investigators.

Current international guidelines call for HCC surveillance among all patients with advanced fibrosis (F3) or cirrhosis (F4) who have achieved SVR, but this is “very unlikely to be cost effective,” reported lead author Hooman Farhang Zangneh, MD, of Toronto General Hospital and colleagues. “HCV-related HCC rarely occurs in patients without cirrhosis,” the investigators explained in Clinical Gastroenterology and Hepatology. “With cirrhosis present, HCC incidence is 1.4% to 4.9% per year. If found early, options for curative therapy include radiofrequency ablation (RFA), surgical resection, and liver transplantation.”

The investigators developed a Markov model to determine which at-risk patients could undergo surveillance while remaining below willingness-to-pay thresholds. Specifically, cost-effectiveness was assessed for ultrasound screenings annually (every year) or biannually (twice a year) among patients with advanced fibrosis (F3) or compensated cirrhosis (F4) who were aged 50 years and had an SVR. Relevant data were drawn from expert opinions, medical literature, and Canada Life Tables. Various HCC incidence rates were tested, including a constant annual rate, rates based on type of antiviral treatment (direct-acting and interferon-based therapies), others based on stage of fibrosis, and another that increased with age. The model was validated by applying it to patients with F3 or F4 fibrosis who had not yet achieved an SVR. All monetary values were reported in 2015 Canadian dollars.

Representative of current guidelines, the investigators first tested costs when conducting surveillance among all patients with F3 or F4 fibrosis with an assumed constant HCC annual incidence rate of 0.5%. Biannual ultrasound surveillance after SVR caught more cases of HCC still in a curable stage (78%) than no surveillance (29%); however, false-positives were relatively common at 21.8% and 15.7% for biannual and annual surveillance, respectively. The investigators noted that in the real world, some of these false-positives are not detected by more advanced imaging, so patients go on to receive unnecessary RFA, which incurs additional costs. Partly for this reason, while biannual surveillance was more effective, it was also more expensive, with an incremental cost-effectiveness ratio (ICER) of $106,792 per quality-adjusted life-years (QALY), compared with $72,105 per QALY for annual surveillance.

Including only patients with F3 fibrosis after interferon-based therapy, using an HCC incidence of 0.23%, biannual and annual ICERs rose to $484,160 and $204,708 per QALY, respectively, both of which exceed standard willingness-to-pay thresholds. In comparison, annual and biannual ICERs were at most $55,850 and $42,305 per QALY, respectively, among patients with cirrhosis before interferon-induced SVR, using an HCC incidence rate of up to 1.39% per year.

“These results suggest that biannual (or annual) HCC surveillance is likely to be cost effective for patients with cirrhosis, but not for patients with F3 fibrosis before SVR,” the investigators wrote.

Costs for HCC surveillance among cirrhosis patients after direct-acting antiviral-induced SVR were still lower, at $43,229 and $34,307 per QALY, which were far lower than costs for patients with F3 fibrosis, which were $188,157 and $111,667 per QALY.

Focusing on the evident savings associated with surveillance of patients with cirrhosis, the investigators tested two diagnostic thresholds within this population with the aim of reducing costs further. They found that surveillance of patients with a pretreatment aspartate aminotransferase to platelet ratio index (APRI) greater than 2.0 (HCC incidence, 0.89%) was associated with biannual and annual ICERs of $48,729 and $37,806 per QALY, respectively, but when APRI was less than 2.0 (HCC incidence, 0.093%), surveillance was less effective and more expensive than no surveillance at all. A similar trend was found for an FIB-4 threshold of 3.25.

Employment of age-stratified risk of HCC also reduced costs of screening for patients with cirrhosis. With this strategy, ICER was $48,432 per QALY for biannual surveillance and $37,201 per QALY for annual surveillance.

“These data suggest that, if we assume HCC incidence increases with age, biannual or annual surveillance will be cost effective for the vast majority, if not all, patients with cirrhosis before SVR,” the investigators wrote.

“Our analysis suggests that HCC surveillance is very unlikely to be cost effective in patients with F3 fibrosis, whereas both annual and biannual modalities are likely to be cost effective at standard willingness-to-pay thresholds for patients with cirrhosis compared with no surveillance,” the investigators wrote.

“Additional long-term follow-up data are required to help identify patients at highest risk of HCC after SVR to tailor surveillance guidelines,” the investigators concluded.

The study was funded by the Toronto Centre for Liver Disease. The investigators declared no conflicts of interest.

This story was updated on 7/12/2019.

SOURCE: Zangneh et al. Clin Gastroenterol Hepatol. 2018 Dec 20. doi: 10.1016/j.cgh.2018.12.018.

Publications
Topics
Sections

For patients with hepatitis C virus (HCV)–related cirrhosis (F4), but not those with advanced fibrosis (F3), hepatocellular carcinoma (HCC) surveillance after a sustained virologic response (SVR) is cost effective, according to investigators.

Current international guidelines call for HCC surveillance among all patients with advanced fibrosis (F3) or cirrhosis (F4) who have achieved SVR, but this is “very unlikely to be cost effective,” reported lead author Hooman Farhang Zangneh, MD, of Toronto General Hospital and colleagues. “HCV-related HCC rarely occurs in patients without cirrhosis,” the investigators explained in Clinical Gastroenterology and Hepatology. “With cirrhosis present, HCC incidence is 1.4% to 4.9% per year. If found early, options for curative therapy include radiofrequency ablation (RFA), surgical resection, and liver transplantation.”

The investigators developed a Markov model to determine which at-risk patients could undergo surveillance while remaining below willingness-to-pay thresholds. Specifically, cost-effectiveness was assessed for ultrasound screenings annually (every year) or biannually (twice a year) among patients with advanced fibrosis (F3) or compensated cirrhosis (F4) who were aged 50 years and had an SVR. Relevant data were drawn from expert opinions, medical literature, and Canada Life Tables. Various HCC incidence rates were tested, including a constant annual rate, rates based on type of antiviral treatment (direct-acting and interferon-based therapies), others based on stage of fibrosis, and another that increased with age. The model was validated by applying it to patients with F3 or F4 fibrosis who had not yet achieved an SVR. All monetary values were reported in 2015 Canadian dollars.

Representative of current guidelines, the investigators first tested costs when conducting surveillance among all patients with F3 or F4 fibrosis with an assumed constant HCC annual incidence rate of 0.5%. Biannual ultrasound surveillance after SVR caught more cases of HCC still in a curable stage (78%) than no surveillance (29%); however, false-positives were relatively common at 21.8% and 15.7% for biannual and annual surveillance, respectively. The investigators noted that in the real world, some of these false-positives are not detected by more advanced imaging, so patients go on to receive unnecessary RFA, which incurs additional costs. Partly for this reason, while biannual surveillance was more effective, it was also more expensive, with an incremental cost-effectiveness ratio (ICER) of $106,792 per quality-adjusted life-years (QALY), compared with $72,105 per QALY for annual surveillance.

Including only patients with F3 fibrosis after interferon-based therapy, using an HCC incidence of 0.23%, biannual and annual ICERs rose to $484,160 and $204,708 per QALY, respectively, both of which exceed standard willingness-to-pay thresholds. In comparison, annual and biannual ICERs were at most $55,850 and $42,305 per QALY, respectively, among patients with cirrhosis before interferon-induced SVR, using an HCC incidence rate of up to 1.39% per year.

“These results suggest that biannual (or annual) HCC surveillance is likely to be cost effective for patients with cirrhosis, but not for patients with F3 fibrosis before SVR,” the investigators wrote.

Costs for HCC surveillance among cirrhosis patients after direct-acting antiviral-induced SVR were still lower, at $43,229 and $34,307 per QALY, which were far lower than costs for patients with F3 fibrosis, which were $188,157 and $111,667 per QALY.

Focusing on the evident savings associated with surveillance of patients with cirrhosis, the investigators tested two diagnostic thresholds within this population with the aim of reducing costs further. They found that surveillance of patients with a pretreatment aspartate aminotransferase to platelet ratio index (APRI) greater than 2.0 (HCC incidence, 0.89%) was associated with biannual and annual ICERs of $48,729 and $37,806 per QALY, respectively, but when APRI was less than 2.0 (HCC incidence, 0.093%), surveillance was less effective and more expensive than no surveillance at all. A similar trend was found for an FIB-4 threshold of 3.25.

Employment of age-stratified risk of HCC also reduced costs of screening for patients with cirrhosis. With this strategy, ICER was $48,432 per QALY for biannual surveillance and $37,201 per QALY for annual surveillance.

“These data suggest that, if we assume HCC incidence increases with age, biannual or annual surveillance will be cost effective for the vast majority, if not all, patients with cirrhosis before SVR,” the investigators wrote.

“Our analysis suggests that HCC surveillance is very unlikely to be cost effective in patients with F3 fibrosis, whereas both annual and biannual modalities are likely to be cost effective at standard willingness-to-pay thresholds for patients with cirrhosis compared with no surveillance,” the investigators wrote.

“Additional long-term follow-up data are required to help identify patients at highest risk of HCC after SVR to tailor surveillance guidelines,” the investigators concluded.

The study was funded by the Toronto Centre for Liver Disease. The investigators declared no conflicts of interest.

This story was updated on 7/12/2019.

SOURCE: Zangneh et al. Clin Gastroenterol Hepatol. 2018 Dec 20. doi: 10.1016/j.cgh.2018.12.018.

For patients with hepatitis C virus (HCV)–related cirrhosis (F4), but not those with advanced fibrosis (F3), hepatocellular carcinoma (HCC) surveillance after a sustained virologic response (SVR) is cost effective, according to investigators.

Current international guidelines call for HCC surveillance among all patients with advanced fibrosis (F3) or cirrhosis (F4) who have achieved SVR, but this is “very unlikely to be cost effective,” reported lead author Hooman Farhang Zangneh, MD, of Toronto General Hospital and colleagues. “HCV-related HCC rarely occurs in patients without cirrhosis,” the investigators explained in Clinical Gastroenterology and Hepatology. “With cirrhosis present, HCC incidence is 1.4% to 4.9% per year. If found early, options for curative therapy include radiofrequency ablation (RFA), surgical resection, and liver transplantation.”

The investigators developed a Markov model to determine which at-risk patients could undergo surveillance while remaining below willingness-to-pay thresholds. Specifically, cost-effectiveness was assessed for ultrasound screenings annually (every year) or biannually (twice a year) among patients with advanced fibrosis (F3) or compensated cirrhosis (F4) who were aged 50 years and had an SVR. Relevant data were drawn from expert opinions, medical literature, and Canada Life Tables. Various HCC incidence rates were tested, including a constant annual rate, rates based on type of antiviral treatment (direct-acting and interferon-based therapies), others based on stage of fibrosis, and another that increased with age. The model was validated by applying it to patients with F3 or F4 fibrosis who had not yet achieved an SVR. All monetary values were reported in 2015 Canadian dollars.

Representative of current guidelines, the investigators first tested costs when conducting surveillance among all patients with F3 or F4 fibrosis with an assumed constant HCC annual incidence rate of 0.5%. Biannual ultrasound surveillance after SVR caught more cases of HCC still in a curable stage (78%) than no surveillance (29%); however, false-positives were relatively common at 21.8% and 15.7% for biannual and annual surveillance, respectively. The investigators noted that in the real world, some of these false-positives are not detected by more advanced imaging, so patients go on to receive unnecessary RFA, which incurs additional costs. Partly for this reason, while biannual surveillance was more effective, it was also more expensive, with an incremental cost-effectiveness ratio (ICER) of $106,792 per quality-adjusted life-years (QALY), compared with $72,105 per QALY for annual surveillance.

Including only patients with F3 fibrosis after interferon-based therapy, using an HCC incidence of 0.23%, biannual and annual ICERs rose to $484,160 and $204,708 per QALY, respectively, both of which exceed standard willingness-to-pay thresholds. In comparison, annual and biannual ICERs were at most $55,850 and $42,305 per QALY, respectively, among patients with cirrhosis before interferon-induced SVR, using an HCC incidence rate of up to 1.39% per year.

“These results suggest that biannual (or annual) HCC surveillance is likely to be cost effective for patients with cirrhosis, but not for patients with F3 fibrosis before SVR,” the investigators wrote.

Costs for HCC surveillance among cirrhosis patients after direct-acting antiviral-induced SVR were still lower, at $43,229 and $34,307 per QALY, which were far lower than costs for patients with F3 fibrosis, which were $188,157 and $111,667 per QALY.

Focusing on the evident savings associated with surveillance of patients with cirrhosis, the investigators tested two diagnostic thresholds within this population with the aim of reducing costs further. They found that surveillance of patients with a pretreatment aspartate aminotransferase to platelet ratio index (APRI) greater than 2.0 (HCC incidence, 0.89%) was associated with biannual and annual ICERs of $48,729 and $37,806 per QALY, respectively, but when APRI was less than 2.0 (HCC incidence, 0.093%), surveillance was less effective and more expensive than no surveillance at all. A similar trend was found for an FIB-4 threshold of 3.25.

Employment of age-stratified risk of HCC also reduced costs of screening for patients with cirrhosis. With this strategy, ICER was $48,432 per QALY for biannual surveillance and $37,201 per QALY for annual surveillance.

“These data suggest that, if we assume HCC incidence increases with age, biannual or annual surveillance will be cost effective for the vast majority, if not all, patients with cirrhosis before SVR,” the investigators wrote.

“Our analysis suggests that HCC surveillance is very unlikely to be cost effective in patients with F3 fibrosis, whereas both annual and biannual modalities are likely to be cost effective at standard willingness-to-pay thresholds for patients with cirrhosis compared with no surveillance,” the investigators wrote.

“Additional long-term follow-up data are required to help identify patients at highest risk of HCC after SVR to tailor surveillance guidelines,” the investigators concluded.

The study was funded by the Toronto Centre for Liver Disease. The investigators declared no conflicts of interest.

This story was updated on 7/12/2019.

SOURCE: Zangneh et al. Clin Gastroenterol Hepatol. 2018 Dec 20. doi: 10.1016/j.cgh.2018.12.018.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Genomic study reveals five subtypes of colorectal cancer

Stage now set for functional studies
Article Type
Changed
Wed, 05/26/2021 - 13:47

Colorectal cancer can be divided into five DNA methylation subtypes that predict molecular and clinical behavior and may offer future therapeutic targets, according to investigators.

In 216 unselected colorectal cancers, five subtypes of the CpG island methylator phenotype (CIMP) showed “striking” associations with sex, age, and tumor location, reported lead author Lochlan Fennell, MD, of the QIMR Berghofer Medical Research Institute in Queensland, Australia, and colleagues. CIMP level increased with age in a stepwise fashion, they noted.

Further associations with CIMP subtype and BRAF mutation status support the investigators’ recent report that sessile serrated adenomas are rare in young patients and pose little risk of malignancy. With additional research, these findings could “inform the development of patient-centric surveillance for young and older patients who present with sessile serrated adenomas,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology.

“CIMP can be detected using a standardized marker panel to stratify tumors as CIMP-high, CIMP-low, or CIMP-negative.” In the present study, the investigators expanded these three existing subtypes into five subtypes, allowing for better prediction of clinical and molecular characteristics associated with disease progression.

Initial genomic testing showed that 13.4% of cases carried a BRAF V600E mutation, 34.7% were mutated at KRAS codon 12 or 13, and almost half of the patients (42.2%) had a TP53 mutation. Sorted into the three previously described subtypes, CIMP negative was most common (68.5%), followed by CIMP low (20.4%), and CIMP high (11.1%). About two-thirds (66%) of BRAF mutant cancers were CIMP high, compared with just 3% of BRAF wild-type cases (P less than .0001). KRAS mutated cases were more often CIMP-low than KRAS wild-type cancers (34.6% vs. 12.8%; P less than .001).

With use of Illumina HumanMethylation450 Bead Chip arrays and recursively partitioned mixed model clustering, five methylation clusters were identified; specifically, these were CIMP-H1 and CIMP-H2 (high methylation levels), CIMP-L1 and CIMP-L2 (intermediate methylation levels), and CIMP-negative (low methylation level). As described above, methylation level demonstrated a direct relationship with age, ranging from CIMP-negative (61.9 years) to CIMP-H1 (75.2 years). The investigators also reported unique characteristics of each new subtype. For instance, the CIMP-H1 cluster had many features in common with cases of serrated neoplasia, such as BRAF mutation positivity (73.9%; P less than .0001).

“BRAF mutations are a hallmark of the serrated neoplasia pathway, and indicate that these cancers probably arose in serrated precursor lesions,” the investigators wrote. “We previously showed that the colonoscopic incidence of sessile serrated adenomas does not differ between patients aged in their 30s and patients who are much older, whereas BRAF mutant cancers were restricted to older individuals, suggesting these BRAF mutant polyps may have limited malignant potential in young patients.”

In contrast with the CIMP-H1 cases, CIMP-H2 cancers were more often KRAS mutant (54.5% vs. 17.4%). Other findings revealed associations with subtype and location; for example, CIMP-L1 cases were located equally in the distal and proximal colon, whereas CIMP-L2 cases more often localized to the distal colon and rectum. Of note for CIMP-negative cancers, most (62.3%) occurred in the distal colon, and none had a BRAF mutation.

The five methylation subtypes also showed associations with consensus molecular subtypes (CMS) to varying degrees. The two strongest correlations were found in CIMP-H1 cancers and CIMP-H2 cancers, which were most frequently classified as CMS1 (69.6%) and CMS3 (54.5%), respectively.

Using CIBERSORT, the investigators detected a variety of associations between the five subtypes and stromal immune cell composition. For example, CIMP-H1 cases were enriched for macrophages, compared with the other subtypes, except CIMP-L2. Mast cells showed a stepwise relationship with subtype; they contributed the most to the immune microenvironment of CIMP-negative cancers and the least to cases classified as CIMP-H1. A converse trend was found with natural killer cells.

Of note, in CIMP-H1 and CIMP-H2 cancers, oncogenes were significantly more likely than tumor-suppressor genes to undergo gene body methylation, which is positively correlated with gene expression, and oncogenes in these subtypes had significantly greater gene body methylation than normal colonic mucosa.

“The five subtypes identified in this study are highly correlated with key clinical and molecular features, including patient age, tumor location, microsatellite instability, and oncogenic mitogen-activated protein kinase mutations,” they wrote. “We show that cancers with high DNA methylation show an increased preponderance for mutating genes involved in epigenetic regulation, and namely those that are implicated in the chromatin remodeling process.”

Concluding, the investigators explained the role of their research in future therapy development. “Our analyses have identified potentially druggable vulnerabilities in cancers of different methylation subtypes,” they wrote. “Inhibitors targeting synthetic lethalities, such as SWI/SNF component inhibitors for those with ARID mutations, should be evaluated because these agents may be clinically beneficial to certain patient subsets.”

The study was funded by the National Health and Medical Research Council, the US National Institutes of Health, Pathology Queensland, and others. The investigators disclosed no conflicts of interest.

SOURCE: Fennell L et al. CMGH. 2019 Apr 4. doi: 10.1016/j.jcmgh.2019.04.002.

Body

Genomic, epigenomic, and transcriptomic information has revealed molecular subclasses of CRC, which has refined our understanding of the molecular and cellular biology of CRC and improved our treatment of patients with CRC. Several reliable and clinically useful molecular subtypes of colorectal cancer have been identified, including microsatellite unstable (MSI), chromosomal unstable (CIN), CpG island methylator phenotype (CIMP), and CMS 1-4 subtypes. Despite these substantial advances, it is also clear that we still only partially grasp the molecular and cellular biology driving CRC.

Dr. William M. Grady

The studies by Fennell et al. provide new insights into the CIMP subtype of CRC that address this knowledge gap. Using a large CRC cohort and more detailed molecular information than available in prior studies, they have identified previously unrecognized CRC CIMP subtypes that have unique methylomes and mutation patterns. These 5 CIMP subclasses vary with regard to location in the colon, frequency of mutations in KRAS, BRAF, and MSI, as well as alterations in epigenetic regulatory genes. The observations related to differences in frequencies of MSI, and mutations in KRAS and BRAF help demystify the heterogeneity in clinical and cellular behavior that has been seen in the broader class of CIMP cancers. Perhaps most importantly, their studies identify plausible driver molecular alterations unique to the CIMP subclasses, such as subclass specific mutations in epigenetic regulatory genes and activated oncogenes. These are promising novel targets for chemoprevention strategies and therapies. Fennell and colleagues have now set the stage for functional studies of these molecular alterations to determine their true role in the cellular and clinical behavior of CRC.

William M. Grady, MD, is the Rodger C. Haggitt Professor of Medicine, department of medicine, division of gastroenterology, University of Washington School of Medicine, and clinical research division, Fred Hutchinson Cancer Research Center, Seattle. He is an advisory board member for Freenome and SEngine; has consulted for DiaCarta, Boehringer Ingelheim, and Guardant Health; and has conducted industry-sponsored research for Jannsenn and Cambridge Epigenetic.
 

Publications
Topics
Sections
Body

Genomic, epigenomic, and transcriptomic information has revealed molecular subclasses of CRC, which has refined our understanding of the molecular and cellular biology of CRC and improved our treatment of patients with CRC. Several reliable and clinically useful molecular subtypes of colorectal cancer have been identified, including microsatellite unstable (MSI), chromosomal unstable (CIN), CpG island methylator phenotype (CIMP), and CMS 1-4 subtypes. Despite these substantial advances, it is also clear that we still only partially grasp the molecular and cellular biology driving CRC.

Dr. William M. Grady

The studies by Fennell et al. provide new insights into the CIMP subtype of CRC that address this knowledge gap. Using a large CRC cohort and more detailed molecular information than available in prior studies, they have identified previously unrecognized CRC CIMP subtypes that have unique methylomes and mutation patterns. These 5 CIMP subclasses vary with regard to location in the colon, frequency of mutations in KRAS, BRAF, and MSI, as well as alterations in epigenetic regulatory genes. The observations related to differences in frequencies of MSI, and mutations in KRAS and BRAF help demystify the heterogeneity in clinical and cellular behavior that has been seen in the broader class of CIMP cancers. Perhaps most importantly, their studies identify plausible driver molecular alterations unique to the CIMP subclasses, such as subclass specific mutations in epigenetic regulatory genes and activated oncogenes. These are promising novel targets for chemoprevention strategies and therapies. Fennell and colleagues have now set the stage for functional studies of these molecular alterations to determine their true role in the cellular and clinical behavior of CRC.

William M. Grady, MD, is the Rodger C. Haggitt Professor of Medicine, department of medicine, division of gastroenterology, University of Washington School of Medicine, and clinical research division, Fred Hutchinson Cancer Research Center, Seattle. He is an advisory board member for Freenome and SEngine; has consulted for DiaCarta, Boehringer Ingelheim, and Guardant Health; and has conducted industry-sponsored research for Jannsenn and Cambridge Epigenetic.
 

Body

Genomic, epigenomic, and transcriptomic information has revealed molecular subclasses of CRC, which has refined our understanding of the molecular and cellular biology of CRC and improved our treatment of patients with CRC. Several reliable and clinically useful molecular subtypes of colorectal cancer have been identified, including microsatellite unstable (MSI), chromosomal unstable (CIN), CpG island methylator phenotype (CIMP), and CMS 1-4 subtypes. Despite these substantial advances, it is also clear that we still only partially grasp the molecular and cellular biology driving CRC.

Dr. William M. Grady

The studies by Fennell et al. provide new insights into the CIMP subtype of CRC that address this knowledge gap. Using a large CRC cohort and more detailed molecular information than available in prior studies, they have identified previously unrecognized CRC CIMP subtypes that have unique methylomes and mutation patterns. These 5 CIMP subclasses vary with regard to location in the colon, frequency of mutations in KRAS, BRAF, and MSI, as well as alterations in epigenetic regulatory genes. The observations related to differences in frequencies of MSI, and mutations in KRAS and BRAF help demystify the heterogeneity in clinical and cellular behavior that has been seen in the broader class of CIMP cancers. Perhaps most importantly, their studies identify plausible driver molecular alterations unique to the CIMP subclasses, such as subclass specific mutations in epigenetic regulatory genes and activated oncogenes. These are promising novel targets for chemoprevention strategies and therapies. Fennell and colleagues have now set the stage for functional studies of these molecular alterations to determine their true role in the cellular and clinical behavior of CRC.

William M. Grady, MD, is the Rodger C. Haggitt Professor of Medicine, department of medicine, division of gastroenterology, University of Washington School of Medicine, and clinical research division, Fred Hutchinson Cancer Research Center, Seattle. He is an advisory board member for Freenome and SEngine; has consulted for DiaCarta, Boehringer Ingelheim, and Guardant Health; and has conducted industry-sponsored research for Jannsenn and Cambridge Epigenetic.
 

Title
Stage now set for functional studies
Stage now set for functional studies

Colorectal cancer can be divided into five DNA methylation subtypes that predict molecular and clinical behavior and may offer future therapeutic targets, according to investigators.

In 216 unselected colorectal cancers, five subtypes of the CpG island methylator phenotype (CIMP) showed “striking” associations with sex, age, and tumor location, reported lead author Lochlan Fennell, MD, of the QIMR Berghofer Medical Research Institute in Queensland, Australia, and colleagues. CIMP level increased with age in a stepwise fashion, they noted.

Further associations with CIMP subtype and BRAF mutation status support the investigators’ recent report that sessile serrated adenomas are rare in young patients and pose little risk of malignancy. With additional research, these findings could “inform the development of patient-centric surveillance for young and older patients who present with sessile serrated adenomas,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology.

“CIMP can be detected using a standardized marker panel to stratify tumors as CIMP-high, CIMP-low, or CIMP-negative.” In the present study, the investigators expanded these three existing subtypes into five subtypes, allowing for better prediction of clinical and molecular characteristics associated with disease progression.

Initial genomic testing showed that 13.4% of cases carried a BRAF V600E mutation, 34.7% were mutated at KRAS codon 12 or 13, and almost half of the patients (42.2%) had a TP53 mutation. Sorted into the three previously described subtypes, CIMP negative was most common (68.5%), followed by CIMP low (20.4%), and CIMP high (11.1%). About two-thirds (66%) of BRAF mutant cancers were CIMP high, compared with just 3% of BRAF wild-type cases (P less than .0001). KRAS mutated cases were more often CIMP-low than KRAS wild-type cancers (34.6% vs. 12.8%; P less than .001).

With use of Illumina HumanMethylation450 Bead Chip arrays and recursively partitioned mixed model clustering, five methylation clusters were identified; specifically, these were CIMP-H1 and CIMP-H2 (high methylation levels), CIMP-L1 and CIMP-L2 (intermediate methylation levels), and CIMP-negative (low methylation level). As described above, methylation level demonstrated a direct relationship with age, ranging from CIMP-negative (61.9 years) to CIMP-H1 (75.2 years). The investigators also reported unique characteristics of each new subtype. For instance, the CIMP-H1 cluster had many features in common with cases of serrated neoplasia, such as BRAF mutation positivity (73.9%; P less than .0001).

“BRAF mutations are a hallmark of the serrated neoplasia pathway, and indicate that these cancers probably arose in serrated precursor lesions,” the investigators wrote. “We previously showed that the colonoscopic incidence of sessile serrated adenomas does not differ between patients aged in their 30s and patients who are much older, whereas BRAF mutant cancers were restricted to older individuals, suggesting these BRAF mutant polyps may have limited malignant potential in young patients.”

In contrast with the CIMP-H1 cases, CIMP-H2 cancers were more often KRAS mutant (54.5% vs. 17.4%). Other findings revealed associations with subtype and location; for example, CIMP-L1 cases were located equally in the distal and proximal colon, whereas CIMP-L2 cases more often localized to the distal colon and rectum. Of note for CIMP-negative cancers, most (62.3%) occurred in the distal colon, and none had a BRAF mutation.

The five methylation subtypes also showed associations with consensus molecular subtypes (CMS) to varying degrees. The two strongest correlations were found in CIMP-H1 cancers and CIMP-H2 cancers, which were most frequently classified as CMS1 (69.6%) and CMS3 (54.5%), respectively.

Using CIBERSORT, the investigators detected a variety of associations between the five subtypes and stromal immune cell composition. For example, CIMP-H1 cases were enriched for macrophages, compared with the other subtypes, except CIMP-L2. Mast cells showed a stepwise relationship with subtype; they contributed the most to the immune microenvironment of CIMP-negative cancers and the least to cases classified as CIMP-H1. A converse trend was found with natural killer cells.

Of note, in CIMP-H1 and CIMP-H2 cancers, oncogenes were significantly more likely than tumor-suppressor genes to undergo gene body methylation, which is positively correlated with gene expression, and oncogenes in these subtypes had significantly greater gene body methylation than normal colonic mucosa.

“The five subtypes identified in this study are highly correlated with key clinical and molecular features, including patient age, tumor location, microsatellite instability, and oncogenic mitogen-activated protein kinase mutations,” they wrote. “We show that cancers with high DNA methylation show an increased preponderance for mutating genes involved in epigenetic regulation, and namely those that are implicated in the chromatin remodeling process.”

Concluding, the investigators explained the role of their research in future therapy development. “Our analyses have identified potentially druggable vulnerabilities in cancers of different methylation subtypes,” they wrote. “Inhibitors targeting synthetic lethalities, such as SWI/SNF component inhibitors for those with ARID mutations, should be evaluated because these agents may be clinically beneficial to certain patient subsets.”

The study was funded by the National Health and Medical Research Council, the US National Institutes of Health, Pathology Queensland, and others. The investigators disclosed no conflicts of interest.

SOURCE: Fennell L et al. CMGH. 2019 Apr 4. doi: 10.1016/j.jcmgh.2019.04.002.

Colorectal cancer can be divided into five DNA methylation subtypes that predict molecular and clinical behavior and may offer future therapeutic targets, according to investigators.

In 216 unselected colorectal cancers, five subtypes of the CpG island methylator phenotype (CIMP) showed “striking” associations with sex, age, and tumor location, reported lead author Lochlan Fennell, MD, of the QIMR Berghofer Medical Research Institute in Queensland, Australia, and colleagues. CIMP level increased with age in a stepwise fashion, they noted.

Further associations with CIMP subtype and BRAF mutation status support the investigators’ recent report that sessile serrated adenomas are rare in young patients and pose little risk of malignancy. With additional research, these findings could “inform the development of patient-centric surveillance for young and older patients who present with sessile serrated adenomas,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology.

“CIMP can be detected using a standardized marker panel to stratify tumors as CIMP-high, CIMP-low, or CIMP-negative.” In the present study, the investigators expanded these three existing subtypes into five subtypes, allowing for better prediction of clinical and molecular characteristics associated with disease progression.

Initial genomic testing showed that 13.4% of cases carried a BRAF V600E mutation, 34.7% were mutated at KRAS codon 12 or 13, and almost half of the patients (42.2%) had a TP53 mutation. Sorted into the three previously described subtypes, CIMP negative was most common (68.5%), followed by CIMP low (20.4%), and CIMP high (11.1%). About two-thirds (66%) of BRAF mutant cancers were CIMP high, compared with just 3% of BRAF wild-type cases (P less than .0001). KRAS mutated cases were more often CIMP-low than KRAS wild-type cancers (34.6% vs. 12.8%; P less than .001).

With use of Illumina HumanMethylation450 Bead Chip arrays and recursively partitioned mixed model clustering, five methylation clusters were identified; specifically, these were CIMP-H1 and CIMP-H2 (high methylation levels), CIMP-L1 and CIMP-L2 (intermediate methylation levels), and CIMP-negative (low methylation level). As described above, methylation level demonstrated a direct relationship with age, ranging from CIMP-negative (61.9 years) to CIMP-H1 (75.2 years). The investigators also reported unique characteristics of each new subtype. For instance, the CIMP-H1 cluster had many features in common with cases of serrated neoplasia, such as BRAF mutation positivity (73.9%; P less than .0001).

“BRAF mutations are a hallmark of the serrated neoplasia pathway, and indicate that these cancers probably arose in serrated precursor lesions,” the investigators wrote. “We previously showed that the colonoscopic incidence of sessile serrated adenomas does not differ between patients aged in their 30s and patients who are much older, whereas BRAF mutant cancers were restricted to older individuals, suggesting these BRAF mutant polyps may have limited malignant potential in young patients.”

In contrast with the CIMP-H1 cases, CIMP-H2 cancers were more often KRAS mutant (54.5% vs. 17.4%). Other findings revealed associations with subtype and location; for example, CIMP-L1 cases were located equally in the distal and proximal colon, whereas CIMP-L2 cases more often localized to the distal colon and rectum. Of note for CIMP-negative cancers, most (62.3%) occurred in the distal colon, and none had a BRAF mutation.

The five methylation subtypes also showed associations with consensus molecular subtypes (CMS) to varying degrees. The two strongest correlations were found in CIMP-H1 cancers and CIMP-H2 cancers, which were most frequently classified as CMS1 (69.6%) and CMS3 (54.5%), respectively.

Using CIBERSORT, the investigators detected a variety of associations between the five subtypes and stromal immune cell composition. For example, CIMP-H1 cases were enriched for macrophages, compared with the other subtypes, except CIMP-L2. Mast cells showed a stepwise relationship with subtype; they contributed the most to the immune microenvironment of CIMP-negative cancers and the least to cases classified as CIMP-H1. A converse trend was found with natural killer cells.

Of note, in CIMP-H1 and CIMP-H2 cancers, oncogenes were significantly more likely than tumor-suppressor genes to undergo gene body methylation, which is positively correlated with gene expression, and oncogenes in these subtypes had significantly greater gene body methylation than normal colonic mucosa.

“The five subtypes identified in this study are highly correlated with key clinical and molecular features, including patient age, tumor location, microsatellite instability, and oncogenic mitogen-activated protein kinase mutations,” they wrote. “We show that cancers with high DNA methylation show an increased preponderance for mutating genes involved in epigenetic regulation, and namely those that are implicated in the chromatin remodeling process.”

Concluding, the investigators explained the role of their research in future therapy development. “Our analyses have identified potentially druggable vulnerabilities in cancers of different methylation subtypes,” they wrote. “Inhibitors targeting synthetic lethalities, such as SWI/SNF component inhibitors for those with ARID mutations, should be evaluated because these agents may be clinically beneficial to certain patient subsets.”

The study was funded by the National Health and Medical Research Council, the US National Institutes of Health, Pathology Queensland, and others. The investigators disclosed no conflicts of interest.

SOURCE: Fennell L et al. CMGH. 2019 Apr 4. doi: 10.1016/j.jcmgh.2019.04.002.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Underwater endoscopic mucosal resection may be an option for colorectal lesions

Article Type
Changed
Wed, 05/26/2021 - 13:47

 

For intermediate-size colorectal lesions, underwater endoscopic mucosal resection (UEMR) may offer cleaner margins than conventional EMR without increasing procedure time or risk of adverse events, based on a recent head-to-head trial conducted in Japan.

UEMR was associated with higher R0 and en bloc resection rates than was conventional EMR (CEMR) when used for intermediate-size colorectal lesions, reported lead author Takeshi Yamashina, MD, of Osaka (Japan) International Cancer Institute, and colleagues. The study was the first multicenter, randomized trial to demonstrate the superiority of UEMR over CEMR, they noted.

Although CEMR is a well-established method of removing sessile colorectal lesions, those larger than 10 mm can be difficult to resect en bloc, which contributes to a local recurrence rate exceeding 15% when alternative, piecemeal resection is performed, the investigators explained in Gastroenterology

Recently, UEMR has emerged as “an alternative to CEMR and is reported to be effective for removing flat or large colorectal polyps,” the investigators wrote. “With UEMR, the bowel lumen is filled with water instead of air/CO2, and the lesion is captured and resected with a snare without submucosal injection of normal saline.”

To find out if UEMR offers better results than CEMR, the investigators recruited 211 patients with 214 colorectal lesions at five centers in Japan. Patients were aged at least 20 years and had mucosal lesions of 10-20 mm in diameter. Based on macroscopic appearance, pit pattern classification with magnifying chromoendoscopy, or narrow-band imaging, lesions were classified as adenoma, sessile serrated adenoma/polyp, or intramucosal adenocarcinoma. Patients were randomly assigned in a 1:1 ratio to the UEMR or CEMR group, and just prior to the procedure, operators were informed of the allocated treatment. Ten expert operators were involved, each with at least 10 years of experience, in addition to 18 nonexpert operators with less than 10 years of experience. The primary endpoint was the difference in R0 resection rate between the two groups, with R0 defined as en bloc resection with histologically negative margins. Secondary endpoints were en bloc resection rate, adverse events, and procedure time.

The results showed a clear win for UEMR, with an R0 rate of 69%, compared with 50% for CEMR (P = .011), and an en bloc resection rate that followed the same trend (89% vs. 75%; P = .007). Neither median procedure times nor number of adverse events were significantly different between groups.

Subset analysis showed that UEMR was best suited for lesions at least 15 mm in diameter, although the investigators pointed out the superior R0 resection rate with UEMR held steady regardless of lesion morphology, size, location, or operator experience level.

The investigators suggested that the findings give reason to amend some existing recommendations. “Although the European Society of Gastrointestinal Endoscopy Clinical Guidelines suggest hot-snare polypectomy with submucosal injection for removing sessile polyps 10-19 mm in size, we found that UEMR was more effective than CEMR, in terms of better R0 and en bloc resection rates,” they wrote. “Hence, we think that UEMR will become an alternative to CEMR. It could fill the gap for removing polyps 9 mm [or larger] (indication for removal by cold-snare polypectomy) and [smaller than] 20 mm (indication for ESD removal).”

During the discussion, the investigators explained that UEMR achieves better outcomes primarily by improving access to lesions. Water immersion causes lesions to float upright into the lumen, while keeping the muscularis propria circular behind the submucosa, which allows for easier snaring and decreases risk of perforation. Furthermore, the investigators noted, water immersion limits flexure angulation, luminal distension, and loop formation, all of which improve maneuverability and visibility.

Still, UEMR may take some operator adjustment, the investigators added, going on to provide some pointers. “In practice, we think it is important to fill the entire lumen only with fluid, so we always deflate the lumen completely and then fill it with fluid,” they wrote. “[When the lumen is filled], it is not necessary to change the patient’s position during the UEMR procedure.”

“Also, in cases with unclear endoscopic vision, endoscopists are familiar with air insufflation but, during UEMR, it is better to infuse the fluid to expand the lumen and maintain a good endoscopic view. Therefore, for the beginner, we recommend that the air insufflation button of the endoscopy machine be switched off.”

Additional tips included using saline instead of distilled water, and employing thin, soft snares.

The investigators reported no external funding or conflicts of interest.

SOURCE: Yamashina T et al. Gastro. 2018 Apr 11. doi: 10.1053/j.gastro.2019.04.005.

Publications
Topics
Sections

 

For intermediate-size colorectal lesions, underwater endoscopic mucosal resection (UEMR) may offer cleaner margins than conventional EMR without increasing procedure time or risk of adverse events, based on a recent head-to-head trial conducted in Japan.

UEMR was associated with higher R0 and en bloc resection rates than was conventional EMR (CEMR) when used for intermediate-size colorectal lesions, reported lead author Takeshi Yamashina, MD, of Osaka (Japan) International Cancer Institute, and colleagues. The study was the first multicenter, randomized trial to demonstrate the superiority of UEMR over CEMR, they noted.

Although CEMR is a well-established method of removing sessile colorectal lesions, those larger than 10 mm can be difficult to resect en bloc, which contributes to a local recurrence rate exceeding 15% when alternative, piecemeal resection is performed, the investigators explained in Gastroenterology

Recently, UEMR has emerged as “an alternative to CEMR and is reported to be effective for removing flat or large colorectal polyps,” the investigators wrote. “With UEMR, the bowel lumen is filled with water instead of air/CO2, and the lesion is captured and resected with a snare without submucosal injection of normal saline.”

To find out if UEMR offers better results than CEMR, the investigators recruited 211 patients with 214 colorectal lesions at five centers in Japan. Patients were aged at least 20 years and had mucosal lesions of 10-20 mm in diameter. Based on macroscopic appearance, pit pattern classification with magnifying chromoendoscopy, or narrow-band imaging, lesions were classified as adenoma, sessile serrated adenoma/polyp, or intramucosal adenocarcinoma. Patients were randomly assigned in a 1:1 ratio to the UEMR or CEMR group, and just prior to the procedure, operators were informed of the allocated treatment. Ten expert operators were involved, each with at least 10 years of experience, in addition to 18 nonexpert operators with less than 10 years of experience. The primary endpoint was the difference in R0 resection rate between the two groups, with R0 defined as en bloc resection with histologically negative margins. Secondary endpoints were en bloc resection rate, adverse events, and procedure time.

The results showed a clear win for UEMR, with an R0 rate of 69%, compared with 50% for CEMR (P = .011), and an en bloc resection rate that followed the same trend (89% vs. 75%; P = .007). Neither median procedure times nor number of adverse events were significantly different between groups.

Subset analysis showed that UEMR was best suited for lesions at least 15 mm in diameter, although the investigators pointed out the superior R0 resection rate with UEMR held steady regardless of lesion morphology, size, location, or operator experience level.

The investigators suggested that the findings give reason to amend some existing recommendations. “Although the European Society of Gastrointestinal Endoscopy Clinical Guidelines suggest hot-snare polypectomy with submucosal injection for removing sessile polyps 10-19 mm in size, we found that UEMR was more effective than CEMR, in terms of better R0 and en bloc resection rates,” they wrote. “Hence, we think that UEMR will become an alternative to CEMR. It could fill the gap for removing polyps 9 mm [or larger] (indication for removal by cold-snare polypectomy) and [smaller than] 20 mm (indication for ESD removal).”

During the discussion, the investigators explained that UEMR achieves better outcomes primarily by improving access to lesions. Water immersion causes lesions to float upright into the lumen, while keeping the muscularis propria circular behind the submucosa, which allows for easier snaring and decreases risk of perforation. Furthermore, the investigators noted, water immersion limits flexure angulation, luminal distension, and loop formation, all of which improve maneuverability and visibility.

Still, UEMR may take some operator adjustment, the investigators added, going on to provide some pointers. “In practice, we think it is important to fill the entire lumen only with fluid, so we always deflate the lumen completely and then fill it with fluid,” they wrote. “[When the lumen is filled], it is not necessary to change the patient’s position during the UEMR procedure.”

“Also, in cases with unclear endoscopic vision, endoscopists are familiar with air insufflation but, during UEMR, it is better to infuse the fluid to expand the lumen and maintain a good endoscopic view. Therefore, for the beginner, we recommend that the air insufflation button of the endoscopy machine be switched off.”

Additional tips included using saline instead of distilled water, and employing thin, soft snares.

The investigators reported no external funding or conflicts of interest.

SOURCE: Yamashina T et al. Gastro. 2018 Apr 11. doi: 10.1053/j.gastro.2019.04.005.

 

For intermediate-size colorectal lesions, underwater endoscopic mucosal resection (UEMR) may offer cleaner margins than conventional EMR without increasing procedure time or risk of adverse events, based on a recent head-to-head trial conducted in Japan.

UEMR was associated with higher R0 and en bloc resection rates than was conventional EMR (CEMR) when used for intermediate-size colorectal lesions, reported lead author Takeshi Yamashina, MD, of Osaka (Japan) International Cancer Institute, and colleagues. The study was the first multicenter, randomized trial to demonstrate the superiority of UEMR over CEMR, they noted.

Although CEMR is a well-established method of removing sessile colorectal lesions, those larger than 10 mm can be difficult to resect en bloc, which contributes to a local recurrence rate exceeding 15% when alternative, piecemeal resection is performed, the investigators explained in Gastroenterology

Recently, UEMR has emerged as “an alternative to CEMR and is reported to be effective for removing flat or large colorectal polyps,” the investigators wrote. “With UEMR, the bowel lumen is filled with water instead of air/CO2, and the lesion is captured and resected with a snare without submucosal injection of normal saline.”

To find out if UEMR offers better results than CEMR, the investigators recruited 211 patients with 214 colorectal lesions at five centers in Japan. Patients were aged at least 20 years and had mucosal lesions of 10-20 mm in diameter. Based on macroscopic appearance, pit pattern classification with magnifying chromoendoscopy, or narrow-band imaging, lesions were classified as adenoma, sessile serrated adenoma/polyp, or intramucosal adenocarcinoma. Patients were randomly assigned in a 1:1 ratio to the UEMR or CEMR group, and just prior to the procedure, operators were informed of the allocated treatment. Ten expert operators were involved, each with at least 10 years of experience, in addition to 18 nonexpert operators with less than 10 years of experience. The primary endpoint was the difference in R0 resection rate between the two groups, with R0 defined as en bloc resection with histologically negative margins. Secondary endpoints were en bloc resection rate, adverse events, and procedure time.

The results showed a clear win for UEMR, with an R0 rate of 69%, compared with 50% for CEMR (P = .011), and an en bloc resection rate that followed the same trend (89% vs. 75%; P = .007). Neither median procedure times nor number of adverse events were significantly different between groups.

Subset analysis showed that UEMR was best suited for lesions at least 15 mm in diameter, although the investigators pointed out the superior R0 resection rate with UEMR held steady regardless of lesion morphology, size, location, or operator experience level.

The investigators suggested that the findings give reason to amend some existing recommendations. “Although the European Society of Gastrointestinal Endoscopy Clinical Guidelines suggest hot-snare polypectomy with submucosal injection for removing sessile polyps 10-19 mm in size, we found that UEMR was more effective than CEMR, in terms of better R0 and en bloc resection rates,” they wrote. “Hence, we think that UEMR will become an alternative to CEMR. It could fill the gap for removing polyps 9 mm [or larger] (indication for removal by cold-snare polypectomy) and [smaller than] 20 mm (indication for ESD removal).”

During the discussion, the investigators explained that UEMR achieves better outcomes primarily by improving access to lesions. Water immersion causes lesions to float upright into the lumen, while keeping the muscularis propria circular behind the submucosa, which allows for easier snaring and decreases risk of perforation. Furthermore, the investigators noted, water immersion limits flexure angulation, luminal distension, and loop formation, all of which improve maneuverability and visibility.

Still, UEMR may take some operator adjustment, the investigators added, going on to provide some pointers. “In practice, we think it is important to fill the entire lumen only with fluid, so we always deflate the lumen completely and then fill it with fluid,” they wrote. “[When the lumen is filled], it is not necessary to change the patient’s position during the UEMR procedure.”

“Also, in cases with unclear endoscopic vision, endoscopists are familiar with air insufflation but, during UEMR, it is better to infuse the fluid to expand the lumen and maintain a good endoscopic view. Therefore, for the beginner, we recommend that the air insufflation button of the endoscopy machine be switched off.”

Additional tips included using saline instead of distilled water, and employing thin, soft snares.

The investigators reported no external funding or conflicts of interest.

SOURCE: Yamashina T et al. Gastro. 2018 Apr 11. doi: 10.1053/j.gastro.2019.04.005.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

AGA Clinical Practice Update: Coagulation in cirrhosis

Article Type
Changed
Mon, 07/01/2019 - 10:33

Cirrhosis can involve “precarious” changes in hemostatic pathways that tip the scales toward either bleeding or hypercoagulation, experts wrote in an American Gastroenterological Association Clinical Practice Update.

Based on current evidence, clinicians should not routinely correct thrombocytopenia and coagulopathy in patients with cirrhosis prior to low-risk procedures, such as therapeutic paracentesis, thoracentesis, and routine upper endoscopy for variceal ligation, Jacqueline G. O’Leary, MD, of Dallas VA Medical Center and her three coreviewers wrote in Gastroenterology.

To optimize clot formation prior to high-risk procedures, and in patients with active bleeding, a platelet count above 50,000 per mcL is still recommended. However, it may be more meaningful to couple that platelet target with a fibrinogen level above 120 mg/dL rather than rely on the international normalized ratio (INR), the experts wrote. Not only does INR vary significantly depending on which thromboplastin is used in the test, but “correcting” INR with a fresh frozen plasma infusion does not affect thrombin production and worsens portal hypertension. Using cryoprecipitate to replenish fibrinogen has less impact on portal hypertension. “Global tests of clot formation, such as rotational thromboelastometry (ROTEM), thromboelastography (TEG), sonorheometry, and thrombin generation may eventually have a role in the evaluation of clotting in patients with cirrhosis but currently lack validated target levels,” the experts wrote.

They advised clinicians to limit the use of blood products (such as fresh frozen plasma and pooled platelet transfusions) because of cost and the risk of exacerbated portal hypertension, infection, and immunologic complications. For severe anemia and uremia, red blood cell transfusion (250 mL) can be considered. Platelet-rich plasma from one donor is less immunologically risky than a pooled platelet transfusion. Thrombopoietin agonists are “a good alternative” to platelet transfusion but require about 10 days for response. Alternative prothrombotic therapies include oral thrombopoietin receptor agonists (avatrombopag and lusutrombopag) to boost platelet count before an invasive procedure, antifibrinolytic therapy (aminocaproic acid and tranexamic acid) for persistent bleeding from mucosal oozing or puncture wounds. Desmopressin should only be considered for patients with comorbid renal failure.

For anticoagulation, the practice update recommends considering systemic heparin infusion for cirrhotic patients with symptomatic deep venous thrombosis (DVT) or portal vein thrombosis (PVT). However, the anti–factor Xa assay will not reliably monitor response if patients have low liver-derived antithrombin III (heparin cofactor). “With newly diagnosed PVT, the decision to intervene with directed therapy rests on the extent of the thrombosis, presence or absence of attributable symptoms, and the risk of bleeding and falls,” the experts stated.

Six-month follow-up imaging is recommended to assess anticoagulation efficacy. More frequent imaging can be considered for PVT patients considered at high risk for therapeutic anticoagulation. If clots do not fully resolve after 6 months of treatment, options including extending therapy with the same agent, switching to a different anticoagulant class, or receiving transjugular intrahepatic portosystemic shunt (TIPS). “The role for TIPS in PVT is evolving and may address complications like portal hypertensive bleeding, medically refractory clot, and the need for repeated banding after variceal bleeding,” the experts noted.

Prophylaxis of DVT is recommended for all hospitalized patients with cirrhosis. Vitamin K antagonists and direct-acting oral anticoagulants (dabigatran, apixaban, rivaroxaban, and edoxaban) are alternatives to heparin for anticoagulation of cirrhotic patients with either PVT and DVT, the experts wrote. However, DOACs are not recommended for most Child-Pugh B patients or for any Child-Pugh C patients.

No funding sources or conflicts of interest were reported.

SOURCE: O’Leary JG et al. Gastroenterology. 2019. doi: 10.1053/j.gastro.2019.03.070.

Publications
Topics
Sections

Cirrhosis can involve “precarious” changes in hemostatic pathways that tip the scales toward either bleeding or hypercoagulation, experts wrote in an American Gastroenterological Association Clinical Practice Update.

Based on current evidence, clinicians should not routinely correct thrombocytopenia and coagulopathy in patients with cirrhosis prior to low-risk procedures, such as therapeutic paracentesis, thoracentesis, and routine upper endoscopy for variceal ligation, Jacqueline G. O’Leary, MD, of Dallas VA Medical Center and her three coreviewers wrote in Gastroenterology.

To optimize clot formation prior to high-risk procedures, and in patients with active bleeding, a platelet count above 50,000 per mcL is still recommended. However, it may be more meaningful to couple that platelet target with a fibrinogen level above 120 mg/dL rather than rely on the international normalized ratio (INR), the experts wrote. Not only does INR vary significantly depending on which thromboplastin is used in the test, but “correcting” INR with a fresh frozen plasma infusion does not affect thrombin production and worsens portal hypertension. Using cryoprecipitate to replenish fibrinogen has less impact on portal hypertension. “Global tests of clot formation, such as rotational thromboelastometry (ROTEM), thromboelastography (TEG), sonorheometry, and thrombin generation may eventually have a role in the evaluation of clotting in patients with cirrhosis but currently lack validated target levels,” the experts wrote.

They advised clinicians to limit the use of blood products (such as fresh frozen plasma and pooled platelet transfusions) because of cost and the risk of exacerbated portal hypertension, infection, and immunologic complications. For severe anemia and uremia, red blood cell transfusion (250 mL) can be considered. Platelet-rich plasma from one donor is less immunologically risky than a pooled platelet transfusion. Thrombopoietin agonists are “a good alternative” to platelet transfusion but require about 10 days for response. Alternative prothrombotic therapies include oral thrombopoietin receptor agonists (avatrombopag and lusutrombopag) to boost platelet count before an invasive procedure, antifibrinolytic therapy (aminocaproic acid and tranexamic acid) for persistent bleeding from mucosal oozing or puncture wounds. Desmopressin should only be considered for patients with comorbid renal failure.

For anticoagulation, the practice update recommends considering systemic heparin infusion for cirrhotic patients with symptomatic deep venous thrombosis (DVT) or portal vein thrombosis (PVT). However, the anti–factor Xa assay will not reliably monitor response if patients have low liver-derived antithrombin III (heparin cofactor). “With newly diagnosed PVT, the decision to intervene with directed therapy rests on the extent of the thrombosis, presence or absence of attributable symptoms, and the risk of bleeding and falls,” the experts stated.

Six-month follow-up imaging is recommended to assess anticoagulation efficacy. More frequent imaging can be considered for PVT patients considered at high risk for therapeutic anticoagulation. If clots do not fully resolve after 6 months of treatment, options including extending therapy with the same agent, switching to a different anticoagulant class, or receiving transjugular intrahepatic portosystemic shunt (TIPS). “The role for TIPS in PVT is evolving and may address complications like portal hypertensive bleeding, medically refractory clot, and the need for repeated banding after variceal bleeding,” the experts noted.

Prophylaxis of DVT is recommended for all hospitalized patients with cirrhosis. Vitamin K antagonists and direct-acting oral anticoagulants (dabigatran, apixaban, rivaroxaban, and edoxaban) are alternatives to heparin for anticoagulation of cirrhotic patients with either PVT and DVT, the experts wrote. However, DOACs are not recommended for most Child-Pugh B patients or for any Child-Pugh C patients.

No funding sources or conflicts of interest were reported.

SOURCE: O’Leary JG et al. Gastroenterology. 2019. doi: 10.1053/j.gastro.2019.03.070.

Cirrhosis can involve “precarious” changes in hemostatic pathways that tip the scales toward either bleeding or hypercoagulation, experts wrote in an American Gastroenterological Association Clinical Practice Update.

Based on current evidence, clinicians should not routinely correct thrombocytopenia and coagulopathy in patients with cirrhosis prior to low-risk procedures, such as therapeutic paracentesis, thoracentesis, and routine upper endoscopy for variceal ligation, Jacqueline G. O’Leary, MD, of Dallas VA Medical Center and her three coreviewers wrote in Gastroenterology.

To optimize clot formation prior to high-risk procedures, and in patients with active bleeding, a platelet count above 50,000 per mcL is still recommended. However, it may be more meaningful to couple that platelet target with a fibrinogen level above 120 mg/dL rather than rely on the international normalized ratio (INR), the experts wrote. Not only does INR vary significantly depending on which thromboplastin is used in the test, but “correcting” INR with a fresh frozen plasma infusion does not affect thrombin production and worsens portal hypertension. Using cryoprecipitate to replenish fibrinogen has less impact on portal hypertension. “Global tests of clot formation, such as rotational thromboelastometry (ROTEM), thromboelastography (TEG), sonorheometry, and thrombin generation may eventually have a role in the evaluation of clotting in patients with cirrhosis but currently lack validated target levels,” the experts wrote.

They advised clinicians to limit the use of blood products (such as fresh frozen plasma and pooled platelet transfusions) because of cost and the risk of exacerbated portal hypertension, infection, and immunologic complications. For severe anemia and uremia, red blood cell transfusion (250 mL) can be considered. Platelet-rich plasma from one donor is less immunologically risky than a pooled platelet transfusion. Thrombopoietin agonists are “a good alternative” to platelet transfusion but require about 10 days for response. Alternative prothrombotic therapies include oral thrombopoietin receptor agonists (avatrombopag and lusutrombopag) to boost platelet count before an invasive procedure, antifibrinolytic therapy (aminocaproic acid and tranexamic acid) for persistent bleeding from mucosal oozing or puncture wounds. Desmopressin should only be considered for patients with comorbid renal failure.

For anticoagulation, the practice update recommends considering systemic heparin infusion for cirrhotic patients with symptomatic deep venous thrombosis (DVT) or portal vein thrombosis (PVT). However, the anti–factor Xa assay will not reliably monitor response if patients have low liver-derived antithrombin III (heparin cofactor). “With newly diagnosed PVT, the decision to intervene with directed therapy rests on the extent of the thrombosis, presence or absence of attributable symptoms, and the risk of bleeding and falls,” the experts stated.

Six-month follow-up imaging is recommended to assess anticoagulation efficacy. More frequent imaging can be considered for PVT patients considered at high risk for therapeutic anticoagulation. If clots do not fully resolve after 6 months of treatment, options including extending therapy with the same agent, switching to a different anticoagulant class, or receiving transjugular intrahepatic portosystemic shunt (TIPS). “The role for TIPS in PVT is evolving and may address complications like portal hypertensive bleeding, medically refractory clot, and the need for repeated banding after variceal bleeding,” the experts noted.

Prophylaxis of DVT is recommended for all hospitalized patients with cirrhosis. Vitamin K antagonists and direct-acting oral anticoagulants (dabigatran, apixaban, rivaroxaban, and edoxaban) are alternatives to heparin for anticoagulation of cirrhotic patients with either PVT and DVT, the experts wrote. However, DOACs are not recommended for most Child-Pugh B patients or for any Child-Pugh C patients.

No funding sources or conflicts of interest were reported.

SOURCE: O’Leary JG et al. Gastroenterology. 2019. doi: 10.1053/j.gastro.2019.03.070.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Atypical food allergies common in IBS

Article Type
Changed
Tue, 07/16/2019 - 13:11

Among patients with irritable bowel syndrome (IBS) who tested negative for classic food allergies, confocal laser endomicroscopy showed that 70% had an immediate disruption of the intestinal barrier in response to at least one food challenge, with accompanying changes in epithelial tight junction proteins and eosinophils.

Among 108 patients who completed the study, 61% showed this atypical allergic response to wheat, wrote Annette Fritscher-Ravens, MD, PhD, of University Hospital Schleswig-Holstein in Kiel, Germany, and her associates. Strikingly, almost 70% of patients with atypical food allergies to wheat, yeast, milk, soy, or egg white who eliminated these foods from their diets showed at least an 80% improvement in IBS symptoms after 3 months. These findings were published in Gastroenterology.

Confocal laser endomicroscopy (CLE) “permits real-time detection and quantification of changes in intestinal tissues and cells, including increases in intraepithelial lymphocytes and fluid extravasation through epithelial leaks,” the investigators wrote. This approach helps clinicians objectively detect and measure gastrointestinal pathology in response to specific foods, potentially freeing IBS patients from highly restrictive diets that ease symptoms but are hard to follow, and are not meant for long-term use.

For the study, the researchers enrolled patients meeting Rome III IBS criteria who tested negative for common food antigens on immunoglobulin E serology and skin tests. During endoscopy, each patient underwent sequential duodenal challenges with 20-mL suspensions of wheat, yeast, milk, soy, and egg white, followed by CLE with biopsy.

Among 108 patients who finished the study, 76 (70%) were CLE positive. They and their first-degree relatives were significantly more likely to have atopic disorders than were CLE-negative patients (P = .001). The most common allergen was wheat (61% of patients), followed by yeast (20%), milk (9%), soy (7%), and egg white (4%). Also, nine patients reacted to two of the tested food antigens.

Compared with CLE-negative patients or controls, CLE-positive patients also had significantly more intraepithelial lymphocytes (P = .001) and postchallenge expression of claudin-2 (P = .023), which contributes to tight junction permeability and is known to be upregulated in intestinal barrier dysfunction, IBS, and inflammatory bowel disease. Conversely, levels of the tight junction protein occludin were significantly lower in duodenal biopsies from CLE-positive patients versus controls (P = .022). “Levels of mRNAs encoding inflammatory cytokines were unchanged in duodenal tissues after CLE challenge, but eosinophil degranulation increased,” the researchers wrote.

In a double-blind, randomized, crossover study, patients then excluded from their diet the antigen to which they had tested positive or consumed a sham (placebo) diet that excluded only some foods containing the antigen, with a 2-week washout period in between. The CLE-positive patients showed a 70% average improvement in Francis IBS severity score after 3 months of the intervention diet and a 76% improvement at 6 months. Strikingly, 68% of CLE-positive patients showed at least an 80% improvement in symptoms, while only 4% did not respond at all.

“Since we do not observe a histological mast cell/basophil increase or activation, and [we] do not find increased mast cell mediators (tryptase) in the duodenal fluid after positive challenge, we assume a nonclassical or atypical food allergy as cause of the mucosal reaction observed by CLE,” the researchers wrote. Other immune cell parameters remained unchanged, but additional studies are needed to see if these changes are truly absent or occur later after challenge. The researchers are conducting murine studies of eosinophilic food allergy to shed more light on these nonclassical food allergies.

Funders included the Rashid Hussein Charity Trust, the German Research Foundation, and the Leibniz Foundation. The researchers reported having no conflicts of interest.

SOURCE: Fritscher-Ravens A et al. Gastroenterology. 2019 May 14. doi: 10.1053/j.gastro.2019.03.046.

Publications
Topics
Sections

Among patients with irritable bowel syndrome (IBS) who tested negative for classic food allergies, confocal laser endomicroscopy showed that 70% had an immediate disruption of the intestinal barrier in response to at least one food challenge, with accompanying changes in epithelial tight junction proteins and eosinophils.

Among 108 patients who completed the study, 61% showed this atypical allergic response to wheat, wrote Annette Fritscher-Ravens, MD, PhD, of University Hospital Schleswig-Holstein in Kiel, Germany, and her associates. Strikingly, almost 70% of patients with atypical food allergies to wheat, yeast, milk, soy, or egg white who eliminated these foods from their diets showed at least an 80% improvement in IBS symptoms after 3 months. These findings were published in Gastroenterology.

Confocal laser endomicroscopy (CLE) “permits real-time detection and quantification of changes in intestinal tissues and cells, including increases in intraepithelial lymphocytes and fluid extravasation through epithelial leaks,” the investigators wrote. This approach helps clinicians objectively detect and measure gastrointestinal pathology in response to specific foods, potentially freeing IBS patients from highly restrictive diets that ease symptoms but are hard to follow, and are not meant for long-term use.

For the study, the researchers enrolled patients meeting Rome III IBS criteria who tested negative for common food antigens on immunoglobulin E serology and skin tests. During endoscopy, each patient underwent sequential duodenal challenges with 20-mL suspensions of wheat, yeast, milk, soy, and egg white, followed by CLE with biopsy.

Among 108 patients who finished the study, 76 (70%) were CLE positive. They and their first-degree relatives were significantly more likely to have atopic disorders than were CLE-negative patients (P = .001). The most common allergen was wheat (61% of patients), followed by yeast (20%), milk (9%), soy (7%), and egg white (4%). Also, nine patients reacted to two of the tested food antigens.

Compared with CLE-negative patients or controls, CLE-positive patients also had significantly more intraepithelial lymphocytes (P = .001) and postchallenge expression of claudin-2 (P = .023), which contributes to tight junction permeability and is known to be upregulated in intestinal barrier dysfunction, IBS, and inflammatory bowel disease. Conversely, levels of the tight junction protein occludin were significantly lower in duodenal biopsies from CLE-positive patients versus controls (P = .022). “Levels of mRNAs encoding inflammatory cytokines were unchanged in duodenal tissues after CLE challenge, but eosinophil degranulation increased,” the researchers wrote.

In a double-blind, randomized, crossover study, patients then excluded from their diet the antigen to which they had tested positive or consumed a sham (placebo) diet that excluded only some foods containing the antigen, with a 2-week washout period in between. The CLE-positive patients showed a 70% average improvement in Francis IBS severity score after 3 months of the intervention diet and a 76% improvement at 6 months. Strikingly, 68% of CLE-positive patients showed at least an 80% improvement in symptoms, while only 4% did not respond at all.

“Since we do not observe a histological mast cell/basophil increase or activation, and [we] do not find increased mast cell mediators (tryptase) in the duodenal fluid after positive challenge, we assume a nonclassical or atypical food allergy as cause of the mucosal reaction observed by CLE,” the researchers wrote. Other immune cell parameters remained unchanged, but additional studies are needed to see if these changes are truly absent or occur later after challenge. The researchers are conducting murine studies of eosinophilic food allergy to shed more light on these nonclassical food allergies.

Funders included the Rashid Hussein Charity Trust, the German Research Foundation, and the Leibniz Foundation. The researchers reported having no conflicts of interest.

SOURCE: Fritscher-Ravens A et al. Gastroenterology. 2019 May 14. doi: 10.1053/j.gastro.2019.03.046.

Among patients with irritable bowel syndrome (IBS) who tested negative for classic food allergies, confocal laser endomicroscopy showed that 70% had an immediate disruption of the intestinal barrier in response to at least one food challenge, with accompanying changes in epithelial tight junction proteins and eosinophils.

Among 108 patients who completed the study, 61% showed this atypical allergic response to wheat, wrote Annette Fritscher-Ravens, MD, PhD, of University Hospital Schleswig-Holstein in Kiel, Germany, and her associates. Strikingly, almost 70% of patients with atypical food allergies to wheat, yeast, milk, soy, or egg white who eliminated these foods from their diets showed at least an 80% improvement in IBS symptoms after 3 months. These findings were published in Gastroenterology.

Confocal laser endomicroscopy (CLE) “permits real-time detection and quantification of changes in intestinal tissues and cells, including increases in intraepithelial lymphocytes and fluid extravasation through epithelial leaks,” the investigators wrote. This approach helps clinicians objectively detect and measure gastrointestinal pathology in response to specific foods, potentially freeing IBS patients from highly restrictive diets that ease symptoms but are hard to follow, and are not meant for long-term use.

For the study, the researchers enrolled patients meeting Rome III IBS criteria who tested negative for common food antigens on immunoglobulin E serology and skin tests. During endoscopy, each patient underwent sequential duodenal challenges with 20-mL suspensions of wheat, yeast, milk, soy, and egg white, followed by CLE with biopsy.

Among 108 patients who finished the study, 76 (70%) were CLE positive. They and their first-degree relatives were significantly more likely to have atopic disorders than were CLE-negative patients (P = .001). The most common allergen was wheat (61% of patients), followed by yeast (20%), milk (9%), soy (7%), and egg white (4%). Also, nine patients reacted to two of the tested food antigens.

Compared with CLE-negative patients or controls, CLE-positive patients also had significantly more intraepithelial lymphocytes (P = .001) and postchallenge expression of claudin-2 (P = .023), which contributes to tight junction permeability and is known to be upregulated in intestinal barrier dysfunction, IBS, and inflammatory bowel disease. Conversely, levels of the tight junction protein occludin were significantly lower in duodenal biopsies from CLE-positive patients versus controls (P = .022). “Levels of mRNAs encoding inflammatory cytokines were unchanged in duodenal tissues after CLE challenge, but eosinophil degranulation increased,” the researchers wrote.

In a double-blind, randomized, crossover study, patients then excluded from their diet the antigen to which they had tested positive or consumed a sham (placebo) diet that excluded only some foods containing the antigen, with a 2-week washout period in between. The CLE-positive patients showed a 70% average improvement in Francis IBS severity score after 3 months of the intervention diet and a 76% improvement at 6 months. Strikingly, 68% of CLE-positive patients showed at least an 80% improvement in symptoms, while only 4% did not respond at all.

“Since we do not observe a histological mast cell/basophil increase or activation, and [we] do not find increased mast cell mediators (tryptase) in the duodenal fluid after positive challenge, we assume a nonclassical or atypical food allergy as cause of the mucosal reaction observed by CLE,” the researchers wrote. Other immune cell parameters remained unchanged, but additional studies are needed to see if these changes are truly absent or occur later after challenge. The researchers are conducting murine studies of eosinophilic food allergy to shed more light on these nonclassical food allergies.

Funders included the Rashid Hussein Charity Trust, the German Research Foundation, and the Leibniz Foundation. The researchers reported having no conflicts of interest.

SOURCE: Fritscher-Ravens A et al. Gastroenterology. 2019 May 14. doi: 10.1053/j.gastro.2019.03.046.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

 

Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Algorithm predicts villous atrophy in children with potential celiac disease

Evidence-based prediction with a grain of salt
Article Type
Changed
Tue, 07/23/2019 - 13:31

A new algorithm may be able to predict which children with potential celiac disease will go on to develop villous atrophy, according to investigators writing in Gastroenterology.

The risk model was developed from the largest cohort of its kind, with the longest follow-up to date, reported lead author Renata Auricchio, MD, PhD, of University Federico II in Naples, Italy, and colleagues. Using the algorithm, which relies most heavily on a baseline number of intraepithelial lymphocytes (IELs) in mucosa, followed by age at diagnosis and genetic profile, clinicians may now consider prescribing gluten-free diets to only the highest-risk patients, instead of all suspected cases, noting that more than half of potential cases do not develop flat mucosa within 12 years.

Development of the algorithm began with enrollment of 340 children aged 2-18 years who were positive for endomysial antibodies immunoglobulin A antibodies and had tested positive twice consecutively for antitissue transglutaminase antibodies. Additionally, children were required to possess HLA DQ2- or DQ8-positive haplotypes and have normal duodenal architecture in five biopsy samples. Because of symptoms suggestive of celiac disease or parental discretion, 60 patients were started on a gluten-free diet and excluded from the study, leaving 280 patients in the final cohort. These patients were kept on a gluten-containing diet and followed for up to 12 years. Every 6 months, the investigators checked antibodies and clinical status, and every 2 years, small bowel biopsy was performed, if symptoms had not necessitated this earlier.

After a median follow-up of 60 months, ranging from 18 months to 12 years, 39 patients (13.9%) developed symptoms of celiac disease and were placed on a gluten-free diet, although they declined confirmatory biopsy, disallowing classification of celiac disease. Another 33 patients (11.7%) were lost to follow-up and 89 (32%) stopped producing antibodies, with none going on to develop villous atrophy. In total, 42 patients (15%) developed flat mucosa during the follow-up period, with an estimated cumulative incidence of 43% at 12 years. The investigators noted that patients most frequently progressed within two time frames; at 24-48 months after enrollment, or at 96-120 months.

To develop the algorithm, the investigators performed multivariable analysis with several potential risk factors, including age, sex, genetic profile, mucosal characteristics, and concomitant autoimmune diseases. Of these, a high number of IELs upon first biopsy was most highly correlated with progression to celiac disease. Patients who developed villous atrophy had a mean value of 11.9 IELs at first biopsy, compared with 6.44 among those who remained potential (P = .05). The next strongest predictive factors were age and genetic profile. Just 7% of children less than 3 years developed flat mucosa, compared with 51% of patients aged 3-10 years and 55% of those older than 10 years (P = .007). HLA status was predictive in the group aged 3-10 years but not significant in the youngest or oldest patients. Therefore, HLA haplotype was included in the final algorithm, but with smaller contribution than five non-HLA genes, namely, IL12a, SH2B3, RGS1, CCR, and IL2/IL21.

“Combining these risk factors, we set up a model to predict the probability for a patient to evolve from potential celiac disease to villous atrophy,” the investigators wrote. “Overall, the discriminant analysis model allows us to correctly classify, at entry, 80% of the children who will not develop a flat mucosa over follow-up, while approximately 69% of those who will develop flat mucosa are correctly classified by the parameters we analyzed. This system is then more accurate to predict a child who will not develop flat mucosa and then can be monitored on a gluten-containing diet than a child who will become celiac.”

The investigators noted that IEL count may be an uncommon diagnostic; however, they recommended the test, even if it necessitates referral. “The [IEL] count turned out to be crucial for the prediction power of the discriminant analysis,” the investigators wrote.

“The long-term risks of potential celiac disease have never been accurately evaluated. Thus, before adopting a wait-and-see strategy on a gluten-containing diet, a final decision should always be shared with the family.”

Still, the investigators concluded that gluten-free diet “should not be prescribed indistinctly to all patients” with potential celiac disease, as it is a “very heterogenic condition and is not necessarily the first step of overt disease.”

The investigators disclosed no funding or conflicts of interest.

SOURCE: Auricchio R et al. Gastroenterology. 2019 Apr 9. doi: 10.1053/j.gastro.2019.04.004.

Body

While the simplification of the diagnostic process for celiac disease (CD), now heavily reliant on CD-specific autoantibodies, has made the life of clinicians easier in many respects, new scenarios also have emerged that are posing new challenges. One of them is that a substantial, growing portion of subjects (who may or may not have symptoms) present with positive CD autoantibodies but a normal duodenal mucosa (“potential celiac patient”). If left on gluten, with time some will develop villous atrophy, but some won’t. What is the clinician supposed to do with them?

Dr. Stefano Guandalini

 

The paper by Auricchio et al. addresses this issue in a rigorous, well-structured way by closely prospectively monitoring a large series of pediatric patients. Their conclusions have very useful implications for the clinician. In fact taking into consideration several criteria, they found valuable after a long observation period – such as age of the child, HLA status, persistence of elevated CD-specific autoantibodies, and presence or absence of intraepithelial lymphocytes in the initial biopsy – they concluded that one can correctly identify at the beginning four out of five potential celiac patients who will not develop villous atrophy, and thus do not need to follow a gluten-free diet.

Ultimately, however, let’s not forget that we are still dealing with percentages of risk to develop full-blown CD, not with definitive certainties. Hence, the decision of starting a gluten-free diet or not (and of how often and in which way to monitor those who remain on gluten) remains a mutually agreed upon plan sealed by two actors: on one side the patient (or the patient’s family); and on the other, an experienced health care provider who has clearly explained the facts. In other words, evidence-based criteria, good old medicine, and a grain of salt! 

Stefano Guandalini, MD, is a pediatric gastroenterologist at the University of Chicago Medical Center. He has no conflicts of interest.
 

Publications
Topics
Sections
Body

While the simplification of the diagnostic process for celiac disease (CD), now heavily reliant on CD-specific autoantibodies, has made the life of clinicians easier in many respects, new scenarios also have emerged that are posing new challenges. One of them is that a substantial, growing portion of subjects (who may or may not have symptoms) present with positive CD autoantibodies but a normal duodenal mucosa (“potential celiac patient”). If left on gluten, with time some will develop villous atrophy, but some won’t. What is the clinician supposed to do with them?

Dr. Stefano Guandalini

 

The paper by Auricchio et al. addresses this issue in a rigorous, well-structured way by closely prospectively monitoring a large series of pediatric patients. Their conclusions have very useful implications for the clinician. In fact taking into consideration several criteria, they found valuable after a long observation period – such as age of the child, HLA status, persistence of elevated CD-specific autoantibodies, and presence or absence of intraepithelial lymphocytes in the initial biopsy – they concluded that one can correctly identify at the beginning four out of five potential celiac patients who will not develop villous atrophy, and thus do not need to follow a gluten-free diet.

Ultimately, however, let’s not forget that we are still dealing with percentages of risk to develop full-blown CD, not with definitive certainties. Hence, the decision of starting a gluten-free diet or not (and of how often and in which way to monitor those who remain on gluten) remains a mutually agreed upon plan sealed by two actors: on one side the patient (or the patient’s family); and on the other, an experienced health care provider who has clearly explained the facts. In other words, evidence-based criteria, good old medicine, and a grain of salt! 

Stefano Guandalini, MD, is a pediatric gastroenterologist at the University of Chicago Medical Center. He has no conflicts of interest.
 

Body

While the simplification of the diagnostic process for celiac disease (CD), now heavily reliant on CD-specific autoantibodies, has made the life of clinicians easier in many respects, new scenarios also have emerged that are posing new challenges. One of them is that a substantial, growing portion of subjects (who may or may not have symptoms) present with positive CD autoantibodies but a normal duodenal mucosa (“potential celiac patient”). If left on gluten, with time some will develop villous atrophy, but some won’t. What is the clinician supposed to do with them?

Dr. Stefano Guandalini

 

The paper by Auricchio et al. addresses this issue in a rigorous, well-structured way by closely prospectively monitoring a large series of pediatric patients. Their conclusions have very useful implications for the clinician. In fact taking into consideration several criteria, they found valuable after a long observation period – such as age of the child, HLA status, persistence of elevated CD-specific autoantibodies, and presence or absence of intraepithelial lymphocytes in the initial biopsy – they concluded that one can correctly identify at the beginning four out of five potential celiac patients who will not develop villous atrophy, and thus do not need to follow a gluten-free diet.

Ultimately, however, let’s not forget that we are still dealing with percentages of risk to develop full-blown CD, not with definitive certainties. Hence, the decision of starting a gluten-free diet or not (and of how often and in which way to monitor those who remain on gluten) remains a mutually agreed upon plan sealed by two actors: on one side the patient (or the patient’s family); and on the other, an experienced health care provider who has clearly explained the facts. In other words, evidence-based criteria, good old medicine, and a grain of salt! 

Stefano Guandalini, MD, is a pediatric gastroenterologist at the University of Chicago Medical Center. He has no conflicts of interest.
 

Title
Evidence-based prediction with a grain of salt
Evidence-based prediction with a grain of salt

A new algorithm may be able to predict which children with potential celiac disease will go on to develop villous atrophy, according to investigators writing in Gastroenterology.

The risk model was developed from the largest cohort of its kind, with the longest follow-up to date, reported lead author Renata Auricchio, MD, PhD, of University Federico II in Naples, Italy, and colleagues. Using the algorithm, which relies most heavily on a baseline number of intraepithelial lymphocytes (IELs) in mucosa, followed by age at diagnosis and genetic profile, clinicians may now consider prescribing gluten-free diets to only the highest-risk patients, instead of all suspected cases, noting that more than half of potential cases do not develop flat mucosa within 12 years.

Development of the algorithm began with enrollment of 340 children aged 2-18 years who were positive for endomysial antibodies immunoglobulin A antibodies and had tested positive twice consecutively for antitissue transglutaminase antibodies. Additionally, children were required to possess HLA DQ2- or DQ8-positive haplotypes and have normal duodenal architecture in five biopsy samples. Because of symptoms suggestive of celiac disease or parental discretion, 60 patients were started on a gluten-free diet and excluded from the study, leaving 280 patients in the final cohort. These patients were kept on a gluten-containing diet and followed for up to 12 years. Every 6 months, the investigators checked antibodies and clinical status, and every 2 years, small bowel biopsy was performed, if symptoms had not necessitated this earlier.

After a median follow-up of 60 months, ranging from 18 months to 12 years, 39 patients (13.9%) developed symptoms of celiac disease and were placed on a gluten-free diet, although they declined confirmatory biopsy, disallowing classification of celiac disease. Another 33 patients (11.7%) were lost to follow-up and 89 (32%) stopped producing antibodies, with none going on to develop villous atrophy. In total, 42 patients (15%) developed flat mucosa during the follow-up period, with an estimated cumulative incidence of 43% at 12 years. The investigators noted that patients most frequently progressed within two time frames; at 24-48 months after enrollment, or at 96-120 months.

To develop the algorithm, the investigators performed multivariable analysis with several potential risk factors, including age, sex, genetic profile, mucosal characteristics, and concomitant autoimmune diseases. Of these, a high number of IELs upon first biopsy was most highly correlated with progression to celiac disease. Patients who developed villous atrophy had a mean value of 11.9 IELs at first biopsy, compared with 6.44 among those who remained potential (P = .05). The next strongest predictive factors were age and genetic profile. Just 7% of children less than 3 years developed flat mucosa, compared with 51% of patients aged 3-10 years and 55% of those older than 10 years (P = .007). HLA status was predictive in the group aged 3-10 years but not significant in the youngest or oldest patients. Therefore, HLA haplotype was included in the final algorithm, but with smaller contribution than five non-HLA genes, namely, IL12a, SH2B3, RGS1, CCR, and IL2/IL21.

“Combining these risk factors, we set up a model to predict the probability for a patient to evolve from potential celiac disease to villous atrophy,” the investigators wrote. “Overall, the discriminant analysis model allows us to correctly classify, at entry, 80% of the children who will not develop a flat mucosa over follow-up, while approximately 69% of those who will develop flat mucosa are correctly classified by the parameters we analyzed. This system is then more accurate to predict a child who will not develop flat mucosa and then can be monitored on a gluten-containing diet than a child who will become celiac.”

The investigators noted that IEL count may be an uncommon diagnostic; however, they recommended the test, even if it necessitates referral. “The [IEL] count turned out to be crucial for the prediction power of the discriminant analysis,” the investigators wrote.

“The long-term risks of potential celiac disease have never been accurately evaluated. Thus, before adopting a wait-and-see strategy on a gluten-containing diet, a final decision should always be shared with the family.”

Still, the investigators concluded that gluten-free diet “should not be prescribed indistinctly to all patients” with potential celiac disease, as it is a “very heterogenic condition and is not necessarily the first step of overt disease.”

The investigators disclosed no funding or conflicts of interest.

SOURCE: Auricchio R et al. Gastroenterology. 2019 Apr 9. doi: 10.1053/j.gastro.2019.04.004.

A new algorithm may be able to predict which children with potential celiac disease will go on to develop villous atrophy, according to investigators writing in Gastroenterology.

The risk model was developed from the largest cohort of its kind, with the longest follow-up to date, reported lead author Renata Auricchio, MD, PhD, of University Federico II in Naples, Italy, and colleagues. Using the algorithm, which relies most heavily on a baseline number of intraepithelial lymphocytes (IELs) in mucosa, followed by age at diagnosis and genetic profile, clinicians may now consider prescribing gluten-free diets to only the highest-risk patients, instead of all suspected cases, noting that more than half of potential cases do not develop flat mucosa within 12 years.

Development of the algorithm began with enrollment of 340 children aged 2-18 years who were positive for endomysial antibodies immunoglobulin A antibodies and had tested positive twice consecutively for antitissue transglutaminase antibodies. Additionally, children were required to possess HLA DQ2- or DQ8-positive haplotypes and have normal duodenal architecture in five biopsy samples. Because of symptoms suggestive of celiac disease or parental discretion, 60 patients were started on a gluten-free diet and excluded from the study, leaving 280 patients in the final cohort. These patients were kept on a gluten-containing diet and followed for up to 12 years. Every 6 months, the investigators checked antibodies and clinical status, and every 2 years, small bowel biopsy was performed, if symptoms had not necessitated this earlier.

After a median follow-up of 60 months, ranging from 18 months to 12 years, 39 patients (13.9%) developed symptoms of celiac disease and were placed on a gluten-free diet, although they declined confirmatory biopsy, disallowing classification of celiac disease. Another 33 patients (11.7%) were lost to follow-up and 89 (32%) stopped producing antibodies, with none going on to develop villous atrophy. In total, 42 patients (15%) developed flat mucosa during the follow-up period, with an estimated cumulative incidence of 43% at 12 years. The investigators noted that patients most frequently progressed within two time frames; at 24-48 months after enrollment, or at 96-120 months.

To develop the algorithm, the investigators performed multivariable analysis with several potential risk factors, including age, sex, genetic profile, mucosal characteristics, and concomitant autoimmune diseases. Of these, a high number of IELs upon first biopsy was most highly correlated with progression to celiac disease. Patients who developed villous atrophy had a mean value of 11.9 IELs at first biopsy, compared with 6.44 among those who remained potential (P = .05). The next strongest predictive factors were age and genetic profile. Just 7% of children less than 3 years developed flat mucosa, compared with 51% of patients aged 3-10 years and 55% of those older than 10 years (P = .007). HLA status was predictive in the group aged 3-10 years but not significant in the youngest or oldest patients. Therefore, HLA haplotype was included in the final algorithm, but with smaller contribution than five non-HLA genes, namely, IL12a, SH2B3, RGS1, CCR, and IL2/IL21.

“Combining these risk factors, we set up a model to predict the probability for a patient to evolve from potential celiac disease to villous atrophy,” the investigators wrote. “Overall, the discriminant analysis model allows us to correctly classify, at entry, 80% of the children who will not develop a flat mucosa over follow-up, while approximately 69% of those who will develop flat mucosa are correctly classified by the parameters we analyzed. This system is then more accurate to predict a child who will not develop flat mucosa and then can be monitored on a gluten-containing diet than a child who will become celiac.”

The investigators noted that IEL count may be an uncommon diagnostic; however, they recommended the test, even if it necessitates referral. “The [IEL] count turned out to be crucial for the prediction power of the discriminant analysis,” the investigators wrote.

“The long-term risks of potential celiac disease have never been accurately evaluated. Thus, before adopting a wait-and-see strategy on a gluten-containing diet, a final decision should always be shared with the family.”

Still, the investigators concluded that gluten-free diet “should not be prescribed indistinctly to all patients” with potential celiac disease, as it is a “very heterogenic condition and is not necessarily the first step of overt disease.”

The investigators disclosed no funding or conflicts of interest.

SOURCE: Auricchio R et al. Gastroenterology. 2019 Apr 9. doi: 10.1053/j.gastro.2019.04.004.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.

Immune modulators help anti-TNF agents battle Crohn’s disease, but not UC

Timely findings on treatment optimization
Article Type
Changed
Tue, 07/09/2019 - 14:10

 

Adding an immune modulator (IM) to anti–tumor necrosis factor (anti-TNF) initiation therapy benefits patients with Crohn’s disease (CD) but not those with ulcerative colitis (UC), according to a recent retrospective look at more than 1,000 cases.

The study showed that patients with CD who started combination therapy instead of monotherapy had lower rates of treatment ineffectiveness, experienced longer delays until hospitalization, and less often needed to switch their anti-TNF agent, reported lead author Laura E. Targownik, MD, of the University of Manitoba, in Winnipeg, Canada, and colleagues.

“Current guidelines on the medical management of IBD strongly support the use of IMs and anti-TNFs in combination over anti-TNF monotherapy,” the investigators wrote in Clinical Gastroenterology and Hepatology. “However, there is a sparsity of real-world data demonstrating the incremental benefits of combination therapy.”

The investigators noted that the SONIC trial, conducted in 2010, showed that patients treated with combination therapy were more likely to achieve corticosteroid-free remission at weeks 26 and 50; this became the basis of evidence leading multiple clinical guidelines to recommend combination therapy for patients with CD.

The present study involved 852 patients with CD and 303 with UC who began treatment with an anti-TNF agent during 2001-2016. Data were drawn from the Manitoba Inflammatory Bowel Disease (IBD) Epidemiology database.

The main outcome of interest was treatment ineffectiveness, which was defined by any of the following four events: acute, IBD-related hospital admission for more than 48 hours; resective intestinal surgery; corticosteroid use at least 14 days after initiating anti-TNF therapy, or, if corticosteroids were used within 16 weeks of anti-TNF initiation, then subsequent corticosteroid use occurring at least 16 weeks after initiation; or switching to a different anti-TNF agent. The investigators also looked for differences in effectiveness between two agents from each class: anti-TNF agents infliximab and adalimumab, and immunomodulators methotrexate and azathioprine.

Results showed that patients with CD had higher rates of ineffectiveness-free survival when treated with combination therapy instead of monotherapy at 1 year (74.2% vs. 68.6%) and 2 years (64.0% vs. 54.5%). Using a Cox proportional hazards model, this translated to a 38% reduced risk of treatment ineffectiveness (adjusted hazard ratio, 0.62).

“This suggests that the findings of the SONIC trial may extend to real-world clinical practice, even in patients who had previous IM exposure,” the investigators noted.

Combination therapy was also significantly associated with longer time to first IBD-related hospitalization (HR, 0.53) and the need to switch anti-TNF agent (HR, 0.63). However, no such relationships were found for time to resective surgery or corticosteroid use. Although combination therapy had no impact on the rate of primary treatment ineffectiveness in multivariable logistic regression, those who received anti-TNF therapy for more than 90 days had delayed secondary treatment ineffectiveness and fewer IBD-related hospitalizations. Choice of agent from either class had no influence on effectiveness of combination therapy.

In contrast with the above findings, combination therapy in patients with UC was less promising, which aligns with previous studies.

“[W]e were not able to demonstrate a significant advantage to combination therapy in persons with UC,” the investigators wrote. “In addition, all published cohort studies to date have not been able to confirm a significant benefit to combination therapy in UC. ... In light of the lower quality of prior evidence, combined with the results from our study, the indication for combination therapy in UC would appear to be weaker.”

“Further analyses in larger cohorts may clarify whether there is a clinically relevant benefit of combination therapy in persons with UC,” the investigators concluded. “Because of the discrepancy between our findings and those of a meta-analysis of cohort studies previously published on this topic, confirmation of our results is required in future studies.”

The investigators disclosed no funding or conflicts of interest.

SOURCE: Targownik LE et al. Clin Gastroenterol Hepatol. 2018 Nov 15. doi: 10.1016/j.cgh.2018.11.003.

Body

Twenty years after the approval of the first anti–tumor necrosis factor (TNF) biologic agent for the treatment of inflammatory bowel disease (IBD), patients and providers are still learning how to optimize these medications. One optimization is the use of combination therapy (immunomodulator and anti-TNF). Immunomodulators are used independently for maintenance of remission of IBD, and they have been shown to reduce immunogenicity and improve efficacy when used in combination with an anti-TNF agent in prior short-term randomized controlled trials. However, use of combination therapy in the real-world is not universally practiced. Data are lacking on the risks and benefits of long-term use of these agents. Therefore, this article by Targownik et al. is very timely.

Dr. Millie Long
Patients with Crohn’s disease treated with combination therapy in this population-based cohort had improved efficacy including a significant decrease in treatment ineffectiveness, increased time to first hospitalization, and increased time to anti-TNF medication switch.

Importantly, a mixed group of patients who had previously been on azathioprine monotherapy and those newly starting this therapy at the time of anti-TNF initiation were included in this cohort (a group similar to what we see in real-world practice). Data on risk factors for disease complications, such as disease phenotype or severity, were not available. By contrast, none of the efficacy associations were improved in the smaller group of patients with ulcerative colitis on combination therapy.

As providers counsel patients on the benefits and risks of various IBD treatment choices, these data by Targownik et al. will inform decisions. Future research should incorporate additional means of biologic optimization, such as the use of therapeutic drug monitoring and/or risk factor–based selection of therapeutic agents, to better inform individualized treatment choices.

Millie D. Long MD, MPH, is an associate professor of medicine in the division of gastroenterology and hepatology; Inflammatory Bowel Diseases Center; vice chief for education; director, Gastroenterology and Hepatology Fellowship Program at the University of North Carolina at Chapel Hill. She has the following conflicts of interest: AbbVie, Takeda, Pfizer, UCB, Janssen, Salix, Prometheus, Target Pharmasolutions, and Valeant. 
 

Publications
Topics
Sections
Body

Twenty years after the approval of the first anti–tumor necrosis factor (TNF) biologic agent for the treatment of inflammatory bowel disease (IBD), patients and providers are still learning how to optimize these medications. One optimization is the use of combination therapy (immunomodulator and anti-TNF). Immunomodulators are used independently for maintenance of remission of IBD, and they have been shown to reduce immunogenicity and improve efficacy when used in combination with an anti-TNF agent in prior short-term randomized controlled trials. However, use of combination therapy in the real-world is not universally practiced. Data are lacking on the risks and benefits of long-term use of these agents. Therefore, this article by Targownik et al. is very timely.

Dr. Millie Long
Patients with Crohn’s disease treated with combination therapy in this population-based cohort had improved efficacy including a significant decrease in treatment ineffectiveness, increased time to first hospitalization, and increased time to anti-TNF medication switch.

Importantly, a mixed group of patients who had previously been on azathioprine monotherapy and those newly starting this therapy at the time of anti-TNF initiation were included in this cohort (a group similar to what we see in real-world practice). Data on risk factors for disease complications, such as disease phenotype or severity, were not available. By contrast, none of the efficacy associations were improved in the smaller group of patients with ulcerative colitis on combination therapy.

As providers counsel patients on the benefits and risks of various IBD treatment choices, these data by Targownik et al. will inform decisions. Future research should incorporate additional means of biologic optimization, such as the use of therapeutic drug monitoring and/or risk factor–based selection of therapeutic agents, to better inform individualized treatment choices.

Millie D. Long MD, MPH, is an associate professor of medicine in the division of gastroenterology and hepatology; Inflammatory Bowel Diseases Center; vice chief for education; director, Gastroenterology and Hepatology Fellowship Program at the University of North Carolina at Chapel Hill. She has the following conflicts of interest: AbbVie, Takeda, Pfizer, UCB, Janssen, Salix, Prometheus, Target Pharmasolutions, and Valeant. 
 

Body

Twenty years after the approval of the first anti–tumor necrosis factor (TNF) biologic agent for the treatment of inflammatory bowel disease (IBD), patients and providers are still learning how to optimize these medications. One optimization is the use of combination therapy (immunomodulator and anti-TNF). Immunomodulators are used independently for maintenance of remission of IBD, and they have been shown to reduce immunogenicity and improve efficacy when used in combination with an anti-TNF agent in prior short-term randomized controlled trials. However, use of combination therapy in the real-world is not universally practiced. Data are lacking on the risks and benefits of long-term use of these agents. Therefore, this article by Targownik et al. is very timely.

Dr. Millie Long
Patients with Crohn’s disease treated with combination therapy in this population-based cohort had improved efficacy including a significant decrease in treatment ineffectiveness, increased time to first hospitalization, and increased time to anti-TNF medication switch.

Importantly, a mixed group of patients who had previously been on azathioprine monotherapy and those newly starting this therapy at the time of anti-TNF initiation were included in this cohort (a group similar to what we see in real-world practice). Data on risk factors for disease complications, such as disease phenotype or severity, were not available. By contrast, none of the efficacy associations were improved in the smaller group of patients with ulcerative colitis on combination therapy.

As providers counsel patients on the benefits and risks of various IBD treatment choices, these data by Targownik et al. will inform decisions. Future research should incorporate additional means of biologic optimization, such as the use of therapeutic drug monitoring and/or risk factor–based selection of therapeutic agents, to better inform individualized treatment choices.

Millie D. Long MD, MPH, is an associate professor of medicine in the division of gastroenterology and hepatology; Inflammatory Bowel Diseases Center; vice chief for education; director, Gastroenterology and Hepatology Fellowship Program at the University of North Carolina at Chapel Hill. She has the following conflicts of interest: AbbVie, Takeda, Pfizer, UCB, Janssen, Salix, Prometheus, Target Pharmasolutions, and Valeant. 
 

Title
Timely findings on treatment optimization
Timely findings on treatment optimization

 

Adding an immune modulator (IM) to anti–tumor necrosis factor (anti-TNF) initiation therapy benefits patients with Crohn’s disease (CD) but not those with ulcerative colitis (UC), according to a recent retrospective look at more than 1,000 cases.

The study showed that patients with CD who started combination therapy instead of monotherapy had lower rates of treatment ineffectiveness, experienced longer delays until hospitalization, and less often needed to switch their anti-TNF agent, reported lead author Laura E. Targownik, MD, of the University of Manitoba, in Winnipeg, Canada, and colleagues.

“Current guidelines on the medical management of IBD strongly support the use of IMs and anti-TNFs in combination over anti-TNF monotherapy,” the investigators wrote in Clinical Gastroenterology and Hepatology. “However, there is a sparsity of real-world data demonstrating the incremental benefits of combination therapy.”

The investigators noted that the SONIC trial, conducted in 2010, showed that patients treated with combination therapy were more likely to achieve corticosteroid-free remission at weeks 26 and 50; this became the basis of evidence leading multiple clinical guidelines to recommend combination therapy for patients with CD.

The present study involved 852 patients with CD and 303 with UC who began treatment with an anti-TNF agent during 2001-2016. Data were drawn from the Manitoba Inflammatory Bowel Disease (IBD) Epidemiology database.

The main outcome of interest was treatment ineffectiveness, which was defined by any of the following four events: acute, IBD-related hospital admission for more than 48 hours; resective intestinal surgery; corticosteroid use at least 14 days after initiating anti-TNF therapy, or, if corticosteroids were used within 16 weeks of anti-TNF initiation, then subsequent corticosteroid use occurring at least 16 weeks after initiation; or switching to a different anti-TNF agent. The investigators also looked for differences in effectiveness between two agents from each class: anti-TNF agents infliximab and adalimumab, and immunomodulators methotrexate and azathioprine.

Results showed that patients with CD had higher rates of ineffectiveness-free survival when treated with combination therapy instead of monotherapy at 1 year (74.2% vs. 68.6%) and 2 years (64.0% vs. 54.5%). Using a Cox proportional hazards model, this translated to a 38% reduced risk of treatment ineffectiveness (adjusted hazard ratio, 0.62).

“This suggests that the findings of the SONIC trial may extend to real-world clinical practice, even in patients who had previous IM exposure,” the investigators noted.

Combination therapy was also significantly associated with longer time to first IBD-related hospitalization (HR, 0.53) and the need to switch anti-TNF agent (HR, 0.63). However, no such relationships were found for time to resective surgery or corticosteroid use. Although combination therapy had no impact on the rate of primary treatment ineffectiveness in multivariable logistic regression, those who received anti-TNF therapy for more than 90 days had delayed secondary treatment ineffectiveness and fewer IBD-related hospitalizations. Choice of agent from either class had no influence on effectiveness of combination therapy.

In contrast with the above findings, combination therapy in patients with UC was less promising, which aligns with previous studies.

“[W]e were not able to demonstrate a significant advantage to combination therapy in persons with UC,” the investigators wrote. “In addition, all published cohort studies to date have not been able to confirm a significant benefit to combination therapy in UC. ... In light of the lower quality of prior evidence, combined with the results from our study, the indication for combination therapy in UC would appear to be weaker.”

“Further analyses in larger cohorts may clarify whether there is a clinically relevant benefit of combination therapy in persons with UC,” the investigators concluded. “Because of the discrepancy between our findings and those of a meta-analysis of cohort studies previously published on this topic, confirmation of our results is required in future studies.”

The investigators disclosed no funding or conflicts of interest.

SOURCE: Targownik LE et al. Clin Gastroenterol Hepatol. 2018 Nov 15. doi: 10.1016/j.cgh.2018.11.003.

 

Adding an immune modulator (IM) to anti–tumor necrosis factor (anti-TNF) initiation therapy benefits patients with Crohn’s disease (CD) but not those with ulcerative colitis (UC), according to a recent retrospective look at more than 1,000 cases.

The study showed that patients with CD who started combination therapy instead of monotherapy had lower rates of treatment ineffectiveness, experienced longer delays until hospitalization, and less often needed to switch their anti-TNF agent, reported lead author Laura E. Targownik, MD, of the University of Manitoba, in Winnipeg, Canada, and colleagues.

“Current guidelines on the medical management of IBD strongly support the use of IMs and anti-TNFs in combination over anti-TNF monotherapy,” the investigators wrote in Clinical Gastroenterology and Hepatology. “However, there is a sparsity of real-world data demonstrating the incremental benefits of combination therapy.”

The investigators noted that the SONIC trial, conducted in 2010, showed that patients treated with combination therapy were more likely to achieve corticosteroid-free remission at weeks 26 and 50; this became the basis of evidence leading multiple clinical guidelines to recommend combination therapy for patients with CD.

The present study involved 852 patients with CD and 303 with UC who began treatment with an anti-TNF agent during 2001-2016. Data were drawn from the Manitoba Inflammatory Bowel Disease (IBD) Epidemiology database.

The main outcome of interest was treatment ineffectiveness, which was defined by any of the following four events: acute, IBD-related hospital admission for more than 48 hours; resective intestinal surgery; corticosteroid use at least 14 days after initiating anti-TNF therapy, or, if corticosteroids were used within 16 weeks of anti-TNF initiation, then subsequent corticosteroid use occurring at least 16 weeks after initiation; or switching to a different anti-TNF agent. The investigators also looked for differences in effectiveness between two agents from each class: anti-TNF agents infliximab and adalimumab, and immunomodulators methotrexate and azathioprine.

Results showed that patients with CD had higher rates of ineffectiveness-free survival when treated with combination therapy instead of monotherapy at 1 year (74.2% vs. 68.6%) and 2 years (64.0% vs. 54.5%). Using a Cox proportional hazards model, this translated to a 38% reduced risk of treatment ineffectiveness (adjusted hazard ratio, 0.62).

“This suggests that the findings of the SONIC trial may extend to real-world clinical practice, even in patients who had previous IM exposure,” the investigators noted.

Combination therapy was also significantly associated with longer time to first IBD-related hospitalization (HR, 0.53) and the need to switch anti-TNF agent (HR, 0.63). However, no such relationships were found for time to resective surgery or corticosteroid use. Although combination therapy had no impact on the rate of primary treatment ineffectiveness in multivariable logistic regression, those who received anti-TNF therapy for more than 90 days had delayed secondary treatment ineffectiveness and fewer IBD-related hospitalizations. Choice of agent from either class had no influence on effectiveness of combination therapy.

In contrast with the above findings, combination therapy in patients with UC was less promising, which aligns with previous studies.

“[W]e were not able to demonstrate a significant advantage to combination therapy in persons with UC,” the investigators wrote. “In addition, all published cohort studies to date have not been able to confirm a significant benefit to combination therapy in UC. ... In light of the lower quality of prior evidence, combined with the results from our study, the indication for combination therapy in UC would appear to be weaker.”

“Further analyses in larger cohorts may clarify whether there is a clinically relevant benefit of combination therapy in persons with UC,” the investigators concluded. “Because of the discrepancy between our findings and those of a meta-analysis of cohort studies previously published on this topic, confirmation of our results is required in future studies.”

The investigators disclosed no funding or conflicts of interest.

SOURCE: Targownik LE et al. Clin Gastroenterol Hepatol. 2018 Nov 15. doi: 10.1016/j.cgh.2018.11.003.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.