Acute hepatic porphyrias no longer as rare as previously thought

Article Type
Changed
Fri, 02/03/2023 - 17:03

Although rare, acute hepatic porphyrias (AHPs) may be more common than previously thought, particularly among women between ages 15 and 50, according to a new clinical practice update from the American Gastroenterological Association.

For acute attacks, treatment should include intravenous hemin, and for patients with recurrent attacks, a newly-approved therapy called givosiran should be considered, wrote the authors of the update, which was published Jan. 13 in Gastroenterology.

Dr. Brian Wang

“Diagnoses of AHPs are often missed, with a delay of more than 15 years from initial presentation. The key to early diagnosis is to consider the diagnosis, especially in patients with recurring severe abdominal pain not ascribable to other causes,” wrote the authors, who were led by Bruce Wang, MD, a hepatologist with the University of California, San Francisco.

AHPs are inherited disorders of heme-metabolism, which include acute intermittent porphyria, hereditary coproporphyria, variegate porphyria, and porphyria due to severe deficiency of 5-aminolevulinic acid dehydratase.

Acute intermittent porphyria (AIP) is the most common type, with an estimated prevalence of symptomatic AHP of 1 in 100,000 patients. However, population-level genetic studies show that the prevalence of pathogenic variants for AIP is between 1 in 1,300 and 1 in 1,785.

The major clinical presentation includes attacks of severe abdominal pain, nausea, vomiting, constipation, muscle weakness, neuropathy, tachycardia, and hypertension, yet without peritoneal signs or abnormalities on cross-sectional imaging.

Recent advances in treatment have improved the outlook for patients with AHP. To provide timely guidance, the authors developed 12 clinical practice advice statements on the diagnosis and management of AHPs based on a review of the published literature and expert opinion.

First, AHP screening should be considered in the evaluation of all patients, particularly among women in their childbearing years between ages 15 and 50 with unexplained, recurrent severe abdominal pain that doesn’t have a clear etiology. About 90% of patients with symptomatic AHP are women, and more than 90% of them experience only one or a few acute attacks in their lifetime, which are often precipitated by factors that increase the activity of the enzyme ALAS1 in the liver.

For initial AHP diagnosis, biochemical testing should measure porphobilinogen (PBG) and delta-aminolevulinic acid (ALA) corrected to creatine on a random urine sample. All patients with significantly elevated urinary PBG or ALA should initially be presumed to have AHP, and during acute attacks, both will be elevated at least five-fold of the upper limit of normal. Because ALA and PBG are porphyrin precursors, urine porphyrin testing should not be used alone for AHP screening.

After that, genetic testing should be used to confirm the AHP diagnosis, as well as the specific type of AHP. Sequencing of the four genes ALAD, HMBS, CPOX, and PPOX leads to aminolevulinic acid dehydrase deficiency, acute intermittent porphyria, hereditary coproporphyria, and variegate porphyria, respectively. When whole-gene sequencing is performed, about 95%-99% of cases can be identified. First-degree family members should be screened with genetic testing, and those who are mutation carriers should be counseled.

For acute attacks of AHP that are severe enough to require hospitalization, the currently approved treatment is intravenous hemin infusion, usually given once daily at a dose of 3-4 mg/kg body weight for 3-5 days. Due to potential thrombophlebitis, it’s best to administer hemin in a high-flow central vein via a peripherally inserted central catheter or central port.

In addition, treatment for acute attacks should include analgesics, antiemetics, and management of systemic arterial hypertension, tachycardia, hyponatremia, and hypomagnesemia. The primary goal of treatment during an acute attack is to decrease ALA production. Patients should be counseled to avoid identifiable triggers, such as porphyrinogenic medications, excess alcohol intake, tobacco use, and caloric deprivation.

Although recent advances have improved treatment for acute attacks, management for patients with frequent attacks remains challenging, the study authors wrote. About 3%-5% of patients with symptomatic AHP experience recurrent attacks, which is defined as four or more attacks per year. These attacks aren’t typically associated with identifiable triggers, although some that occur during the luteal phase of a patient’s menstrual cycle are believed to be triggered by progesterone. However, treatment with hormonal suppression therapy, such as GnRH agonists, has had limited success.

Off-label use of prophylactic intravenous heme therapy is common, although the effectiveness in preventing recurrent attacks isn’t well-established. In addition, chronic hemin use is associated with several complications, including infections, iron overload, and the need for indwelling central venous catheters.

Recently, the Food and Drug Administration approved givosiran, a small interfering RNA-based therapy that targets delta-aminolevulinate synthase 1, for treatment in adults with AHP. Monthly subcutaneous therapy appears to significantly lower rates of acute attacks among patients who experience recurrent attacks.

“We suggest prescribing givosiran only for those patients with recurrent acute attacks that are both biochemically and genetically confirmed,” the authors wrote. “Due to limited safety data, givosiran should not be used in women who are pregnant or planning a pregnancy.”

In the most severe cases, liver transplantation should be limited to patients with intractable symptoms and a significantly decreased quality of life who are refractory to pharmacotherapy. If living donor transplantation is considered, genetic testing should be used to screen related living donors since HMBS pathogenic variants in asymptomatic donors could results in poor posttransplantation outcomes.

In the long-term, patients with AHP should be monitored annually for liver disease and chronic kidney disease with serum creatinine and estimated glomerular filtration rate monitored. Patients also face an increased risk of hepatocellular carcinoma and should start screening at age 50, with a liver ultrasound every 6 months.

“Fortunately, most people with genetic defects never experience severe acute attacks or may experience only one or a few attacks throughout their lives,” the authors wrote.

The authors (Bruce Wang, MD, Herbert L. Bonkovsky, MD, AGAF, and Manisha Balwani, MD, MS) reported that they are part of the Porphyrias Consortium. The Porphyrias Consortium is part of the Rare Diseases Clinical Research Network, an initiative of the Division of Rare Diseases Research Innovation at the National Center for Advancing Translational Sciences. The consortium is funded through a collaboration between the center and the National Institute of Diabetes and Digestive and Kidney Diseases. Several authors disclosed funding support and honoraria for advisory board roles with various pharmaceutical companies, including Alnylam, which makes givosiran.

This article was updated 2/3/23.

Publications
Topics
Sections

Although rare, acute hepatic porphyrias (AHPs) may be more common than previously thought, particularly among women between ages 15 and 50, according to a new clinical practice update from the American Gastroenterological Association.

For acute attacks, treatment should include intravenous hemin, and for patients with recurrent attacks, a newly-approved therapy called givosiran should be considered, wrote the authors of the update, which was published Jan. 13 in Gastroenterology.

Dr. Brian Wang

“Diagnoses of AHPs are often missed, with a delay of more than 15 years from initial presentation. The key to early diagnosis is to consider the diagnosis, especially in patients with recurring severe abdominal pain not ascribable to other causes,” wrote the authors, who were led by Bruce Wang, MD, a hepatologist with the University of California, San Francisco.

AHPs are inherited disorders of heme-metabolism, which include acute intermittent porphyria, hereditary coproporphyria, variegate porphyria, and porphyria due to severe deficiency of 5-aminolevulinic acid dehydratase.

Acute intermittent porphyria (AIP) is the most common type, with an estimated prevalence of symptomatic AHP of 1 in 100,000 patients. However, population-level genetic studies show that the prevalence of pathogenic variants for AIP is between 1 in 1,300 and 1 in 1,785.

The major clinical presentation includes attacks of severe abdominal pain, nausea, vomiting, constipation, muscle weakness, neuropathy, tachycardia, and hypertension, yet without peritoneal signs or abnormalities on cross-sectional imaging.

Recent advances in treatment have improved the outlook for patients with AHP. To provide timely guidance, the authors developed 12 clinical practice advice statements on the diagnosis and management of AHPs based on a review of the published literature and expert opinion.

First, AHP screening should be considered in the evaluation of all patients, particularly among women in their childbearing years between ages 15 and 50 with unexplained, recurrent severe abdominal pain that doesn’t have a clear etiology. About 90% of patients with symptomatic AHP are women, and more than 90% of them experience only one or a few acute attacks in their lifetime, which are often precipitated by factors that increase the activity of the enzyme ALAS1 in the liver.

For initial AHP diagnosis, biochemical testing should measure porphobilinogen (PBG) and delta-aminolevulinic acid (ALA) corrected to creatine on a random urine sample. All patients with significantly elevated urinary PBG or ALA should initially be presumed to have AHP, and during acute attacks, both will be elevated at least five-fold of the upper limit of normal. Because ALA and PBG are porphyrin precursors, urine porphyrin testing should not be used alone for AHP screening.

After that, genetic testing should be used to confirm the AHP diagnosis, as well as the specific type of AHP. Sequencing of the four genes ALAD, HMBS, CPOX, and PPOX leads to aminolevulinic acid dehydrase deficiency, acute intermittent porphyria, hereditary coproporphyria, and variegate porphyria, respectively. When whole-gene sequencing is performed, about 95%-99% of cases can be identified. First-degree family members should be screened with genetic testing, and those who are mutation carriers should be counseled.

For acute attacks of AHP that are severe enough to require hospitalization, the currently approved treatment is intravenous hemin infusion, usually given once daily at a dose of 3-4 mg/kg body weight for 3-5 days. Due to potential thrombophlebitis, it’s best to administer hemin in a high-flow central vein via a peripherally inserted central catheter or central port.

In addition, treatment for acute attacks should include analgesics, antiemetics, and management of systemic arterial hypertension, tachycardia, hyponatremia, and hypomagnesemia. The primary goal of treatment during an acute attack is to decrease ALA production. Patients should be counseled to avoid identifiable triggers, such as porphyrinogenic medications, excess alcohol intake, tobacco use, and caloric deprivation.

Although recent advances have improved treatment for acute attacks, management for patients with frequent attacks remains challenging, the study authors wrote. About 3%-5% of patients with symptomatic AHP experience recurrent attacks, which is defined as four or more attacks per year. These attacks aren’t typically associated with identifiable triggers, although some that occur during the luteal phase of a patient’s menstrual cycle are believed to be triggered by progesterone. However, treatment with hormonal suppression therapy, such as GnRH agonists, has had limited success.

Off-label use of prophylactic intravenous heme therapy is common, although the effectiveness in preventing recurrent attacks isn’t well-established. In addition, chronic hemin use is associated with several complications, including infections, iron overload, and the need for indwelling central venous catheters.

Recently, the Food and Drug Administration approved givosiran, a small interfering RNA-based therapy that targets delta-aminolevulinate synthase 1, for treatment in adults with AHP. Monthly subcutaneous therapy appears to significantly lower rates of acute attacks among patients who experience recurrent attacks.

“We suggest prescribing givosiran only for those patients with recurrent acute attacks that are both biochemically and genetically confirmed,” the authors wrote. “Due to limited safety data, givosiran should not be used in women who are pregnant or planning a pregnancy.”

In the most severe cases, liver transplantation should be limited to patients with intractable symptoms and a significantly decreased quality of life who are refractory to pharmacotherapy. If living donor transplantation is considered, genetic testing should be used to screen related living donors since HMBS pathogenic variants in asymptomatic donors could results in poor posttransplantation outcomes.

In the long-term, patients with AHP should be monitored annually for liver disease and chronic kidney disease with serum creatinine and estimated glomerular filtration rate monitored. Patients also face an increased risk of hepatocellular carcinoma and should start screening at age 50, with a liver ultrasound every 6 months.

“Fortunately, most people with genetic defects never experience severe acute attacks or may experience only one or a few attacks throughout their lives,” the authors wrote.

The authors (Bruce Wang, MD, Herbert L. Bonkovsky, MD, AGAF, and Manisha Balwani, MD, MS) reported that they are part of the Porphyrias Consortium. The Porphyrias Consortium is part of the Rare Diseases Clinical Research Network, an initiative of the Division of Rare Diseases Research Innovation at the National Center for Advancing Translational Sciences. The consortium is funded through a collaboration between the center and the National Institute of Diabetes and Digestive and Kidney Diseases. Several authors disclosed funding support and honoraria for advisory board roles with various pharmaceutical companies, including Alnylam, which makes givosiran.

This article was updated 2/3/23.

Although rare, acute hepatic porphyrias (AHPs) may be more common than previously thought, particularly among women between ages 15 and 50, according to a new clinical practice update from the American Gastroenterological Association.

For acute attacks, treatment should include intravenous hemin, and for patients with recurrent attacks, a newly-approved therapy called givosiran should be considered, wrote the authors of the update, which was published Jan. 13 in Gastroenterology.

Dr. Brian Wang

“Diagnoses of AHPs are often missed, with a delay of more than 15 years from initial presentation. The key to early diagnosis is to consider the diagnosis, especially in patients with recurring severe abdominal pain not ascribable to other causes,” wrote the authors, who were led by Bruce Wang, MD, a hepatologist with the University of California, San Francisco.

AHPs are inherited disorders of heme-metabolism, which include acute intermittent porphyria, hereditary coproporphyria, variegate porphyria, and porphyria due to severe deficiency of 5-aminolevulinic acid dehydratase.

Acute intermittent porphyria (AIP) is the most common type, with an estimated prevalence of symptomatic AHP of 1 in 100,000 patients. However, population-level genetic studies show that the prevalence of pathogenic variants for AIP is between 1 in 1,300 and 1 in 1,785.

The major clinical presentation includes attacks of severe abdominal pain, nausea, vomiting, constipation, muscle weakness, neuropathy, tachycardia, and hypertension, yet without peritoneal signs or abnormalities on cross-sectional imaging.

Recent advances in treatment have improved the outlook for patients with AHP. To provide timely guidance, the authors developed 12 clinical practice advice statements on the diagnosis and management of AHPs based on a review of the published literature and expert opinion.

First, AHP screening should be considered in the evaluation of all patients, particularly among women in their childbearing years between ages 15 and 50 with unexplained, recurrent severe abdominal pain that doesn’t have a clear etiology. About 90% of patients with symptomatic AHP are women, and more than 90% of them experience only one or a few acute attacks in their lifetime, which are often precipitated by factors that increase the activity of the enzyme ALAS1 in the liver.

For initial AHP diagnosis, biochemical testing should measure porphobilinogen (PBG) and delta-aminolevulinic acid (ALA) corrected to creatine on a random urine sample. All patients with significantly elevated urinary PBG or ALA should initially be presumed to have AHP, and during acute attacks, both will be elevated at least five-fold of the upper limit of normal. Because ALA and PBG are porphyrin precursors, urine porphyrin testing should not be used alone for AHP screening.

After that, genetic testing should be used to confirm the AHP diagnosis, as well as the specific type of AHP. Sequencing of the four genes ALAD, HMBS, CPOX, and PPOX leads to aminolevulinic acid dehydrase deficiency, acute intermittent porphyria, hereditary coproporphyria, and variegate porphyria, respectively. When whole-gene sequencing is performed, about 95%-99% of cases can be identified. First-degree family members should be screened with genetic testing, and those who are mutation carriers should be counseled.

For acute attacks of AHP that are severe enough to require hospitalization, the currently approved treatment is intravenous hemin infusion, usually given once daily at a dose of 3-4 mg/kg body weight for 3-5 days. Due to potential thrombophlebitis, it’s best to administer hemin in a high-flow central vein via a peripherally inserted central catheter or central port.

In addition, treatment for acute attacks should include analgesics, antiemetics, and management of systemic arterial hypertension, tachycardia, hyponatremia, and hypomagnesemia. The primary goal of treatment during an acute attack is to decrease ALA production. Patients should be counseled to avoid identifiable triggers, such as porphyrinogenic medications, excess alcohol intake, tobacco use, and caloric deprivation.

Although recent advances have improved treatment for acute attacks, management for patients with frequent attacks remains challenging, the study authors wrote. About 3%-5% of patients with symptomatic AHP experience recurrent attacks, which is defined as four or more attacks per year. These attacks aren’t typically associated with identifiable triggers, although some that occur during the luteal phase of a patient’s menstrual cycle are believed to be triggered by progesterone. However, treatment with hormonal suppression therapy, such as GnRH agonists, has had limited success.

Off-label use of prophylactic intravenous heme therapy is common, although the effectiveness in preventing recurrent attacks isn’t well-established. In addition, chronic hemin use is associated with several complications, including infections, iron overload, and the need for indwelling central venous catheters.

Recently, the Food and Drug Administration approved givosiran, a small interfering RNA-based therapy that targets delta-aminolevulinate synthase 1, for treatment in adults with AHP. Monthly subcutaneous therapy appears to significantly lower rates of acute attacks among patients who experience recurrent attacks.

“We suggest prescribing givosiran only for those patients with recurrent acute attacks that are both biochemically and genetically confirmed,” the authors wrote. “Due to limited safety data, givosiran should not be used in women who are pregnant or planning a pregnancy.”

In the most severe cases, liver transplantation should be limited to patients with intractable symptoms and a significantly decreased quality of life who are refractory to pharmacotherapy. If living donor transplantation is considered, genetic testing should be used to screen related living donors since HMBS pathogenic variants in asymptomatic donors could results in poor posttransplantation outcomes.

In the long-term, patients with AHP should be monitored annually for liver disease and chronic kidney disease with serum creatinine and estimated glomerular filtration rate monitored. Patients also face an increased risk of hepatocellular carcinoma and should start screening at age 50, with a liver ultrasound every 6 months.

“Fortunately, most people with genetic defects never experience severe acute attacks or may experience only one or a few attacks throughout their lives,” the authors wrote.

The authors (Bruce Wang, MD, Herbert L. Bonkovsky, MD, AGAF, and Manisha Balwani, MD, MS) reported that they are part of the Porphyrias Consortium. The Porphyrias Consortium is part of the Rare Diseases Clinical Research Network, an initiative of the Division of Rare Diseases Research Innovation at the National Center for Advancing Translational Sciences. The consortium is funded through a collaboration between the center and the National Institute of Diabetes and Digestive and Kidney Diseases. Several authors disclosed funding support and honoraria for advisory board roles with various pharmaceutical companies, including Alnylam, which makes givosiran.

This article was updated 2/3/23.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Childhood behavioral, emotional problems linked to poor economic and social outcomes in adulthood

Article Type
Changed
Tue, 01/24/2023 - 15:14

Children with chronically elevated externalizing symptoms, such as behavioral problems, or internalizing symptoms, such as mental health concerns, have an increased risk for poor economic and social outcomes in adulthood, data from a new study suggest.

Children with comorbid externalizing and internalizing symptoms were especially vulnerable to long-term economic and social exclusion.

“Research has mostly studied the outcomes of children with either behavioral problems or depression-anxiety problems. However, comorbidity is the rule rather than the exception in clinical practice,” senior author Massimilliano Orri, PhD, an assistant professor of psychiatry at McGill University and clinical psychologist with the Douglas Mental Health University Institute, both in Montreal, said in an interview.

“Our findings are important, as they show that comorbidity between externalizing and internalizing problems is associated with real-life outcomes that profoundly influence a youth’s chances to participate in society later in life,” he said.

The study was published in JAMA Network Open.
 

Analyzing associations

Dr. Orri and colleagues analyzed data for 3,017 children in the Quebec Longitudinal Study of Kindergarten Children, a population-based birth cohort that enrolled participants in 1986-1987 and 1987-1988 while they were attending kindergarten. The sample included 2,000 children selected at random and 1,017 children who scored at or above the 80th percentile for disruptive behavior problems.

The research team looked at the association between childhood behavioral profiles and economic and social outcomes for ages 19-37 years, including employment earnings, receipt of welfare, intimate partnerships, and having children living in the household. They obtained the outcome data from participants’ tax returns for 1998-2017.

During enrollment in the study, the children’s teachers assessed behavioral symptoms annually for ages 6-12 years using the Social Behavior Questionnaire. Based on the assessments, the research team categorized the students as having no or low symptoms, high externalizing symptoms only (such as hyperactivity, impulsivity, aggression, and rule violation), high internalizing symptoms only (such as anxiety, depression, worry, and social withdrawal), or comorbid symptoms. They looked at other variables as well, including the child’s sex, the parents’ age at the birth of their first child, the parents’ years of education, family structure, and the parents’ household income.

Among the 3,017 participants, 45.4% of children had no or low symptoms, 29.2% had high externalizing symptoms, 11.7% had high internalizing symptoms, and 13.7% had comorbid symptoms. About 53% were boys, and 47% were girls.

In general, boys were more likely to exhibit high externalizing symptoms, and girls were more likely to exhibit high internalizing symptoms. In the comorbid group, about 82% were boys, and they were more likely to have younger mothers, come from households with lower earnings when they were ages 3-5 years, and have a nonintact family at age 6 years.

The average age at follow-up was 37 years. Participants earned an average of $32,800 per year at ages 33-37 years (between 2013 and 2017). During the 20 years of follow-up, participants received welfare support for about 1.5 years, had an intimate partner for 7.4 years, and had children living in the household for 11 years.

Overall, participants in the high externalizing and high internalizing symptom profiles – and especially those in the comorbid profile – had lower earnings and a higher incidence of annual welfare receipt across early adulthood, compared with participants with low or no symptoms. They were also less likely to have an intimate partner or have children living in the household. Participants with a comorbid symptom profile earned $15,031 less per year and had a 3.79-times higher incidence of annual welfare receipt.
 

 

 

Lower earnings

Across the sample, men were more likely to have higher earnings and less likely to receive welfare each year, but they also were less likely to have an intimate partner or have children in the household. Among those with the high externalizing profile, men were significantly less likely to receive welfare. Among the comorbid profile, men were less likely to have children in the household.

Compared with the no-symptom or low-symptom profile, those in the high externalizing profile earned $5,904 less per year and had a two-times–higher incidence of welfare receipt. Those in the high internalizing profile earned $8,473 less per year, had a 2.07-times higher incidence of welfare receipt, and had a lower incidence of intimate partnership.

Compared with the high externalizing profile, those in the comorbid profile earned $9,126 less per year, had a higher incidence of annual welfare receipt, had a lower incidence of intimate partnership, and were less likely to have children in the household. Similarly, compared with the high internalizing profile, those in the comorbid profile earned $6,558 less per year and were more likely to exhibit the other poor long-term outcomes. Participants in the high internalizing profile earned $2,568 less per year than those in the high externalizing profile.

During a 40-year working career, the estimated lost personal employment earnings were $140,515 for the high externalizing profile, $201,657 for the high internalizing profile, and $357,737 for the comorbid profile, compared with those in the no-symptom or low-symptom profile.

“We know that children with externalizing and internalizing symptoms can have many problems in the short term – like social difficulties and lower education attainment – but it’s important to also understand the potential long-term outcomes,” study author Francis Vergunst, DPhil/PhD, an associate professor of child psychosocial difficulties at the University of Oslo, told this news organization.

“For example, when people have insufficient income, are forced to seek welfare support, or lack the social support structure that comes from an intimate partnership, it can have profound consequences for their mental health and well-being – and for society as a whole,” he said. “Understanding this helps to build the case for early prevention programs that can reduce childhood externalizing and internalizing problems and improve long-term outcomes.”

Several mechanisms could explain the associations found across the childhood symptom profiles, the study authors wrote. For instance, children with early behavior problems may be more likely to engage in risky adolescent activities, such as substance use, delinquent peer affiliations, and academic underachievement, which affects their transition to adulthood and accumulation of social and economic capital throughout life. Those with comorbid symptoms likely experience a compounded effect.

Future studies should investigate how to intervene effectively to support children, particularly those with comorbid externalizing and internalizing symptoms, the study authors write.

“Currently, most published studies focus on children with either externalizing or internalizing problems (and these programs can be effective, especially for externalizing problems), but we know very little about how to improve long-term outcomes for children with comorbid symptoms,” Dr. Vergunst said. “Given the large costs of these problems for individuals and society, this is a critical area for further research.”
 

 

 

‘Solid evidence’

Commenting on the findings, Ian Colman, PhD, a professor of epidemiology and public health and director of the Applied Psychiatric Epidemiology Across the Life course (APEAL) lab at the University of Ottawa, said, “Research like this provides solid evidence that if we do not provide appropriate supports for children who are struggling with their mental health or related behaviors, then these children are more likely to face a life of social and economic exclusion.”

Dr. Colman, who wasn’t involved with this study, has researched long-term psychosocial outcomes among adolescents with depression, as well as those with externalizing behaviors. He and colleagues have found poorer outcomes among those who exhibit mild or severe difficulties during childhood.

“Studying the long-term outcomes associated with child and adolescent mental and behavioral disorders gives us an idea of how concerned we should be about their future,” he said.

Dr. Vergunst was funded by the Canadian Institute of Health Research and Fonds de Recherche du Quebec Santé postdoctoral fellowships. Dr. Orri and Dr. Colman report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Children with chronically elevated externalizing symptoms, such as behavioral problems, or internalizing symptoms, such as mental health concerns, have an increased risk for poor economic and social outcomes in adulthood, data from a new study suggest.

Children with comorbid externalizing and internalizing symptoms were especially vulnerable to long-term economic and social exclusion.

“Research has mostly studied the outcomes of children with either behavioral problems or depression-anxiety problems. However, comorbidity is the rule rather than the exception in clinical practice,” senior author Massimilliano Orri, PhD, an assistant professor of psychiatry at McGill University and clinical psychologist with the Douglas Mental Health University Institute, both in Montreal, said in an interview.

“Our findings are important, as they show that comorbidity between externalizing and internalizing problems is associated with real-life outcomes that profoundly influence a youth’s chances to participate in society later in life,” he said.

The study was published in JAMA Network Open.
 

Analyzing associations

Dr. Orri and colleagues analyzed data for 3,017 children in the Quebec Longitudinal Study of Kindergarten Children, a population-based birth cohort that enrolled participants in 1986-1987 and 1987-1988 while they were attending kindergarten. The sample included 2,000 children selected at random and 1,017 children who scored at or above the 80th percentile for disruptive behavior problems.

The research team looked at the association between childhood behavioral profiles and economic and social outcomes for ages 19-37 years, including employment earnings, receipt of welfare, intimate partnerships, and having children living in the household. They obtained the outcome data from participants’ tax returns for 1998-2017.

During enrollment in the study, the children’s teachers assessed behavioral symptoms annually for ages 6-12 years using the Social Behavior Questionnaire. Based on the assessments, the research team categorized the students as having no or low symptoms, high externalizing symptoms only (such as hyperactivity, impulsivity, aggression, and rule violation), high internalizing symptoms only (such as anxiety, depression, worry, and social withdrawal), or comorbid symptoms. They looked at other variables as well, including the child’s sex, the parents’ age at the birth of their first child, the parents’ years of education, family structure, and the parents’ household income.

Among the 3,017 participants, 45.4% of children had no or low symptoms, 29.2% had high externalizing symptoms, 11.7% had high internalizing symptoms, and 13.7% had comorbid symptoms. About 53% were boys, and 47% were girls.

In general, boys were more likely to exhibit high externalizing symptoms, and girls were more likely to exhibit high internalizing symptoms. In the comorbid group, about 82% were boys, and they were more likely to have younger mothers, come from households with lower earnings when they were ages 3-5 years, and have a nonintact family at age 6 years.

The average age at follow-up was 37 years. Participants earned an average of $32,800 per year at ages 33-37 years (between 2013 and 2017). During the 20 years of follow-up, participants received welfare support for about 1.5 years, had an intimate partner for 7.4 years, and had children living in the household for 11 years.

Overall, participants in the high externalizing and high internalizing symptom profiles – and especially those in the comorbid profile – had lower earnings and a higher incidence of annual welfare receipt across early adulthood, compared with participants with low or no symptoms. They were also less likely to have an intimate partner or have children living in the household. Participants with a comorbid symptom profile earned $15,031 less per year and had a 3.79-times higher incidence of annual welfare receipt.
 

 

 

Lower earnings

Across the sample, men were more likely to have higher earnings and less likely to receive welfare each year, but they also were less likely to have an intimate partner or have children in the household. Among those with the high externalizing profile, men were significantly less likely to receive welfare. Among the comorbid profile, men were less likely to have children in the household.

Compared with the no-symptom or low-symptom profile, those in the high externalizing profile earned $5,904 less per year and had a two-times–higher incidence of welfare receipt. Those in the high internalizing profile earned $8,473 less per year, had a 2.07-times higher incidence of welfare receipt, and had a lower incidence of intimate partnership.

Compared with the high externalizing profile, those in the comorbid profile earned $9,126 less per year, had a higher incidence of annual welfare receipt, had a lower incidence of intimate partnership, and were less likely to have children in the household. Similarly, compared with the high internalizing profile, those in the comorbid profile earned $6,558 less per year and were more likely to exhibit the other poor long-term outcomes. Participants in the high internalizing profile earned $2,568 less per year than those in the high externalizing profile.

During a 40-year working career, the estimated lost personal employment earnings were $140,515 for the high externalizing profile, $201,657 for the high internalizing profile, and $357,737 for the comorbid profile, compared with those in the no-symptom or low-symptom profile.

“We know that children with externalizing and internalizing symptoms can have many problems in the short term – like social difficulties and lower education attainment – but it’s important to also understand the potential long-term outcomes,” study author Francis Vergunst, DPhil/PhD, an associate professor of child psychosocial difficulties at the University of Oslo, told this news organization.

“For example, when people have insufficient income, are forced to seek welfare support, or lack the social support structure that comes from an intimate partnership, it can have profound consequences for their mental health and well-being – and for society as a whole,” he said. “Understanding this helps to build the case for early prevention programs that can reduce childhood externalizing and internalizing problems and improve long-term outcomes.”

Several mechanisms could explain the associations found across the childhood symptom profiles, the study authors wrote. For instance, children with early behavior problems may be more likely to engage in risky adolescent activities, such as substance use, delinquent peer affiliations, and academic underachievement, which affects their transition to adulthood and accumulation of social and economic capital throughout life. Those with comorbid symptoms likely experience a compounded effect.

Future studies should investigate how to intervene effectively to support children, particularly those with comorbid externalizing and internalizing symptoms, the study authors write.

“Currently, most published studies focus on children with either externalizing or internalizing problems (and these programs can be effective, especially for externalizing problems), but we know very little about how to improve long-term outcomes for children with comorbid symptoms,” Dr. Vergunst said. “Given the large costs of these problems for individuals and society, this is a critical area for further research.”
 

 

 

‘Solid evidence’

Commenting on the findings, Ian Colman, PhD, a professor of epidemiology and public health and director of the Applied Psychiatric Epidemiology Across the Life course (APEAL) lab at the University of Ottawa, said, “Research like this provides solid evidence that if we do not provide appropriate supports for children who are struggling with their mental health or related behaviors, then these children are more likely to face a life of social and economic exclusion.”

Dr. Colman, who wasn’t involved with this study, has researched long-term psychosocial outcomes among adolescents with depression, as well as those with externalizing behaviors. He and colleagues have found poorer outcomes among those who exhibit mild or severe difficulties during childhood.

“Studying the long-term outcomes associated with child and adolescent mental and behavioral disorders gives us an idea of how concerned we should be about their future,” he said.

Dr. Vergunst was funded by the Canadian Institute of Health Research and Fonds de Recherche du Quebec Santé postdoctoral fellowships. Dr. Orri and Dr. Colman report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Children with chronically elevated externalizing symptoms, such as behavioral problems, or internalizing symptoms, such as mental health concerns, have an increased risk for poor economic and social outcomes in adulthood, data from a new study suggest.

Children with comorbid externalizing and internalizing symptoms were especially vulnerable to long-term economic and social exclusion.

“Research has mostly studied the outcomes of children with either behavioral problems or depression-anxiety problems. However, comorbidity is the rule rather than the exception in clinical practice,” senior author Massimilliano Orri, PhD, an assistant professor of psychiatry at McGill University and clinical psychologist with the Douglas Mental Health University Institute, both in Montreal, said in an interview.

“Our findings are important, as they show that comorbidity between externalizing and internalizing problems is associated with real-life outcomes that profoundly influence a youth’s chances to participate in society later in life,” he said.

The study was published in JAMA Network Open.
 

Analyzing associations

Dr. Orri and colleagues analyzed data for 3,017 children in the Quebec Longitudinal Study of Kindergarten Children, a population-based birth cohort that enrolled participants in 1986-1987 and 1987-1988 while they were attending kindergarten. The sample included 2,000 children selected at random and 1,017 children who scored at or above the 80th percentile for disruptive behavior problems.

The research team looked at the association between childhood behavioral profiles and economic and social outcomes for ages 19-37 years, including employment earnings, receipt of welfare, intimate partnerships, and having children living in the household. They obtained the outcome data from participants’ tax returns for 1998-2017.

During enrollment in the study, the children’s teachers assessed behavioral symptoms annually for ages 6-12 years using the Social Behavior Questionnaire. Based on the assessments, the research team categorized the students as having no or low symptoms, high externalizing symptoms only (such as hyperactivity, impulsivity, aggression, and rule violation), high internalizing symptoms only (such as anxiety, depression, worry, and social withdrawal), or comorbid symptoms. They looked at other variables as well, including the child’s sex, the parents’ age at the birth of their first child, the parents’ years of education, family structure, and the parents’ household income.

Among the 3,017 participants, 45.4% of children had no or low symptoms, 29.2% had high externalizing symptoms, 11.7% had high internalizing symptoms, and 13.7% had comorbid symptoms. About 53% were boys, and 47% were girls.

In general, boys were more likely to exhibit high externalizing symptoms, and girls were more likely to exhibit high internalizing symptoms. In the comorbid group, about 82% were boys, and they were more likely to have younger mothers, come from households with lower earnings when they were ages 3-5 years, and have a nonintact family at age 6 years.

The average age at follow-up was 37 years. Participants earned an average of $32,800 per year at ages 33-37 years (between 2013 and 2017). During the 20 years of follow-up, participants received welfare support for about 1.5 years, had an intimate partner for 7.4 years, and had children living in the household for 11 years.

Overall, participants in the high externalizing and high internalizing symptom profiles – and especially those in the comorbid profile – had lower earnings and a higher incidence of annual welfare receipt across early adulthood, compared with participants with low or no symptoms. They were also less likely to have an intimate partner or have children living in the household. Participants with a comorbid symptom profile earned $15,031 less per year and had a 3.79-times higher incidence of annual welfare receipt.
 

 

 

Lower earnings

Across the sample, men were more likely to have higher earnings and less likely to receive welfare each year, but they also were less likely to have an intimate partner or have children in the household. Among those with the high externalizing profile, men were significantly less likely to receive welfare. Among the comorbid profile, men were less likely to have children in the household.

Compared with the no-symptom or low-symptom profile, those in the high externalizing profile earned $5,904 less per year and had a two-times–higher incidence of welfare receipt. Those in the high internalizing profile earned $8,473 less per year, had a 2.07-times higher incidence of welfare receipt, and had a lower incidence of intimate partnership.

Compared with the high externalizing profile, those in the comorbid profile earned $9,126 less per year, had a higher incidence of annual welfare receipt, had a lower incidence of intimate partnership, and were less likely to have children in the household. Similarly, compared with the high internalizing profile, those in the comorbid profile earned $6,558 less per year and were more likely to exhibit the other poor long-term outcomes. Participants in the high internalizing profile earned $2,568 less per year than those in the high externalizing profile.

During a 40-year working career, the estimated lost personal employment earnings were $140,515 for the high externalizing profile, $201,657 for the high internalizing profile, and $357,737 for the comorbid profile, compared with those in the no-symptom or low-symptom profile.

“We know that children with externalizing and internalizing symptoms can have many problems in the short term – like social difficulties and lower education attainment – but it’s important to also understand the potential long-term outcomes,” study author Francis Vergunst, DPhil/PhD, an associate professor of child psychosocial difficulties at the University of Oslo, told this news organization.

“For example, when people have insufficient income, are forced to seek welfare support, or lack the social support structure that comes from an intimate partnership, it can have profound consequences for their mental health and well-being – and for society as a whole,” he said. “Understanding this helps to build the case for early prevention programs that can reduce childhood externalizing and internalizing problems and improve long-term outcomes.”

Several mechanisms could explain the associations found across the childhood symptom profiles, the study authors wrote. For instance, children with early behavior problems may be more likely to engage in risky adolescent activities, such as substance use, delinquent peer affiliations, and academic underachievement, which affects their transition to adulthood and accumulation of social and economic capital throughout life. Those with comorbid symptoms likely experience a compounded effect.

Future studies should investigate how to intervene effectively to support children, particularly those with comorbid externalizing and internalizing symptoms, the study authors write.

“Currently, most published studies focus on children with either externalizing or internalizing problems (and these programs can be effective, especially for externalizing problems), but we know very little about how to improve long-term outcomes for children with comorbid symptoms,” Dr. Vergunst said. “Given the large costs of these problems for individuals and society, this is a critical area for further research.”
 

 

 

‘Solid evidence’

Commenting on the findings, Ian Colman, PhD, a professor of epidemiology and public health and director of the Applied Psychiatric Epidemiology Across the Life course (APEAL) lab at the University of Ottawa, said, “Research like this provides solid evidence that if we do not provide appropriate supports for children who are struggling with their mental health or related behaviors, then these children are more likely to face a life of social and economic exclusion.”

Dr. Colman, who wasn’t involved with this study, has researched long-term psychosocial outcomes among adolescents with depression, as well as those with externalizing behaviors. He and colleagues have found poorer outcomes among those who exhibit mild or severe difficulties during childhood.

“Studying the long-term outcomes associated with child and adolescent mental and behavioral disorders gives us an idea of how concerned we should be about their future,” he said.

Dr. Vergunst was funded by the Canadian Institute of Health Research and Fonds de Recherche du Quebec Santé postdoctoral fellowships. Dr. Orri and Dr. Colman report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NETWORK OPEN

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

FAST score appears accurate for diagnosis of fibrotic NASH

Article Type
Changed
Fri, 01/20/2023 - 16:20

The FibroScan-aspartate aminotransferase (FAST) score shows high sensitivity and specificity for noninvasive identification of patients with fibrotic nonalcoholic steatohepatitis (NASH), according to a new systematic review and meta-analysis.

The FAST score had an overall sensitivity of 89% and an overall specificity of 89% with a defined rule-out cutoff of .35 or lower and rule-in cutoff of .67 or higher, respectively, Federico Ravaioli, MD, PhD, a gastroenterologist at the University of Modena & Reggio Emilia in Italy, and colleagues, wrote in Gut.

Dr. Federico Ravaioli

“These results could be used in clinical screening studies to efficiently identify patients at risk of progressive NASH, who should be referred for a conclusive liver biopsy, and who might benefit from treatment with emerging pharmacotherapies,” the authors wrote.

The research team analyzed 12 observational studies with 5,835 participants with biopsy-confirmed nonalcoholic fatty liver disease (NAFLD) between February 2020 and April 2022. They included articles that reported data for the calculation of sensitivity and specificity of the FAST score for identifying adult patients with fibrotic NASH based on a defined rule-out cutoff of .35 or lower and rule-in cutoff of .67 or higher. Fibrotic NASH was defined as patients with NASH plus a NAFLD activity score of 4 or greater and fibrosis stage 2 or higher.

The pooled prevalence of fibrotic NASH was 28%. The mean age of participants ranged from 40 to 60, and the proportion of men ranged from 23% to 91%. The mean body mass index ranged from 23 kg/m2 to 41 kg/m2, with a prevalence of obesity ranging from 23% to 100% and preexisting type 2 diabetes ranging from 18% to 60%. Nine studies included patients with biopsy-proven NAFLD from tertiary care liver centers, and three studies included patients from bariatric clinics or bariatric surgery centers with available liver biopsy data.

Fibrotic NASH was ruled out in 2,723 patients (45.5%) by a FAST score of .35 or lower and ruled in 1,287 patients (21.5%) by a FAST score of .67 or higher. In addition, 1,979 patients (33%) had a FAST score in the so-called “grey” intermediate zone.

Overall, the FAST score pooled sensitivity was 89%, and the pooled specificity was 89%. By the rule-out cutoff of .35, the sensitivity was 89% and the specificity was 56%. By the rule-in cutoff of .67, the sensitivity was 46% and the specificity was 89%.

At an expected prevalence of fibrotic NASH of 30%, the negative predictive value of the .35 cutoff was 92%, and the positive predictive value of the .67 cutoff was 65%. Across the included studies, the negative predictive value ranged from 77% to 98%, and the positive predictive value ranged from 32% to 87%.

For the rule-in cutoff of .67, at a pretest probability of 10%, 20%, 26.3%, and 30%, there was an increasing likelihood of detecting fibrotic NASH by FAST score at 32%, 52%, 60%, and 65%, respectively. For the rule-out cutoff of .35, at the same pretest probability levels, the likelihood of someone not having fibrotic NASH and not being detected by FAST score was 2%, 5%, 7%, and 8%, respectively.

In subgroup analyses, the sensitivity of the rule-out cutoff was significantly affected by the study design. In addition, age and BMI above the median both affected pooled sensitivity but not pooled specificity. On the other hand, the rule-in cutoff was significantly affected by study design, BMI above the median, and presence of preexisting type 2 diabetes above the median.

“Today, we stand on the cusp of a revolutionary time to treat NASH. This is due in part to the fact that many exciting, novel precision metabolic treatments are in the pipeline to combat this disease,” said Brian DeBosch, MD, PhD, associate professor of cell biology and physiology at the Washington University in St. Louis, who was not involved with this study.

Dr. Brian DeBosch


“A major barrier in clinical NASH management is a rapid, noninvasive, and precise means by which to clinically stage such patients,” Dr. DeBosch said. “We now approach as closely as ever the sensitivity and specificity required to stratify the highest-risk patients, identify candidates for advanced therapy, and meaningfully reduce biopsies through using noninvasive testing.”

Dr. DeBosch noted the importance of pretest probability and specific subpopulations when deciding whether to use the FAST score. For instance, he said, a tertiary academic liver transplant center will see a different patient population than in a primary care setting. Also, in this study, the presence or absence of diabetes and a BMI above 30 significantly altered sensitivity and specificity.

“One important remaining question stemming from these data is whether FAST can also be used as a surrogate measure to follow disease regression over time following intervention,” Dr. DeBosch said. “Even if FAST is not useful in that way, defining individuals who most need to undergo biopsy and/or those who need to undergo treatment remain important uses for this test.”

The study authors did not declare a specific funding source or report any competing interests. DeBosch reported no relevant disclosures.

Publications
Topics
Sections

The FibroScan-aspartate aminotransferase (FAST) score shows high sensitivity and specificity for noninvasive identification of patients with fibrotic nonalcoholic steatohepatitis (NASH), according to a new systematic review and meta-analysis.

The FAST score had an overall sensitivity of 89% and an overall specificity of 89% with a defined rule-out cutoff of .35 or lower and rule-in cutoff of .67 or higher, respectively, Federico Ravaioli, MD, PhD, a gastroenterologist at the University of Modena & Reggio Emilia in Italy, and colleagues, wrote in Gut.

Dr. Federico Ravaioli

“These results could be used in clinical screening studies to efficiently identify patients at risk of progressive NASH, who should be referred for a conclusive liver biopsy, and who might benefit from treatment with emerging pharmacotherapies,” the authors wrote.

The research team analyzed 12 observational studies with 5,835 participants with biopsy-confirmed nonalcoholic fatty liver disease (NAFLD) between February 2020 and April 2022. They included articles that reported data for the calculation of sensitivity and specificity of the FAST score for identifying adult patients with fibrotic NASH based on a defined rule-out cutoff of .35 or lower and rule-in cutoff of .67 or higher. Fibrotic NASH was defined as patients with NASH plus a NAFLD activity score of 4 or greater and fibrosis stage 2 or higher.

The pooled prevalence of fibrotic NASH was 28%. The mean age of participants ranged from 40 to 60, and the proportion of men ranged from 23% to 91%. The mean body mass index ranged from 23 kg/m2 to 41 kg/m2, with a prevalence of obesity ranging from 23% to 100% and preexisting type 2 diabetes ranging from 18% to 60%. Nine studies included patients with biopsy-proven NAFLD from tertiary care liver centers, and three studies included patients from bariatric clinics or bariatric surgery centers with available liver biopsy data.

Fibrotic NASH was ruled out in 2,723 patients (45.5%) by a FAST score of .35 or lower and ruled in 1,287 patients (21.5%) by a FAST score of .67 or higher. In addition, 1,979 patients (33%) had a FAST score in the so-called “grey” intermediate zone.

Overall, the FAST score pooled sensitivity was 89%, and the pooled specificity was 89%. By the rule-out cutoff of .35, the sensitivity was 89% and the specificity was 56%. By the rule-in cutoff of .67, the sensitivity was 46% and the specificity was 89%.

At an expected prevalence of fibrotic NASH of 30%, the negative predictive value of the .35 cutoff was 92%, and the positive predictive value of the .67 cutoff was 65%. Across the included studies, the negative predictive value ranged from 77% to 98%, and the positive predictive value ranged from 32% to 87%.

For the rule-in cutoff of .67, at a pretest probability of 10%, 20%, 26.3%, and 30%, there was an increasing likelihood of detecting fibrotic NASH by FAST score at 32%, 52%, 60%, and 65%, respectively. For the rule-out cutoff of .35, at the same pretest probability levels, the likelihood of someone not having fibrotic NASH and not being detected by FAST score was 2%, 5%, 7%, and 8%, respectively.

In subgroup analyses, the sensitivity of the rule-out cutoff was significantly affected by the study design. In addition, age and BMI above the median both affected pooled sensitivity but not pooled specificity. On the other hand, the rule-in cutoff was significantly affected by study design, BMI above the median, and presence of preexisting type 2 diabetes above the median.

“Today, we stand on the cusp of a revolutionary time to treat NASH. This is due in part to the fact that many exciting, novel precision metabolic treatments are in the pipeline to combat this disease,” said Brian DeBosch, MD, PhD, associate professor of cell biology and physiology at the Washington University in St. Louis, who was not involved with this study.

Dr. Brian DeBosch


“A major barrier in clinical NASH management is a rapid, noninvasive, and precise means by which to clinically stage such patients,” Dr. DeBosch said. “We now approach as closely as ever the sensitivity and specificity required to stratify the highest-risk patients, identify candidates for advanced therapy, and meaningfully reduce biopsies through using noninvasive testing.”

Dr. DeBosch noted the importance of pretest probability and specific subpopulations when deciding whether to use the FAST score. For instance, he said, a tertiary academic liver transplant center will see a different patient population than in a primary care setting. Also, in this study, the presence or absence of diabetes and a BMI above 30 significantly altered sensitivity and specificity.

“One important remaining question stemming from these data is whether FAST can also be used as a surrogate measure to follow disease regression over time following intervention,” Dr. DeBosch said. “Even if FAST is not useful in that way, defining individuals who most need to undergo biopsy and/or those who need to undergo treatment remain important uses for this test.”

The study authors did not declare a specific funding source or report any competing interests. DeBosch reported no relevant disclosures.

The FibroScan-aspartate aminotransferase (FAST) score shows high sensitivity and specificity for noninvasive identification of patients with fibrotic nonalcoholic steatohepatitis (NASH), according to a new systematic review and meta-analysis.

The FAST score had an overall sensitivity of 89% and an overall specificity of 89% with a defined rule-out cutoff of .35 or lower and rule-in cutoff of .67 or higher, respectively, Federico Ravaioli, MD, PhD, a gastroenterologist at the University of Modena & Reggio Emilia in Italy, and colleagues, wrote in Gut.

Dr. Federico Ravaioli

“These results could be used in clinical screening studies to efficiently identify patients at risk of progressive NASH, who should be referred for a conclusive liver biopsy, and who might benefit from treatment with emerging pharmacotherapies,” the authors wrote.

The research team analyzed 12 observational studies with 5,835 participants with biopsy-confirmed nonalcoholic fatty liver disease (NAFLD) between February 2020 and April 2022. They included articles that reported data for the calculation of sensitivity and specificity of the FAST score for identifying adult patients with fibrotic NASH based on a defined rule-out cutoff of .35 or lower and rule-in cutoff of .67 or higher. Fibrotic NASH was defined as patients with NASH plus a NAFLD activity score of 4 or greater and fibrosis stage 2 or higher.

The pooled prevalence of fibrotic NASH was 28%. The mean age of participants ranged from 40 to 60, and the proportion of men ranged from 23% to 91%. The mean body mass index ranged from 23 kg/m2 to 41 kg/m2, with a prevalence of obesity ranging from 23% to 100% and preexisting type 2 diabetes ranging from 18% to 60%. Nine studies included patients with biopsy-proven NAFLD from tertiary care liver centers, and three studies included patients from bariatric clinics or bariatric surgery centers with available liver biopsy data.

Fibrotic NASH was ruled out in 2,723 patients (45.5%) by a FAST score of .35 or lower and ruled in 1,287 patients (21.5%) by a FAST score of .67 or higher. In addition, 1,979 patients (33%) had a FAST score in the so-called “grey” intermediate zone.

Overall, the FAST score pooled sensitivity was 89%, and the pooled specificity was 89%. By the rule-out cutoff of .35, the sensitivity was 89% and the specificity was 56%. By the rule-in cutoff of .67, the sensitivity was 46% and the specificity was 89%.

At an expected prevalence of fibrotic NASH of 30%, the negative predictive value of the .35 cutoff was 92%, and the positive predictive value of the .67 cutoff was 65%. Across the included studies, the negative predictive value ranged from 77% to 98%, and the positive predictive value ranged from 32% to 87%.

For the rule-in cutoff of .67, at a pretest probability of 10%, 20%, 26.3%, and 30%, there was an increasing likelihood of detecting fibrotic NASH by FAST score at 32%, 52%, 60%, and 65%, respectively. For the rule-out cutoff of .35, at the same pretest probability levels, the likelihood of someone not having fibrotic NASH and not being detected by FAST score was 2%, 5%, 7%, and 8%, respectively.

In subgroup analyses, the sensitivity of the rule-out cutoff was significantly affected by the study design. In addition, age and BMI above the median both affected pooled sensitivity but not pooled specificity. On the other hand, the rule-in cutoff was significantly affected by study design, BMI above the median, and presence of preexisting type 2 diabetes above the median.

“Today, we stand on the cusp of a revolutionary time to treat NASH. This is due in part to the fact that many exciting, novel precision metabolic treatments are in the pipeline to combat this disease,” said Brian DeBosch, MD, PhD, associate professor of cell biology and physiology at the Washington University in St. Louis, who was not involved with this study.

Dr. Brian DeBosch


“A major barrier in clinical NASH management is a rapid, noninvasive, and precise means by which to clinically stage such patients,” Dr. DeBosch said. “We now approach as closely as ever the sensitivity and specificity required to stratify the highest-risk patients, identify candidates for advanced therapy, and meaningfully reduce biopsies through using noninvasive testing.”

Dr. DeBosch noted the importance of pretest probability and specific subpopulations when deciding whether to use the FAST score. For instance, he said, a tertiary academic liver transplant center will see a different patient population than in a primary care setting. Also, in this study, the presence or absence of diabetes and a BMI above 30 significantly altered sensitivity and specificity.

“One important remaining question stemming from these data is whether FAST can also be used as a surrogate measure to follow disease regression over time following intervention,” Dr. DeBosch said. “Even if FAST is not useful in that way, defining individuals who most need to undergo biopsy and/or those who need to undergo treatment remain important uses for this test.”

The study authors did not declare a specific funding source or report any competing interests. DeBosch reported no relevant disclosures.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GUT

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Updated celiac disease guideline addresses common clinical questions

Article Type
Changed
Mon, 01/23/2023 - 09:54

The American College of Gastroenterology issued updated guidelines for celiac disease diagnosis, management, and screening that incorporates research conducted since the last update in 2013.

The guidelines offer evidence-based recommendations for common clinical questions on topics that include nonbiopsy diagnosis, gluten-free oats, probiotic use, and gluten-detection devices. They also point to areas for ongoing research.

“The main message of the guideline is all about quality of care,” Alberto Rubio-Tapia, MD, a gastroenterologist at the Cleveland Clinic, said in an interview.

“A precise celiac disease diagnosis is just the beginning of the role of the gastroenterologist,” he said. “But most importantly, we need to take care of our patients’ needs with good goal-directed follow-up using a multidisciplinary approach, with experienced dietitians playing an important role.”

The update was published in the American Journal of Gastroenterology.
 

Diagnosis recommendations

The ACG assembled a team of celiac disease experts and expert guideline methodologists to develop an update with high-quality evidence, Dr. Rubio-Tapia said. The authors made recommendations and suggestions for future research regarding eight questions concerning diagnosis, disease management, and screening.

For diagnosis, the guidelines recommend esophagogastroduodenoscopy (EGD) with multiple duodenal biopsies – one or two from the bulb and four from the distal duodenum – for confirmation in children and adults with suspicion of celiac disease. EGD and duodenal biopsies can also be useful for the differential diagnosis of other malabsorptive disorders or enteropathies, the authors wrote.

For children, a nonbiopsy option may be considered to be reliable for diagnosis. This option includes a combination of high-level tissue transglutaminase (TTG) IgA – at greater than 10 times the upper limit of normal – and a positive endomysial antibody finding in a second blood sample. The same criteria may be considered after the fact for symptomatic adults who are unwilling or unable to undergo upper GI endoscopy.

For children younger than 2 years, the TTG-IgA is the preferred test for those who are not IgA deficient. For children with IgA deficiency, testing should be performed using IgG-based antibodies.
 

Disease management guidance

After diagnosis, intestinal healing should be the endpoint for a gluten-free diet, the guidelines recommended. Clinicians and patients should discuss individualized goals of the gluten-free diet beyond clinical and serologic remission.

The standard of care for assessing patients’ diet adherence is an interview with a dietician who has expertise in gluten-free diets, the recommendations stated. Subsequent visits should be encouraged as needed to reinforce adherence.

During disease management, upper endoscopy with intestinal biopsies can be helpful for monitoring cases in which there is a lack of clinical response or in which symptoms relapse despite a gluten-free diet, the authors noted.

In addition, after a shared decision-making conversation between the patient and provider, a follow-up biopsy could be considered for assessment of mucosal healing in adults who don’t have symptoms 2 years after starting a gluten-free diet, they wrote.

“Although most patients do well on a gluten-free diet, it’s a heavy burden of care and an important issue that impacts patients,” Joseph Murray, MD, a gastroenterologist at the Mayo Clinic in Rochester, Minn., said in an interview.

Dr. Murray, who wasn’t involved with this guideline update, contributed to the 2013 guidelines and the 2019 American Gastroenterological Association practice update on diagnosing and monitoring celiac disease. He agreed with many of the recommendations in this update.

“The goal of achieving healing is a good goal to reach. We do that routinely in my practice,” he said. “The older the patient, perhaps the more important it is to discuss, including the risk for complications. There’s a nuance involved with shared decision-making.”
 

 

 

Nutrition advice

The guidelines recommended against routine use of gluten-detection devices for food or biospecimens for patients with celiac disease. Although multiple devices have become commercially available in recent years, they are not regulated by the Food and Drug Administration and have sensitivity problems that can lead to false positive and false negative results, the authors noted. There’s also a lack of evidence that the devices enhance diet adherence or quality of life.

The evidence is insufficient to recommend for or against the use of probiotics for the treatment of celiac disease, the recommendations stated. Although dysbiosis is a feature of celiac disease, its role in disease pathogenesis and symptomatology is uncertain, the authors wrote.

Probiotics may help with functional disorders, such as irritable bowel syndrome, but because probiotics are marketed as supplements and regulations are lax, some products may contain detectable gluten despite being labeled gluten free, they added.

On the other hand, the authors recommended gluten-free oats as part of a gluten-free diet. Oat consumption appears to be safe for most patients with celiac disease, but it may be immunogenic in a subset of patients, depending on the products or quantity consumed. Given the small risk for an immune reaction to the oat protein avenin, monitoring for oat tolerance through symptoms and serology should be conducted, although the intervals for monitoring remain unknown.
 

Vaccination and screening

The guidelines also support vaccination against pneumococcal disease, since adults with celiac disease are at significantly increased risk of infection and complications. Vaccination is widely recommended for people aged 65 and older, for smokers aged 19-64, and for adults with underlying conditions that place them at higher risk, the authors noted.

Overall, the guidelines recommended case findings to increase detection of celiac disease in clinical practice but recommend against mass screening in the community. Patients with symptoms for whom there is lab evidence of malabsorption should be tested, as well as those for whom celiac disease could be a treatable cause of symptoms, the authors wrote. Those with a first-degree family member who has a confirmed diagnosis should also be tested if they have possible symptoms, and asymptomatic relatives should consider testing as well.

The updated guidelines include changes that are important for patients and patient care, and they emphasize the need for continued research on key questions, Isabel Hujoel, MD, a gastroenterologist at the University of Washington Medical Center, Seattle, told this news organization.

“In particular, the discussion on the lack of evidence behind gluten-detection devices and probiotic use in celiac disease addresses conversations that come up frequently in clinic,” said Dr. Hujoel, who wasn’t involved with the update. “The guidelines also include a new addition below each recommendation where future research questions are raised. Many of these questions address gaps in our understanding on celiac disease, such as the possibility of a nonbiopsy diagnosis in adults, which will potentially dramatically impact patient care if addressed.”

The update received no funding. The authors, Dr. Murray, and Dr. Hujoel have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The American College of Gastroenterology issued updated guidelines for celiac disease diagnosis, management, and screening that incorporates research conducted since the last update in 2013.

The guidelines offer evidence-based recommendations for common clinical questions on topics that include nonbiopsy diagnosis, gluten-free oats, probiotic use, and gluten-detection devices. They also point to areas for ongoing research.

“The main message of the guideline is all about quality of care,” Alberto Rubio-Tapia, MD, a gastroenterologist at the Cleveland Clinic, said in an interview.

“A precise celiac disease diagnosis is just the beginning of the role of the gastroenterologist,” he said. “But most importantly, we need to take care of our patients’ needs with good goal-directed follow-up using a multidisciplinary approach, with experienced dietitians playing an important role.”

The update was published in the American Journal of Gastroenterology.
 

Diagnosis recommendations

The ACG assembled a team of celiac disease experts and expert guideline methodologists to develop an update with high-quality evidence, Dr. Rubio-Tapia said. The authors made recommendations and suggestions for future research regarding eight questions concerning diagnosis, disease management, and screening.

For diagnosis, the guidelines recommend esophagogastroduodenoscopy (EGD) with multiple duodenal biopsies – one or two from the bulb and four from the distal duodenum – for confirmation in children and adults with suspicion of celiac disease. EGD and duodenal biopsies can also be useful for the differential diagnosis of other malabsorptive disorders or enteropathies, the authors wrote.

For children, a nonbiopsy option may be considered to be reliable for diagnosis. This option includes a combination of high-level tissue transglutaminase (TTG) IgA – at greater than 10 times the upper limit of normal – and a positive endomysial antibody finding in a second blood sample. The same criteria may be considered after the fact for symptomatic adults who are unwilling or unable to undergo upper GI endoscopy.

For children younger than 2 years, the TTG-IgA is the preferred test for those who are not IgA deficient. For children with IgA deficiency, testing should be performed using IgG-based antibodies.
 

Disease management guidance

After diagnosis, intestinal healing should be the endpoint for a gluten-free diet, the guidelines recommended. Clinicians and patients should discuss individualized goals of the gluten-free diet beyond clinical and serologic remission.

The standard of care for assessing patients’ diet adherence is an interview with a dietician who has expertise in gluten-free diets, the recommendations stated. Subsequent visits should be encouraged as needed to reinforce adherence.

During disease management, upper endoscopy with intestinal biopsies can be helpful for monitoring cases in which there is a lack of clinical response or in which symptoms relapse despite a gluten-free diet, the authors noted.

In addition, after a shared decision-making conversation between the patient and provider, a follow-up biopsy could be considered for assessment of mucosal healing in adults who don’t have symptoms 2 years after starting a gluten-free diet, they wrote.

“Although most patients do well on a gluten-free diet, it’s a heavy burden of care and an important issue that impacts patients,” Joseph Murray, MD, a gastroenterologist at the Mayo Clinic in Rochester, Minn., said in an interview.

Dr. Murray, who wasn’t involved with this guideline update, contributed to the 2013 guidelines and the 2019 American Gastroenterological Association practice update on diagnosing and monitoring celiac disease. He agreed with many of the recommendations in this update.

“The goal of achieving healing is a good goal to reach. We do that routinely in my practice,” he said. “The older the patient, perhaps the more important it is to discuss, including the risk for complications. There’s a nuance involved with shared decision-making.”
 

 

 

Nutrition advice

The guidelines recommended against routine use of gluten-detection devices for food or biospecimens for patients with celiac disease. Although multiple devices have become commercially available in recent years, they are not regulated by the Food and Drug Administration and have sensitivity problems that can lead to false positive and false negative results, the authors noted. There’s also a lack of evidence that the devices enhance diet adherence or quality of life.

The evidence is insufficient to recommend for or against the use of probiotics for the treatment of celiac disease, the recommendations stated. Although dysbiosis is a feature of celiac disease, its role in disease pathogenesis and symptomatology is uncertain, the authors wrote.

Probiotics may help with functional disorders, such as irritable bowel syndrome, but because probiotics are marketed as supplements and regulations are lax, some products may contain detectable gluten despite being labeled gluten free, they added.

On the other hand, the authors recommended gluten-free oats as part of a gluten-free diet. Oat consumption appears to be safe for most patients with celiac disease, but it may be immunogenic in a subset of patients, depending on the products or quantity consumed. Given the small risk for an immune reaction to the oat protein avenin, monitoring for oat tolerance through symptoms and serology should be conducted, although the intervals for monitoring remain unknown.
 

Vaccination and screening

The guidelines also support vaccination against pneumococcal disease, since adults with celiac disease are at significantly increased risk of infection and complications. Vaccination is widely recommended for people aged 65 and older, for smokers aged 19-64, and for adults with underlying conditions that place them at higher risk, the authors noted.

Overall, the guidelines recommended case findings to increase detection of celiac disease in clinical practice but recommend against mass screening in the community. Patients with symptoms for whom there is lab evidence of malabsorption should be tested, as well as those for whom celiac disease could be a treatable cause of symptoms, the authors wrote. Those with a first-degree family member who has a confirmed diagnosis should also be tested if they have possible symptoms, and asymptomatic relatives should consider testing as well.

The updated guidelines include changes that are important for patients and patient care, and they emphasize the need for continued research on key questions, Isabel Hujoel, MD, a gastroenterologist at the University of Washington Medical Center, Seattle, told this news organization.

“In particular, the discussion on the lack of evidence behind gluten-detection devices and probiotic use in celiac disease addresses conversations that come up frequently in clinic,” said Dr. Hujoel, who wasn’t involved with the update. “The guidelines also include a new addition below each recommendation where future research questions are raised. Many of these questions address gaps in our understanding on celiac disease, such as the possibility of a nonbiopsy diagnosis in adults, which will potentially dramatically impact patient care if addressed.”

The update received no funding. The authors, Dr. Murray, and Dr. Hujoel have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

The American College of Gastroenterology issued updated guidelines for celiac disease diagnosis, management, and screening that incorporates research conducted since the last update in 2013.

The guidelines offer evidence-based recommendations for common clinical questions on topics that include nonbiopsy diagnosis, gluten-free oats, probiotic use, and gluten-detection devices. They also point to areas for ongoing research.

“The main message of the guideline is all about quality of care,” Alberto Rubio-Tapia, MD, a gastroenterologist at the Cleveland Clinic, said in an interview.

“A precise celiac disease diagnosis is just the beginning of the role of the gastroenterologist,” he said. “But most importantly, we need to take care of our patients’ needs with good goal-directed follow-up using a multidisciplinary approach, with experienced dietitians playing an important role.”

The update was published in the American Journal of Gastroenterology.
 

Diagnosis recommendations

The ACG assembled a team of celiac disease experts and expert guideline methodologists to develop an update with high-quality evidence, Dr. Rubio-Tapia said. The authors made recommendations and suggestions for future research regarding eight questions concerning diagnosis, disease management, and screening.

For diagnosis, the guidelines recommend esophagogastroduodenoscopy (EGD) with multiple duodenal biopsies – one or two from the bulb and four from the distal duodenum – for confirmation in children and adults with suspicion of celiac disease. EGD and duodenal biopsies can also be useful for the differential diagnosis of other malabsorptive disorders or enteropathies, the authors wrote.

For children, a nonbiopsy option may be considered to be reliable for diagnosis. This option includes a combination of high-level tissue transglutaminase (TTG) IgA – at greater than 10 times the upper limit of normal – and a positive endomysial antibody finding in a second blood sample. The same criteria may be considered after the fact for symptomatic adults who are unwilling or unable to undergo upper GI endoscopy.

For children younger than 2 years, the TTG-IgA is the preferred test for those who are not IgA deficient. For children with IgA deficiency, testing should be performed using IgG-based antibodies.
 

Disease management guidance

After diagnosis, intestinal healing should be the endpoint for a gluten-free diet, the guidelines recommended. Clinicians and patients should discuss individualized goals of the gluten-free diet beyond clinical and serologic remission.

The standard of care for assessing patients’ diet adherence is an interview with a dietician who has expertise in gluten-free diets, the recommendations stated. Subsequent visits should be encouraged as needed to reinforce adherence.

During disease management, upper endoscopy with intestinal biopsies can be helpful for monitoring cases in which there is a lack of clinical response or in which symptoms relapse despite a gluten-free diet, the authors noted.

In addition, after a shared decision-making conversation between the patient and provider, a follow-up biopsy could be considered for assessment of mucosal healing in adults who don’t have symptoms 2 years after starting a gluten-free diet, they wrote.

“Although most patients do well on a gluten-free diet, it’s a heavy burden of care and an important issue that impacts patients,” Joseph Murray, MD, a gastroenterologist at the Mayo Clinic in Rochester, Minn., said in an interview.

Dr. Murray, who wasn’t involved with this guideline update, contributed to the 2013 guidelines and the 2019 American Gastroenterological Association practice update on diagnosing and monitoring celiac disease. He agreed with many of the recommendations in this update.

“The goal of achieving healing is a good goal to reach. We do that routinely in my practice,” he said. “The older the patient, perhaps the more important it is to discuss, including the risk for complications. There’s a nuance involved with shared decision-making.”
 

 

 

Nutrition advice

The guidelines recommended against routine use of gluten-detection devices for food or biospecimens for patients with celiac disease. Although multiple devices have become commercially available in recent years, they are not regulated by the Food and Drug Administration and have sensitivity problems that can lead to false positive and false negative results, the authors noted. There’s also a lack of evidence that the devices enhance diet adherence or quality of life.

The evidence is insufficient to recommend for or against the use of probiotics for the treatment of celiac disease, the recommendations stated. Although dysbiosis is a feature of celiac disease, its role in disease pathogenesis and symptomatology is uncertain, the authors wrote.

Probiotics may help with functional disorders, such as irritable bowel syndrome, but because probiotics are marketed as supplements and regulations are lax, some products may contain detectable gluten despite being labeled gluten free, they added.

On the other hand, the authors recommended gluten-free oats as part of a gluten-free diet. Oat consumption appears to be safe for most patients with celiac disease, but it may be immunogenic in a subset of patients, depending on the products or quantity consumed. Given the small risk for an immune reaction to the oat protein avenin, monitoring for oat tolerance through symptoms and serology should be conducted, although the intervals for monitoring remain unknown.
 

Vaccination and screening

The guidelines also support vaccination against pneumococcal disease, since adults with celiac disease are at significantly increased risk of infection and complications. Vaccination is widely recommended for people aged 65 and older, for smokers aged 19-64, and for adults with underlying conditions that place them at higher risk, the authors noted.

Overall, the guidelines recommended case findings to increase detection of celiac disease in clinical practice but recommend against mass screening in the community. Patients with symptoms for whom there is lab evidence of malabsorption should be tested, as well as those for whom celiac disease could be a treatable cause of symptoms, the authors wrote. Those with a first-degree family member who has a confirmed diagnosis should also be tested if they have possible symptoms, and asymptomatic relatives should consider testing as well.

The updated guidelines include changes that are important for patients and patient care, and they emphasize the need for continued research on key questions, Isabel Hujoel, MD, a gastroenterologist at the University of Washington Medical Center, Seattle, told this news organization.

“In particular, the discussion on the lack of evidence behind gluten-detection devices and probiotic use in celiac disease addresses conversations that come up frequently in clinic,” said Dr. Hujoel, who wasn’t involved with the update. “The guidelines also include a new addition below each recommendation where future research questions are raised. Many of these questions address gaps in our understanding on celiac disease, such as the possibility of a nonbiopsy diagnosis in adults, which will potentially dramatically impact patient care if addressed.”

The update received no funding. The authors, Dr. Murray, and Dr. Hujoel have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE AMERICAN JOURNAL OF GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Vision screening at well-child visits cost-effective for detecting amblyopia

Article Type
Changed
Fri, 01/13/2023 - 16:27

Screening for amblyopia during primary care visits is more cost-effective than screening in school settings and optometric examinations in kindergarten-aged children in Toronto, data suggest.

Because of the low prevalence of amblyopia among young children, a population-based screening program may not warrant the resources required, despite the added health benefits of a universal program, according to the researchers.

“Amblyopia is a public health problem. For this reason, population-wide approaches to detect and treat amblyopia are critical, and approaches such as school screening and mandated optometry exams have been recommended and introduced in some jurisdictions,” study author Afua Oteng Asare, OD, PhD, a research assistant professor at the University of Utah in Salt Lake City, told this news organization. Dr. Asare conducted the study as a PhD student at the University of Toronto.

“With increasing budgeting constraints and limited resources, policymakers are relying more on economic analyses that measure value-for-money to inform their decisions on programming,” she said. “Evidence comparing the cost-effectiveness of vision-testing approaches to the status quo is, however, limited.”

The study was published in JAMA Network Open.
 

Analyzing costs

Despite recommendations for routine testing, a notable percentage of children in Canada and the United States don’t receive an annual vision exam. The percentage is even higher among children from low-income households, said Dr. Asare. Universal screening in schools and mandatory optometric examinations may improve vision care. But the cost-effectiveness of these measures is unknown for certain conditions, such as amblyopia, the prevalence of which ranges between 3% and 5% in young children.

In Ontario, Canada’s largest province with about 3 million children, universal funding for children’s annual comprehensive eye exams and vision screening during well-child visits is provided through provincial health insurance.

In 2018, the Ontario Ministry of Health introduced guidelines for administering vision screening in kindergartens by public health departments. However, school-based screening has been difficult to introduce because of increasing costs and budgeting constraints, the authors wrote. As an alternative to underfunded programs, optometric associations in Canada have advocated for physicians to recommend early childhood optometric exams.

The investigators analyzed the incremental costs and health benefits, from the perspective of the Ontario government, of public health school screening and optometrist-based vision exams, compared with standard vision screening conducted during well-child visits with primary care physicians. They focused on the aim of detecting amblyopia and amblyopia-related risk factors in children between ages 3 and 5 years in Toronto.

For the analysis, the research team simulated a hypothetical cohort of 25,000 children over 15 years in a probabilistic health state transition model. They incorporated various assumptions, including that children had irreversible vision impairment if not diagnosed by an optometrist. In addition, incremental costs were adjusted to favor the standard screening strategy during well-child visits.

In the school-based and primary care scenarios, children with a positive or inconclusive test result were referred to an optometrist for diagnosis and treatment, which would incur the cost of an optometric evaluation. If positive, children were treated with prescription glasses and additional patching for amblyopia.

The research team measured outcomes as incremental quality-adjusted life-years (QALYs), and health utilities were derived from data on adults, because of the lack of data on children under age 6 years with amblyopia or amblyopia risk factors. The researchers also estimated direct costs to the Ontario government, including visits with primary care doctors, optometrists, public health nurses, and contract screeners, as well as prescription glasses for children with vision impairment who receive social assistance. Costs were expressed in Canadian dollars (CAD).

Overall, compared with the primary care screening strategy, the school screening and optometric examination strategies were generally less costly and had more health benefits. The incremental difference in cost was a savings per child of $84.09 CAD for school screening and $74.47 CAD for optometric examinations. Optometric examinations yielded the largest gain in QALYs, compared with the primary care screening strategy, producing average QALYs of 0.0508 per child.

However, only 20% of school screening iterations and 29% of optometric exam iterations were cost-effective, relative to the primary care screening strategy, at a willingness-to-pay threshold of $50,000 CAD per QALY gained. For instance, when comparing optometric exams with primary care screenings, if the cost of vision screening was $11.50 CAD, the incremental cost-effectiveness ratio would be $77.95 CAD per QALY gained.
 

 

 

Results ‘make sense’  

“We were initially surprised that the alternative screening programs were not cost-effective, compared to status quo vision screening in well-child visits,” said Dr. Asare. “However, the results make sense, considering the study’s universal approach (screening all children regardless of their vision status) and the study’s consideration only of amblyopia, and not of refractive errors, which are even more common in kindergarten children.”

Dr. Asare noted the lack of current data on the rate of vision screenings conducted in childhood by primary care practitioners and on referrals to eye care providers for children with abnormal screenings. Data on vision health disparities and barriers to accessing vision care in young children also are scarce.

“My ultimate research goal is to create and evaluate evidence-based, cost-effective interventions to be used at the point of care by pediatric primary care providers to improve the quality of vision care for children, especially those from socioeconomically deprived backgrounds,” she said. “The take-home message is that school vision screening and mandated eye exams are excellent programs, but they may not be suitable for all contexts.”

Additional studies are needed to look at the cost-effectiveness of the different screening strategies for other aspects included in childhood vision tests, including binocular vision problems, refractive disorders, myopia, allergies, and rare eye diseases.
 

Significant underestimation?

Susan Leat, PhD, a researcher and professor emerita at the University of Waterloo (Ont.) School of Optometry and Vision Science, said, “This study only considers amblyopia, and not all eye diseases and disorders, which significantly underestimates the cost-effectiveness of optometric eye exams.”

Dr. Leat, who wasn’t involved with this study, has researched pediatric optometry and visual development. She and colleagues are developing new tools to test visual acuity in young children.

“If all disorders were taken into account, then optometric testing would be by far the most cost-effective,” she said. “Optometrists can detect all disorders, including more subtle disorders, which if uncorrected or untreated, can impact a child’s early learning.”

The study authors reported no funding for the study. Dr. Asare and Dr. Leat reported no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Screening for amblyopia during primary care visits is more cost-effective than screening in school settings and optometric examinations in kindergarten-aged children in Toronto, data suggest.

Because of the low prevalence of amblyopia among young children, a population-based screening program may not warrant the resources required, despite the added health benefits of a universal program, according to the researchers.

“Amblyopia is a public health problem. For this reason, population-wide approaches to detect and treat amblyopia are critical, and approaches such as school screening and mandated optometry exams have been recommended and introduced in some jurisdictions,” study author Afua Oteng Asare, OD, PhD, a research assistant professor at the University of Utah in Salt Lake City, told this news organization. Dr. Asare conducted the study as a PhD student at the University of Toronto.

“With increasing budgeting constraints and limited resources, policymakers are relying more on economic analyses that measure value-for-money to inform their decisions on programming,” she said. “Evidence comparing the cost-effectiveness of vision-testing approaches to the status quo is, however, limited.”

The study was published in JAMA Network Open.
 

Analyzing costs

Despite recommendations for routine testing, a notable percentage of children in Canada and the United States don’t receive an annual vision exam. The percentage is even higher among children from low-income households, said Dr. Asare. Universal screening in schools and mandatory optometric examinations may improve vision care. But the cost-effectiveness of these measures is unknown for certain conditions, such as amblyopia, the prevalence of which ranges between 3% and 5% in young children.

In Ontario, Canada’s largest province with about 3 million children, universal funding for children’s annual comprehensive eye exams and vision screening during well-child visits is provided through provincial health insurance.

In 2018, the Ontario Ministry of Health introduced guidelines for administering vision screening in kindergartens by public health departments. However, school-based screening has been difficult to introduce because of increasing costs and budgeting constraints, the authors wrote. As an alternative to underfunded programs, optometric associations in Canada have advocated for physicians to recommend early childhood optometric exams.

The investigators analyzed the incremental costs and health benefits, from the perspective of the Ontario government, of public health school screening and optometrist-based vision exams, compared with standard vision screening conducted during well-child visits with primary care physicians. They focused on the aim of detecting amblyopia and amblyopia-related risk factors in children between ages 3 and 5 years in Toronto.

For the analysis, the research team simulated a hypothetical cohort of 25,000 children over 15 years in a probabilistic health state transition model. They incorporated various assumptions, including that children had irreversible vision impairment if not diagnosed by an optometrist. In addition, incremental costs were adjusted to favor the standard screening strategy during well-child visits.

In the school-based and primary care scenarios, children with a positive or inconclusive test result were referred to an optometrist for diagnosis and treatment, which would incur the cost of an optometric evaluation. If positive, children were treated with prescription glasses and additional patching for amblyopia.

The research team measured outcomes as incremental quality-adjusted life-years (QALYs), and health utilities were derived from data on adults, because of the lack of data on children under age 6 years with amblyopia or amblyopia risk factors. The researchers also estimated direct costs to the Ontario government, including visits with primary care doctors, optometrists, public health nurses, and contract screeners, as well as prescription glasses for children with vision impairment who receive social assistance. Costs were expressed in Canadian dollars (CAD).

Overall, compared with the primary care screening strategy, the school screening and optometric examination strategies were generally less costly and had more health benefits. The incremental difference in cost was a savings per child of $84.09 CAD for school screening and $74.47 CAD for optometric examinations. Optometric examinations yielded the largest gain in QALYs, compared with the primary care screening strategy, producing average QALYs of 0.0508 per child.

However, only 20% of school screening iterations and 29% of optometric exam iterations were cost-effective, relative to the primary care screening strategy, at a willingness-to-pay threshold of $50,000 CAD per QALY gained. For instance, when comparing optometric exams with primary care screenings, if the cost of vision screening was $11.50 CAD, the incremental cost-effectiveness ratio would be $77.95 CAD per QALY gained.
 

 

 

Results ‘make sense’  

“We were initially surprised that the alternative screening programs were not cost-effective, compared to status quo vision screening in well-child visits,” said Dr. Asare. “However, the results make sense, considering the study’s universal approach (screening all children regardless of their vision status) and the study’s consideration only of amblyopia, and not of refractive errors, which are even more common in kindergarten children.”

Dr. Asare noted the lack of current data on the rate of vision screenings conducted in childhood by primary care practitioners and on referrals to eye care providers for children with abnormal screenings. Data on vision health disparities and barriers to accessing vision care in young children also are scarce.

“My ultimate research goal is to create and evaluate evidence-based, cost-effective interventions to be used at the point of care by pediatric primary care providers to improve the quality of vision care for children, especially those from socioeconomically deprived backgrounds,” she said. “The take-home message is that school vision screening and mandated eye exams are excellent programs, but they may not be suitable for all contexts.”

Additional studies are needed to look at the cost-effectiveness of the different screening strategies for other aspects included in childhood vision tests, including binocular vision problems, refractive disorders, myopia, allergies, and rare eye diseases.
 

Significant underestimation?

Susan Leat, PhD, a researcher and professor emerita at the University of Waterloo (Ont.) School of Optometry and Vision Science, said, “This study only considers amblyopia, and not all eye diseases and disorders, which significantly underestimates the cost-effectiveness of optometric eye exams.”

Dr. Leat, who wasn’t involved with this study, has researched pediatric optometry and visual development. She and colleagues are developing new tools to test visual acuity in young children.

“If all disorders were taken into account, then optometric testing would be by far the most cost-effective,” she said. “Optometrists can detect all disorders, including more subtle disorders, which if uncorrected or untreated, can impact a child’s early learning.”

The study authors reported no funding for the study. Dr. Asare and Dr. Leat reported no relevant disclosures.

A version of this article first appeared on Medscape.com.

Screening for amblyopia during primary care visits is more cost-effective than screening in school settings and optometric examinations in kindergarten-aged children in Toronto, data suggest.

Because of the low prevalence of amblyopia among young children, a population-based screening program may not warrant the resources required, despite the added health benefits of a universal program, according to the researchers.

“Amblyopia is a public health problem. For this reason, population-wide approaches to detect and treat amblyopia are critical, and approaches such as school screening and mandated optometry exams have been recommended and introduced in some jurisdictions,” study author Afua Oteng Asare, OD, PhD, a research assistant professor at the University of Utah in Salt Lake City, told this news organization. Dr. Asare conducted the study as a PhD student at the University of Toronto.

“With increasing budgeting constraints and limited resources, policymakers are relying more on economic analyses that measure value-for-money to inform their decisions on programming,” she said. “Evidence comparing the cost-effectiveness of vision-testing approaches to the status quo is, however, limited.”

The study was published in JAMA Network Open.
 

Analyzing costs

Despite recommendations for routine testing, a notable percentage of children in Canada and the United States don’t receive an annual vision exam. The percentage is even higher among children from low-income households, said Dr. Asare. Universal screening in schools and mandatory optometric examinations may improve vision care. But the cost-effectiveness of these measures is unknown for certain conditions, such as amblyopia, the prevalence of which ranges between 3% and 5% in young children.

In Ontario, Canada’s largest province with about 3 million children, universal funding for children’s annual comprehensive eye exams and vision screening during well-child visits is provided through provincial health insurance.

In 2018, the Ontario Ministry of Health introduced guidelines for administering vision screening in kindergartens by public health departments. However, school-based screening has been difficult to introduce because of increasing costs and budgeting constraints, the authors wrote. As an alternative to underfunded programs, optometric associations in Canada have advocated for physicians to recommend early childhood optometric exams.

The investigators analyzed the incremental costs and health benefits, from the perspective of the Ontario government, of public health school screening and optometrist-based vision exams, compared with standard vision screening conducted during well-child visits with primary care physicians. They focused on the aim of detecting amblyopia and amblyopia-related risk factors in children between ages 3 and 5 years in Toronto.

For the analysis, the research team simulated a hypothetical cohort of 25,000 children over 15 years in a probabilistic health state transition model. They incorporated various assumptions, including that children had irreversible vision impairment if not diagnosed by an optometrist. In addition, incremental costs were adjusted to favor the standard screening strategy during well-child visits.

In the school-based and primary care scenarios, children with a positive or inconclusive test result were referred to an optometrist for diagnosis and treatment, which would incur the cost of an optometric evaluation. If positive, children were treated with prescription glasses and additional patching for amblyopia.

The research team measured outcomes as incremental quality-adjusted life-years (QALYs), and health utilities were derived from data on adults, because of the lack of data on children under age 6 years with amblyopia or amblyopia risk factors. The researchers also estimated direct costs to the Ontario government, including visits with primary care doctors, optometrists, public health nurses, and contract screeners, as well as prescription glasses for children with vision impairment who receive social assistance. Costs were expressed in Canadian dollars (CAD).

Overall, compared with the primary care screening strategy, the school screening and optometric examination strategies were generally less costly and had more health benefits. The incremental difference in cost was a savings per child of $84.09 CAD for school screening and $74.47 CAD for optometric examinations. Optometric examinations yielded the largest gain in QALYs, compared with the primary care screening strategy, producing average QALYs of 0.0508 per child.

However, only 20% of school screening iterations and 29% of optometric exam iterations were cost-effective, relative to the primary care screening strategy, at a willingness-to-pay threshold of $50,000 CAD per QALY gained. For instance, when comparing optometric exams with primary care screenings, if the cost of vision screening was $11.50 CAD, the incremental cost-effectiveness ratio would be $77.95 CAD per QALY gained.
 

 

 

Results ‘make sense’  

“We were initially surprised that the alternative screening programs were not cost-effective, compared to status quo vision screening in well-child visits,” said Dr. Asare. “However, the results make sense, considering the study’s universal approach (screening all children regardless of their vision status) and the study’s consideration only of amblyopia, and not of refractive errors, which are even more common in kindergarten children.”

Dr. Asare noted the lack of current data on the rate of vision screenings conducted in childhood by primary care practitioners and on referrals to eye care providers for children with abnormal screenings. Data on vision health disparities and barriers to accessing vision care in young children also are scarce.

“My ultimate research goal is to create and evaluate evidence-based, cost-effective interventions to be used at the point of care by pediatric primary care providers to improve the quality of vision care for children, especially those from socioeconomically deprived backgrounds,” she said. “The take-home message is that school vision screening and mandated eye exams are excellent programs, but they may not be suitable for all contexts.”

Additional studies are needed to look at the cost-effectiveness of the different screening strategies for other aspects included in childhood vision tests, including binocular vision problems, refractive disorders, myopia, allergies, and rare eye diseases.
 

Significant underestimation?

Susan Leat, PhD, a researcher and professor emerita at the University of Waterloo (Ont.) School of Optometry and Vision Science, said, “This study only considers amblyopia, and not all eye diseases and disorders, which significantly underestimates the cost-effectiveness of optometric eye exams.”

Dr. Leat, who wasn’t involved with this study, has researched pediatric optometry and visual development. She and colleagues are developing new tools to test visual acuity in young children.

“If all disorders were taken into account, then optometric testing would be by far the most cost-effective,” she said. “Optometrists can detect all disorders, including more subtle disorders, which if uncorrected or untreated, can impact a child’s early learning.”

The study authors reported no funding for the study. Dr. Asare and Dr. Leat reported no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NETWORK OPEN

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Which treatments improve long-term outcomes of critical COVID illness?

Article Type
Changed
Thu, 01/12/2023 - 16:41

Treatment with interleukin-6 receptor antagonists or antiplatelet agents improves survival and outcomes at 6 months for critically ill patients with COVID-19, according to new data.

However, survival wasn’t improved with therapeutic anticoagulation, convalescent plasma, or lopinavir-ritonavir, and survival was worsened with hydroxychloroquine.

“After critically ill patients leave the hospital, there’s a high risk of readmission, death after discharge, or exacerbations of chronic illness,” study author Patrick Lawler, MD, a clinician-scientist at the Peter Munk Cardiac Centre at University Health Network and an assistant professor of medicine at the University of Toronto, said in an interview.

“When looking at the impact of treatment, we don’t want to improve short-term outcomes yet worsen long-term disability,” he said. “That long-term, 6-month horizon is what matters most to patients.”

The study was published online in JAMA.
 

Investigating treatments

The investigators analyzed data from an ongoing platform trial called Randomized Embedded Multifactorial Adaptive Platform for Community Acquired Pneumonia (REMAP-CAP). The trial is evaluating treatments for patients with severe pneumonia in pandemic and nonpandemic settings.

In the trial, patients are randomly assigned to receive one or more interventions within the following six treatment domains: immune modulators, convalescent plasma, antiplatelet therapy, anticoagulation, antivirals, and corticosteroids. The trial’s primary outcome for patients with COVID-19 is hospital survival and organ support–free days up to 21 days. Researchers previously observed improvement after treatment with IL-6 receptor antagonists (which are immune modulators).

For this study, the research team analyzed data for 4,869 critically ill adult patients with COVID-19 who were enrolled between March 2020 and June 2021 at 197 sites in 14 countries. A 180-day follow-up was completed in March 2022. The critically ill patients had been admitted to an intensive care unit and had received respiratory or cardiovascular organ support.

The researchers examined survival through day 180. A hazard ratio of less than 1 represented improved survival, and an HR greater than 1 represented harm. Futility was represented by a relative improvement in outcome of less than 20%, which was shown by an HR greater than 0.83.

Among the 4,869 patients, 4,107 patients had a known mortality status, and 2,590 were alive at day 180. Among the 1,517 patients who died by day 180, 91 deaths (6%) occurred between hospital discharge and day 180.

Overall, use of IL-6 receptor antagonists (either tocilizumab or sarilumab) had a greater than 99.9% probability of improving 6-month survival, and use of antiplatelet agents (aspirin or a P2Y12 inhibitor such as clopidogrel, prasugrel, or ticagrelor) had a 95% probability of improving 6-month survival, compared with control therapies.

In contrast, long-term survival wasn’t improved with therapeutic anticoagulation (11.5%), convalescent plasma (54.7%), or lopinavir-ritonavir (31.9%). The probability of trial-defined statistical futility was high for anticoagulation (99.9%), convalescent plasma (99.2%), and lopinavir-ritonavir (96.6%).

Long-term survival was worsened with hydroxychloroquine, with a posterior probability of harm of 96.9%. In addition, the combination of lopinavir-ritonavir and hydroxychloroquine had a 96.8% probability of harm.

Corticosteroids didn’t improve long-term outcomes, although enrollment in the treatment domain was terminated early in response to external evidence. The probability of improving 6-month survival ranged from 57.1% to 61.6% for various hydrocortisone dosing strategies.
 

 

 

Consistent treatment effects

When considered along with previously reported short-term results from the REMAP-CAP trial, the findings indicate that initial in-hospital treatment effects were consistent for most therapies through 6 months.

“We were very relieved to see that treatments with a favorable benefit for patients in the short term also appeared to be beneficial through 180 days,” said Dr. Lawler. “This supports the current clinical practice strategy in providing treatment to critically ill patients with COVID-19.”

In a subgroup analysis of 989 patients, health-related quality of life at day 180 was higher among those treated with IL-6 receptor antagonists and antiplatelet agents. The average quality-of-life score for the lopinavir-ritonavir group was lower than for control patients.

Among 720 survivors, 273 patients (37.9%) had moderate, severe, or complete disability at day 180. IL-6 receptor antagonists had a 92.6% probability of reducing disability, and anakinra (an IL-1 receptor antagonist) had a 90.8% probability of reducing disability. However, lopinavir-ritonavir had a 91.7% probability of worsening disability.

The REMAP-CAP trial investigators will continue to assess treatment domains and long-term outcomes among COVID-19 patients. They will evaluate additional data regarding disability, quality of life, and long-COVID outcomes.
 

“Reassuring” results

Commenting on the study, Angela Cheung, MD, PhD, a professor of medicine at the University of Toronto and senior scientist at the Toronto General Research Institute, said, “It is important to look at the longer-term effects of these therapies, as sometimes we may improve things in the short term, but that may not translate to longer-term gains. Historically, most trials conducted in this patient population assess only short outcomes, such as organ failure or 28-day mortality.”

Dr. Cheung, who wasn’t involved with this study, serves as the co-lead for the Canadian COVID-19 Prospective Cohort Study (CANCOV) and the Recovering From COVID-19 Lingering Symptoms Adaptive Integrative Medicine Trial (RECLAIM). These studies are also analyzing long-term outcomes among COVID-19 patients.

“It is reassuring to see that the 6-month outcomes are consistent with the short-term outcomes,” she said. “This study will help guide critical care medicine physicians in their treatment of critically ill patients with COVID-19.”

The study was supported by numerous grants and funds, including the Canadian Institute of Health Research COVID-19 Rapid Research Funding. Amgen and Eisai also provided funding. Dr. Lawler received grants from Canadian Institutes for Health Research and the Heart and Stroke Foundation of Canada during the conduct of the study and personal fees from Novartis, CorEvitas, Partners Healthcare, and the American College of Cardiology outside the submitted work. Dr. Cheung has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Treatment with interleukin-6 receptor antagonists or antiplatelet agents improves survival and outcomes at 6 months for critically ill patients with COVID-19, according to new data.

However, survival wasn’t improved with therapeutic anticoagulation, convalescent plasma, or lopinavir-ritonavir, and survival was worsened with hydroxychloroquine.

“After critically ill patients leave the hospital, there’s a high risk of readmission, death after discharge, or exacerbations of chronic illness,” study author Patrick Lawler, MD, a clinician-scientist at the Peter Munk Cardiac Centre at University Health Network and an assistant professor of medicine at the University of Toronto, said in an interview.

“When looking at the impact of treatment, we don’t want to improve short-term outcomes yet worsen long-term disability,” he said. “That long-term, 6-month horizon is what matters most to patients.”

The study was published online in JAMA.
 

Investigating treatments

The investigators analyzed data from an ongoing platform trial called Randomized Embedded Multifactorial Adaptive Platform for Community Acquired Pneumonia (REMAP-CAP). The trial is evaluating treatments for patients with severe pneumonia in pandemic and nonpandemic settings.

In the trial, patients are randomly assigned to receive one or more interventions within the following six treatment domains: immune modulators, convalescent plasma, antiplatelet therapy, anticoagulation, antivirals, and corticosteroids. The trial’s primary outcome for patients with COVID-19 is hospital survival and organ support–free days up to 21 days. Researchers previously observed improvement after treatment with IL-6 receptor antagonists (which are immune modulators).

For this study, the research team analyzed data for 4,869 critically ill adult patients with COVID-19 who were enrolled between March 2020 and June 2021 at 197 sites in 14 countries. A 180-day follow-up was completed in March 2022. The critically ill patients had been admitted to an intensive care unit and had received respiratory or cardiovascular organ support.

The researchers examined survival through day 180. A hazard ratio of less than 1 represented improved survival, and an HR greater than 1 represented harm. Futility was represented by a relative improvement in outcome of less than 20%, which was shown by an HR greater than 0.83.

Among the 4,869 patients, 4,107 patients had a known mortality status, and 2,590 were alive at day 180. Among the 1,517 patients who died by day 180, 91 deaths (6%) occurred between hospital discharge and day 180.

Overall, use of IL-6 receptor antagonists (either tocilizumab or sarilumab) had a greater than 99.9% probability of improving 6-month survival, and use of antiplatelet agents (aspirin or a P2Y12 inhibitor such as clopidogrel, prasugrel, or ticagrelor) had a 95% probability of improving 6-month survival, compared with control therapies.

In contrast, long-term survival wasn’t improved with therapeutic anticoagulation (11.5%), convalescent plasma (54.7%), or lopinavir-ritonavir (31.9%). The probability of trial-defined statistical futility was high for anticoagulation (99.9%), convalescent plasma (99.2%), and lopinavir-ritonavir (96.6%).

Long-term survival was worsened with hydroxychloroquine, with a posterior probability of harm of 96.9%. In addition, the combination of lopinavir-ritonavir and hydroxychloroquine had a 96.8% probability of harm.

Corticosteroids didn’t improve long-term outcomes, although enrollment in the treatment domain was terminated early in response to external evidence. The probability of improving 6-month survival ranged from 57.1% to 61.6% for various hydrocortisone dosing strategies.
 

 

 

Consistent treatment effects

When considered along with previously reported short-term results from the REMAP-CAP trial, the findings indicate that initial in-hospital treatment effects were consistent for most therapies through 6 months.

“We were very relieved to see that treatments with a favorable benefit for patients in the short term also appeared to be beneficial through 180 days,” said Dr. Lawler. “This supports the current clinical practice strategy in providing treatment to critically ill patients with COVID-19.”

In a subgroup analysis of 989 patients, health-related quality of life at day 180 was higher among those treated with IL-6 receptor antagonists and antiplatelet agents. The average quality-of-life score for the lopinavir-ritonavir group was lower than for control patients.

Among 720 survivors, 273 patients (37.9%) had moderate, severe, or complete disability at day 180. IL-6 receptor antagonists had a 92.6% probability of reducing disability, and anakinra (an IL-1 receptor antagonist) had a 90.8% probability of reducing disability. However, lopinavir-ritonavir had a 91.7% probability of worsening disability.

The REMAP-CAP trial investigators will continue to assess treatment domains and long-term outcomes among COVID-19 patients. They will evaluate additional data regarding disability, quality of life, and long-COVID outcomes.
 

“Reassuring” results

Commenting on the study, Angela Cheung, MD, PhD, a professor of medicine at the University of Toronto and senior scientist at the Toronto General Research Institute, said, “It is important to look at the longer-term effects of these therapies, as sometimes we may improve things in the short term, but that may not translate to longer-term gains. Historically, most trials conducted in this patient population assess only short outcomes, such as organ failure or 28-day mortality.”

Dr. Cheung, who wasn’t involved with this study, serves as the co-lead for the Canadian COVID-19 Prospective Cohort Study (CANCOV) and the Recovering From COVID-19 Lingering Symptoms Adaptive Integrative Medicine Trial (RECLAIM). These studies are also analyzing long-term outcomes among COVID-19 patients.

“It is reassuring to see that the 6-month outcomes are consistent with the short-term outcomes,” she said. “This study will help guide critical care medicine physicians in their treatment of critically ill patients with COVID-19.”

The study was supported by numerous grants and funds, including the Canadian Institute of Health Research COVID-19 Rapid Research Funding. Amgen and Eisai also provided funding. Dr. Lawler received grants from Canadian Institutes for Health Research and the Heart and Stroke Foundation of Canada during the conduct of the study and personal fees from Novartis, CorEvitas, Partners Healthcare, and the American College of Cardiology outside the submitted work. Dr. Cheung has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Treatment with interleukin-6 receptor antagonists or antiplatelet agents improves survival and outcomes at 6 months for critically ill patients with COVID-19, according to new data.

However, survival wasn’t improved with therapeutic anticoagulation, convalescent plasma, or lopinavir-ritonavir, and survival was worsened with hydroxychloroquine.

“After critically ill patients leave the hospital, there’s a high risk of readmission, death after discharge, or exacerbations of chronic illness,” study author Patrick Lawler, MD, a clinician-scientist at the Peter Munk Cardiac Centre at University Health Network and an assistant professor of medicine at the University of Toronto, said in an interview.

“When looking at the impact of treatment, we don’t want to improve short-term outcomes yet worsen long-term disability,” he said. “That long-term, 6-month horizon is what matters most to patients.”

The study was published online in JAMA.
 

Investigating treatments

The investigators analyzed data from an ongoing platform trial called Randomized Embedded Multifactorial Adaptive Platform for Community Acquired Pneumonia (REMAP-CAP). The trial is evaluating treatments for patients with severe pneumonia in pandemic and nonpandemic settings.

In the trial, patients are randomly assigned to receive one or more interventions within the following six treatment domains: immune modulators, convalescent plasma, antiplatelet therapy, anticoagulation, antivirals, and corticosteroids. The trial’s primary outcome for patients with COVID-19 is hospital survival and organ support–free days up to 21 days. Researchers previously observed improvement after treatment with IL-6 receptor antagonists (which are immune modulators).

For this study, the research team analyzed data for 4,869 critically ill adult patients with COVID-19 who were enrolled between March 2020 and June 2021 at 197 sites in 14 countries. A 180-day follow-up was completed in March 2022. The critically ill patients had been admitted to an intensive care unit and had received respiratory or cardiovascular organ support.

The researchers examined survival through day 180. A hazard ratio of less than 1 represented improved survival, and an HR greater than 1 represented harm. Futility was represented by a relative improvement in outcome of less than 20%, which was shown by an HR greater than 0.83.

Among the 4,869 patients, 4,107 patients had a known mortality status, and 2,590 were alive at day 180. Among the 1,517 patients who died by day 180, 91 deaths (6%) occurred between hospital discharge and day 180.

Overall, use of IL-6 receptor antagonists (either tocilizumab or sarilumab) had a greater than 99.9% probability of improving 6-month survival, and use of antiplatelet agents (aspirin or a P2Y12 inhibitor such as clopidogrel, prasugrel, or ticagrelor) had a 95% probability of improving 6-month survival, compared with control therapies.

In contrast, long-term survival wasn’t improved with therapeutic anticoagulation (11.5%), convalescent plasma (54.7%), or lopinavir-ritonavir (31.9%). The probability of trial-defined statistical futility was high for anticoagulation (99.9%), convalescent plasma (99.2%), and lopinavir-ritonavir (96.6%).

Long-term survival was worsened with hydroxychloroquine, with a posterior probability of harm of 96.9%. In addition, the combination of lopinavir-ritonavir and hydroxychloroquine had a 96.8% probability of harm.

Corticosteroids didn’t improve long-term outcomes, although enrollment in the treatment domain was terminated early in response to external evidence. The probability of improving 6-month survival ranged from 57.1% to 61.6% for various hydrocortisone dosing strategies.
 

 

 

Consistent treatment effects

When considered along with previously reported short-term results from the REMAP-CAP trial, the findings indicate that initial in-hospital treatment effects were consistent for most therapies through 6 months.

“We were very relieved to see that treatments with a favorable benefit for patients in the short term also appeared to be beneficial through 180 days,” said Dr. Lawler. “This supports the current clinical practice strategy in providing treatment to critically ill patients with COVID-19.”

In a subgroup analysis of 989 patients, health-related quality of life at day 180 was higher among those treated with IL-6 receptor antagonists and antiplatelet agents. The average quality-of-life score for the lopinavir-ritonavir group was lower than for control patients.

Among 720 survivors, 273 patients (37.9%) had moderate, severe, or complete disability at day 180. IL-6 receptor antagonists had a 92.6% probability of reducing disability, and anakinra (an IL-1 receptor antagonist) had a 90.8% probability of reducing disability. However, lopinavir-ritonavir had a 91.7% probability of worsening disability.

The REMAP-CAP trial investigators will continue to assess treatment domains and long-term outcomes among COVID-19 patients. They will evaluate additional data regarding disability, quality of life, and long-COVID outcomes.
 

“Reassuring” results

Commenting on the study, Angela Cheung, MD, PhD, a professor of medicine at the University of Toronto and senior scientist at the Toronto General Research Institute, said, “It is important to look at the longer-term effects of these therapies, as sometimes we may improve things in the short term, but that may not translate to longer-term gains. Historically, most trials conducted in this patient population assess only short outcomes, such as organ failure or 28-day mortality.”

Dr. Cheung, who wasn’t involved with this study, serves as the co-lead for the Canadian COVID-19 Prospective Cohort Study (CANCOV) and the Recovering From COVID-19 Lingering Symptoms Adaptive Integrative Medicine Trial (RECLAIM). These studies are also analyzing long-term outcomes among COVID-19 patients.

“It is reassuring to see that the 6-month outcomes are consistent with the short-term outcomes,” she said. “This study will help guide critical care medicine physicians in their treatment of critically ill patients with COVID-19.”

The study was supported by numerous grants and funds, including the Canadian Institute of Health Research COVID-19 Rapid Research Funding. Amgen and Eisai also provided funding. Dr. Lawler received grants from Canadian Institutes for Health Research and the Heart and Stroke Foundation of Canada during the conduct of the study and personal fees from Novartis, CorEvitas, Partners Healthcare, and the American College of Cardiology outside the submitted work. Dr. Cheung has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Two drug classes appear effective for gastroparesis treatment

Article Type
Changed
Wed, 01/25/2023 - 13:16

Two classes of drugs may be more effective than others for the treatment of gastroparesis, though the overall quality of evidence remains low to moderate and additional data are needed, according to a new report.

Oral dopamine antagonists and tachykinin-1 antagonists appear superior to placebo, finds the study. In addition, some drugs rank higher for addressing individual symptoms.

“Gastroparesis has a substantial impact on quality of life and societal functioning for patients, and the costs to the health service are high,” Alexander Ford, MBChB, MD, a professor of gastroenterology and honorary consultant gastroenterologist at the Leeds (England) Institute of Medical Research at St. James’s, University of Leeds, said in an interview.

“There are very few licensed therapies, but some novel drugs are in the pipeline, some existing drugs that are licensed for other conditions could be repurposed if efficacious, and some older drugs that have safety concerns may be beneficial,” he said. “Given the impact on patients and their symptoms, they may be willing to accept these safety risks in return for symptom improvement.”

Only one drug, the dopamine antagonist metoclopramide, has Food and Drug Administration approval for the treatment of gastroparesis, noted Dr. Ford and colleagues. The lack of other recommended drugs or new medications has resulted in off-label use of drugs in other classes.

The study was published online in Gastroenterology.
 

Investigating treatments

To address the lack of evidence supporting the efficacy and safety of licensed and unlicensed drugs for the condition, the researchers conducted a systematic review and network meta-analysis of randomized controlled trials of drugs for gastroparesis dating from 1947 to September 2022. The trials involved more than dozen drugs in several classes.

They determined drug efficacy on the basis of global symptoms of gastroparesis and individual symptoms such as nausea, vomiting, abdominal pain, bloating, or fullness. They judged safety on the basis of total adverse events and adverse events leading to withdrawal.

The research team extracted data as intention-to-treat analyses, assuming dropouts to be treatment failures. They reported efficacy as a pooled relative risk of symptoms not improving and ranked the drugs according to P-score.

The analysis included 29 randomized controlled trials with 3,772 patients. Only four trials were at low risk of bias.

Overall, only two drug classes were considered efficacious: oral dopamine antagonists (RR, 0.58; P-score, 0.96) and tachykinin-1 antagonists (RR, 0.69; P-score, 0.83).

On the basis of 25 trials that reported on global symptoms, clebopride ranked first for efficacy (RR, 0.30; P-score, 0.99), followed by domperidone (RR, 0.69; P-score, 0.76). None of the other drugs were superior to the placebo. After direct and indirect comparisons, clebopride was superior to all other drugs except aprepitant.

After excluding three trials with a placebo run-in and a trial where only responders to single-blind domperidone were randomized, the researchers analyzed 21 trials with 2,233 patients. In this analysis, domperidone ranked first (RR, 0.48; P-score, 0.93), followed by oral metoclopramide (RR, 0.54; P-score, 0.87). None of the other drugs were superior to placebo.

Among 16 trials, including 1,381 patients, that confirmed delayed gastric emptying among all participants, only clebopride and metoclopramide were more efficacious than placebo. Clebopride ranked first (RR, 0.30; P-score, 0.95) and metoclopramide ranked third (RR, 0.48).

Among 13 trials with 785 patients with diabetic gastroparesis, none of the active drugs were superior to placebo. Among 12 trials recruiting patients with idiopathic or mixed etiology gastroparesis, clebopride ranked first (RR, 0.30; P-score, 0.93).

On the basis of trials that assessed individual symptoms, oral metoclopramide ranked first for nausea (RR, 0.46; P-score, 0.95), fullness (RR, 0.67; P-score, 0.86), and bloating (RR, 0.53; P-score, 0.97). However, the data came from one small trial. Tradipitant and TZP-102, a ghrelin agonist, were efficacious for nausea, and TZP-102 ranked second for fullness. No drugs were more efficacious than the placebo for abdominal pain or vomiting.

Among 20 trials that reported on the total number of adverse events, camicinal was the least likely to be associated with adverse events (RR, 0.77; P-score, 0.93) and prucalopride was the most likely to be associated with adverse events (RR, 2.96; P-score, 0.10). Prucalopride, oral metoclopramide, and aprepitant also were more likely than placebo to be associated with adverse events.

In 23 trials that reported on withdrawals caused by adverse events, camicinal was the least likely to be associated with withdrawals (RR, 0.20; P-score, 0.87). Nortriptyline was the most likely to be associated with withdrawals (RR, 3.33; P-score, 0.16). However, there were no significant differences between any individual drug and placebo.
 

 

 

Urgent need remains

More trials of drugs to treat gastroparesis are needed, Ford said.

“We need to consider the reintroduction of dopamine antagonists, if patients are willing to accept the safety concerns,” he added. “The other important point is most drugs were not of benefit. There is an urgent need to find efficacious therapies, and these should be fast-tracked for licensing approval if efficacy is proven.”

The study is “helpful for practicing clinicians since it provides a comprehensive review of clinical trials in gastroparesis,” Anthony Lembo, MD, a gastroenterologist at the Cleveland Clinic, said in an interview.

Dr. Lembo, who wasn’t involved with this study, has researched several drugs for gastroparesis, including relamorelin and TZP-102. He agreed that additional research is needed.

“There is a paucity of novel treatments currently in development,” he said. “However, there is interest in developing a product similar to domperidone without cardiac side effects, as well as performing larger studies with botulinum toxin injection.”

The authors did not disclose a funding source for the study. One author disclosed research funding from the National Institutes of Health and consulting roles with various pharmaceutical companies. Ford and the other authors reported no disclosures. Dr. Lembo reported no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Two classes of drugs may be more effective than others for the treatment of gastroparesis, though the overall quality of evidence remains low to moderate and additional data are needed, according to a new report.

Oral dopamine antagonists and tachykinin-1 antagonists appear superior to placebo, finds the study. In addition, some drugs rank higher for addressing individual symptoms.

“Gastroparesis has a substantial impact on quality of life and societal functioning for patients, and the costs to the health service are high,” Alexander Ford, MBChB, MD, a professor of gastroenterology and honorary consultant gastroenterologist at the Leeds (England) Institute of Medical Research at St. James’s, University of Leeds, said in an interview.

“There are very few licensed therapies, but some novel drugs are in the pipeline, some existing drugs that are licensed for other conditions could be repurposed if efficacious, and some older drugs that have safety concerns may be beneficial,” he said. “Given the impact on patients and their symptoms, they may be willing to accept these safety risks in return for symptom improvement.”

Only one drug, the dopamine antagonist metoclopramide, has Food and Drug Administration approval for the treatment of gastroparesis, noted Dr. Ford and colleagues. The lack of other recommended drugs or new medications has resulted in off-label use of drugs in other classes.

The study was published online in Gastroenterology.
 

Investigating treatments

To address the lack of evidence supporting the efficacy and safety of licensed and unlicensed drugs for the condition, the researchers conducted a systematic review and network meta-analysis of randomized controlled trials of drugs for gastroparesis dating from 1947 to September 2022. The trials involved more than dozen drugs in several classes.

They determined drug efficacy on the basis of global symptoms of gastroparesis and individual symptoms such as nausea, vomiting, abdominal pain, bloating, or fullness. They judged safety on the basis of total adverse events and adverse events leading to withdrawal.

The research team extracted data as intention-to-treat analyses, assuming dropouts to be treatment failures. They reported efficacy as a pooled relative risk of symptoms not improving and ranked the drugs according to P-score.

The analysis included 29 randomized controlled trials with 3,772 patients. Only four trials were at low risk of bias.

Overall, only two drug classes were considered efficacious: oral dopamine antagonists (RR, 0.58; P-score, 0.96) and tachykinin-1 antagonists (RR, 0.69; P-score, 0.83).

On the basis of 25 trials that reported on global symptoms, clebopride ranked first for efficacy (RR, 0.30; P-score, 0.99), followed by domperidone (RR, 0.69; P-score, 0.76). None of the other drugs were superior to the placebo. After direct and indirect comparisons, clebopride was superior to all other drugs except aprepitant.

After excluding three trials with a placebo run-in and a trial where only responders to single-blind domperidone were randomized, the researchers analyzed 21 trials with 2,233 patients. In this analysis, domperidone ranked first (RR, 0.48; P-score, 0.93), followed by oral metoclopramide (RR, 0.54; P-score, 0.87). None of the other drugs were superior to placebo.

Among 16 trials, including 1,381 patients, that confirmed delayed gastric emptying among all participants, only clebopride and metoclopramide were more efficacious than placebo. Clebopride ranked first (RR, 0.30; P-score, 0.95) and metoclopramide ranked third (RR, 0.48).

Among 13 trials with 785 patients with diabetic gastroparesis, none of the active drugs were superior to placebo. Among 12 trials recruiting patients with idiopathic or mixed etiology gastroparesis, clebopride ranked first (RR, 0.30; P-score, 0.93).

On the basis of trials that assessed individual symptoms, oral metoclopramide ranked first for nausea (RR, 0.46; P-score, 0.95), fullness (RR, 0.67; P-score, 0.86), and bloating (RR, 0.53; P-score, 0.97). However, the data came from one small trial. Tradipitant and TZP-102, a ghrelin agonist, were efficacious for nausea, and TZP-102 ranked second for fullness. No drugs were more efficacious than the placebo for abdominal pain or vomiting.

Among 20 trials that reported on the total number of adverse events, camicinal was the least likely to be associated with adverse events (RR, 0.77; P-score, 0.93) and prucalopride was the most likely to be associated with adverse events (RR, 2.96; P-score, 0.10). Prucalopride, oral metoclopramide, and aprepitant also were more likely than placebo to be associated with adverse events.

In 23 trials that reported on withdrawals caused by adverse events, camicinal was the least likely to be associated with withdrawals (RR, 0.20; P-score, 0.87). Nortriptyline was the most likely to be associated with withdrawals (RR, 3.33; P-score, 0.16). However, there were no significant differences between any individual drug and placebo.
 

 

 

Urgent need remains

More trials of drugs to treat gastroparesis are needed, Ford said.

“We need to consider the reintroduction of dopamine antagonists, if patients are willing to accept the safety concerns,” he added. “The other important point is most drugs were not of benefit. There is an urgent need to find efficacious therapies, and these should be fast-tracked for licensing approval if efficacy is proven.”

The study is “helpful for practicing clinicians since it provides a comprehensive review of clinical trials in gastroparesis,” Anthony Lembo, MD, a gastroenterologist at the Cleveland Clinic, said in an interview.

Dr. Lembo, who wasn’t involved with this study, has researched several drugs for gastroparesis, including relamorelin and TZP-102. He agreed that additional research is needed.

“There is a paucity of novel treatments currently in development,” he said. “However, there is interest in developing a product similar to domperidone without cardiac side effects, as well as performing larger studies with botulinum toxin injection.”

The authors did not disclose a funding source for the study. One author disclosed research funding from the National Institutes of Health and consulting roles with various pharmaceutical companies. Ford and the other authors reported no disclosures. Dr. Lembo reported no relevant disclosures.

A version of this article first appeared on Medscape.com.

Two classes of drugs may be more effective than others for the treatment of gastroparesis, though the overall quality of evidence remains low to moderate and additional data are needed, according to a new report.

Oral dopamine antagonists and tachykinin-1 antagonists appear superior to placebo, finds the study. In addition, some drugs rank higher for addressing individual symptoms.

“Gastroparesis has a substantial impact on quality of life and societal functioning for patients, and the costs to the health service are high,” Alexander Ford, MBChB, MD, a professor of gastroenterology and honorary consultant gastroenterologist at the Leeds (England) Institute of Medical Research at St. James’s, University of Leeds, said in an interview.

“There are very few licensed therapies, but some novel drugs are in the pipeline, some existing drugs that are licensed for other conditions could be repurposed if efficacious, and some older drugs that have safety concerns may be beneficial,” he said. “Given the impact on patients and their symptoms, they may be willing to accept these safety risks in return for symptom improvement.”

Only one drug, the dopamine antagonist metoclopramide, has Food and Drug Administration approval for the treatment of gastroparesis, noted Dr. Ford and colleagues. The lack of other recommended drugs or new medications has resulted in off-label use of drugs in other classes.

The study was published online in Gastroenterology.
 

Investigating treatments

To address the lack of evidence supporting the efficacy and safety of licensed and unlicensed drugs for the condition, the researchers conducted a systematic review and network meta-analysis of randomized controlled trials of drugs for gastroparesis dating from 1947 to September 2022. The trials involved more than dozen drugs in several classes.

They determined drug efficacy on the basis of global symptoms of gastroparesis and individual symptoms such as nausea, vomiting, abdominal pain, bloating, or fullness. They judged safety on the basis of total adverse events and adverse events leading to withdrawal.

The research team extracted data as intention-to-treat analyses, assuming dropouts to be treatment failures. They reported efficacy as a pooled relative risk of symptoms not improving and ranked the drugs according to P-score.

The analysis included 29 randomized controlled trials with 3,772 patients. Only four trials were at low risk of bias.

Overall, only two drug classes were considered efficacious: oral dopamine antagonists (RR, 0.58; P-score, 0.96) and tachykinin-1 antagonists (RR, 0.69; P-score, 0.83).

On the basis of 25 trials that reported on global symptoms, clebopride ranked first for efficacy (RR, 0.30; P-score, 0.99), followed by domperidone (RR, 0.69; P-score, 0.76). None of the other drugs were superior to the placebo. After direct and indirect comparisons, clebopride was superior to all other drugs except aprepitant.

After excluding three trials with a placebo run-in and a trial where only responders to single-blind domperidone were randomized, the researchers analyzed 21 trials with 2,233 patients. In this analysis, domperidone ranked first (RR, 0.48; P-score, 0.93), followed by oral metoclopramide (RR, 0.54; P-score, 0.87). None of the other drugs were superior to placebo.

Among 16 trials, including 1,381 patients, that confirmed delayed gastric emptying among all participants, only clebopride and metoclopramide were more efficacious than placebo. Clebopride ranked first (RR, 0.30; P-score, 0.95) and metoclopramide ranked third (RR, 0.48).

Among 13 trials with 785 patients with diabetic gastroparesis, none of the active drugs were superior to placebo. Among 12 trials recruiting patients with idiopathic or mixed etiology gastroparesis, clebopride ranked first (RR, 0.30; P-score, 0.93).

On the basis of trials that assessed individual symptoms, oral metoclopramide ranked first for nausea (RR, 0.46; P-score, 0.95), fullness (RR, 0.67; P-score, 0.86), and bloating (RR, 0.53; P-score, 0.97). However, the data came from one small trial. Tradipitant and TZP-102, a ghrelin agonist, were efficacious for nausea, and TZP-102 ranked second for fullness. No drugs were more efficacious than the placebo for abdominal pain or vomiting.

Among 20 trials that reported on the total number of adverse events, camicinal was the least likely to be associated with adverse events (RR, 0.77; P-score, 0.93) and prucalopride was the most likely to be associated with adverse events (RR, 2.96; P-score, 0.10). Prucalopride, oral metoclopramide, and aprepitant also were more likely than placebo to be associated with adverse events.

In 23 trials that reported on withdrawals caused by adverse events, camicinal was the least likely to be associated with withdrawals (RR, 0.20; P-score, 0.87). Nortriptyline was the most likely to be associated with withdrawals (RR, 3.33; P-score, 0.16). However, there were no significant differences between any individual drug and placebo.
 

 

 

Urgent need remains

More trials of drugs to treat gastroparesis are needed, Ford said.

“We need to consider the reintroduction of dopamine antagonists, if patients are willing to accept the safety concerns,” he added. “The other important point is most drugs were not of benefit. There is an urgent need to find efficacious therapies, and these should be fast-tracked for licensing approval if efficacy is proven.”

The study is “helpful for practicing clinicians since it provides a comprehensive review of clinical trials in gastroparesis,” Anthony Lembo, MD, a gastroenterologist at the Cleveland Clinic, said in an interview.

Dr. Lembo, who wasn’t involved with this study, has researched several drugs for gastroparesis, including relamorelin and TZP-102. He agreed that additional research is needed.

“There is a paucity of novel treatments currently in development,” he said. “However, there is interest in developing a product similar to domperidone without cardiac side effects, as well as performing larger studies with botulinum toxin injection.”

The authors did not disclose a funding source for the study. One author disclosed research funding from the National Institutes of Health and consulting roles with various pharmaceutical companies. Ford and the other authors reported no disclosures. Dr. Lembo reported no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Earlier colorectal cancer screening appears cost-effective in overweight, obese patients

Is the juice worth the squeeze?
Article Type
Changed
Fri, 01/06/2023 - 11:37

Starting colorectal cancer screening earlier than age 50 appears to be cost-effective for both men and women across all body mass index (BMI) measures, according to a study published in Clinical Gastroenterology and Hepatology.

In particular, colonoscopy is cost-effective at age 45 for all BMI strata and at age 40 in obese men. In addition, fecal immunochemical testing (FIT) is highly cost-effective at ages 40 or 45 for all BMI values, wrote Aaron Yeoh, MD, a gastroenterologist at the Stanford (Calif.) University, and colleagues.

Increased body fatness, defined as a high BMI, has increased sharply in recent decades and has been associated with a higher risk of colorectal cancer (CRC). Given the rising incidence of CRC in younger people, the American Cancer Society and U.S. Preventive Services Task Force now endorse screening at age 45. In previous analyses, Dr. Yeoh and colleagues suggested that the policy is likely to be cost-effective, but they didn’t explore the potential differences by BMI.

“Our results suggest that 45 years of age is a reasonable screening initiation age for women and men with BMI ranging from normal through all classes of obesity,” the authors wrote. “Before changing screening policy, supportive data from clinical studies would be needed. Our approach can be applied to future efforts aiming to risk-stratify CRC screening based on multiple clinical factors or biomarkers.”

The research team examined the potential effectiveness and cost-effectiveness of screening tailored to BMI starting as early as age 40 and ending at age 75 in 10 separate cohorts of men and women of normal weight (18.5 to <25 kg/m2), overweight (25 to <30 kg/m2), and three strata of obesity – obese I (30 to <35 kg/m2), obese II (35 to <40 kg/m2), and obese III (>40 kg/m2).

For each cohort, the researchers estimated incremental costs per quality-adjusted life year (QALY) gained by initiating screening at age 40 versus age 45 versus age 50, or by shortening colonoscopy intervals. They modeled screening colonoscopy every 10 years (Colo10) or every 5 years (Colo5), or annual FIT, offered from ages 40, 45, or 50 through age 75 with 100% adherence, with postpolypectomy surveillance through age 80.

For model inputs, the research team favored high-quality data from meta-analyses or large prospective trials. Screening, treatment, and complication costs were set at 2018 Centers for Medicare & Medicaid Services rates for ages 65 and older and modified to reflect commercial costs at ages 65 and younger. The authors assumed use of moderate sedation, and sensitivity analyses addressed possible increased costs and complications of colonoscopy under propofol.

Overall, without screening, sex-specific total CRC deaths were similar for people with overweight or obesity I-III and slightly higher than for people with normal BMI. For both men and women across all BMI strata, Colo10 or FIT starting at age 50 substantially decreased CRC incidence and mortality versus no screening, and the magnitude of the clinical impact was comparable across BMI.

For both sexes across BMI, Colo10 or FIT starting at age 50 was highly cost-effective. The cost per QALY gained for Colo10 compared with no screening became more favorable as BMI increased from normal to obesity III. FIT was cost-saving compared with no screening for all cohorts and was cost-saving or highly cost-effective compared with Colo10 within each cohort.

Initiating Colo10 at age 45 showed incremental decreases in CRC incidence and mortality, which were modest compared with the gains of Colo10 at age 50 versus no screening. However, the incremental gains were achieved at acceptable incremental costs ranging from $64,500 to $85,900 per QALY gained in women and from $33,400 to $64,200 per QALY gained in men.

Initiating Colo10 at age 40 in women and men in the lowest three BMI strata was associated with high incremental costs per QALY gained. In contrast, Colo10 initiation at age 40 cost $80,400 per QALY gained in men with obesity III and $93,300 per QALY gained in men with obesity II.

FIT starting at ages 40 or 45 yielded progressively greater decreases in CRC incidence and mortality for both men and women across BMI strata, and it was highly cost-effective versus starting at later ages. Compared with Colo10, at every screening initiation age, FIT was cost-saving or preferred based on very high incremental costs per QALY, and FIT required substantially fewer colonoscopies per person.

Intensifying screening by shortening the colonoscopy interval to Colo5 was never preferred over shifting Colo10 to earlier screening initiation ages. In all cohorts, Colo5 was either less effective and more costly than Colo10 at a younger age, or when it was more effective, the cost per QALY gained was substantially higher than $100,000 per QALY gained.

Additional studies are needed to understand obesity-specific colonoscopy risks and costs, the authors wrote. In addition, obesity is only one of several factors that should be considered when tailoring CRC screening to the level of CRC risk, they wrote.

“As the search for a multifactor prediction tool that is ready for clinical application continues, we face the question of how to approach single CRC risk factors such as obesity,” they wrote. “While screening guidelines based on BMI can be envisioned if supportive clinical data accumulate, clinical implementation must overcome operational challenges.”

The study funding was not disclosed. One author reported advisory and consultant roles for several medical companies, and the remaining authors disclosed no conflicts.

Body

Obesity is associated with an increased risk of colorectal cancer, along with cancers of the breast, endometrium, and esophagus. Even maternal obesity is associated with higher offspring colorectal cancer rates. Key mechanisms that underlie these associations include high insulin levels in obesity that propel tumor growth, adipose tissue that secretes inflammatory cytokines, and high glucose levels that act as fuel for cancer proliferation.

Dr. Sarah McGill
With the recommended start of colorectal cancer screening now at age 45, and the U.S. demographic obesity problem worsening, Yeoh and his Stanford colleagues put their well-described cost-effectiveness model to work to analyze screening at different body mass indices. The new inputs consider not only higher colorectal cancer risk among obese individuals, but also increased all-cause mortality.

For men with BMI over 35, moving the colonoscopy screening age earlier to age 40 was cost-effective. However, it’s not clear that in practice the juice is worth the squeeze. Changing screening initiation times further based on personalized factors such as BMI could make screening more confusing for patients and physicians and may hurt uptake, a critical factor for the success of any screening program.

The study supports the current paradigm that screening starting at age 45 is cost-effective among men and women at all BMI ranges, a reassuring conclusion. It also serves as a sobering reminder that promoting metabolic health in our patients, our schools, and our communities is a valuable endeavor.
 

Sarah McGill, MD, MSc, FACG, FASGE, is associate professor medicine, gastroenterology, and hepatology at the University of North Carolina at Chapel Hill. She receives research funding from Olympus America, Finch Therapeutics, Genentech, Guardant Health, and Exact Sciences.

Publications
Topics
Sections
Body

Obesity is associated with an increased risk of colorectal cancer, along with cancers of the breast, endometrium, and esophagus. Even maternal obesity is associated with higher offspring colorectal cancer rates. Key mechanisms that underlie these associations include high insulin levels in obesity that propel tumor growth, adipose tissue that secretes inflammatory cytokines, and high glucose levels that act as fuel for cancer proliferation.

Dr. Sarah McGill
With the recommended start of colorectal cancer screening now at age 45, and the U.S. demographic obesity problem worsening, Yeoh and his Stanford colleagues put their well-described cost-effectiveness model to work to analyze screening at different body mass indices. The new inputs consider not only higher colorectal cancer risk among obese individuals, but also increased all-cause mortality.

For men with BMI over 35, moving the colonoscopy screening age earlier to age 40 was cost-effective. However, it’s not clear that in practice the juice is worth the squeeze. Changing screening initiation times further based on personalized factors such as BMI could make screening more confusing for patients and physicians and may hurt uptake, a critical factor for the success of any screening program.

The study supports the current paradigm that screening starting at age 45 is cost-effective among men and women at all BMI ranges, a reassuring conclusion. It also serves as a sobering reminder that promoting metabolic health in our patients, our schools, and our communities is a valuable endeavor.
 

Sarah McGill, MD, MSc, FACG, FASGE, is associate professor medicine, gastroenterology, and hepatology at the University of North Carolina at Chapel Hill. She receives research funding from Olympus America, Finch Therapeutics, Genentech, Guardant Health, and Exact Sciences.

Body

Obesity is associated with an increased risk of colorectal cancer, along with cancers of the breast, endometrium, and esophagus. Even maternal obesity is associated with higher offspring colorectal cancer rates. Key mechanisms that underlie these associations include high insulin levels in obesity that propel tumor growth, adipose tissue that secretes inflammatory cytokines, and high glucose levels that act as fuel for cancer proliferation.

Dr. Sarah McGill
With the recommended start of colorectal cancer screening now at age 45, and the U.S. demographic obesity problem worsening, Yeoh and his Stanford colleagues put their well-described cost-effectiveness model to work to analyze screening at different body mass indices. The new inputs consider not only higher colorectal cancer risk among obese individuals, but also increased all-cause mortality.

For men with BMI over 35, moving the colonoscopy screening age earlier to age 40 was cost-effective. However, it’s not clear that in practice the juice is worth the squeeze. Changing screening initiation times further based on personalized factors such as BMI could make screening more confusing for patients and physicians and may hurt uptake, a critical factor for the success of any screening program.

The study supports the current paradigm that screening starting at age 45 is cost-effective among men and women at all BMI ranges, a reassuring conclusion. It also serves as a sobering reminder that promoting metabolic health in our patients, our schools, and our communities is a valuable endeavor.
 

Sarah McGill, MD, MSc, FACG, FASGE, is associate professor medicine, gastroenterology, and hepatology at the University of North Carolina at Chapel Hill. She receives research funding from Olympus America, Finch Therapeutics, Genentech, Guardant Health, and Exact Sciences.

Title
Is the juice worth the squeeze?
Is the juice worth the squeeze?

Starting colorectal cancer screening earlier than age 50 appears to be cost-effective for both men and women across all body mass index (BMI) measures, according to a study published in Clinical Gastroenterology and Hepatology.

In particular, colonoscopy is cost-effective at age 45 for all BMI strata and at age 40 in obese men. In addition, fecal immunochemical testing (FIT) is highly cost-effective at ages 40 or 45 for all BMI values, wrote Aaron Yeoh, MD, a gastroenterologist at the Stanford (Calif.) University, and colleagues.

Increased body fatness, defined as a high BMI, has increased sharply in recent decades and has been associated with a higher risk of colorectal cancer (CRC). Given the rising incidence of CRC in younger people, the American Cancer Society and U.S. Preventive Services Task Force now endorse screening at age 45. In previous analyses, Dr. Yeoh and colleagues suggested that the policy is likely to be cost-effective, but they didn’t explore the potential differences by BMI.

“Our results suggest that 45 years of age is a reasonable screening initiation age for women and men with BMI ranging from normal through all classes of obesity,” the authors wrote. “Before changing screening policy, supportive data from clinical studies would be needed. Our approach can be applied to future efforts aiming to risk-stratify CRC screening based on multiple clinical factors or biomarkers.”

The research team examined the potential effectiveness and cost-effectiveness of screening tailored to BMI starting as early as age 40 and ending at age 75 in 10 separate cohorts of men and women of normal weight (18.5 to <25 kg/m2), overweight (25 to <30 kg/m2), and three strata of obesity – obese I (30 to <35 kg/m2), obese II (35 to <40 kg/m2), and obese III (>40 kg/m2).

For each cohort, the researchers estimated incremental costs per quality-adjusted life year (QALY) gained by initiating screening at age 40 versus age 45 versus age 50, or by shortening colonoscopy intervals. They modeled screening colonoscopy every 10 years (Colo10) or every 5 years (Colo5), or annual FIT, offered from ages 40, 45, or 50 through age 75 with 100% adherence, with postpolypectomy surveillance through age 80.

For model inputs, the research team favored high-quality data from meta-analyses or large prospective trials. Screening, treatment, and complication costs were set at 2018 Centers for Medicare & Medicaid Services rates for ages 65 and older and modified to reflect commercial costs at ages 65 and younger. The authors assumed use of moderate sedation, and sensitivity analyses addressed possible increased costs and complications of colonoscopy under propofol.

Overall, without screening, sex-specific total CRC deaths were similar for people with overweight or obesity I-III and slightly higher than for people with normal BMI. For both men and women across all BMI strata, Colo10 or FIT starting at age 50 substantially decreased CRC incidence and mortality versus no screening, and the magnitude of the clinical impact was comparable across BMI.

For both sexes across BMI, Colo10 or FIT starting at age 50 was highly cost-effective. The cost per QALY gained for Colo10 compared with no screening became more favorable as BMI increased from normal to obesity III. FIT was cost-saving compared with no screening for all cohorts and was cost-saving or highly cost-effective compared with Colo10 within each cohort.

Initiating Colo10 at age 45 showed incremental decreases in CRC incidence and mortality, which were modest compared with the gains of Colo10 at age 50 versus no screening. However, the incremental gains were achieved at acceptable incremental costs ranging from $64,500 to $85,900 per QALY gained in women and from $33,400 to $64,200 per QALY gained in men.

Initiating Colo10 at age 40 in women and men in the lowest three BMI strata was associated with high incremental costs per QALY gained. In contrast, Colo10 initiation at age 40 cost $80,400 per QALY gained in men with obesity III and $93,300 per QALY gained in men with obesity II.

FIT starting at ages 40 or 45 yielded progressively greater decreases in CRC incidence and mortality for both men and women across BMI strata, and it was highly cost-effective versus starting at later ages. Compared with Colo10, at every screening initiation age, FIT was cost-saving or preferred based on very high incremental costs per QALY, and FIT required substantially fewer colonoscopies per person.

Intensifying screening by shortening the colonoscopy interval to Colo5 was never preferred over shifting Colo10 to earlier screening initiation ages. In all cohorts, Colo5 was either less effective and more costly than Colo10 at a younger age, or when it was more effective, the cost per QALY gained was substantially higher than $100,000 per QALY gained.

Additional studies are needed to understand obesity-specific colonoscopy risks and costs, the authors wrote. In addition, obesity is only one of several factors that should be considered when tailoring CRC screening to the level of CRC risk, they wrote.

“As the search for a multifactor prediction tool that is ready for clinical application continues, we face the question of how to approach single CRC risk factors such as obesity,” they wrote. “While screening guidelines based on BMI can be envisioned if supportive clinical data accumulate, clinical implementation must overcome operational challenges.”

The study funding was not disclosed. One author reported advisory and consultant roles for several medical companies, and the remaining authors disclosed no conflicts.

Starting colorectal cancer screening earlier than age 50 appears to be cost-effective for both men and women across all body mass index (BMI) measures, according to a study published in Clinical Gastroenterology and Hepatology.

In particular, colonoscopy is cost-effective at age 45 for all BMI strata and at age 40 in obese men. In addition, fecal immunochemical testing (FIT) is highly cost-effective at ages 40 or 45 for all BMI values, wrote Aaron Yeoh, MD, a gastroenterologist at the Stanford (Calif.) University, and colleagues.

Increased body fatness, defined as a high BMI, has increased sharply in recent decades and has been associated with a higher risk of colorectal cancer (CRC). Given the rising incidence of CRC in younger people, the American Cancer Society and U.S. Preventive Services Task Force now endorse screening at age 45. In previous analyses, Dr. Yeoh and colleagues suggested that the policy is likely to be cost-effective, but they didn’t explore the potential differences by BMI.

“Our results suggest that 45 years of age is a reasonable screening initiation age for women and men with BMI ranging from normal through all classes of obesity,” the authors wrote. “Before changing screening policy, supportive data from clinical studies would be needed. Our approach can be applied to future efforts aiming to risk-stratify CRC screening based on multiple clinical factors or biomarkers.”

The research team examined the potential effectiveness and cost-effectiveness of screening tailored to BMI starting as early as age 40 and ending at age 75 in 10 separate cohorts of men and women of normal weight (18.5 to <25 kg/m2), overweight (25 to <30 kg/m2), and three strata of obesity – obese I (30 to <35 kg/m2), obese II (35 to <40 kg/m2), and obese III (>40 kg/m2).

For each cohort, the researchers estimated incremental costs per quality-adjusted life year (QALY) gained by initiating screening at age 40 versus age 45 versus age 50, or by shortening colonoscopy intervals. They modeled screening colonoscopy every 10 years (Colo10) or every 5 years (Colo5), or annual FIT, offered from ages 40, 45, or 50 through age 75 with 100% adherence, with postpolypectomy surveillance through age 80.

For model inputs, the research team favored high-quality data from meta-analyses or large prospective trials. Screening, treatment, and complication costs were set at 2018 Centers for Medicare & Medicaid Services rates for ages 65 and older and modified to reflect commercial costs at ages 65 and younger. The authors assumed use of moderate sedation, and sensitivity analyses addressed possible increased costs and complications of colonoscopy under propofol.

Overall, without screening, sex-specific total CRC deaths were similar for people with overweight or obesity I-III and slightly higher than for people with normal BMI. For both men and women across all BMI strata, Colo10 or FIT starting at age 50 substantially decreased CRC incidence and mortality versus no screening, and the magnitude of the clinical impact was comparable across BMI.

For both sexes across BMI, Colo10 or FIT starting at age 50 was highly cost-effective. The cost per QALY gained for Colo10 compared with no screening became more favorable as BMI increased from normal to obesity III. FIT was cost-saving compared with no screening for all cohorts and was cost-saving or highly cost-effective compared with Colo10 within each cohort.

Initiating Colo10 at age 45 showed incremental decreases in CRC incidence and mortality, which were modest compared with the gains of Colo10 at age 50 versus no screening. However, the incremental gains were achieved at acceptable incremental costs ranging from $64,500 to $85,900 per QALY gained in women and from $33,400 to $64,200 per QALY gained in men.

Initiating Colo10 at age 40 in women and men in the lowest three BMI strata was associated with high incremental costs per QALY gained. In contrast, Colo10 initiation at age 40 cost $80,400 per QALY gained in men with obesity III and $93,300 per QALY gained in men with obesity II.

FIT starting at ages 40 or 45 yielded progressively greater decreases in CRC incidence and mortality for both men and women across BMI strata, and it was highly cost-effective versus starting at later ages. Compared with Colo10, at every screening initiation age, FIT was cost-saving or preferred based on very high incremental costs per QALY, and FIT required substantially fewer colonoscopies per person.

Intensifying screening by shortening the colonoscopy interval to Colo5 was never preferred over shifting Colo10 to earlier screening initiation ages. In all cohorts, Colo5 was either less effective and more costly than Colo10 at a younger age, or when it was more effective, the cost per QALY gained was substantially higher than $100,000 per QALY gained.

Additional studies are needed to understand obesity-specific colonoscopy risks and costs, the authors wrote. In addition, obesity is only one of several factors that should be considered when tailoring CRC screening to the level of CRC risk, they wrote.

“As the search for a multifactor prediction tool that is ready for clinical application continues, we face the question of how to approach single CRC risk factors such as obesity,” they wrote. “While screening guidelines based on BMI can be envisioned if supportive clinical data accumulate, clinical implementation must overcome operational challenges.”

The study funding was not disclosed. One author reported advisory and consultant roles for several medical companies, and the remaining authors disclosed no conflicts.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Proximal ADR could become important new quality metric

'What gets measured gets managed'
Article Type
Changed
Wed, 04/12/2023 - 13:41

Measurement of the proximal adenoma detection rate may be an important new quality metric for screening colonoscopy, propose researchers in a study that found proportionately more adenomas detected in the right colon with increasing patient age.

As patients age, in fact, the rate of increase of proximal adenomas is far greater than for distal adenomas in both men and women and in all races, wrote Lawrence Kosinski, MD, founder and chief medical officer of Sonar MD in Chicago, and colleagues.

Dr. Lawrence Kosinski

Adenoma detection rate (ADR), the proportion of screening colonoscopies performed by a physician that detect at least one histologically confirmed colorectal adenoma or adenocarcinoma, has become an accepted quality metric because of the association of high ADR with lower rates of postcolonoscopy colorectal cancer (CRC).  ADR varies widely among endoscopists, however, which could be related to differences in adenoma detection in different parts of the colon.

“An endoscopist could perform a high-quality examination of the distal colon and find one adenoma but at the same time miss important pathology in the proximal colon,” the authors wrote. “These differences could be clinically important if CRC occurs after colonoscopy.” The study was published in Techniques and Innovations in Gastrointestinal Endoscopy .

Dr. Kosinski and colleagues analyzed retrospective claims data from all colonoscopies performed between 2016-2018 submitted to the Health Care Service Corporation, which is the exclusive Blue Cross Blue Shield licensee for Illinois, Texas, Oklahoma, New Mexico, and Montana. All 50 states were represented in the patient population, though Illinois and Texas accounted for 66% of the cases.

The research team limited the study group to include patients who underwent a screening colonoscopy, representing 30.9% of the total population. They further refined the data to include only screening colonoscopies performed by the 710 endoscopists with at least 100 screenings during the study period, representing 34.5% of the total patients. They also excluded 10,685 cases with family history because the high-risk patients could alter the results.

Using ICD-10 codes, the researchers identified the polyp detection locations and then calculated the ADR for the entire colon (T-ADR) and both the proximal (P-ADR) and distal (D-ADR) colon to determine differences in the ratio of P-ADR versus D-ADR by age, sex, and race. They were unable to determine whether the polyps were adenomas or sessile serrated lesions, so the ADR calculations include both.

The 182,296 screening colonoscopies included 93,164 women (51%) and 89,132 men (49%). About 79% of patients were aged 50-64 years, and 5.8% were under age 50. The dataset preceded the U.S. Preventive Services Task Force recommendation to initiate screening at age 45.

Overall, T-ADR was consistent with accepted norms in both men (25.99%) and women (19.72%). Compared with women, men had a 4.5% higher prevalence of proximal adenomas and a 2.5% higher prevalence of distal adenomas at any age. The small cohort of Native Americans (296 patients) had a numerically higher T-ADR, P-ADR, and D-ADR than other groups.

By age, T-ADR increased significantly with advancing age, from 0.13 in patients under age 40 to 0.39 in ages 70 and older. The increase was driven by a sharp rise in P-ADR, particularly after age 60. There was a relatively small increase in D-ADR after ages 45-49.

Notably, the P-ADR/D-ADR ratio increased from 1.2 in patients under age 40 to 2.65 in ages 75 and older in both men and women.

Since the experience of the endoscopist affects ADR, the research team also calculated the ADR data by the number of total colonoscopies by endoscopist per decile. T-ADR, P-ADR, and D-ADR were associated directly in a linear relationship with the number of total colonoscopies performed. The slope of the P-ADR trendline was 2.3 times higher than the slope of the D-ADR trendline, indicating a higher volume of procedures directly related to higher polyp detection – specifically the P-ADR.

“Our data demonstrate that it is feasible to measure P-ADR in clinical practice,” the authors wrote. “We propose that P-ADR be considered a quality metric for colonoscopy.”

In addition, because of considerable variation in ADR based on age and sex, calculated ADR should be normalized by the age and sex of the specific patient profile of each endoscopist so relevant benchmarks can be established based on practice demographics, they wrote. For example, an endoscopist with a practice that includes predominantly younger women would have a different benchmark than a colleague with an older male population.

“With appropriate use of gender and age adjustments to ADR, endoscopists in need of further education and mentoring can be identified,” they wrote.

The authors declared no funding for the study. One author reported advisory roles for several medical companies, and the remaining authors disclosed no conflicts.

Body

“What gets measured gets managed” is a common mantra in quality improvement. Adenoma detection rate (ADR) is currently one measure of a “quality” colonoscopy and a metric that is studied to determine means for improvement. ADR is an imperfect measure as it does not necessarily reflect true risk of a postcolonoscopy cancer in all parts of the colon since many post colonoscopy cancers are found in the proximal colon. To try to better understand potential differences in polyps found in different segments of the colon, and to determine if this was a metric that could be measured, Kosinski and colleagues studied a large claims database to understand the potential difference in ADR in the proximal versus distal colon.

Dr. Sunanda V. Kane
Their main finding was that, with advancing age, the ADR for proximal lesions is far greater than for distal ones, especially after 60. They also found that experience of the colonoscopist as determined by the number of procedures performed made a difference in all forms of ADR. Given that proximal lesions tend to be more flat and potentially hard to see, it makes sense that experience is important. Determining an ADR for all parts of the colon rather than grouping it together makes sense since the pathology of polyps can differ based on location and is likely a metric that will find its way into the definition of acceptable colonoscopy practice.


Sunanda Kane, MD, MSPH, is professor of medicine in the division of gastroenterology and hepatology at the Mayo Clinic, Rochester, Minn. Dr. Kane has no relevant conflicts of interest.

Publications
Topics
Sections
Body

“What gets measured gets managed” is a common mantra in quality improvement. Adenoma detection rate (ADR) is currently one measure of a “quality” colonoscopy and a metric that is studied to determine means for improvement. ADR is an imperfect measure as it does not necessarily reflect true risk of a postcolonoscopy cancer in all parts of the colon since many post colonoscopy cancers are found in the proximal colon. To try to better understand potential differences in polyps found in different segments of the colon, and to determine if this was a metric that could be measured, Kosinski and colleagues studied a large claims database to understand the potential difference in ADR in the proximal versus distal colon.

Dr. Sunanda V. Kane
Their main finding was that, with advancing age, the ADR for proximal lesions is far greater than for distal ones, especially after 60. They also found that experience of the colonoscopist as determined by the number of procedures performed made a difference in all forms of ADR. Given that proximal lesions tend to be more flat and potentially hard to see, it makes sense that experience is important. Determining an ADR for all parts of the colon rather than grouping it together makes sense since the pathology of polyps can differ based on location and is likely a metric that will find its way into the definition of acceptable colonoscopy practice.


Sunanda Kane, MD, MSPH, is professor of medicine in the division of gastroenterology and hepatology at the Mayo Clinic, Rochester, Minn. Dr. Kane has no relevant conflicts of interest.

Body

“What gets measured gets managed” is a common mantra in quality improvement. Adenoma detection rate (ADR) is currently one measure of a “quality” colonoscopy and a metric that is studied to determine means for improvement. ADR is an imperfect measure as it does not necessarily reflect true risk of a postcolonoscopy cancer in all parts of the colon since many post colonoscopy cancers are found in the proximal colon. To try to better understand potential differences in polyps found in different segments of the colon, and to determine if this was a metric that could be measured, Kosinski and colleagues studied a large claims database to understand the potential difference in ADR in the proximal versus distal colon.

Dr. Sunanda V. Kane
Their main finding was that, with advancing age, the ADR for proximal lesions is far greater than for distal ones, especially after 60. They also found that experience of the colonoscopist as determined by the number of procedures performed made a difference in all forms of ADR. Given that proximal lesions tend to be more flat and potentially hard to see, it makes sense that experience is important. Determining an ADR for all parts of the colon rather than grouping it together makes sense since the pathology of polyps can differ based on location and is likely a metric that will find its way into the definition of acceptable colonoscopy practice.


Sunanda Kane, MD, MSPH, is professor of medicine in the division of gastroenterology and hepatology at the Mayo Clinic, Rochester, Minn. Dr. Kane has no relevant conflicts of interest.

Title
'What gets measured gets managed'
'What gets measured gets managed'

Measurement of the proximal adenoma detection rate may be an important new quality metric for screening colonoscopy, propose researchers in a study that found proportionately more adenomas detected in the right colon with increasing patient age.

As patients age, in fact, the rate of increase of proximal adenomas is far greater than for distal adenomas in both men and women and in all races, wrote Lawrence Kosinski, MD, founder and chief medical officer of Sonar MD in Chicago, and colleagues.

Dr. Lawrence Kosinski

Adenoma detection rate (ADR), the proportion of screening colonoscopies performed by a physician that detect at least one histologically confirmed colorectal adenoma or adenocarcinoma, has become an accepted quality metric because of the association of high ADR with lower rates of postcolonoscopy colorectal cancer (CRC).  ADR varies widely among endoscopists, however, which could be related to differences in adenoma detection in different parts of the colon.

“An endoscopist could perform a high-quality examination of the distal colon and find one adenoma but at the same time miss important pathology in the proximal colon,” the authors wrote. “These differences could be clinically important if CRC occurs after colonoscopy.” The study was published in Techniques and Innovations in Gastrointestinal Endoscopy .

Dr. Kosinski and colleagues analyzed retrospective claims data from all colonoscopies performed between 2016-2018 submitted to the Health Care Service Corporation, which is the exclusive Blue Cross Blue Shield licensee for Illinois, Texas, Oklahoma, New Mexico, and Montana. All 50 states were represented in the patient population, though Illinois and Texas accounted for 66% of the cases.

The research team limited the study group to include patients who underwent a screening colonoscopy, representing 30.9% of the total population. They further refined the data to include only screening colonoscopies performed by the 710 endoscopists with at least 100 screenings during the study period, representing 34.5% of the total patients. They also excluded 10,685 cases with family history because the high-risk patients could alter the results.

Using ICD-10 codes, the researchers identified the polyp detection locations and then calculated the ADR for the entire colon (T-ADR) and both the proximal (P-ADR) and distal (D-ADR) colon to determine differences in the ratio of P-ADR versus D-ADR by age, sex, and race. They were unable to determine whether the polyps were adenomas or sessile serrated lesions, so the ADR calculations include both.

The 182,296 screening colonoscopies included 93,164 women (51%) and 89,132 men (49%). About 79% of patients were aged 50-64 years, and 5.8% were under age 50. The dataset preceded the U.S. Preventive Services Task Force recommendation to initiate screening at age 45.

Overall, T-ADR was consistent with accepted norms in both men (25.99%) and women (19.72%). Compared with women, men had a 4.5% higher prevalence of proximal adenomas and a 2.5% higher prevalence of distal adenomas at any age. The small cohort of Native Americans (296 patients) had a numerically higher T-ADR, P-ADR, and D-ADR than other groups.

By age, T-ADR increased significantly with advancing age, from 0.13 in patients under age 40 to 0.39 in ages 70 and older. The increase was driven by a sharp rise in P-ADR, particularly after age 60. There was a relatively small increase in D-ADR after ages 45-49.

Notably, the P-ADR/D-ADR ratio increased from 1.2 in patients under age 40 to 2.65 in ages 75 and older in both men and women.

Since the experience of the endoscopist affects ADR, the research team also calculated the ADR data by the number of total colonoscopies by endoscopist per decile. T-ADR, P-ADR, and D-ADR were associated directly in a linear relationship with the number of total colonoscopies performed. The slope of the P-ADR trendline was 2.3 times higher than the slope of the D-ADR trendline, indicating a higher volume of procedures directly related to higher polyp detection – specifically the P-ADR.

“Our data demonstrate that it is feasible to measure P-ADR in clinical practice,” the authors wrote. “We propose that P-ADR be considered a quality metric for colonoscopy.”

In addition, because of considerable variation in ADR based on age and sex, calculated ADR should be normalized by the age and sex of the specific patient profile of each endoscopist so relevant benchmarks can be established based on practice demographics, they wrote. For example, an endoscopist with a practice that includes predominantly younger women would have a different benchmark than a colleague with an older male population.

“With appropriate use of gender and age adjustments to ADR, endoscopists in need of further education and mentoring can be identified,” they wrote.

The authors declared no funding for the study. One author reported advisory roles for several medical companies, and the remaining authors disclosed no conflicts.

Measurement of the proximal adenoma detection rate may be an important new quality metric for screening colonoscopy, propose researchers in a study that found proportionately more adenomas detected in the right colon with increasing patient age.

As patients age, in fact, the rate of increase of proximal adenomas is far greater than for distal adenomas in both men and women and in all races, wrote Lawrence Kosinski, MD, founder and chief medical officer of Sonar MD in Chicago, and colleagues.

Dr. Lawrence Kosinski

Adenoma detection rate (ADR), the proportion of screening colonoscopies performed by a physician that detect at least one histologically confirmed colorectal adenoma or adenocarcinoma, has become an accepted quality metric because of the association of high ADR with lower rates of postcolonoscopy colorectal cancer (CRC).  ADR varies widely among endoscopists, however, which could be related to differences in adenoma detection in different parts of the colon.

“An endoscopist could perform a high-quality examination of the distal colon and find one adenoma but at the same time miss important pathology in the proximal colon,” the authors wrote. “These differences could be clinically important if CRC occurs after colonoscopy.” The study was published in Techniques and Innovations in Gastrointestinal Endoscopy .

Dr. Kosinski and colleagues analyzed retrospective claims data from all colonoscopies performed between 2016-2018 submitted to the Health Care Service Corporation, which is the exclusive Blue Cross Blue Shield licensee for Illinois, Texas, Oklahoma, New Mexico, and Montana. All 50 states were represented in the patient population, though Illinois and Texas accounted for 66% of the cases.

The research team limited the study group to include patients who underwent a screening colonoscopy, representing 30.9% of the total population. They further refined the data to include only screening colonoscopies performed by the 710 endoscopists with at least 100 screenings during the study period, representing 34.5% of the total patients. They also excluded 10,685 cases with family history because the high-risk patients could alter the results.

Using ICD-10 codes, the researchers identified the polyp detection locations and then calculated the ADR for the entire colon (T-ADR) and both the proximal (P-ADR) and distal (D-ADR) colon to determine differences in the ratio of P-ADR versus D-ADR by age, sex, and race. They were unable to determine whether the polyps were adenomas or sessile serrated lesions, so the ADR calculations include both.

The 182,296 screening colonoscopies included 93,164 women (51%) and 89,132 men (49%). About 79% of patients were aged 50-64 years, and 5.8% were under age 50. The dataset preceded the U.S. Preventive Services Task Force recommendation to initiate screening at age 45.

Overall, T-ADR was consistent with accepted norms in both men (25.99%) and women (19.72%). Compared with women, men had a 4.5% higher prevalence of proximal adenomas and a 2.5% higher prevalence of distal adenomas at any age. The small cohort of Native Americans (296 patients) had a numerically higher T-ADR, P-ADR, and D-ADR than other groups.

By age, T-ADR increased significantly with advancing age, from 0.13 in patients under age 40 to 0.39 in ages 70 and older. The increase was driven by a sharp rise in P-ADR, particularly after age 60. There was a relatively small increase in D-ADR after ages 45-49.

Notably, the P-ADR/D-ADR ratio increased from 1.2 in patients under age 40 to 2.65 in ages 75 and older in both men and women.

Since the experience of the endoscopist affects ADR, the research team also calculated the ADR data by the number of total colonoscopies by endoscopist per decile. T-ADR, P-ADR, and D-ADR were associated directly in a linear relationship with the number of total colonoscopies performed. The slope of the P-ADR trendline was 2.3 times higher than the slope of the D-ADR trendline, indicating a higher volume of procedures directly related to higher polyp detection – specifically the P-ADR.

“Our data demonstrate that it is feasible to measure P-ADR in clinical practice,” the authors wrote. “We propose that P-ADR be considered a quality metric for colonoscopy.”

In addition, because of considerable variation in ADR based on age and sex, calculated ADR should be normalized by the age and sex of the specific patient profile of each endoscopist so relevant benchmarks can be established based on practice demographics, they wrote. For example, an endoscopist with a practice that includes predominantly younger women would have a different benchmark than a colleague with an older male population.

“With appropriate use of gender and age adjustments to ADR, endoscopists in need of further education and mentoring can be identified,” they wrote.

The authors declared no funding for the study. One author reported advisory roles for several medical companies, and the remaining authors disclosed no conflicts.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM TECHNIQUES AND INNOVATIONS IN GASTROINTESTINAL ENDOSCOPY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Genomic features may explain sex differences in HBV-associated HCC

Article Type
Changed
Wed, 01/04/2023 - 10:24

In findings that point to a potential treatment strategy, researchers in China have discovered how two risk factors – male hormones and aflatoxin – may drive hepatocellular carcinoma (HCC). The liver cancer genetics and biology differ between men and women and help explain why aflatoxin exposure increases the risk of HCC in hepatitis B virus (HBV)–infected patients, particularly in men.

The researchers found evidence that androgen signaling increased aflatoxin metabolism and genotoxicity, reduced DNA repair capabilities, and quelled antitumor immunity, Chungui Xu, PhD, with the State Key Lab of Molecular Oncology at the National Cancer Center at Peking Union Medical College in Beijing, and colleagues wrote. The study was published in Cellular and Molecular Gastroenterology and Hepatology.

“Androgen signaling in the context of genotoxic stress repressed DNA damage repair,” the authors wrote. “The alteration caused more nuclear DNA leakage into cytosol to activate the cGAS-STING pathway, which increased T-cell infiltration into tumor mass and improved anti–programmed cell death protein 1 [PD-1] immunotherapy in HCCs.”

In the study, the researchers conducted genomic analyses of HCC tumor samples from people with HBV who were exposed to aflatoxin in Qidong, China, an area that until recently had some of the highest liver cancer rates in the world. In subsequent experiments in cell lines and mice, the team investigated how the genetic alterations and transcription dysfunctions reflected the combined carcinogenic effects of aflatoxin and HPV.

Dr. Xu and colleagues performed whole-genome, whole-exome, and RNA sequencing on tumor and matched nonneoplastic liver tissues from 101 HBV-related HCC patients (47 men and 54 women). The patients had received primary hepatectomy without systemic treatment or radiation therapy and were followed for 5 years. Aflatoxin exposure was confirmed by recording aflatoxin M1 in their urine 3-18 years before HCC diagnosis. For comparison, the research team analyzed 113 HBV-related HCC samples without aflatoxin exposure from the Cancer Genome Atlas database. They also looked at 181 Chinese HCC samples from the International Cancer Genome Consortium that had no record of aflatoxin exposure. They found no sex differences in mutation patterns for previously identified HCC driver genes, but the tumor mutation burden was higher in the Qidong set.
 

In the Qidong samples, the research team identified 71 genes with significantly different mutation frequencies by sex. Among those, 62 genes were associated more frequently with men, and 9 genes were associated with women. None of the genes have been reported previously as HCC drivers, although some have been found previously in other cancers, such as melanoma, lung cancer, and thyroid adenocarcinoma.

From whole-genome sequencing of 88 samples, the research team detected HBV integration in 37 samples and identified 110 breakpoints. No difference in HBV breakpoint numbers was detected between the sexes, though there were differences in somatic mutation profiles and in HBV integration, and only men had HBV breakpoints binding to androgen receptors.

From RNA sequencing of 87 samples, the research team identified 3,070 significantly differentially expressed genes between men and women. The transcription levels of estrogen receptor 1 and 2 were similar between the sexes, but men expressed higher androgen receptor levels.

The researchers then analyzed the variation in gene expression between the male and female gene sets to understand HCC transcriptional dysfunction. The samples from men showed different biological capabilities, with several signaling pathways related to HCC development and progression that were up-regulated. The male samples also showed repression of specific antitumor immunity.

Men’s HCC tumor samples expressed higher levels of aflatoxin metabolism-related genes, such as AHR and CYP1A1, but lower levels of GSTM1 genes.

Turning to cell lines, the researchers used HBV-positive HepG2.2.15 cells and PLC/PRF/5 cells to test sex hormones in the regulation of AHR and CYP1A1 and how their interactions affected aflatoxin B1 cytotoxicity. After aflatoxin treatment, the addition of testosterone to the cultures significantly enhanced the transcription levels of AHR and CYP1A1. The aflatoxin dose needed to cause cell death was reduced by half in the presence of testosterone.

DNA damage from aflatoxin activates DNA repair mechanisms, so the research team analyzed different repair pathways. In the male tumor samples, the most down-regulated pathway was NHEJ. The male samples expressed significantly lower levels of NHEJ factors than did the female samples, including XRCC4, MRE11, ATM, HRCC5, and NBN.

In cell lines, the researchers tested the effects of androgen alone and with aflatoxin on the regulation of NHEJ factors. The transcriptional levels of XRCC4, LIG4, and MRE11 were reduced significantly in cells treated with both aflatoxin and testosterone, compared with those treated with aflatoxin alone. Notably, the addition of 17beta-estradiol estrogen partially reversed the reduction of XRCC4 and MRE11 expression.

The tumor samples from men also showed different gene signatures of immune responses and inflammation from the samples from women. The genes related to interferon I signaling and response were up-regulated significantly in male samples but not in female samples. In addition, the samples from men showed repression of antigen-specific antitumor immunity. The research team detected significantly increased CD8+T-cell infiltration in tumor tissues of men but not women, as well as higher transcriptional levels of PD-1 and CTLA-4, which are two immune checkpoint proteins on T cells that keep them from attacking the tumor. The data indicate that androgen signaling in established HBV-related HCCs contribute to the development of an immunosuppressive microenvironment, the authors wrote, which could render the tumor sensitive to anti–PD-1 immunotherapy.

In mice, the researchers examined the impact of a favorable androgen pathway on anti–PD-1 treatment effects against hepatoma. They administered tamoxifen to block ER signaling in syngeneic tumor-bearing mice. In both male and female mice, tamoxifen enhanced the anti–PD-1 effects to eradicate the tumor quickly. They also administered flutamide to tumor-bearing mice to block the androgen pathway and found no significant difference in tumor growth in female mice, but in male mice, tumors grew faster in the flutamide-treated mice.

“Therapeutics that favor androgen signaling and/or blocking estrogen signaling may provide a new strategy to improve the efficacy of immune checkpoint inhibitors against HCC in combination with radiotherapy or chemotherapy that induced DNA damage,” the authors wrote. “The adjuvant effects of tamoxifen for favorable androgen signaling to boost the anti–PD-1 effect in HCC patients needs future study in a prospective HCC cohort.”

The study was supported by the National Natural Science Foundation Fund of China, Innovation Fund for Medical Sciences of Chinese Academy of Medical Sciences, State Key Project for Infectious Diseases, and Peking Union Medical College. The authors disclosed no conflicts.

To read an editorial that accompanied this study in Cellular and Molecular Gastroenterology and Hepatology, go to https://www.cmghjournal.org/article/S2352-345X(22)00234-X/fulltext.

Publications
Topics
Sections

In findings that point to a potential treatment strategy, researchers in China have discovered how two risk factors – male hormones and aflatoxin – may drive hepatocellular carcinoma (HCC). The liver cancer genetics and biology differ between men and women and help explain why aflatoxin exposure increases the risk of HCC in hepatitis B virus (HBV)–infected patients, particularly in men.

The researchers found evidence that androgen signaling increased aflatoxin metabolism and genotoxicity, reduced DNA repair capabilities, and quelled antitumor immunity, Chungui Xu, PhD, with the State Key Lab of Molecular Oncology at the National Cancer Center at Peking Union Medical College in Beijing, and colleagues wrote. The study was published in Cellular and Molecular Gastroenterology and Hepatology.

“Androgen signaling in the context of genotoxic stress repressed DNA damage repair,” the authors wrote. “The alteration caused more nuclear DNA leakage into cytosol to activate the cGAS-STING pathway, which increased T-cell infiltration into tumor mass and improved anti–programmed cell death protein 1 [PD-1] immunotherapy in HCCs.”

In the study, the researchers conducted genomic analyses of HCC tumor samples from people with HBV who were exposed to aflatoxin in Qidong, China, an area that until recently had some of the highest liver cancer rates in the world. In subsequent experiments in cell lines and mice, the team investigated how the genetic alterations and transcription dysfunctions reflected the combined carcinogenic effects of aflatoxin and HPV.

Dr. Xu and colleagues performed whole-genome, whole-exome, and RNA sequencing on tumor and matched nonneoplastic liver tissues from 101 HBV-related HCC patients (47 men and 54 women). The patients had received primary hepatectomy without systemic treatment or radiation therapy and were followed for 5 years. Aflatoxin exposure was confirmed by recording aflatoxin M1 in their urine 3-18 years before HCC diagnosis. For comparison, the research team analyzed 113 HBV-related HCC samples without aflatoxin exposure from the Cancer Genome Atlas database. They also looked at 181 Chinese HCC samples from the International Cancer Genome Consortium that had no record of aflatoxin exposure. They found no sex differences in mutation patterns for previously identified HCC driver genes, but the tumor mutation burden was higher in the Qidong set.
 

In the Qidong samples, the research team identified 71 genes with significantly different mutation frequencies by sex. Among those, 62 genes were associated more frequently with men, and 9 genes were associated with women. None of the genes have been reported previously as HCC drivers, although some have been found previously in other cancers, such as melanoma, lung cancer, and thyroid adenocarcinoma.

From whole-genome sequencing of 88 samples, the research team detected HBV integration in 37 samples and identified 110 breakpoints. No difference in HBV breakpoint numbers was detected between the sexes, though there were differences in somatic mutation profiles and in HBV integration, and only men had HBV breakpoints binding to androgen receptors.

From RNA sequencing of 87 samples, the research team identified 3,070 significantly differentially expressed genes between men and women. The transcription levels of estrogen receptor 1 and 2 were similar between the sexes, but men expressed higher androgen receptor levels.

The researchers then analyzed the variation in gene expression between the male and female gene sets to understand HCC transcriptional dysfunction. The samples from men showed different biological capabilities, with several signaling pathways related to HCC development and progression that were up-regulated. The male samples also showed repression of specific antitumor immunity.

Men’s HCC tumor samples expressed higher levels of aflatoxin metabolism-related genes, such as AHR and CYP1A1, but lower levels of GSTM1 genes.

Turning to cell lines, the researchers used HBV-positive HepG2.2.15 cells and PLC/PRF/5 cells to test sex hormones in the regulation of AHR and CYP1A1 and how their interactions affected aflatoxin B1 cytotoxicity. After aflatoxin treatment, the addition of testosterone to the cultures significantly enhanced the transcription levels of AHR and CYP1A1. The aflatoxin dose needed to cause cell death was reduced by half in the presence of testosterone.

DNA damage from aflatoxin activates DNA repair mechanisms, so the research team analyzed different repair pathways. In the male tumor samples, the most down-regulated pathway was NHEJ. The male samples expressed significantly lower levels of NHEJ factors than did the female samples, including XRCC4, MRE11, ATM, HRCC5, and NBN.

In cell lines, the researchers tested the effects of androgen alone and with aflatoxin on the regulation of NHEJ factors. The transcriptional levels of XRCC4, LIG4, and MRE11 were reduced significantly in cells treated with both aflatoxin and testosterone, compared with those treated with aflatoxin alone. Notably, the addition of 17beta-estradiol estrogen partially reversed the reduction of XRCC4 and MRE11 expression.

The tumor samples from men also showed different gene signatures of immune responses and inflammation from the samples from women. The genes related to interferon I signaling and response were up-regulated significantly in male samples but not in female samples. In addition, the samples from men showed repression of antigen-specific antitumor immunity. The research team detected significantly increased CD8+T-cell infiltration in tumor tissues of men but not women, as well as higher transcriptional levels of PD-1 and CTLA-4, which are two immune checkpoint proteins on T cells that keep them from attacking the tumor. The data indicate that androgen signaling in established HBV-related HCCs contribute to the development of an immunosuppressive microenvironment, the authors wrote, which could render the tumor sensitive to anti–PD-1 immunotherapy.

In mice, the researchers examined the impact of a favorable androgen pathway on anti–PD-1 treatment effects against hepatoma. They administered tamoxifen to block ER signaling in syngeneic tumor-bearing mice. In both male and female mice, tamoxifen enhanced the anti–PD-1 effects to eradicate the tumor quickly. They also administered flutamide to tumor-bearing mice to block the androgen pathway and found no significant difference in tumor growth in female mice, but in male mice, tumors grew faster in the flutamide-treated mice.

“Therapeutics that favor androgen signaling and/or blocking estrogen signaling may provide a new strategy to improve the efficacy of immune checkpoint inhibitors against HCC in combination with radiotherapy or chemotherapy that induced DNA damage,” the authors wrote. “The adjuvant effects of tamoxifen for favorable androgen signaling to boost the anti–PD-1 effect in HCC patients needs future study in a prospective HCC cohort.”

The study was supported by the National Natural Science Foundation Fund of China, Innovation Fund for Medical Sciences of Chinese Academy of Medical Sciences, State Key Project for Infectious Diseases, and Peking Union Medical College. The authors disclosed no conflicts.

To read an editorial that accompanied this study in Cellular and Molecular Gastroenterology and Hepatology, go to https://www.cmghjournal.org/article/S2352-345X(22)00234-X/fulltext.

In findings that point to a potential treatment strategy, researchers in China have discovered how two risk factors – male hormones and aflatoxin – may drive hepatocellular carcinoma (HCC). The liver cancer genetics and biology differ between men and women and help explain why aflatoxin exposure increases the risk of HCC in hepatitis B virus (HBV)–infected patients, particularly in men.

The researchers found evidence that androgen signaling increased aflatoxin metabolism and genotoxicity, reduced DNA repair capabilities, and quelled antitumor immunity, Chungui Xu, PhD, with the State Key Lab of Molecular Oncology at the National Cancer Center at Peking Union Medical College in Beijing, and colleagues wrote. The study was published in Cellular and Molecular Gastroenterology and Hepatology.

“Androgen signaling in the context of genotoxic stress repressed DNA damage repair,” the authors wrote. “The alteration caused more nuclear DNA leakage into cytosol to activate the cGAS-STING pathway, which increased T-cell infiltration into tumor mass and improved anti–programmed cell death protein 1 [PD-1] immunotherapy in HCCs.”

In the study, the researchers conducted genomic analyses of HCC tumor samples from people with HBV who were exposed to aflatoxin in Qidong, China, an area that until recently had some of the highest liver cancer rates in the world. In subsequent experiments in cell lines and mice, the team investigated how the genetic alterations and transcription dysfunctions reflected the combined carcinogenic effects of aflatoxin and HPV.

Dr. Xu and colleagues performed whole-genome, whole-exome, and RNA sequencing on tumor and matched nonneoplastic liver tissues from 101 HBV-related HCC patients (47 men and 54 women). The patients had received primary hepatectomy without systemic treatment or radiation therapy and were followed for 5 years. Aflatoxin exposure was confirmed by recording aflatoxin M1 in their urine 3-18 years before HCC diagnosis. For comparison, the research team analyzed 113 HBV-related HCC samples without aflatoxin exposure from the Cancer Genome Atlas database. They also looked at 181 Chinese HCC samples from the International Cancer Genome Consortium that had no record of aflatoxin exposure. They found no sex differences in mutation patterns for previously identified HCC driver genes, but the tumor mutation burden was higher in the Qidong set.
 

In the Qidong samples, the research team identified 71 genes with significantly different mutation frequencies by sex. Among those, 62 genes were associated more frequently with men, and 9 genes were associated with women. None of the genes have been reported previously as HCC drivers, although some have been found previously in other cancers, such as melanoma, lung cancer, and thyroid adenocarcinoma.

From whole-genome sequencing of 88 samples, the research team detected HBV integration in 37 samples and identified 110 breakpoints. No difference in HBV breakpoint numbers was detected between the sexes, though there were differences in somatic mutation profiles and in HBV integration, and only men had HBV breakpoints binding to androgen receptors.

From RNA sequencing of 87 samples, the research team identified 3,070 significantly differentially expressed genes between men and women. The transcription levels of estrogen receptor 1 and 2 were similar between the sexes, but men expressed higher androgen receptor levels.

The researchers then analyzed the variation in gene expression between the male and female gene sets to understand HCC transcriptional dysfunction. The samples from men showed different biological capabilities, with several signaling pathways related to HCC development and progression that were up-regulated. The male samples also showed repression of specific antitumor immunity.

Men’s HCC tumor samples expressed higher levels of aflatoxin metabolism-related genes, such as AHR and CYP1A1, but lower levels of GSTM1 genes.

Turning to cell lines, the researchers used HBV-positive HepG2.2.15 cells and PLC/PRF/5 cells to test sex hormones in the regulation of AHR and CYP1A1 and how their interactions affected aflatoxin B1 cytotoxicity. After aflatoxin treatment, the addition of testosterone to the cultures significantly enhanced the transcription levels of AHR and CYP1A1. The aflatoxin dose needed to cause cell death was reduced by half in the presence of testosterone.

DNA damage from aflatoxin activates DNA repair mechanisms, so the research team analyzed different repair pathways. In the male tumor samples, the most down-regulated pathway was NHEJ. The male samples expressed significantly lower levels of NHEJ factors than did the female samples, including XRCC4, MRE11, ATM, HRCC5, and NBN.

In cell lines, the researchers tested the effects of androgen alone and with aflatoxin on the regulation of NHEJ factors. The transcriptional levels of XRCC4, LIG4, and MRE11 were reduced significantly in cells treated with both aflatoxin and testosterone, compared with those treated with aflatoxin alone. Notably, the addition of 17beta-estradiol estrogen partially reversed the reduction of XRCC4 and MRE11 expression.

The tumor samples from men also showed different gene signatures of immune responses and inflammation from the samples from women. The genes related to interferon I signaling and response were up-regulated significantly in male samples but not in female samples. In addition, the samples from men showed repression of antigen-specific antitumor immunity. The research team detected significantly increased CD8+T-cell infiltration in tumor tissues of men but not women, as well as higher transcriptional levels of PD-1 and CTLA-4, which are two immune checkpoint proteins on T cells that keep them from attacking the tumor. The data indicate that androgen signaling in established HBV-related HCCs contribute to the development of an immunosuppressive microenvironment, the authors wrote, which could render the tumor sensitive to anti–PD-1 immunotherapy.

In mice, the researchers examined the impact of a favorable androgen pathway on anti–PD-1 treatment effects against hepatoma. They administered tamoxifen to block ER signaling in syngeneic tumor-bearing mice. In both male and female mice, tamoxifen enhanced the anti–PD-1 effects to eradicate the tumor quickly. They also administered flutamide to tumor-bearing mice to block the androgen pathway and found no significant difference in tumor growth in female mice, but in male mice, tumors grew faster in the flutamide-treated mice.

“Therapeutics that favor androgen signaling and/or blocking estrogen signaling may provide a new strategy to improve the efficacy of immune checkpoint inhibitors against HCC in combination with radiotherapy or chemotherapy that induced DNA damage,” the authors wrote. “The adjuvant effects of tamoxifen for favorable androgen signaling to boost the anti–PD-1 effect in HCC patients needs future study in a prospective HCC cohort.”

The study was supported by the National Natural Science Foundation Fund of China, Innovation Fund for Medical Sciences of Chinese Academy of Medical Sciences, State Key Project for Infectious Diseases, and Peking Union Medical College. The authors disclosed no conflicts.

To read an editorial that accompanied this study in Cellular and Molecular Gastroenterology and Hepatology, go to https://www.cmghjournal.org/article/S2352-345X(22)00234-X/fulltext.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article