User login
Composite Scale Better Gauges Mucosal Injury in Celiac Disease
to a study in Clinical Gastroenterology and Hepatology.
, accordingThe new morphometric duodenal biopsy mucosal scale joins together villous height-to-crypt depth ratio (Vh:Cd) and intraepithelial lymphocytes (IEL) — each key CeD histological measures of the small intestine — in a scale called VCIEL.
The authors believe the VCIEL will enable a broader and more accurate measurement of mucosal health in CeD. It will be particularly useful for population analysis in clinical trials and could improve the powering of trial design. “Use of VCIEL may lead to better outcome measures for potential new therapeutic treatments benefiting patients,” wrote Jocelyn A. Silvester, MD, PhD, a pediatrician at Boston Children’s Hospital and an assistant professor at Harvard Medical School, and colleagues.
This chronic enteropathy affects about 1% of the world’s population and requires a lifelong adherence to a gluten-free diet, the authors noted.
The authors pointed to weaknesses in the current quantitative and qualitative ways of measuring gluten-induced mucosal injury on biopsy for CeD. “Morphometry measures the injury continuum for architecture and inflammation, but these are used as separate outcomes,” they wrote. “The original Marsh-Oberhuber [M-O] classifications are rather contrived approaches to assess a biologic continuum, forcing the injury in categorical groups of unclear clinical relevance and where clinically significant changes may occur within one single category.”
Moreover, the quantitation of inflammation relies on binary assessment as normal or increased, which results in histology that is unscorable by M-O if villous atrophy persists without increased IELs, they added.
The Study
In the absence of a broadly accepted single measure of mucosal injury in CeD, the group assessed whether the composite metric could improve statistical precision for assessing histology.
Enter VCIEL, which combines the Vh:Cd and IEL for individual patients with equal weighting by converting each scale to a fraction of their standard deviation and summing the results.
The researchers applied the VCIEL formula in a reanalysis of four clinical gluten-challenge trials and compared the results for Vh:Cd and IEL separately with those for VCIEL for clinical significance (effect size) and statistical significance.
In reanalysis of the ALV003-1021 trial, for example, the researchers observed an effect size and P value (analysis of covariance) of 1.37 and .038 for a delta (difference) value of Vh:Cd 1.17 and .005 for IEL and 1.86 and .004 for VCIEL.
For the similar gluten-challenge IMGX003-NCCIH-1721 trial, the corresponding delta results were .76 and .057 for Vh:Cd, .98 and .018 for IEL, and 1.14 and .007 for VCIEL. Comparable improvements with VCIEL over individual Vh:Cd and IEL were observed for other studies, including a nontherapeutic gluten challenge study.
In NCT03409796 trial data, the computation of VCIEL values showed an improved statistical significance relative to the component values of Vh:Cd and IEL by the within-group paired 2-tailed t test P values from baseline to day 15, particularly at a 10-g gluten challenge dose: Vh:Cd, IEL, VCIEL = .0050, .0031, and .0014, respectively.
Little correlation emerged between baseline values and changes with intervention for Vh:Cd and IEL on an individual patient basis.
The greater accuracy and statistical precision of the VCIEL scale are presumably due to averaging over some of the measurement uncertainty in individual patient and timepoint Vh:Cd and IEL values and creating a composite of different histologic properties, the authors noted.
This study was funded by ImmunogenX, Inc. First author Jack A. Syage is a cofounder and shareholder in ImmunogenX Inc. Dr. Silvester has served on an advisory board for Takeda Pharmaceuticals and has received research funding from Biomedal S.L., Cour Pharmaceuticals, and Glutenostics LLC. Several coauthors disclosed various financial ties to multiple private-sector pharmaceutical and biomedical companies, including ImmunogenX.
to a study in Clinical Gastroenterology and Hepatology.
, accordingThe new morphometric duodenal biopsy mucosal scale joins together villous height-to-crypt depth ratio (Vh:Cd) and intraepithelial lymphocytes (IEL) — each key CeD histological measures of the small intestine — in a scale called VCIEL.
The authors believe the VCIEL will enable a broader and more accurate measurement of mucosal health in CeD. It will be particularly useful for population analysis in clinical trials and could improve the powering of trial design. “Use of VCIEL may lead to better outcome measures for potential new therapeutic treatments benefiting patients,” wrote Jocelyn A. Silvester, MD, PhD, a pediatrician at Boston Children’s Hospital and an assistant professor at Harvard Medical School, and colleagues.
This chronic enteropathy affects about 1% of the world’s population and requires a lifelong adherence to a gluten-free diet, the authors noted.
The authors pointed to weaknesses in the current quantitative and qualitative ways of measuring gluten-induced mucosal injury on biopsy for CeD. “Morphometry measures the injury continuum for architecture and inflammation, but these are used as separate outcomes,” they wrote. “The original Marsh-Oberhuber [M-O] classifications are rather contrived approaches to assess a biologic continuum, forcing the injury in categorical groups of unclear clinical relevance and where clinically significant changes may occur within one single category.”
Moreover, the quantitation of inflammation relies on binary assessment as normal or increased, which results in histology that is unscorable by M-O if villous atrophy persists without increased IELs, they added.
The Study
In the absence of a broadly accepted single measure of mucosal injury in CeD, the group assessed whether the composite metric could improve statistical precision for assessing histology.
Enter VCIEL, which combines the Vh:Cd and IEL for individual patients with equal weighting by converting each scale to a fraction of their standard deviation and summing the results.
The researchers applied the VCIEL formula in a reanalysis of four clinical gluten-challenge trials and compared the results for Vh:Cd and IEL separately with those for VCIEL for clinical significance (effect size) and statistical significance.
In reanalysis of the ALV003-1021 trial, for example, the researchers observed an effect size and P value (analysis of covariance) of 1.37 and .038 for a delta (difference) value of Vh:Cd 1.17 and .005 for IEL and 1.86 and .004 for VCIEL.
For the similar gluten-challenge IMGX003-NCCIH-1721 trial, the corresponding delta results were .76 and .057 for Vh:Cd, .98 and .018 for IEL, and 1.14 and .007 for VCIEL. Comparable improvements with VCIEL over individual Vh:Cd and IEL were observed for other studies, including a nontherapeutic gluten challenge study.
In NCT03409796 trial data, the computation of VCIEL values showed an improved statistical significance relative to the component values of Vh:Cd and IEL by the within-group paired 2-tailed t test P values from baseline to day 15, particularly at a 10-g gluten challenge dose: Vh:Cd, IEL, VCIEL = .0050, .0031, and .0014, respectively.
Little correlation emerged between baseline values and changes with intervention for Vh:Cd and IEL on an individual patient basis.
The greater accuracy and statistical precision of the VCIEL scale are presumably due to averaging over some of the measurement uncertainty in individual patient and timepoint Vh:Cd and IEL values and creating a composite of different histologic properties, the authors noted.
This study was funded by ImmunogenX, Inc. First author Jack A. Syage is a cofounder and shareholder in ImmunogenX Inc. Dr. Silvester has served on an advisory board for Takeda Pharmaceuticals and has received research funding from Biomedal S.L., Cour Pharmaceuticals, and Glutenostics LLC. Several coauthors disclosed various financial ties to multiple private-sector pharmaceutical and biomedical companies, including ImmunogenX.
to a study in Clinical Gastroenterology and Hepatology.
, accordingThe new morphometric duodenal biopsy mucosal scale joins together villous height-to-crypt depth ratio (Vh:Cd) and intraepithelial lymphocytes (IEL) — each key CeD histological measures of the small intestine — in a scale called VCIEL.
The authors believe the VCIEL will enable a broader and more accurate measurement of mucosal health in CeD. It will be particularly useful for population analysis in clinical trials and could improve the powering of trial design. “Use of VCIEL may lead to better outcome measures for potential new therapeutic treatments benefiting patients,” wrote Jocelyn A. Silvester, MD, PhD, a pediatrician at Boston Children’s Hospital and an assistant professor at Harvard Medical School, and colleagues.
This chronic enteropathy affects about 1% of the world’s population and requires a lifelong adherence to a gluten-free diet, the authors noted.
The authors pointed to weaknesses in the current quantitative and qualitative ways of measuring gluten-induced mucosal injury on biopsy for CeD. “Morphometry measures the injury continuum for architecture and inflammation, but these are used as separate outcomes,” they wrote. “The original Marsh-Oberhuber [M-O] classifications are rather contrived approaches to assess a biologic continuum, forcing the injury in categorical groups of unclear clinical relevance and where clinically significant changes may occur within one single category.”
Moreover, the quantitation of inflammation relies on binary assessment as normal or increased, which results in histology that is unscorable by M-O if villous atrophy persists without increased IELs, they added.
The Study
In the absence of a broadly accepted single measure of mucosal injury in CeD, the group assessed whether the composite metric could improve statistical precision for assessing histology.
Enter VCIEL, which combines the Vh:Cd and IEL for individual patients with equal weighting by converting each scale to a fraction of their standard deviation and summing the results.
The researchers applied the VCIEL formula in a reanalysis of four clinical gluten-challenge trials and compared the results for Vh:Cd and IEL separately with those for VCIEL for clinical significance (effect size) and statistical significance.
In reanalysis of the ALV003-1021 trial, for example, the researchers observed an effect size and P value (analysis of covariance) of 1.37 and .038 for a delta (difference) value of Vh:Cd 1.17 and .005 for IEL and 1.86 and .004 for VCIEL.
For the similar gluten-challenge IMGX003-NCCIH-1721 trial, the corresponding delta results were .76 and .057 for Vh:Cd, .98 and .018 for IEL, and 1.14 and .007 for VCIEL. Comparable improvements with VCIEL over individual Vh:Cd and IEL were observed for other studies, including a nontherapeutic gluten challenge study.
In NCT03409796 trial data, the computation of VCIEL values showed an improved statistical significance relative to the component values of Vh:Cd and IEL by the within-group paired 2-tailed t test P values from baseline to day 15, particularly at a 10-g gluten challenge dose: Vh:Cd, IEL, VCIEL = .0050, .0031, and .0014, respectively.
Little correlation emerged between baseline values and changes with intervention for Vh:Cd and IEL on an individual patient basis.
The greater accuracy and statistical precision of the VCIEL scale are presumably due to averaging over some of the measurement uncertainty in individual patient and timepoint Vh:Cd and IEL values and creating a composite of different histologic properties, the authors noted.
This study was funded by ImmunogenX, Inc. First author Jack A. Syage is a cofounder and shareholder in ImmunogenX Inc. Dr. Silvester has served on an advisory board for Takeda Pharmaceuticals and has received research funding from Biomedal S.L., Cour Pharmaceuticals, and Glutenostics LLC. Several coauthors disclosed various financial ties to multiple private-sector pharmaceutical and biomedical companies, including ImmunogenX.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
High-Quality Diet in Early Life May Ward Off Later IBD
, prospective pooled data from two Scandinavian birth cohorts suggested.
It appears important to feed children a quality diet at a very young age, in particular one rich in vegetables and fish, since by age three, only dietary fish intake had any impact on IBD risk.
Although high intakes of these two food categories in very early life correlated with lower IBD risk, exposure to sugar-sweetened beverages (SSBs) was associated with an increased risk. “While non-causal explanations for our results cannot be ruled out, these novel findings are consistent with the hypothesis that early-life diet, possibly mediated through changes in the gut microbiome, may affect the risk of developing IBD,” wrote lead author Annie Guo, a PhD candidate in the Department of Pediatrics, University of Gothenburg, Sweden, and colleagues. The report was published in Gut.
“This is a population-based study investigating the risk for IBD, rather than the specific effect of diet,” Ms. Guo said in an interview. “Therefore, the results are not enough on their own to be translated into individual advice that can be applicable in the clinic. However, the study supports current dietary guidelines for small children, that is, the intake of sugar should be limited and a higher intake of fish and vegetables is beneficial for overall health.”
Two-Cohort Study
The investigators prospectively recorded food-group information on children (just under half were female) from the All Babies in Southeast Sweden and The Norwegian Mother, Father and Child Cohort Study to assess the diet quality using a Healthy Eating Index and intake frequency. Parents answered questions about their offspring’s diet at ages 12-18 months and 30-36 months. Quality of diet was measured by intake of meat, fish, fruit, vegetables, dairy, sweets, snacks, and drinks.
The Swedish cohort included 21,700 children born between October 1997 and October 1999, while the Norwegian analysis included 114,500 children, 95,200 mothers, and 75,200 fathers recruited from across Norway from 1999 to 2008. In 1,304,433 person-years of follow-up, the researchers tracked 81,280 participants from birth to childhood and adolescence, with median follow-ups in the two cohorts ranging from 1 year of age to 21.3 years (Sweden) and to 15.2 years of age (Norway). Of these children, 307 were diagnosed with IBD: Crohn’s disease (CD; n = 131); ulcerative colitis (UC; n = 97); and IBD unclassified (n = 79).
Adjusting for parental IBD history, sex, origin, education, and maternal comorbidities, the study found:
- Compared with low-quality diet, both medium- and high-quality diets at 1 year were associated with a roughly 25% reduced risk for IBD (pooled adjusted hazard ratio [aHR], 0.75 [95% CI, 0.58-0.98] and 0.75 [0.56-1.0], respectively).
- The pooled aHR per increase of category was 0.86 (95% CI, 0.74-0.99). The pooled aHR for IBD in 1-year-olds with high vs low fish intake was 0.70 (95% CI, 0.49-1.0), and this diet showed an association with a reduced risk for UC (pooled aHR, 0.46; 95% CI, 0.21-0.99). Higher vegetable intake at 1 year was also associated with a risk reduction in IBD (HR, 0.72; 95% CI, 0.55-0.95). It has been hypothesized that intake of vegetables and vegetable fibers may have programming effects on the immune system.
- AutoWith 72% of children reportedly consuming SSBs at age 1, pooled aHRs showed that some vs no intake of SSBs was associated with an increased risk for later IBD (pooled aHR, 1.42; 95% CI, 1.05-1.90).
- There were no obvious associations between overall IBD or CD/UC risk and meat, dairy, fruit, grains, potatoes, and foods high in sugar and/or fat. Diet at age 3 years was not associated with incident IBD (pooled aHR, 1.02; 95% CI, 0.76-1.37), suggesting that the risk impact of diet is greatest on very young and vulnerable microbiomes.
Ms. Guo noted that a Swedish national survey among 4-year-olds found a mean SSB consumption of 187 g/d with a mean frequency of once daily. The most desired changes in food habits are a lower intake of soft drinks, sweets, crisps, cakes, and biscuits and an increase in the intake of fruits and vegetables. A similar Norwegian survey among 2-year-olds showed that SSBs were consumed by 36% of all children with a mean intake of 40 g/d.
The exact mechanism by which sugar affects the intestinal microbiota is not established. “However, what we do know is that an excessive intake of sugar can disrupt the balance of the gut microbiome,” Ms. Guo said. “And if the child has a high intake of foods with high in sugar, that also increases the chances that the child’s overall diet has a lower intake of other foods that contribute to a diverse microbiome such as fruits and vegetables.”
An ‘Elegant’ Study
In an accompanying editorial, gastroenterologist Ashwin N. Ananthakrishnan, MBBS, MPH, AGAF, of Mass General Brigham and the Mass General Research Institute, Boston, cautioned that accurately measuring food intake in very young children is difficult, and dietary questionnaires in this study did not address food additives and emulsifiers common in commercial baby food, which may play a role in the pathogenesis of IBD.
Another study limitation is that the dietary questionnaire used has not been qualitatively or quantitatively validated against other more conventional methods, said Dr. Ananthakrishnan, who was not involved in the research.
Nevertheless, he called the study “elegant” and expanding of the data on the importance of this period in IBD development. “Although in the present study there was no association between diet at 3 years and development of IBD (in contrast to the association observed for dietary intake at 1 year), other prospective cohorts of adult-onset IBD have demonstrated an inverse association between vegetable or fish intake and reduced risk for CD while sugar-sweetened beverages have been linked to a higher risk for IBD.”
As to the question of recommending early preventive diet for IBD, “thus far, data on the impact of diet very early in childhood, outside of breastfeeding, on the risk for IBD has been lacking,” Dr. Ananthakrishnan said in an interview. “This important study highlights that diet as early as 1 year can modify subsequent risk for IBD. This raises the intriguing possibility of whether early changes in diet could be used, particularly in those at higher risk, to reduce or even prevent future development of IBD. Of course, more works needs to be done to define modifiability of diet as a risk factor, but this is an important supportive data.”
In his editorial, Dr. Ananthakrishnan stated that despite the absence of gold-standard interventional data demonstrating a benefit of dietary interventions, “in my opinion, it may still be reasonable to suggest such interventions to motivate individuals who incorporate several of the dietary patterns associated with lower risk for IBD from this and other studies. This includes ensuring adequate dietary fiber, particularly from fruits and vegetables, intake of fish, minimizing sugar-sweetened beverages and preferring fresh over processed and ultra-processed foods and snacks.” According to the study authors, their novel findings support further research on the role of childhood diet in the prevention of IBD.
The All Babies in Southeast Sweden Study is supported by Barndiabetesfonden (Swedish Child Diabetes Foundation), the Swedish Council for Working Life and Social Research, the Swedish Research Council, the Medical Research Council of Southeast Sweden, the JDRF Wallenberg Foundation, ALF and LFoU grants from Region Östergötland and Linköping University, and the Joanna Cocozza Foundation.
The Norwegian Mother, Father and Child Cohort Study is supported by the Norwegian Ministry of Health and Care Services and the Ministry of Education and Research.
Ms. Guo received grants from the Swedish Society for Medical Research and the Henning and Johan Throne-Holst Foundation to conduct this study. Co-author Karl Mårild has received funding from the Swedish Society for Medical Research, the Swedish Research Council, and ALF, Sweden’s medical research and education co-ordinating body. The authors declared no competing interests. Dr. Ananthakrishnan is supported by the National Institutes of Health, the Leona M. and Harry B. Helmsley Charitable Trust, and the Chleck Family Foundation. He has served on the scientific advisory board for Geneoscopy.
, prospective pooled data from two Scandinavian birth cohorts suggested.
It appears important to feed children a quality diet at a very young age, in particular one rich in vegetables and fish, since by age three, only dietary fish intake had any impact on IBD risk.
Although high intakes of these two food categories in very early life correlated with lower IBD risk, exposure to sugar-sweetened beverages (SSBs) was associated with an increased risk. “While non-causal explanations for our results cannot be ruled out, these novel findings are consistent with the hypothesis that early-life diet, possibly mediated through changes in the gut microbiome, may affect the risk of developing IBD,” wrote lead author Annie Guo, a PhD candidate in the Department of Pediatrics, University of Gothenburg, Sweden, and colleagues. The report was published in Gut.
“This is a population-based study investigating the risk for IBD, rather than the specific effect of diet,” Ms. Guo said in an interview. “Therefore, the results are not enough on their own to be translated into individual advice that can be applicable in the clinic. However, the study supports current dietary guidelines for small children, that is, the intake of sugar should be limited and a higher intake of fish and vegetables is beneficial for overall health.”
Two-Cohort Study
The investigators prospectively recorded food-group information on children (just under half were female) from the All Babies in Southeast Sweden and The Norwegian Mother, Father and Child Cohort Study to assess the diet quality using a Healthy Eating Index and intake frequency. Parents answered questions about their offspring’s diet at ages 12-18 months and 30-36 months. Quality of diet was measured by intake of meat, fish, fruit, vegetables, dairy, sweets, snacks, and drinks.
The Swedish cohort included 21,700 children born between October 1997 and October 1999, while the Norwegian analysis included 114,500 children, 95,200 mothers, and 75,200 fathers recruited from across Norway from 1999 to 2008. In 1,304,433 person-years of follow-up, the researchers tracked 81,280 participants from birth to childhood and adolescence, with median follow-ups in the two cohorts ranging from 1 year of age to 21.3 years (Sweden) and to 15.2 years of age (Norway). Of these children, 307 were diagnosed with IBD: Crohn’s disease (CD; n = 131); ulcerative colitis (UC; n = 97); and IBD unclassified (n = 79).
Adjusting for parental IBD history, sex, origin, education, and maternal comorbidities, the study found:
- Compared with low-quality diet, both medium- and high-quality diets at 1 year were associated with a roughly 25% reduced risk for IBD (pooled adjusted hazard ratio [aHR], 0.75 [95% CI, 0.58-0.98] and 0.75 [0.56-1.0], respectively).
- The pooled aHR per increase of category was 0.86 (95% CI, 0.74-0.99). The pooled aHR for IBD in 1-year-olds with high vs low fish intake was 0.70 (95% CI, 0.49-1.0), and this diet showed an association with a reduced risk for UC (pooled aHR, 0.46; 95% CI, 0.21-0.99). Higher vegetable intake at 1 year was also associated with a risk reduction in IBD (HR, 0.72; 95% CI, 0.55-0.95). It has been hypothesized that intake of vegetables and vegetable fibers may have programming effects on the immune system.
- AutoWith 72% of children reportedly consuming SSBs at age 1, pooled aHRs showed that some vs no intake of SSBs was associated with an increased risk for later IBD (pooled aHR, 1.42; 95% CI, 1.05-1.90).
- There were no obvious associations between overall IBD or CD/UC risk and meat, dairy, fruit, grains, potatoes, and foods high in sugar and/or fat. Diet at age 3 years was not associated with incident IBD (pooled aHR, 1.02; 95% CI, 0.76-1.37), suggesting that the risk impact of diet is greatest on very young and vulnerable microbiomes.
Ms. Guo noted that a Swedish national survey among 4-year-olds found a mean SSB consumption of 187 g/d with a mean frequency of once daily. The most desired changes in food habits are a lower intake of soft drinks, sweets, crisps, cakes, and biscuits and an increase in the intake of fruits and vegetables. A similar Norwegian survey among 2-year-olds showed that SSBs were consumed by 36% of all children with a mean intake of 40 g/d.
The exact mechanism by which sugar affects the intestinal microbiota is not established. “However, what we do know is that an excessive intake of sugar can disrupt the balance of the gut microbiome,” Ms. Guo said. “And if the child has a high intake of foods with high in sugar, that also increases the chances that the child’s overall diet has a lower intake of other foods that contribute to a diverse microbiome such as fruits and vegetables.”
An ‘Elegant’ Study
In an accompanying editorial, gastroenterologist Ashwin N. Ananthakrishnan, MBBS, MPH, AGAF, of Mass General Brigham and the Mass General Research Institute, Boston, cautioned that accurately measuring food intake in very young children is difficult, and dietary questionnaires in this study did not address food additives and emulsifiers common in commercial baby food, which may play a role in the pathogenesis of IBD.
Another study limitation is that the dietary questionnaire used has not been qualitatively or quantitatively validated against other more conventional methods, said Dr. Ananthakrishnan, who was not involved in the research.
Nevertheless, he called the study “elegant” and expanding of the data on the importance of this period in IBD development. “Although in the present study there was no association between diet at 3 years and development of IBD (in contrast to the association observed for dietary intake at 1 year), other prospective cohorts of adult-onset IBD have demonstrated an inverse association between vegetable or fish intake and reduced risk for CD while sugar-sweetened beverages have been linked to a higher risk for IBD.”
As to the question of recommending early preventive diet for IBD, “thus far, data on the impact of diet very early in childhood, outside of breastfeeding, on the risk for IBD has been lacking,” Dr. Ananthakrishnan said in an interview. “This important study highlights that diet as early as 1 year can modify subsequent risk for IBD. This raises the intriguing possibility of whether early changes in diet could be used, particularly in those at higher risk, to reduce or even prevent future development of IBD. Of course, more works needs to be done to define modifiability of diet as a risk factor, but this is an important supportive data.”
In his editorial, Dr. Ananthakrishnan stated that despite the absence of gold-standard interventional data demonstrating a benefit of dietary interventions, “in my opinion, it may still be reasonable to suggest such interventions to motivate individuals who incorporate several of the dietary patterns associated with lower risk for IBD from this and other studies. This includes ensuring adequate dietary fiber, particularly from fruits and vegetables, intake of fish, minimizing sugar-sweetened beverages and preferring fresh over processed and ultra-processed foods and snacks.” According to the study authors, their novel findings support further research on the role of childhood diet in the prevention of IBD.
The All Babies in Southeast Sweden Study is supported by Barndiabetesfonden (Swedish Child Diabetes Foundation), the Swedish Council for Working Life and Social Research, the Swedish Research Council, the Medical Research Council of Southeast Sweden, the JDRF Wallenberg Foundation, ALF and LFoU grants from Region Östergötland and Linköping University, and the Joanna Cocozza Foundation.
The Norwegian Mother, Father and Child Cohort Study is supported by the Norwegian Ministry of Health and Care Services and the Ministry of Education and Research.
Ms. Guo received grants from the Swedish Society for Medical Research and the Henning and Johan Throne-Holst Foundation to conduct this study. Co-author Karl Mårild has received funding from the Swedish Society for Medical Research, the Swedish Research Council, and ALF, Sweden’s medical research and education co-ordinating body. The authors declared no competing interests. Dr. Ananthakrishnan is supported by the National Institutes of Health, the Leona M. and Harry B. Helmsley Charitable Trust, and the Chleck Family Foundation. He has served on the scientific advisory board for Geneoscopy.
, prospective pooled data from two Scandinavian birth cohorts suggested.
It appears important to feed children a quality diet at a very young age, in particular one rich in vegetables and fish, since by age three, only dietary fish intake had any impact on IBD risk.
Although high intakes of these two food categories in very early life correlated with lower IBD risk, exposure to sugar-sweetened beverages (SSBs) was associated with an increased risk. “While non-causal explanations for our results cannot be ruled out, these novel findings are consistent with the hypothesis that early-life diet, possibly mediated through changes in the gut microbiome, may affect the risk of developing IBD,” wrote lead author Annie Guo, a PhD candidate in the Department of Pediatrics, University of Gothenburg, Sweden, and colleagues. The report was published in Gut.
“This is a population-based study investigating the risk for IBD, rather than the specific effect of diet,” Ms. Guo said in an interview. “Therefore, the results are not enough on their own to be translated into individual advice that can be applicable in the clinic. However, the study supports current dietary guidelines for small children, that is, the intake of sugar should be limited and a higher intake of fish and vegetables is beneficial for overall health.”
Two-Cohort Study
The investigators prospectively recorded food-group information on children (just under half were female) from the All Babies in Southeast Sweden and The Norwegian Mother, Father and Child Cohort Study to assess the diet quality using a Healthy Eating Index and intake frequency. Parents answered questions about their offspring’s diet at ages 12-18 months and 30-36 months. Quality of diet was measured by intake of meat, fish, fruit, vegetables, dairy, sweets, snacks, and drinks.
The Swedish cohort included 21,700 children born between October 1997 and October 1999, while the Norwegian analysis included 114,500 children, 95,200 mothers, and 75,200 fathers recruited from across Norway from 1999 to 2008. In 1,304,433 person-years of follow-up, the researchers tracked 81,280 participants from birth to childhood and adolescence, with median follow-ups in the two cohorts ranging from 1 year of age to 21.3 years (Sweden) and to 15.2 years of age (Norway). Of these children, 307 were diagnosed with IBD: Crohn’s disease (CD; n = 131); ulcerative colitis (UC; n = 97); and IBD unclassified (n = 79).
Adjusting for parental IBD history, sex, origin, education, and maternal comorbidities, the study found:
- Compared with low-quality diet, both medium- and high-quality diets at 1 year were associated with a roughly 25% reduced risk for IBD (pooled adjusted hazard ratio [aHR], 0.75 [95% CI, 0.58-0.98] and 0.75 [0.56-1.0], respectively).
- The pooled aHR per increase of category was 0.86 (95% CI, 0.74-0.99). The pooled aHR for IBD in 1-year-olds with high vs low fish intake was 0.70 (95% CI, 0.49-1.0), and this diet showed an association with a reduced risk for UC (pooled aHR, 0.46; 95% CI, 0.21-0.99). Higher vegetable intake at 1 year was also associated with a risk reduction in IBD (HR, 0.72; 95% CI, 0.55-0.95). It has been hypothesized that intake of vegetables and vegetable fibers may have programming effects on the immune system.
- AutoWith 72% of children reportedly consuming SSBs at age 1, pooled aHRs showed that some vs no intake of SSBs was associated with an increased risk for later IBD (pooled aHR, 1.42; 95% CI, 1.05-1.90).
- There were no obvious associations between overall IBD or CD/UC risk and meat, dairy, fruit, grains, potatoes, and foods high in sugar and/or fat. Diet at age 3 years was not associated with incident IBD (pooled aHR, 1.02; 95% CI, 0.76-1.37), suggesting that the risk impact of diet is greatest on very young and vulnerable microbiomes.
Ms. Guo noted that a Swedish national survey among 4-year-olds found a mean SSB consumption of 187 g/d with a mean frequency of once daily. The most desired changes in food habits are a lower intake of soft drinks, sweets, crisps, cakes, and biscuits and an increase in the intake of fruits and vegetables. A similar Norwegian survey among 2-year-olds showed that SSBs were consumed by 36% of all children with a mean intake of 40 g/d.
The exact mechanism by which sugar affects the intestinal microbiota is not established. “However, what we do know is that an excessive intake of sugar can disrupt the balance of the gut microbiome,” Ms. Guo said. “And if the child has a high intake of foods with high in sugar, that also increases the chances that the child’s overall diet has a lower intake of other foods that contribute to a diverse microbiome such as fruits and vegetables.”
An ‘Elegant’ Study
In an accompanying editorial, gastroenterologist Ashwin N. Ananthakrishnan, MBBS, MPH, AGAF, of Mass General Brigham and the Mass General Research Institute, Boston, cautioned that accurately measuring food intake in very young children is difficult, and dietary questionnaires in this study did not address food additives and emulsifiers common in commercial baby food, which may play a role in the pathogenesis of IBD.
Another study limitation is that the dietary questionnaire used has not been qualitatively or quantitatively validated against other more conventional methods, said Dr. Ananthakrishnan, who was not involved in the research.
Nevertheless, he called the study “elegant” and expanding of the data on the importance of this period in IBD development. “Although in the present study there was no association between diet at 3 years and development of IBD (in contrast to the association observed for dietary intake at 1 year), other prospective cohorts of adult-onset IBD have demonstrated an inverse association between vegetable or fish intake and reduced risk for CD while sugar-sweetened beverages have been linked to a higher risk for IBD.”
As to the question of recommending early preventive diet for IBD, “thus far, data on the impact of diet very early in childhood, outside of breastfeeding, on the risk for IBD has been lacking,” Dr. Ananthakrishnan said in an interview. “This important study highlights that diet as early as 1 year can modify subsequent risk for IBD. This raises the intriguing possibility of whether early changes in diet could be used, particularly in those at higher risk, to reduce or even prevent future development of IBD. Of course, more works needs to be done to define modifiability of diet as a risk factor, but this is an important supportive data.”
In his editorial, Dr. Ananthakrishnan stated that despite the absence of gold-standard interventional data demonstrating a benefit of dietary interventions, “in my opinion, it may still be reasonable to suggest such interventions to motivate individuals who incorporate several of the dietary patterns associated with lower risk for IBD from this and other studies. This includes ensuring adequate dietary fiber, particularly from fruits and vegetables, intake of fish, minimizing sugar-sweetened beverages and preferring fresh over processed and ultra-processed foods and snacks.” According to the study authors, their novel findings support further research on the role of childhood diet in the prevention of IBD.
The All Babies in Southeast Sweden Study is supported by Barndiabetesfonden (Swedish Child Diabetes Foundation), the Swedish Council for Working Life and Social Research, the Swedish Research Council, the Medical Research Council of Southeast Sweden, the JDRF Wallenberg Foundation, ALF and LFoU grants from Region Östergötland and Linköping University, and the Joanna Cocozza Foundation.
The Norwegian Mother, Father and Child Cohort Study is supported by the Norwegian Ministry of Health and Care Services and the Ministry of Education and Research.
Ms. Guo received grants from the Swedish Society for Medical Research and the Henning and Johan Throne-Holst Foundation to conduct this study. Co-author Karl Mårild has received funding from the Swedish Society for Medical Research, the Swedish Research Council, and ALF, Sweden’s medical research and education co-ordinating body. The authors declared no competing interests. Dr. Ananthakrishnan is supported by the National Institutes of Health, the Leona M. and Harry B. Helmsley Charitable Trust, and the Chleck Family Foundation. He has served on the scientific advisory board for Geneoscopy.
FROM GUT
Late-Stage Incidence Rates Support CRC Screening From Age 45
, a cross-sectional study of stage-stratified CRC found.
It is well known that CRC is becoming more prevalent generally in the under 50-year population, but stage-related analyses have not been done.
Staging analysis in this age group is important, however, as an increasing burden of advance-staged disease would provide further evidence for earlier screening initiation, wrote Eric M. Montminy, MD, a gastroenterologist at John H. Stroger Hospital of County Cook, Chicago, Illinois, and colleagues in JAMA Network Open.
The United States Preventive Services Task Force (USPSTF) has recommended that average-risk screening begin at 45 years of age, as do the American Gastroenterological Association and other GI societies, although the American College of Physicians last year published clinical guidance recommending 50 years as the age to start screening for CRC for patients with average risk.
“Patients aged 46-49 may become confused on which guideline to follow, similar to confusion occurring with prior breast cancer screening changes,” Dr. Montminy said in an interview. “We wanted to demonstrate incidence rates with stage stratification to help clarify the incidence trends in this age group. Stage stratification is a key because it provides insight into the relationship between time and cancer incidence, ie, is screening finding early cancer or not?”
A 2020 study in JAMA Network Open demonstrated a 46.1% increase in CRC incidence rates (IRs) in persons aged 49-50 years. This steep increase is consistent with the presence of a large preexisting and undetected case burden.
“Our results demonstrate that adults aged 46-49 years, who are between now-conflicting guidelines on whether to start screening at age 45 or 50 years, have an increasing burden of more advanced-stage CRC and thus may be at an increased risk if screening is not initiated at age 45 years,” Dr. Montminy’s group wrote.
Using incidence data per 100,000 population from the National Cancer Institute’s Surveillance, Epidemiology, and End Results registry, the investigators observed the following IRs for early-onset CRC in the age group of 46-49 years:
- Distant adenocarcinoma IRs increased faster than other stages: annual percentage change (APC), 2.2 (95% CI, 1.8-2.6).
- Regional IRs also significantly increased: APC, 1.3 (95% CI, 0.8-1.7).
- Absolute regional IRs of CRC in the age bracket of 46-49 years are similar to total pancreatic cancer IRs in all ages and all stages combined (13.2 of 100,000) over similar years. When distant IRs for CRC are included with regional IRs, those for IRs for CRC are double those for pancreatic cancer of all stages combined.
- The only decrease was seen in localized IRs: APC, -0.6 (95% CI, -1 to -0.2).
“My best advice for clinicians is to provide the facts from the data to patients so they can make an informed health decision,” Dr. Montminy said. “This includes taking an appropriate personal and family history and having the patient factor this aspect into their decision on when and how they want to perform colon cancer screening.”
His institution adheres to the USPSTF recommendation of initiation of CRC screening at age 45 years.
Findings From 2000 to 2020
During 2000-2020 period, 26,887 CRCs were diagnosed in adults aged 46-49 years (54.5% in men).
As of 2020, the localized adenocarcinoma IR decreased to 7.7 of 100,000, but regional adenocarcinoma IR increased to 13.4 of 100,000 and distant adenocarcinoma IR increased to 9.0 of 100,000.
Regional adenocarcinoma IR remained the highest of all stages in 2000-2020. From 2014 to 2020, distant IRs became similar to localized IRs, except in 2017 when distant IRs were significantly higher than localized.
Why the CRC Uptick?
“It remains an enigma at this time as to why we’re seeing this shift,” Dr. Montminy said, noting that etiologies from the colonic microbiome to cellphones have been postulated. “To date, no theory has substantially provided causality. But whatever the source is, it is affecting Western countries in unison with data demonstrating a birth cohort effect as well,” he added. “We additionally know, based on the current epidemiologic data, that current screening practices are failing, and a unified discussion must occur in order to prevent young patients from developing advanced colon cancer.”
Offering his perspective on the findings, Joshua Meyer, MD, vice chair of translational research in the Department of Radiation Oncology at Fox Chase Cancer Center in Philadelphia, said the findings reinforce the practice of offering screening to average-risk individuals starting at age 45 years, the threshold at his institution. “There are previously published data demonstrating an increase in advanced stage at the time of screening initiation, and these data support that,” said Dr. Meyer, who was not involved in the present analysis.
More research needs to be done, he continued, not just on optimal age but also on the effect of multiple other factors impacting risk. “These may include family history and genetic risk as well as the role of blood- and stool-based screening assays in an integrated strategy to screen for colorectal cancer.”
There are multiple screening tests, and while colonoscopy, the gold standard, is very safe, it is not completely without risks, Dr. Meyer added. “And the question of the appropriate allocation of limited societal resources continues to be discussed on a broader level and largely explains the difference between the two guidelines.”
This study received no specific funding. Co-author Jordan J. Karlitz, MD, reported personal fees from GRAIL (senior medical director) and an equity position from Gastro Girl/GI On Demand outside f the submitted work. Dr. Meyer disclosed no conflicts of interest relevant to his comments.
, a cross-sectional study of stage-stratified CRC found.
It is well known that CRC is becoming more prevalent generally in the under 50-year population, but stage-related analyses have not been done.
Staging analysis in this age group is important, however, as an increasing burden of advance-staged disease would provide further evidence for earlier screening initiation, wrote Eric M. Montminy, MD, a gastroenterologist at John H. Stroger Hospital of County Cook, Chicago, Illinois, and colleagues in JAMA Network Open.
The United States Preventive Services Task Force (USPSTF) has recommended that average-risk screening begin at 45 years of age, as do the American Gastroenterological Association and other GI societies, although the American College of Physicians last year published clinical guidance recommending 50 years as the age to start screening for CRC for patients with average risk.
“Patients aged 46-49 may become confused on which guideline to follow, similar to confusion occurring with prior breast cancer screening changes,” Dr. Montminy said in an interview. “We wanted to demonstrate incidence rates with stage stratification to help clarify the incidence trends in this age group. Stage stratification is a key because it provides insight into the relationship between time and cancer incidence, ie, is screening finding early cancer or not?”
A 2020 study in JAMA Network Open demonstrated a 46.1% increase in CRC incidence rates (IRs) in persons aged 49-50 years. This steep increase is consistent with the presence of a large preexisting and undetected case burden.
“Our results demonstrate that adults aged 46-49 years, who are between now-conflicting guidelines on whether to start screening at age 45 or 50 years, have an increasing burden of more advanced-stage CRC and thus may be at an increased risk if screening is not initiated at age 45 years,” Dr. Montminy’s group wrote.
Using incidence data per 100,000 population from the National Cancer Institute’s Surveillance, Epidemiology, and End Results registry, the investigators observed the following IRs for early-onset CRC in the age group of 46-49 years:
- Distant adenocarcinoma IRs increased faster than other stages: annual percentage change (APC), 2.2 (95% CI, 1.8-2.6).
- Regional IRs also significantly increased: APC, 1.3 (95% CI, 0.8-1.7).
- Absolute regional IRs of CRC in the age bracket of 46-49 years are similar to total pancreatic cancer IRs in all ages and all stages combined (13.2 of 100,000) over similar years. When distant IRs for CRC are included with regional IRs, those for IRs for CRC are double those for pancreatic cancer of all stages combined.
- The only decrease was seen in localized IRs: APC, -0.6 (95% CI, -1 to -0.2).
“My best advice for clinicians is to provide the facts from the data to patients so they can make an informed health decision,” Dr. Montminy said. “This includes taking an appropriate personal and family history and having the patient factor this aspect into their decision on when and how they want to perform colon cancer screening.”
His institution adheres to the USPSTF recommendation of initiation of CRC screening at age 45 years.
Findings From 2000 to 2020
During 2000-2020 period, 26,887 CRCs were diagnosed in adults aged 46-49 years (54.5% in men).
As of 2020, the localized adenocarcinoma IR decreased to 7.7 of 100,000, but regional adenocarcinoma IR increased to 13.4 of 100,000 and distant adenocarcinoma IR increased to 9.0 of 100,000.
Regional adenocarcinoma IR remained the highest of all stages in 2000-2020. From 2014 to 2020, distant IRs became similar to localized IRs, except in 2017 when distant IRs were significantly higher than localized.
Why the CRC Uptick?
“It remains an enigma at this time as to why we’re seeing this shift,” Dr. Montminy said, noting that etiologies from the colonic microbiome to cellphones have been postulated. “To date, no theory has substantially provided causality. But whatever the source is, it is affecting Western countries in unison with data demonstrating a birth cohort effect as well,” he added. “We additionally know, based on the current epidemiologic data, that current screening practices are failing, and a unified discussion must occur in order to prevent young patients from developing advanced colon cancer.”
Offering his perspective on the findings, Joshua Meyer, MD, vice chair of translational research in the Department of Radiation Oncology at Fox Chase Cancer Center in Philadelphia, said the findings reinforce the practice of offering screening to average-risk individuals starting at age 45 years, the threshold at his institution. “There are previously published data demonstrating an increase in advanced stage at the time of screening initiation, and these data support that,” said Dr. Meyer, who was not involved in the present analysis.
More research needs to be done, he continued, not just on optimal age but also on the effect of multiple other factors impacting risk. “These may include family history and genetic risk as well as the role of blood- and stool-based screening assays in an integrated strategy to screen for colorectal cancer.”
There are multiple screening tests, and while colonoscopy, the gold standard, is very safe, it is not completely without risks, Dr. Meyer added. “And the question of the appropriate allocation of limited societal resources continues to be discussed on a broader level and largely explains the difference between the two guidelines.”
This study received no specific funding. Co-author Jordan J. Karlitz, MD, reported personal fees from GRAIL (senior medical director) and an equity position from Gastro Girl/GI On Demand outside f the submitted work. Dr. Meyer disclosed no conflicts of interest relevant to his comments.
, a cross-sectional study of stage-stratified CRC found.
It is well known that CRC is becoming more prevalent generally in the under 50-year population, but stage-related analyses have not been done.
Staging analysis in this age group is important, however, as an increasing burden of advance-staged disease would provide further evidence for earlier screening initiation, wrote Eric M. Montminy, MD, a gastroenterologist at John H. Stroger Hospital of County Cook, Chicago, Illinois, and colleagues in JAMA Network Open.
The United States Preventive Services Task Force (USPSTF) has recommended that average-risk screening begin at 45 years of age, as do the American Gastroenterological Association and other GI societies, although the American College of Physicians last year published clinical guidance recommending 50 years as the age to start screening for CRC for patients with average risk.
“Patients aged 46-49 may become confused on which guideline to follow, similar to confusion occurring with prior breast cancer screening changes,” Dr. Montminy said in an interview. “We wanted to demonstrate incidence rates with stage stratification to help clarify the incidence trends in this age group. Stage stratification is a key because it provides insight into the relationship between time and cancer incidence, ie, is screening finding early cancer or not?”
A 2020 study in JAMA Network Open demonstrated a 46.1% increase in CRC incidence rates (IRs) in persons aged 49-50 years. This steep increase is consistent with the presence of a large preexisting and undetected case burden.
“Our results demonstrate that adults aged 46-49 years, who are between now-conflicting guidelines on whether to start screening at age 45 or 50 years, have an increasing burden of more advanced-stage CRC and thus may be at an increased risk if screening is not initiated at age 45 years,” Dr. Montminy’s group wrote.
Using incidence data per 100,000 population from the National Cancer Institute’s Surveillance, Epidemiology, and End Results registry, the investigators observed the following IRs for early-onset CRC in the age group of 46-49 years:
- Distant adenocarcinoma IRs increased faster than other stages: annual percentage change (APC), 2.2 (95% CI, 1.8-2.6).
- Regional IRs also significantly increased: APC, 1.3 (95% CI, 0.8-1.7).
- Absolute regional IRs of CRC in the age bracket of 46-49 years are similar to total pancreatic cancer IRs in all ages and all stages combined (13.2 of 100,000) over similar years. When distant IRs for CRC are included with regional IRs, those for IRs for CRC are double those for pancreatic cancer of all stages combined.
- The only decrease was seen in localized IRs: APC, -0.6 (95% CI, -1 to -0.2).
“My best advice for clinicians is to provide the facts from the data to patients so they can make an informed health decision,” Dr. Montminy said. “This includes taking an appropriate personal and family history and having the patient factor this aspect into their decision on when and how they want to perform colon cancer screening.”
His institution adheres to the USPSTF recommendation of initiation of CRC screening at age 45 years.
Findings From 2000 to 2020
During 2000-2020 period, 26,887 CRCs were diagnosed in adults aged 46-49 years (54.5% in men).
As of 2020, the localized adenocarcinoma IR decreased to 7.7 of 100,000, but regional adenocarcinoma IR increased to 13.4 of 100,000 and distant adenocarcinoma IR increased to 9.0 of 100,000.
Regional adenocarcinoma IR remained the highest of all stages in 2000-2020. From 2014 to 2020, distant IRs became similar to localized IRs, except in 2017 when distant IRs were significantly higher than localized.
Why the CRC Uptick?
“It remains an enigma at this time as to why we’re seeing this shift,” Dr. Montminy said, noting that etiologies from the colonic microbiome to cellphones have been postulated. “To date, no theory has substantially provided causality. But whatever the source is, it is affecting Western countries in unison with data demonstrating a birth cohort effect as well,” he added. “We additionally know, based on the current epidemiologic data, that current screening practices are failing, and a unified discussion must occur in order to prevent young patients from developing advanced colon cancer.”
Offering his perspective on the findings, Joshua Meyer, MD, vice chair of translational research in the Department of Radiation Oncology at Fox Chase Cancer Center in Philadelphia, said the findings reinforce the practice of offering screening to average-risk individuals starting at age 45 years, the threshold at his institution. “There are previously published data demonstrating an increase in advanced stage at the time of screening initiation, and these data support that,” said Dr. Meyer, who was not involved in the present analysis.
More research needs to be done, he continued, not just on optimal age but also on the effect of multiple other factors impacting risk. “These may include family history and genetic risk as well as the role of blood- and stool-based screening assays in an integrated strategy to screen for colorectal cancer.”
There are multiple screening tests, and while colonoscopy, the gold standard, is very safe, it is not completely without risks, Dr. Meyer added. “And the question of the appropriate allocation of limited societal resources continues to be discussed on a broader level and largely explains the difference between the two guidelines.”
This study received no specific funding. Co-author Jordan J. Karlitz, MD, reported personal fees from GRAIL (senior medical director) and an equity position from Gastro Girl/GI On Demand outside f the submitted work. Dr. Meyer disclosed no conflicts of interest relevant to his comments.
FROM JAMA NETWORK OPEN
Heart Failure the Most Common Complication of Atrial Fibrillation, Not Stroke
FROM BMJ
The lifetime risk of atrial fibrillation (AF) increased from 2000 to 2022 from one in four to one in three, a Danish population-based study of temporal trends found.
Heart failure was the most frequent complication linked to this arrhythmia, with a lifetime risk of two in five, twice that of stroke, according to investigators led by Nicklas Vinter, MD, PhD, a postdoctoral researcher at the Danish Center for Health Service Research in the Department of Clinical Medicine at Aalborg University, Denmark.
Published in BMJ, the study found the lifetime risks of post-AF stroke, ischemic stroke, and myocardial infarction improved only modestly over time and remained high, with virtually no improvement in the lifetime risk of heart failure.
“Our work provides novel lifetime risk estimates that are instrumental in facilitating effective risk communication between patients and their physicians,” Dr. Vinter said in an interview. “The knowledge of risks from a lifelong perspective may serve as a motivator for patients to commence or intensify preventive efforts.” AF patients could, for example, adopt healthier lifestyles or adhere to prescribed medications, Dr. Vinter explained.
“The substantial lifetime risk of heart failure following atrial fibrillation necessitates heightened attention to its prevention and early detection,” Dr. Vinter said. “Furthermore, the high lifetime risk of stroke remains a critical complication, which highlights the importance of continuous attention to the initiation and maintenance of oral anticoagulation therapy.”
The Study
The cohort consisted of 3.5 million individuals (51.7% women) who did not have AF as of age 45 or older. These individuals were followed until incident AF, migration, death, or end of follow-up, whichever came first.
All 362,721 individuals with incident AF (53.6% men) but no prevalent complication were further followed over two time periods (2000-2010 and 2011-2020) until incident heart failure, stroke, or myocardial infarction.
Among the findings:
- Lifetime AF risk increased from 24.2% in 2000-2010 to 30.9% in 2011-2022, for a difference of 6.7% (95% confidence interval [CI], 6.5%-6.8%).
- Lifetime AF risk rose across all subgroups over time, with a larger increase in men and individuals with heart failure, myocardial infarction, stroke, diabetes, and chronic kidney disease.
- Lifetime risk of heart failure was 42.9% in 2000-2010 and 42.1% in 2011-2022, for a difference of −0.8% (95% CI, −3.8% to 2.2%).
- The lifetime risks of post-AF stroke and of myocardial infarction decreased slightly between the two periods, from 22.4% to 19.9% for stroke (difference −2.5%, 95% CI, −4.2% to −0.7%) and from 13.7% to 9.8% for myocardial infarction (−3.9%, 95% CI, −5.3% to −2.4%). No differential decrease between men and women emerged.
“Our novel quantification of the long-term downstream consequences of atrial fibrillation highlights the critical need for treatments to further decrease stroke risk as well as for heart failure prevention strategies among patients with atrial fibrillation,” the Danish researchers wrote.
Offering an outsider’s perspective, John P. Higgins, MD, MBA, MPhil, a sports cardiologist at McGovern Medical School at The University of Texas Health Science Center at Houston, said, “Think of atrial fibrillation as a barometer of underlying stress on the heart. When blood pressure is high, or a patient has underlying asymptomatic coronary artery disease or heart failure, they are more likely to have episodes of atrial fibrillation.”
According to Dr. Higgins, risk factors for AF are underappreciated in the United States and elsewhere, and primary care doctors need to be aware of them. “We should try to identify these risk factors and do primary prevention to improve risk factors to reduce the progression to heart failure and myocardial infarction and stroke. But lifelong prevention is even better, he added. “Doing things to prevent actually getting risk factors in the first place. So a healthy lifestyle including exercise, diet, hydration, sleep, relaxation, social contact, and a little sunlight might be the long-term keys and starting them at a young age, too.”
In an accompanying editorial, Jianhua Wu, PhD, a professor of biostatistics and health data science with the Wolfson Institute of Population Health at Queen Mary University of London, and a colleague, cited the study’s robust observational research and called the analysis noteworthy for its quantification of the long-term risks of post-AF sequelae. They cautioned, however, that its grouping into two 10-year periods (2000-2010 and 2011-2020) came at the cost of losing temporal resolution. They also called out the lack of reporting on the ethnic composition of the study population, a factor that influences lifetime AF risk, and the absence of subgroup analysis by socioeconomic status, which affects incidence and outcomes.
The editorialists noted that while interventions to prevent stroke dominated AF research and guidelines during the study time period, no evidence suggests these interventions can prevent incident heart failure. “Alignment of both randomised clinical trials and guidelines to better reflect the needs of the real-world population with atrial fibrillation is necessary because further improvements to patient prognosis are likely to require a broader perspective on atrial fibrillation management beyond prevention of stroke,” they wrote.
In the meantime this study “challenges research priorities and guideline design, and raises critical questions for the research and clinical communities about how the growing burden of atrial fibrillation can be stopped,” they wrote.
This work was supported by the Danish Cardiovascular Academy, which is funded by the Novo Nordisk Foundation, and The Danish Heart Foundation. Dr. Vinter has been an advisory board member and consultant for AstraZeneca and has an institutional research grant from BMS/Pfizer unrelated to the current study. He reported personal consulting fees from BMS and Pfizer. Other coauthors disclosed research support from and/or consulting work for private industry, as well as grants from not-for-profit research-funding organizations. Dr. Higgins had no competing interest to declare. The editorial writers had no relevant financial interests to declare. Dr. Wu is supported by Barts Charity.
FROM BMJ
The lifetime risk of atrial fibrillation (AF) increased from 2000 to 2022 from one in four to one in three, a Danish population-based study of temporal trends found.
Heart failure was the most frequent complication linked to this arrhythmia, with a lifetime risk of two in five, twice that of stroke, according to investigators led by Nicklas Vinter, MD, PhD, a postdoctoral researcher at the Danish Center for Health Service Research in the Department of Clinical Medicine at Aalborg University, Denmark.
Published in BMJ, the study found the lifetime risks of post-AF stroke, ischemic stroke, and myocardial infarction improved only modestly over time and remained high, with virtually no improvement in the lifetime risk of heart failure.
“Our work provides novel lifetime risk estimates that are instrumental in facilitating effective risk communication between patients and their physicians,” Dr. Vinter said in an interview. “The knowledge of risks from a lifelong perspective may serve as a motivator for patients to commence or intensify preventive efforts.” AF patients could, for example, adopt healthier lifestyles or adhere to prescribed medications, Dr. Vinter explained.
“The substantial lifetime risk of heart failure following atrial fibrillation necessitates heightened attention to its prevention and early detection,” Dr. Vinter said. “Furthermore, the high lifetime risk of stroke remains a critical complication, which highlights the importance of continuous attention to the initiation and maintenance of oral anticoagulation therapy.”
The Study
The cohort consisted of 3.5 million individuals (51.7% women) who did not have AF as of age 45 or older. These individuals were followed until incident AF, migration, death, or end of follow-up, whichever came first.
All 362,721 individuals with incident AF (53.6% men) but no prevalent complication were further followed over two time periods (2000-2010 and 2011-2020) until incident heart failure, stroke, or myocardial infarction.
Among the findings:
- Lifetime AF risk increased from 24.2% in 2000-2010 to 30.9% in 2011-2022, for a difference of 6.7% (95% confidence interval [CI], 6.5%-6.8%).
- Lifetime AF risk rose across all subgroups over time, with a larger increase in men and individuals with heart failure, myocardial infarction, stroke, diabetes, and chronic kidney disease.
- Lifetime risk of heart failure was 42.9% in 2000-2010 and 42.1% in 2011-2022, for a difference of −0.8% (95% CI, −3.8% to 2.2%).
- The lifetime risks of post-AF stroke and of myocardial infarction decreased slightly between the two periods, from 22.4% to 19.9% for stroke (difference −2.5%, 95% CI, −4.2% to −0.7%) and from 13.7% to 9.8% for myocardial infarction (−3.9%, 95% CI, −5.3% to −2.4%). No differential decrease between men and women emerged.
“Our novel quantification of the long-term downstream consequences of atrial fibrillation highlights the critical need for treatments to further decrease stroke risk as well as for heart failure prevention strategies among patients with atrial fibrillation,” the Danish researchers wrote.
Offering an outsider’s perspective, John P. Higgins, MD, MBA, MPhil, a sports cardiologist at McGovern Medical School at The University of Texas Health Science Center at Houston, said, “Think of atrial fibrillation as a barometer of underlying stress on the heart. When blood pressure is high, or a patient has underlying asymptomatic coronary artery disease or heart failure, they are more likely to have episodes of atrial fibrillation.”
According to Dr. Higgins, risk factors for AF are underappreciated in the United States and elsewhere, and primary care doctors need to be aware of them. “We should try to identify these risk factors and do primary prevention to improve risk factors to reduce the progression to heart failure and myocardial infarction and stroke. But lifelong prevention is even better, he added. “Doing things to prevent actually getting risk factors in the first place. So a healthy lifestyle including exercise, diet, hydration, sleep, relaxation, social contact, and a little sunlight might be the long-term keys and starting them at a young age, too.”
In an accompanying editorial, Jianhua Wu, PhD, a professor of biostatistics and health data science with the Wolfson Institute of Population Health at Queen Mary University of London, and a colleague, cited the study’s robust observational research and called the analysis noteworthy for its quantification of the long-term risks of post-AF sequelae. They cautioned, however, that its grouping into two 10-year periods (2000-2010 and 2011-2020) came at the cost of losing temporal resolution. They also called out the lack of reporting on the ethnic composition of the study population, a factor that influences lifetime AF risk, and the absence of subgroup analysis by socioeconomic status, which affects incidence and outcomes.
The editorialists noted that while interventions to prevent stroke dominated AF research and guidelines during the study time period, no evidence suggests these interventions can prevent incident heart failure. “Alignment of both randomised clinical trials and guidelines to better reflect the needs of the real-world population with atrial fibrillation is necessary because further improvements to patient prognosis are likely to require a broader perspective on atrial fibrillation management beyond prevention of stroke,” they wrote.
In the meantime this study “challenges research priorities and guideline design, and raises critical questions for the research and clinical communities about how the growing burden of atrial fibrillation can be stopped,” they wrote.
This work was supported by the Danish Cardiovascular Academy, which is funded by the Novo Nordisk Foundation, and The Danish Heart Foundation. Dr. Vinter has been an advisory board member and consultant for AstraZeneca and has an institutional research grant from BMS/Pfizer unrelated to the current study. He reported personal consulting fees from BMS and Pfizer. Other coauthors disclosed research support from and/or consulting work for private industry, as well as grants from not-for-profit research-funding organizations. Dr. Higgins had no competing interest to declare. The editorial writers had no relevant financial interests to declare. Dr. Wu is supported by Barts Charity.
FROM BMJ
The lifetime risk of atrial fibrillation (AF) increased from 2000 to 2022 from one in four to one in three, a Danish population-based study of temporal trends found.
Heart failure was the most frequent complication linked to this arrhythmia, with a lifetime risk of two in five, twice that of stroke, according to investigators led by Nicklas Vinter, MD, PhD, a postdoctoral researcher at the Danish Center for Health Service Research in the Department of Clinical Medicine at Aalborg University, Denmark.
Published in BMJ, the study found the lifetime risks of post-AF stroke, ischemic stroke, and myocardial infarction improved only modestly over time and remained high, with virtually no improvement in the lifetime risk of heart failure.
“Our work provides novel lifetime risk estimates that are instrumental in facilitating effective risk communication between patients and their physicians,” Dr. Vinter said in an interview. “The knowledge of risks from a lifelong perspective may serve as a motivator for patients to commence or intensify preventive efforts.” AF patients could, for example, adopt healthier lifestyles or adhere to prescribed medications, Dr. Vinter explained.
“The substantial lifetime risk of heart failure following atrial fibrillation necessitates heightened attention to its prevention and early detection,” Dr. Vinter said. “Furthermore, the high lifetime risk of stroke remains a critical complication, which highlights the importance of continuous attention to the initiation and maintenance of oral anticoagulation therapy.”
The Study
The cohort consisted of 3.5 million individuals (51.7% women) who did not have AF as of age 45 or older. These individuals were followed until incident AF, migration, death, or end of follow-up, whichever came first.
All 362,721 individuals with incident AF (53.6% men) but no prevalent complication were further followed over two time periods (2000-2010 and 2011-2020) until incident heart failure, stroke, or myocardial infarction.
Among the findings:
- Lifetime AF risk increased from 24.2% in 2000-2010 to 30.9% in 2011-2022, for a difference of 6.7% (95% confidence interval [CI], 6.5%-6.8%).
- Lifetime AF risk rose across all subgroups over time, with a larger increase in men and individuals with heart failure, myocardial infarction, stroke, diabetes, and chronic kidney disease.
- Lifetime risk of heart failure was 42.9% in 2000-2010 and 42.1% in 2011-2022, for a difference of −0.8% (95% CI, −3.8% to 2.2%).
- The lifetime risks of post-AF stroke and of myocardial infarction decreased slightly between the two periods, from 22.4% to 19.9% for stroke (difference −2.5%, 95% CI, −4.2% to −0.7%) and from 13.7% to 9.8% for myocardial infarction (−3.9%, 95% CI, −5.3% to −2.4%). No differential decrease between men and women emerged.
“Our novel quantification of the long-term downstream consequences of atrial fibrillation highlights the critical need for treatments to further decrease stroke risk as well as for heart failure prevention strategies among patients with atrial fibrillation,” the Danish researchers wrote.
Offering an outsider’s perspective, John P. Higgins, MD, MBA, MPhil, a sports cardiologist at McGovern Medical School at The University of Texas Health Science Center at Houston, said, “Think of atrial fibrillation as a barometer of underlying stress on the heart. When blood pressure is high, or a patient has underlying asymptomatic coronary artery disease or heart failure, they are more likely to have episodes of atrial fibrillation.”
According to Dr. Higgins, risk factors for AF are underappreciated in the United States and elsewhere, and primary care doctors need to be aware of them. “We should try to identify these risk factors and do primary prevention to improve risk factors to reduce the progression to heart failure and myocardial infarction and stroke. But lifelong prevention is even better, he added. “Doing things to prevent actually getting risk factors in the first place. So a healthy lifestyle including exercise, diet, hydration, sleep, relaxation, social contact, and a little sunlight might be the long-term keys and starting them at a young age, too.”
In an accompanying editorial, Jianhua Wu, PhD, a professor of biostatistics and health data science with the Wolfson Institute of Population Health at Queen Mary University of London, and a colleague, cited the study’s robust observational research and called the analysis noteworthy for its quantification of the long-term risks of post-AF sequelae. They cautioned, however, that its grouping into two 10-year periods (2000-2010 and 2011-2020) came at the cost of losing temporal resolution. They also called out the lack of reporting on the ethnic composition of the study population, a factor that influences lifetime AF risk, and the absence of subgroup analysis by socioeconomic status, which affects incidence and outcomes.
The editorialists noted that while interventions to prevent stroke dominated AF research and guidelines during the study time period, no evidence suggests these interventions can prevent incident heart failure. “Alignment of both randomised clinical trials and guidelines to better reflect the needs of the real-world population with atrial fibrillation is necessary because further improvements to patient prognosis are likely to require a broader perspective on atrial fibrillation management beyond prevention of stroke,” they wrote.
In the meantime this study “challenges research priorities and guideline design, and raises critical questions for the research and clinical communities about how the growing burden of atrial fibrillation can be stopped,” they wrote.
This work was supported by the Danish Cardiovascular Academy, which is funded by the Novo Nordisk Foundation, and The Danish Heart Foundation. Dr. Vinter has been an advisory board member and consultant for AstraZeneca and has an institutional research grant from BMS/Pfizer unrelated to the current study. He reported personal consulting fees from BMS and Pfizer. Other coauthors disclosed research support from and/or consulting work for private industry, as well as grants from not-for-profit research-funding organizations. Dr. Higgins had no competing interest to declare. The editorial writers had no relevant financial interests to declare. Dr. Wu is supported by Barts Charity.
Antibiotics of Little Benefit in Lower Respiratory Tract Infection
Antibiotics had no measurable effect on the severity or duration of coughs due to acute lower respiratory tract infection (LRTI, or acute bronchitis), a large prospective study found.
In fact, those receiving an antibiotic in the primary- and urgent-care setting had a small but significant increase in overall length of illness (17.5 vs 15.9 days; P = .05) — largely because patients with longer illness before the index visit were more likely to receive these drugs. The study adds further support for reducing the prescription of antibiotics for LRTIs.
“Importantly, the pathogen data demonstrated that the length of time until illness resolution for those with bacterial infection was the same as for those not receiving an antibiotic versus those receiving one (17.3 vs 17.4 days),” researchers led by Daniel J. Merenstein, MD, a professor and director of research programs, family medicine, at Georgetown University Medical Center in Washington, wrote in the Journal of General Internal Medicine (doi: 10.1007/s11606-024-08758-y).
Patients believed an antibiotic would shorten their illness by an average of about 4 days, from 13.4 days to 9.7 days, whereas the average duration of all coughs was more than 2 weeks regardless of pathogen type or receipt of an antibiotic.
“Patients had unrealistic expectations regarding the duration of LRTI and the effect of antibiotics, which should be the target of antibiotic stewardship efforts,” the group wrote.
LRTIs can, however, be dangerous, with 3%-5% progressing to pneumonia, “but not everyone has easy access at an initial visit to an x-ray, which may be the reason clinicians still give antibiotics without any other evidence of a bacterial infection,” Dr. Merenstein said in a news release. “Patients have come to expect antibiotics for a cough, even if it doesn’t help. Basic symptom-relieving medications plus time bring a resolution to most people’s infections.”
The authors noted that cough is the most common reason for an ambulatory care visit, accounting for 2.7 million outpatient visits and more than 4 million emergency department visits annually.
Risks
Overuse of antibiotics can result in dizziness, nausea, diarrhea, and rash, along with a roughly 4% chance of serious adverse effects including anaphylaxis; Stevens-Johnson syndrome, a serious skin and mucous membrane disorder; and Clostridioides difficile-associated diarrhea.
An estimated half of all antibiotic prescriptions for acute respiratory conditions are unnecessary. Before the COVID-19 pandemic, antibiotics were prescribed about 70% of the time for a diagnosis of uncomplicated cough and LRTI. The viral pandemic did not change this practice according to a meta-analysis of 130 studies showing that 78% of COVID-19 patients were prescribed an antibiotic.
The study
The study looked at a cohort of 718 patients, with a mean age of 38.9 years, 65.3% female, of whom 207 received an antibiotic and 511 did not. Of those with baseline data, 29% had an antibiotic prescribed at baseline, the most common (in 85%) being amoxicillin-clavulanate, azithromycin, doxycycline, and amoxicillin. Antibiotics had no effect on the duration or overall severity of cough in viral, bacterial, or mixed infections. Receipt of an antibiotic did, however, reduce the likelihood of a follow-up visit: 14.1% vs 8.2% (adjusted odds ratio, 0.47; 95% confidence interval, 0.26-0.84) — perhaps because it removed the motivation for seeking another consultation. Antibiotic recipients were more likely to receive a systemic corticosteroid (31.9% vs 4.5%, P <.001) and were also more likely to receive an albuterol inhaler (22.7% vs 7.6%, P <.001).
Jeffrey A. Linder, MD, MPH, a primary care physician and chief of internal medicine and geriatrics at Northwestern University Feinberg School of Medicine in Chicago, agrees that in the vast majority of LRTIs — usually acute bronchitis — antibiotics do not speed the healing process. “Forty years of research show that antibiotics do not make acute bronchitis go away any faster,” Dr. Linder, who was not involved in the current study, said in an interview. “There’s even growing evidence that a lot of pneumonia is viral as well, and 10 or 20 years from now we may often not be giving antibiotics for pneumonia because we’ll be able to see better if it’s caused by a virus.”
A large 2018 review by Dr. Linder and associates reported that 46% of antibiotics were prescribed without any infection-related diagnosis code and 20% without an office visit.
Dr. Linder routinely informs patients requesting an antibiotic about the risks of putting an ineffective chemical into their body. “I stress that it can cause rash and other allergic reactions, and even promote C diff infection,” he said. “And I also say it messes with the good bacteria in the microbiome, and they usually come around.”
Patients need to know, Dr. Linder added, that the normal course of healing the respiratory tract after acute bronchitis takes weeks. While a wet cough with sputum or phlegm will last a few days, it’s replaced with a dry annoying cough that persists for up to 3 weeks. “As long as they’re feeling generally better, that cough is normal,” he said. “A virus has run roughshod over their airways and they need a long time to heal and the cough is part of the healing process. Think how long it takes to heal a cut on a finger.”
In an era of escalating antimicrobial resistance fueled by antibiotic overuse, it’s become increasingly important to reserve antibiotics for necessary cases. According to a recent World Health Organization call to action, “Uncontrolled antimicrobial resistance is expected to lower life expectancy and lead to unprecedented health expenditure and economic losses.”
That said, there is important clinical work to be done to determine if there is a limited role for antibiotics in patients with cough, perhaps based on age and baseline severity. “Serious cough symptoms and how to treat them properly needs to be studied more, perhaps in a randomized clinical trial as this study was observational and there haven’t been any randomized trials looking at this issue since about 2012,” Dr. Merenstein said.
This research was funded by the Agency for Healthcare Research and Quality. The authors have no conflicts of interest to declare. Dr. Linder reported stock ownership in pharmaceutical companies but none that make antibiotics or other infectious disease drugs.
Antibiotics had no measurable effect on the severity or duration of coughs due to acute lower respiratory tract infection (LRTI, or acute bronchitis), a large prospective study found.
In fact, those receiving an antibiotic in the primary- and urgent-care setting had a small but significant increase in overall length of illness (17.5 vs 15.9 days; P = .05) — largely because patients with longer illness before the index visit were more likely to receive these drugs. The study adds further support for reducing the prescription of antibiotics for LRTIs.
“Importantly, the pathogen data demonstrated that the length of time until illness resolution for those with bacterial infection was the same as for those not receiving an antibiotic versus those receiving one (17.3 vs 17.4 days),” researchers led by Daniel J. Merenstein, MD, a professor and director of research programs, family medicine, at Georgetown University Medical Center in Washington, wrote in the Journal of General Internal Medicine (doi: 10.1007/s11606-024-08758-y).
Patients believed an antibiotic would shorten their illness by an average of about 4 days, from 13.4 days to 9.7 days, whereas the average duration of all coughs was more than 2 weeks regardless of pathogen type or receipt of an antibiotic.
“Patients had unrealistic expectations regarding the duration of LRTI and the effect of antibiotics, which should be the target of antibiotic stewardship efforts,” the group wrote.
LRTIs can, however, be dangerous, with 3%-5% progressing to pneumonia, “but not everyone has easy access at an initial visit to an x-ray, which may be the reason clinicians still give antibiotics without any other evidence of a bacterial infection,” Dr. Merenstein said in a news release. “Patients have come to expect antibiotics for a cough, even if it doesn’t help. Basic symptom-relieving medications plus time bring a resolution to most people’s infections.”
The authors noted that cough is the most common reason for an ambulatory care visit, accounting for 2.7 million outpatient visits and more than 4 million emergency department visits annually.
Risks
Overuse of antibiotics can result in dizziness, nausea, diarrhea, and rash, along with a roughly 4% chance of serious adverse effects including anaphylaxis; Stevens-Johnson syndrome, a serious skin and mucous membrane disorder; and Clostridioides difficile-associated diarrhea.
An estimated half of all antibiotic prescriptions for acute respiratory conditions are unnecessary. Before the COVID-19 pandemic, antibiotics were prescribed about 70% of the time for a diagnosis of uncomplicated cough and LRTI. The viral pandemic did not change this practice according to a meta-analysis of 130 studies showing that 78% of COVID-19 patients were prescribed an antibiotic.
The study
The study looked at a cohort of 718 patients, with a mean age of 38.9 years, 65.3% female, of whom 207 received an antibiotic and 511 did not. Of those with baseline data, 29% had an antibiotic prescribed at baseline, the most common (in 85%) being amoxicillin-clavulanate, azithromycin, doxycycline, and amoxicillin. Antibiotics had no effect on the duration or overall severity of cough in viral, bacterial, or mixed infections. Receipt of an antibiotic did, however, reduce the likelihood of a follow-up visit: 14.1% vs 8.2% (adjusted odds ratio, 0.47; 95% confidence interval, 0.26-0.84) — perhaps because it removed the motivation for seeking another consultation. Antibiotic recipients were more likely to receive a systemic corticosteroid (31.9% vs 4.5%, P <.001) and were also more likely to receive an albuterol inhaler (22.7% vs 7.6%, P <.001).
Jeffrey A. Linder, MD, MPH, a primary care physician and chief of internal medicine and geriatrics at Northwestern University Feinberg School of Medicine in Chicago, agrees that in the vast majority of LRTIs — usually acute bronchitis — antibiotics do not speed the healing process. “Forty years of research show that antibiotics do not make acute bronchitis go away any faster,” Dr. Linder, who was not involved in the current study, said in an interview. “There’s even growing evidence that a lot of pneumonia is viral as well, and 10 or 20 years from now we may often not be giving antibiotics for pneumonia because we’ll be able to see better if it’s caused by a virus.”
A large 2018 review by Dr. Linder and associates reported that 46% of antibiotics were prescribed without any infection-related diagnosis code and 20% without an office visit.
Dr. Linder routinely informs patients requesting an antibiotic about the risks of putting an ineffective chemical into their body. “I stress that it can cause rash and other allergic reactions, and even promote C diff infection,” he said. “And I also say it messes with the good bacteria in the microbiome, and they usually come around.”
Patients need to know, Dr. Linder added, that the normal course of healing the respiratory tract after acute bronchitis takes weeks. While a wet cough with sputum or phlegm will last a few days, it’s replaced with a dry annoying cough that persists for up to 3 weeks. “As long as they’re feeling generally better, that cough is normal,” he said. “A virus has run roughshod over their airways and they need a long time to heal and the cough is part of the healing process. Think how long it takes to heal a cut on a finger.”
In an era of escalating antimicrobial resistance fueled by antibiotic overuse, it’s become increasingly important to reserve antibiotics for necessary cases. According to a recent World Health Organization call to action, “Uncontrolled antimicrobial resistance is expected to lower life expectancy and lead to unprecedented health expenditure and economic losses.”
That said, there is important clinical work to be done to determine if there is a limited role for antibiotics in patients with cough, perhaps based on age and baseline severity. “Serious cough symptoms and how to treat them properly needs to be studied more, perhaps in a randomized clinical trial as this study was observational and there haven’t been any randomized trials looking at this issue since about 2012,” Dr. Merenstein said.
This research was funded by the Agency for Healthcare Research and Quality. The authors have no conflicts of interest to declare. Dr. Linder reported stock ownership in pharmaceutical companies but none that make antibiotics or other infectious disease drugs.
Antibiotics had no measurable effect on the severity or duration of coughs due to acute lower respiratory tract infection (LRTI, or acute bronchitis), a large prospective study found.
In fact, those receiving an antibiotic in the primary- and urgent-care setting had a small but significant increase in overall length of illness (17.5 vs 15.9 days; P = .05) — largely because patients with longer illness before the index visit were more likely to receive these drugs. The study adds further support for reducing the prescription of antibiotics for LRTIs.
“Importantly, the pathogen data demonstrated that the length of time until illness resolution for those with bacterial infection was the same as for those not receiving an antibiotic versus those receiving one (17.3 vs 17.4 days),” researchers led by Daniel J. Merenstein, MD, a professor and director of research programs, family medicine, at Georgetown University Medical Center in Washington, wrote in the Journal of General Internal Medicine (doi: 10.1007/s11606-024-08758-y).
Patients believed an antibiotic would shorten their illness by an average of about 4 days, from 13.4 days to 9.7 days, whereas the average duration of all coughs was more than 2 weeks regardless of pathogen type or receipt of an antibiotic.
“Patients had unrealistic expectations regarding the duration of LRTI and the effect of antibiotics, which should be the target of antibiotic stewardship efforts,” the group wrote.
LRTIs can, however, be dangerous, with 3%-5% progressing to pneumonia, “but not everyone has easy access at an initial visit to an x-ray, which may be the reason clinicians still give antibiotics without any other evidence of a bacterial infection,” Dr. Merenstein said in a news release. “Patients have come to expect antibiotics for a cough, even if it doesn’t help. Basic symptom-relieving medications plus time bring a resolution to most people’s infections.”
The authors noted that cough is the most common reason for an ambulatory care visit, accounting for 2.7 million outpatient visits and more than 4 million emergency department visits annually.
Risks
Overuse of antibiotics can result in dizziness, nausea, diarrhea, and rash, along with a roughly 4% chance of serious adverse effects including anaphylaxis; Stevens-Johnson syndrome, a serious skin and mucous membrane disorder; and Clostridioides difficile-associated diarrhea.
An estimated half of all antibiotic prescriptions for acute respiratory conditions are unnecessary. Before the COVID-19 pandemic, antibiotics were prescribed about 70% of the time for a diagnosis of uncomplicated cough and LRTI. The viral pandemic did not change this practice according to a meta-analysis of 130 studies showing that 78% of COVID-19 patients were prescribed an antibiotic.
The study
The study looked at a cohort of 718 patients, with a mean age of 38.9 years, 65.3% female, of whom 207 received an antibiotic and 511 did not. Of those with baseline data, 29% had an antibiotic prescribed at baseline, the most common (in 85%) being amoxicillin-clavulanate, azithromycin, doxycycline, and amoxicillin. Antibiotics had no effect on the duration or overall severity of cough in viral, bacterial, or mixed infections. Receipt of an antibiotic did, however, reduce the likelihood of a follow-up visit: 14.1% vs 8.2% (adjusted odds ratio, 0.47; 95% confidence interval, 0.26-0.84) — perhaps because it removed the motivation for seeking another consultation. Antibiotic recipients were more likely to receive a systemic corticosteroid (31.9% vs 4.5%, P <.001) and were also more likely to receive an albuterol inhaler (22.7% vs 7.6%, P <.001).
Jeffrey A. Linder, MD, MPH, a primary care physician and chief of internal medicine and geriatrics at Northwestern University Feinberg School of Medicine in Chicago, agrees that in the vast majority of LRTIs — usually acute bronchitis — antibiotics do not speed the healing process. “Forty years of research show that antibiotics do not make acute bronchitis go away any faster,” Dr. Linder, who was not involved in the current study, said in an interview. “There’s even growing evidence that a lot of pneumonia is viral as well, and 10 or 20 years from now we may often not be giving antibiotics for pneumonia because we’ll be able to see better if it’s caused by a virus.”
A large 2018 review by Dr. Linder and associates reported that 46% of antibiotics were prescribed without any infection-related diagnosis code and 20% without an office visit.
Dr. Linder routinely informs patients requesting an antibiotic about the risks of putting an ineffective chemical into their body. “I stress that it can cause rash and other allergic reactions, and even promote C diff infection,” he said. “And I also say it messes with the good bacteria in the microbiome, and they usually come around.”
Patients need to know, Dr. Linder added, that the normal course of healing the respiratory tract after acute bronchitis takes weeks. While a wet cough with sputum or phlegm will last a few days, it’s replaced with a dry annoying cough that persists for up to 3 weeks. “As long as they’re feeling generally better, that cough is normal,” he said. “A virus has run roughshod over their airways and they need a long time to heal and the cough is part of the healing process. Think how long it takes to heal a cut on a finger.”
In an era of escalating antimicrobial resistance fueled by antibiotic overuse, it’s become increasingly important to reserve antibiotics for necessary cases. According to a recent World Health Organization call to action, “Uncontrolled antimicrobial resistance is expected to lower life expectancy and lead to unprecedented health expenditure and economic losses.”
That said, there is important clinical work to be done to determine if there is a limited role for antibiotics in patients with cough, perhaps based on age and baseline severity. “Serious cough symptoms and how to treat them properly needs to be studied more, perhaps in a randomized clinical trial as this study was observational and there haven’t been any randomized trials looking at this issue since about 2012,” Dr. Merenstein said.
This research was funded by the Agency for Healthcare Research and Quality. The authors have no conflicts of interest to declare. Dr. Linder reported stock ownership in pharmaceutical companies but none that make antibiotics or other infectious disease drugs.
FROM JOURNAL OF GENERAL INTERNAL MEDICINE
Salt Substitutes May Cut All-Cause And Cardiovascular Mortality
Large-scale salt substitution holds promise for reducing mortality with no elevated risk of serious harms, especially for older people at increased cardiovascular disease (CVD) risk, a systematic review and meta-analysis by Australian researchers suggested.
The study, published in Annals of Internal Medicine, adds more evidence that broad adoption of potassium-rich salt substitutes for food preparation could have a significant effect on population health.
Although the supporting evidence was of low certainty, the analysis of 16 international randomized controlled trials of various interventions with 35,321 participants found salt substitution to be associated with an absolute reduction of 5 in 1000 in all-cause mortality (confidence interval, –3 to –7) and 3 in 1000 in CVD mortality (CI, –1 to –5).
Led by Hannah Greenwood, BPsychSc, a cardiovascular researcher at the Institute for Evidence-Based Healthcare at Bond University in Gold Coast, Queensland, the investigators also found very low certainty evidence of an absolute reduction of 8 in 1000 in major adverse cardiovascular events (CI, 0 to –15), with a 1 in 1000 decrease in more serious adverse events (CI, 4 to –2) in the same population.
Seven of the 16 studies were conducted in China and Taiwan and seven were conducted in populations of older age (mean age 62 years) and/or at higher cardiovascular risk.
With most of the data deriving from populations of older age at higher-than-average CV risk and/or eating an Asian diet, the findings’ generalizability to populations following a Western diet and/or at average CVD risk is limited, the researchers acknowledged.
“We are less certain about the effects in Western, younger, and healthy population groups,” corresponding author Loai Albarqouni, MD, MSc, PhD, assistant professor at the Institute for Evidence-Based Healthcare, said in an interview. “While we saw small, clinically meaningful reductions in cardiovascular deaths and events, effectiveness should be better established before salt substitutes are recommended more broadly, though they are promising.”
In addition, he said, since the longest follow-up of substitute use was 10 years, “we can’t speak to benefits or harms beyond this time frame.”
Still, recommending salt substitutes may be an effective way for physicians to help patients reduce CVD risk, especially those hesitant to start medication, he said. “But physicians should take into account individual circumstances and other factors like kidney disease before recommending salt substitutes. Other non-drug methods of reducing cardiovascular risk, such as diet or exercise, may also be considered.”
Dr. Albarqouni stressed that sodium intake is not the only driver of CVD and reducing intake is just one piece of the puzzle. He cautioned that substitutes themselves can contain high levels of sodium, “so if people are using them in large volumes, they may still present similar risks to the sodium in regular salt.”
While the substitutes appear safe as evidenced by low incidence of hyperkalemia or renal dysfunction, the evidence is scarce, heterogeneous, and weak, the authors stressed.
“They can pose a health risk among people who have kidney disease, diabetes, and heart failure or who take certain medications, including ACE inhibitors and potassium-sparing diuretics,” said Emma Laing, PhD, RDN, director of dietetics at the University of Georgia in Athens. And while their salty flavor makes these a reasonable alternate to sodium chloride, “the downsides include a higher cost and bitter or metallic taste in high amounts. These salt substitutes tend to be better accepted by patients if they contain less than 30% potassium chloride.”
She noted that flavorful salt-free spices, herbs, lemon and lime juices, and vinegars can be effective in lowering dietary sodium when used in lieu of cooking salt.
In similar findings, a recent Chinese study of elderly normotensive people in residential care facilities observed a decrease in the incidence of hypertension with salt substitution.
Approximately one-third of otherwise health individuals are salt-sensitive, rising to more than 50% those with hypertension, and excessive salt intake is estimated to be responsible for nearly 5 million deaths per year globally.
How much impact could household food preparation with salt substitutes really have in North America where sodium consumption is largely driven by processed and takeout food? “While someone may make the switch to a salt substitute for home cooking, their sodium intake might still be very high if a lot of processed or takeaway foods are eaten,” Dr. Albarqouni said. “To see large population impacts, we will likely need policy and institutional-level change as to how sodium is used in food processing, alongside individuals’ switching from regular salt to salt substitutes.”
In agreement, an accompanying editorial by researchers from the universities of Sydney, New South Wales, and California, San Diego, noted the failure of governments and industry to address the World Health Organization’s call for a 30% reduction in global sodium consumption by 2025. With hypertension a major global health burden, the editorialists, led by J. Jaime Miranda, MD, MSc, PhD, of the Sydney School of Public Health at the University of Sydney, believe salt substitutes could be an accessible path toward that goal for food production companies.
“Although the benefits of reducing salt intake have been known for decades, little progress has been made in the quest to lower salt intake on the industry and commercial fronts with existing regulatory tools,” they wrote. “Consequently, we must turn our attention to effective evidence-based alternatives, such as the use of potassium-enriched salts.”
Given the high rates of nonadherence to antihypertensive medication, nonpharmacologic measures to improve blood pressure control are required, they added. “Expanding the routine use of potassium-enriched salts across households and the food industry would benefit not only persons with existing hypertension but all members of the household and communities. An entire shift of the population’s blood pressure curve is possible.”
The study authors called for research to determine the cost-effectiveness of salt substitution in older Asian populations and its efficacy in groups at average cardiovascular risk or following a Western diet.
This research was supported by the National Health and Medical Research Council of Australia and an Australian Government Research Training Program Scholarship. Coauthor Dr. Lauren Ball disclosed support from the National Health and Medical Research Council of Australia. Ms. Hannah Greenwood received support from the Australian government and Bond University. Dr. Miranda disclosed numerous consulting, advisory, and research-funding relationships with government, academic, philanthropic, and nonprofit organizations. Editorial commentator Dr. Kathy Trieu reported research support from multiple government and non-profit research-funding organizations. Dr. Cheryl Anderson disclosed ties to Weight Watchers and the McCormick Science Institute, as well support from numerous government, academic, and nonprofit research-funding agencies.
Large-scale salt substitution holds promise for reducing mortality with no elevated risk of serious harms, especially for older people at increased cardiovascular disease (CVD) risk, a systematic review and meta-analysis by Australian researchers suggested.
The study, published in Annals of Internal Medicine, adds more evidence that broad adoption of potassium-rich salt substitutes for food preparation could have a significant effect on population health.
Although the supporting evidence was of low certainty, the analysis of 16 international randomized controlled trials of various interventions with 35,321 participants found salt substitution to be associated with an absolute reduction of 5 in 1000 in all-cause mortality (confidence interval, –3 to –7) and 3 in 1000 in CVD mortality (CI, –1 to –5).
Led by Hannah Greenwood, BPsychSc, a cardiovascular researcher at the Institute for Evidence-Based Healthcare at Bond University in Gold Coast, Queensland, the investigators also found very low certainty evidence of an absolute reduction of 8 in 1000 in major adverse cardiovascular events (CI, 0 to –15), with a 1 in 1000 decrease in more serious adverse events (CI, 4 to –2) in the same population.
Seven of the 16 studies were conducted in China and Taiwan and seven were conducted in populations of older age (mean age 62 years) and/or at higher cardiovascular risk.
With most of the data deriving from populations of older age at higher-than-average CV risk and/or eating an Asian diet, the findings’ generalizability to populations following a Western diet and/or at average CVD risk is limited, the researchers acknowledged.
“We are less certain about the effects in Western, younger, and healthy population groups,” corresponding author Loai Albarqouni, MD, MSc, PhD, assistant professor at the Institute for Evidence-Based Healthcare, said in an interview. “While we saw small, clinically meaningful reductions in cardiovascular deaths and events, effectiveness should be better established before salt substitutes are recommended more broadly, though they are promising.”
In addition, he said, since the longest follow-up of substitute use was 10 years, “we can’t speak to benefits or harms beyond this time frame.”
Still, recommending salt substitutes may be an effective way for physicians to help patients reduce CVD risk, especially those hesitant to start medication, he said. “But physicians should take into account individual circumstances and other factors like kidney disease before recommending salt substitutes. Other non-drug methods of reducing cardiovascular risk, such as diet or exercise, may also be considered.”
Dr. Albarqouni stressed that sodium intake is not the only driver of CVD and reducing intake is just one piece of the puzzle. He cautioned that substitutes themselves can contain high levels of sodium, “so if people are using them in large volumes, they may still present similar risks to the sodium in regular salt.”
While the substitutes appear safe as evidenced by low incidence of hyperkalemia or renal dysfunction, the evidence is scarce, heterogeneous, and weak, the authors stressed.
“They can pose a health risk among people who have kidney disease, diabetes, and heart failure or who take certain medications, including ACE inhibitors and potassium-sparing diuretics,” said Emma Laing, PhD, RDN, director of dietetics at the University of Georgia in Athens. And while their salty flavor makes these a reasonable alternate to sodium chloride, “the downsides include a higher cost and bitter or metallic taste in high amounts. These salt substitutes tend to be better accepted by patients if they contain less than 30% potassium chloride.”
She noted that flavorful salt-free spices, herbs, lemon and lime juices, and vinegars can be effective in lowering dietary sodium when used in lieu of cooking salt.
In similar findings, a recent Chinese study of elderly normotensive people in residential care facilities observed a decrease in the incidence of hypertension with salt substitution.
Approximately one-third of otherwise health individuals are salt-sensitive, rising to more than 50% those with hypertension, and excessive salt intake is estimated to be responsible for nearly 5 million deaths per year globally.
How much impact could household food preparation with salt substitutes really have in North America where sodium consumption is largely driven by processed and takeout food? “While someone may make the switch to a salt substitute for home cooking, their sodium intake might still be very high if a lot of processed or takeaway foods are eaten,” Dr. Albarqouni said. “To see large population impacts, we will likely need policy and institutional-level change as to how sodium is used in food processing, alongside individuals’ switching from regular salt to salt substitutes.”
In agreement, an accompanying editorial by researchers from the universities of Sydney, New South Wales, and California, San Diego, noted the failure of governments and industry to address the World Health Organization’s call for a 30% reduction in global sodium consumption by 2025. With hypertension a major global health burden, the editorialists, led by J. Jaime Miranda, MD, MSc, PhD, of the Sydney School of Public Health at the University of Sydney, believe salt substitutes could be an accessible path toward that goal for food production companies.
“Although the benefits of reducing salt intake have been known for decades, little progress has been made in the quest to lower salt intake on the industry and commercial fronts with existing regulatory tools,” they wrote. “Consequently, we must turn our attention to effective evidence-based alternatives, such as the use of potassium-enriched salts.”
Given the high rates of nonadherence to antihypertensive medication, nonpharmacologic measures to improve blood pressure control are required, they added. “Expanding the routine use of potassium-enriched salts across households and the food industry would benefit not only persons with existing hypertension but all members of the household and communities. An entire shift of the population’s blood pressure curve is possible.”
The study authors called for research to determine the cost-effectiveness of salt substitution in older Asian populations and its efficacy in groups at average cardiovascular risk or following a Western diet.
This research was supported by the National Health and Medical Research Council of Australia and an Australian Government Research Training Program Scholarship. Coauthor Dr. Lauren Ball disclosed support from the National Health and Medical Research Council of Australia. Ms. Hannah Greenwood received support from the Australian government and Bond University. Dr. Miranda disclosed numerous consulting, advisory, and research-funding relationships with government, academic, philanthropic, and nonprofit organizations. Editorial commentator Dr. Kathy Trieu reported research support from multiple government and non-profit research-funding organizations. Dr. Cheryl Anderson disclosed ties to Weight Watchers and the McCormick Science Institute, as well support from numerous government, academic, and nonprofit research-funding agencies.
Large-scale salt substitution holds promise for reducing mortality with no elevated risk of serious harms, especially for older people at increased cardiovascular disease (CVD) risk, a systematic review and meta-analysis by Australian researchers suggested.
The study, published in Annals of Internal Medicine, adds more evidence that broad adoption of potassium-rich salt substitutes for food preparation could have a significant effect on population health.
Although the supporting evidence was of low certainty, the analysis of 16 international randomized controlled trials of various interventions with 35,321 participants found salt substitution to be associated with an absolute reduction of 5 in 1000 in all-cause mortality (confidence interval, –3 to –7) and 3 in 1000 in CVD mortality (CI, –1 to –5).
Led by Hannah Greenwood, BPsychSc, a cardiovascular researcher at the Institute for Evidence-Based Healthcare at Bond University in Gold Coast, Queensland, the investigators also found very low certainty evidence of an absolute reduction of 8 in 1000 in major adverse cardiovascular events (CI, 0 to –15), with a 1 in 1000 decrease in more serious adverse events (CI, 4 to –2) in the same population.
Seven of the 16 studies were conducted in China and Taiwan and seven were conducted in populations of older age (mean age 62 years) and/or at higher cardiovascular risk.
With most of the data deriving from populations of older age at higher-than-average CV risk and/or eating an Asian diet, the findings’ generalizability to populations following a Western diet and/or at average CVD risk is limited, the researchers acknowledged.
“We are less certain about the effects in Western, younger, and healthy population groups,” corresponding author Loai Albarqouni, MD, MSc, PhD, assistant professor at the Institute for Evidence-Based Healthcare, said in an interview. “While we saw small, clinically meaningful reductions in cardiovascular deaths and events, effectiveness should be better established before salt substitutes are recommended more broadly, though they are promising.”
In addition, he said, since the longest follow-up of substitute use was 10 years, “we can’t speak to benefits or harms beyond this time frame.”
Still, recommending salt substitutes may be an effective way for physicians to help patients reduce CVD risk, especially those hesitant to start medication, he said. “But physicians should take into account individual circumstances and other factors like kidney disease before recommending salt substitutes. Other non-drug methods of reducing cardiovascular risk, such as diet or exercise, may also be considered.”
Dr. Albarqouni stressed that sodium intake is not the only driver of CVD and reducing intake is just one piece of the puzzle. He cautioned that substitutes themselves can contain high levels of sodium, “so if people are using them in large volumes, they may still present similar risks to the sodium in regular salt.”
While the substitutes appear safe as evidenced by low incidence of hyperkalemia or renal dysfunction, the evidence is scarce, heterogeneous, and weak, the authors stressed.
“They can pose a health risk among people who have kidney disease, diabetes, and heart failure or who take certain medications, including ACE inhibitors and potassium-sparing diuretics,” said Emma Laing, PhD, RDN, director of dietetics at the University of Georgia in Athens. And while their salty flavor makes these a reasonable alternate to sodium chloride, “the downsides include a higher cost and bitter or metallic taste in high amounts. These salt substitutes tend to be better accepted by patients if they contain less than 30% potassium chloride.”
She noted that flavorful salt-free spices, herbs, lemon and lime juices, and vinegars can be effective in lowering dietary sodium when used in lieu of cooking salt.
In similar findings, a recent Chinese study of elderly normotensive people in residential care facilities observed a decrease in the incidence of hypertension with salt substitution.
Approximately one-third of otherwise health individuals are salt-sensitive, rising to more than 50% those with hypertension, and excessive salt intake is estimated to be responsible for nearly 5 million deaths per year globally.
How much impact could household food preparation with salt substitutes really have in North America where sodium consumption is largely driven by processed and takeout food? “While someone may make the switch to a salt substitute for home cooking, their sodium intake might still be very high if a lot of processed or takeaway foods are eaten,” Dr. Albarqouni said. “To see large population impacts, we will likely need policy and institutional-level change as to how sodium is used in food processing, alongside individuals’ switching from regular salt to salt substitutes.”
In agreement, an accompanying editorial by researchers from the universities of Sydney, New South Wales, and California, San Diego, noted the failure of governments and industry to address the World Health Organization’s call for a 30% reduction in global sodium consumption by 2025. With hypertension a major global health burden, the editorialists, led by J. Jaime Miranda, MD, MSc, PhD, of the Sydney School of Public Health at the University of Sydney, believe salt substitutes could be an accessible path toward that goal for food production companies.
“Although the benefits of reducing salt intake have been known for decades, little progress has been made in the quest to lower salt intake on the industry and commercial fronts with existing regulatory tools,” they wrote. “Consequently, we must turn our attention to effective evidence-based alternatives, such as the use of potassium-enriched salts.”
Given the high rates of nonadherence to antihypertensive medication, nonpharmacologic measures to improve blood pressure control are required, they added. “Expanding the routine use of potassium-enriched salts across households and the food industry would benefit not only persons with existing hypertension but all members of the household and communities. An entire shift of the population’s blood pressure curve is possible.”
The study authors called for research to determine the cost-effectiveness of salt substitution in older Asian populations and its efficacy in groups at average cardiovascular risk or following a Western diet.
This research was supported by the National Health and Medical Research Council of Australia and an Australian Government Research Training Program Scholarship. Coauthor Dr. Lauren Ball disclosed support from the National Health and Medical Research Council of Australia. Ms. Hannah Greenwood received support from the Australian government and Bond University. Dr. Miranda disclosed numerous consulting, advisory, and research-funding relationships with government, academic, philanthropic, and nonprofit organizations. Editorial commentator Dr. Kathy Trieu reported research support from multiple government and non-profit research-funding organizations. Dr. Cheryl Anderson disclosed ties to Weight Watchers and the McCormick Science Institute, as well support from numerous government, academic, and nonprofit research-funding agencies.
FROM ANNALS OF INTERNAL MEDICINE
Premenstrual Disorders and Perinatal Depression: A Two-Way Street
Premenstrual disorders (PMDs) and perinatal depression (PND) appear to have a bidirectional association, a Swedish national registry-based analysis found.
In women with PND, 2.9% had PMDs before pregnancy vs 0.6% in a matched cohort of unaffected women, according to an international team led by Quian Yang, MD, PhD, of the Institute of Environmental Medicine at the Karolinska Institutet in Stockholm, Sweden. Their study appears in PLoS Medicine.
“Preconception and maternity care providers should be aware of the risk of developing perinatal depression among women with a history of PMDs,” Dr. Yang said in an interview. “Healthcare providers may inform women with perinatal depression about the potential risk of PMDs when menstruation returns after childbirth.” She recommended screening as part of routine perinatal care to identify and treat the condition at an early stage. Counseling and medication may help prevent adverse consequences.
In other findings, the correlation with PMDs held for both prenatal and postnatal depression, regardless of any history of psychiatric disorders and also in full-sister comparisons, the authors noted, with a stronger correlation in the absence of psychiatric disorders (P for interaction <.001).
“Interestingly, we noted a stronger association between PMDs and subsequent PND than the association in the other direction, Dr. Yang said. And although many experience PMD symptom onset in adolescence, symptom worsening has been reported with increasing age and parity. “It is possible that women with milder premenstrual symptoms experienced worse symptoms after pregnancy and are therefore first diagnosed with PMD after pregnancy,” the authors hypothesized.
Both PMDs and PND share depressive symptomatology and onset coinciding with hormonal fluctuations, particularly estrogen and progesterone, suggesting a shared etiology, Dr. Yang explained. “It’s plausible that an abnormal response to natural hormone fluctuations predisposes women to both PMDs and PND. However, the underlying mechanism is complex, and future research is needed to reveal the underlying etiology.”
Affecting a majority of women of reproductive age to some degree, PMDs in certain women can cause significant functional impairment and, when severe, have been linked to increased risks of accidents and suicidal behavior. The psychological symptoms of the more serious form, premenstrual dysphoric disorder, for example, are associated with a 50%-78% lifetime risk for psychiatric disorders, including major depressive, dysthymic, seasonal affective, and generalized anxiety disorders, as well as suicidality.
Mood disorders are common in pregnancy and the postpartum period.
The Swedish Study
In 1.8 million singleton pregnancies in Sweden during 2001-2018, the investigators identified 84,949 women with PND and 849,482 unaffected women and individually matched them 10:1 by age and calendar year. Incident PND and PMDs were identified through clinical diagnoses or prescribed medications, and adjustment was made for such demographics as country of birth, educational level, region of residency, and cohabitation status.
In an initial matched-cohort case-control study with a mean follow-up of 6.9 years, PMDs were associated with a nearly five times higher risk of subsequent PND (odds ratio, 4.76; 95% CI, 4.52-5.01; P <.001).
In another matched cohort with a mean follow-up of 7.0 years, there were 4227 newly diagnosed PMDs in women with PND (incidence rate [IR], 7.6/1000 person-years) and 21,326 among controls (IR, 3.8/1000). Compared with matched controls, women with PND were at almost twice the risk of subsequent PMDs (hazard ratio, 1.81; 95% CI, 1.74-1.88; P <.001).
Commenting on the study but not involved in it, Bernard L. Harlow, PhD, a professor of epidemiology at Boston University School of Public Health in Massachusetts who specializes in epidemiologic studies of female reproductive disorders, said he was not surprised at these findings, which clearly support the need for PMD screening in mothers-to-be. “Anything that is easy to measure and noninvasive that will minimize the risk of postpartum depression should be part of the standard of care during the prenatal period.” As to safety: If treatment is indicated, he added, “studies have shown that the risk to the mother and child is much greater if the mother’s mood disorder is not controlled than any risk to the baby due to depression treatment.” But though PMDs may be predictive of PND, there are still barriers to actual PND care. A 2023 analysis reported that 65% of mothers-to-be who screened positive for metal health comorbidities were not referred for treatment.
Dr. Yang and colleagues acknowledged that their findings may not be generalizable to mild forms of these disorders since the data were based on clinical diagnoses and prescriptions.
The study was supported by the Chinese Scholarship Council, the Swedish Research Council for Health, Working Life and Welfare, the Karolinska Institutet, and the Icelandic Research Fund. The authors and Dr. Harlow had no relevant competing interests to disclose.
Premenstrual disorders (PMDs) and perinatal depression (PND) appear to have a bidirectional association, a Swedish national registry-based analysis found.
In women with PND, 2.9% had PMDs before pregnancy vs 0.6% in a matched cohort of unaffected women, according to an international team led by Quian Yang, MD, PhD, of the Institute of Environmental Medicine at the Karolinska Institutet in Stockholm, Sweden. Their study appears in PLoS Medicine.
“Preconception and maternity care providers should be aware of the risk of developing perinatal depression among women with a history of PMDs,” Dr. Yang said in an interview. “Healthcare providers may inform women with perinatal depression about the potential risk of PMDs when menstruation returns after childbirth.” She recommended screening as part of routine perinatal care to identify and treat the condition at an early stage. Counseling and medication may help prevent adverse consequences.
In other findings, the correlation with PMDs held for both prenatal and postnatal depression, regardless of any history of psychiatric disorders and also in full-sister comparisons, the authors noted, with a stronger correlation in the absence of psychiatric disorders (P for interaction <.001).
“Interestingly, we noted a stronger association between PMDs and subsequent PND than the association in the other direction, Dr. Yang said. And although many experience PMD symptom onset in adolescence, symptom worsening has been reported with increasing age and parity. “It is possible that women with milder premenstrual symptoms experienced worse symptoms after pregnancy and are therefore first diagnosed with PMD after pregnancy,” the authors hypothesized.
Both PMDs and PND share depressive symptomatology and onset coinciding with hormonal fluctuations, particularly estrogen and progesterone, suggesting a shared etiology, Dr. Yang explained. “It’s plausible that an abnormal response to natural hormone fluctuations predisposes women to both PMDs and PND. However, the underlying mechanism is complex, and future research is needed to reveal the underlying etiology.”
Affecting a majority of women of reproductive age to some degree, PMDs in certain women can cause significant functional impairment and, when severe, have been linked to increased risks of accidents and suicidal behavior. The psychological symptoms of the more serious form, premenstrual dysphoric disorder, for example, are associated with a 50%-78% lifetime risk for psychiatric disorders, including major depressive, dysthymic, seasonal affective, and generalized anxiety disorders, as well as suicidality.
Mood disorders are common in pregnancy and the postpartum period.
The Swedish Study
In 1.8 million singleton pregnancies in Sweden during 2001-2018, the investigators identified 84,949 women with PND and 849,482 unaffected women and individually matched them 10:1 by age and calendar year. Incident PND and PMDs were identified through clinical diagnoses or prescribed medications, and adjustment was made for such demographics as country of birth, educational level, region of residency, and cohabitation status.
In an initial matched-cohort case-control study with a mean follow-up of 6.9 years, PMDs were associated with a nearly five times higher risk of subsequent PND (odds ratio, 4.76; 95% CI, 4.52-5.01; P <.001).
In another matched cohort with a mean follow-up of 7.0 years, there were 4227 newly diagnosed PMDs in women with PND (incidence rate [IR], 7.6/1000 person-years) and 21,326 among controls (IR, 3.8/1000). Compared with matched controls, women with PND were at almost twice the risk of subsequent PMDs (hazard ratio, 1.81; 95% CI, 1.74-1.88; P <.001).
Commenting on the study but not involved in it, Bernard L. Harlow, PhD, a professor of epidemiology at Boston University School of Public Health in Massachusetts who specializes in epidemiologic studies of female reproductive disorders, said he was not surprised at these findings, which clearly support the need for PMD screening in mothers-to-be. “Anything that is easy to measure and noninvasive that will minimize the risk of postpartum depression should be part of the standard of care during the prenatal period.” As to safety: If treatment is indicated, he added, “studies have shown that the risk to the mother and child is much greater if the mother’s mood disorder is not controlled than any risk to the baby due to depression treatment.” But though PMDs may be predictive of PND, there are still barriers to actual PND care. A 2023 analysis reported that 65% of mothers-to-be who screened positive for metal health comorbidities were not referred for treatment.
Dr. Yang and colleagues acknowledged that their findings may not be generalizable to mild forms of these disorders since the data were based on clinical diagnoses and prescriptions.
The study was supported by the Chinese Scholarship Council, the Swedish Research Council for Health, Working Life and Welfare, the Karolinska Institutet, and the Icelandic Research Fund. The authors and Dr. Harlow had no relevant competing interests to disclose.
Premenstrual disorders (PMDs) and perinatal depression (PND) appear to have a bidirectional association, a Swedish national registry-based analysis found.
In women with PND, 2.9% had PMDs before pregnancy vs 0.6% in a matched cohort of unaffected women, according to an international team led by Quian Yang, MD, PhD, of the Institute of Environmental Medicine at the Karolinska Institutet in Stockholm, Sweden. Their study appears in PLoS Medicine.
“Preconception and maternity care providers should be aware of the risk of developing perinatal depression among women with a history of PMDs,” Dr. Yang said in an interview. “Healthcare providers may inform women with perinatal depression about the potential risk of PMDs when menstruation returns after childbirth.” She recommended screening as part of routine perinatal care to identify and treat the condition at an early stage. Counseling and medication may help prevent adverse consequences.
In other findings, the correlation with PMDs held for both prenatal and postnatal depression, regardless of any history of psychiatric disorders and also in full-sister comparisons, the authors noted, with a stronger correlation in the absence of psychiatric disorders (P for interaction <.001).
“Interestingly, we noted a stronger association between PMDs and subsequent PND than the association in the other direction, Dr. Yang said. And although many experience PMD symptom onset in adolescence, symptom worsening has been reported with increasing age and parity. “It is possible that women with milder premenstrual symptoms experienced worse symptoms after pregnancy and are therefore first diagnosed with PMD after pregnancy,” the authors hypothesized.
Both PMDs and PND share depressive symptomatology and onset coinciding with hormonal fluctuations, particularly estrogen and progesterone, suggesting a shared etiology, Dr. Yang explained. “It’s plausible that an abnormal response to natural hormone fluctuations predisposes women to both PMDs and PND. However, the underlying mechanism is complex, and future research is needed to reveal the underlying etiology.”
Affecting a majority of women of reproductive age to some degree, PMDs in certain women can cause significant functional impairment and, when severe, have been linked to increased risks of accidents and suicidal behavior. The psychological symptoms of the more serious form, premenstrual dysphoric disorder, for example, are associated with a 50%-78% lifetime risk for psychiatric disorders, including major depressive, dysthymic, seasonal affective, and generalized anxiety disorders, as well as suicidality.
Mood disorders are common in pregnancy and the postpartum period.
The Swedish Study
In 1.8 million singleton pregnancies in Sweden during 2001-2018, the investigators identified 84,949 women with PND and 849,482 unaffected women and individually matched them 10:1 by age and calendar year. Incident PND and PMDs were identified through clinical diagnoses or prescribed medications, and adjustment was made for such demographics as country of birth, educational level, region of residency, and cohabitation status.
In an initial matched-cohort case-control study with a mean follow-up of 6.9 years, PMDs were associated with a nearly five times higher risk of subsequent PND (odds ratio, 4.76; 95% CI, 4.52-5.01; P <.001).
In another matched cohort with a mean follow-up of 7.0 years, there were 4227 newly diagnosed PMDs in women with PND (incidence rate [IR], 7.6/1000 person-years) and 21,326 among controls (IR, 3.8/1000). Compared with matched controls, women with PND were at almost twice the risk of subsequent PMDs (hazard ratio, 1.81; 95% CI, 1.74-1.88; P <.001).
Commenting on the study but not involved in it, Bernard L. Harlow, PhD, a professor of epidemiology at Boston University School of Public Health in Massachusetts who specializes in epidemiologic studies of female reproductive disorders, said he was not surprised at these findings, which clearly support the need for PMD screening in mothers-to-be. “Anything that is easy to measure and noninvasive that will minimize the risk of postpartum depression should be part of the standard of care during the prenatal period.” As to safety: If treatment is indicated, he added, “studies have shown that the risk to the mother and child is much greater if the mother’s mood disorder is not controlled than any risk to the baby due to depression treatment.” But though PMDs may be predictive of PND, there are still barriers to actual PND care. A 2023 analysis reported that 65% of mothers-to-be who screened positive for metal health comorbidities were not referred for treatment.
Dr. Yang and colleagues acknowledged that their findings may not be generalizable to mild forms of these disorders since the data were based on clinical diagnoses and prescriptions.
The study was supported by the Chinese Scholarship Council, the Swedish Research Council for Health, Working Life and Welfare, the Karolinska Institutet, and the Icelandic Research Fund. The authors and Dr. Harlow had no relevant competing interests to disclose.
FROM PLOS MEDICINE
Myomectomy best for avoiding reintervention after fibroid procedures
Reintervention rates after uterus-preserving surgery for leiomyomata were lowest after vaginal myomectomy, the most frequent among four therapeutic approaches, a large cohort study reported.
Accounting for censoring, the 7-year reintervention risk for vaginal myomectomy was 20.6%, followed by uterine artery embolization (26%), endometrial ablation (35.5%), and hysteroscopic myomectomy (37%).
Hysterectomies accounted for 63.2% of reinterventions according to lead author Susanna D. Mitro, PhD, a research scientist in the Division of Research and Department of Obstetrics and Gynecology at Kaiser Permanente Northern California, Oakland, and colleagues.
Risk did not vary by body mass index, race/ethnicity, or Neighborhood Deprivation Index, but did vary for some procedures by age and parity,
These findings generally align with earlier research and “illustrate clinically meaningful long-term differences in reintervention rates after a first uterus-preserving treatment for leiomyomas,” the researchers wrote in Obstetrics & Gynecology.
The Study
In a cohort of 10,324 patients ages 18-50, 19.9% were Asian, 21.2% Black, 21.3% Hispanic, and 32.5% White, with 5.2% of other races and ethnicities. The most affected age groups were 41-45 and 46-50 years. All participants underwent a first uterus-preserving procedure after leiomyoma diagnosis according to 2009-2021 electronic health records at Kaiser Permanente Northern California.
Reintervention referred to a second uterus-preserving procedure or hysterectomy. Median follow-up was 3.8 years (interquartile range, 1.8-7.4 years), and the proportions of index procedures were as follows: 18% (1857) for hysteroscopic myomectomy; 16.2% (1669) for uterine artery embolization; 21.4% (2211) for endometrial ablations; and 44.4% (4,587) for myomectomy.
Reintervention rates were higher in younger patients after uterine artery embolization, with patients ages 18-35 at the index procedure having 1.4-3.7 times greater reintervention rates than patients ages 46-50 years. Reintervention rates for hysteroscopic myomectomy varied by parity, with multiparous patients at 35% greater risk than their nulliparous counterparts.
On the age issue, the authors note that symptom recurrence may be less common in older patients, perhaps because of the onset of menopause. “Alternatively, findings may be explained by age-specific care strategies: Older patients experiencing symptom recurrence may prefer to wait until the onset of menopause rather than pursuing another surgical treatment,” they wrote.
A recent study with 7 years’ follow-up reported a 2.4 times greater risk of hysterectomy after uterine artery embolization versus myomectomy. Reintervention rates may be lower after myomectomy because otherwise asymptomatic patients pursue myomectomy to treat infertility, the authors wrote. Alternatively, myomectomy may more completely remove leiomyomas.
These common benign tumors take a toll on healthcare resources, in 2012 costing up to $9.4 billion annually (in 2010 dollars) for related surgeries, medications, and procedures. Leiomyomas are reportedly the most frequent reason for hysterectomy.
Robust data on the optimal therapeutic approach to fibroids have been sparse, however, with a 2017 comparative-effectiveness review from the Agency for Healthcare Research and Quality reporting that evidence on leiomyoma treatments was insufficient to guide clinical care. Few well-conducted trials of leiomyoma treatment have directly compared different treatment options, the authors noted.
The rate of myomectomy is reported to be 9.2 per 10,000 woman-years in Black women and 1.3 per 10,000 woman years in White women, and the recurrence rate after myomectomy can be as great as 60% when patients are followed up to 5 years.
The authors said their findings “may be a reference to discuss expectations for treatment outcomes when choosing initial uterus-preserving treatment for leiomyomas, especially for patients receiving treatment years before the likely onset of menopause.”
This research was supported by the National Institute of Arthritis and Musculoskeletal and Skin Diseases of the National Institutes of Health. Coauthor Dr. Lauren Wise is a paid consultant for AbbVie and has received in-kind donations from Swiss Precision Diagnostics and Kindara.com; she has also received payment from the Gates Foundation.
Reintervention rates after uterus-preserving surgery for leiomyomata were lowest after vaginal myomectomy, the most frequent among four therapeutic approaches, a large cohort study reported.
Accounting for censoring, the 7-year reintervention risk for vaginal myomectomy was 20.6%, followed by uterine artery embolization (26%), endometrial ablation (35.5%), and hysteroscopic myomectomy (37%).
Hysterectomies accounted for 63.2% of reinterventions according to lead author Susanna D. Mitro, PhD, a research scientist in the Division of Research and Department of Obstetrics and Gynecology at Kaiser Permanente Northern California, Oakland, and colleagues.
Risk did not vary by body mass index, race/ethnicity, or Neighborhood Deprivation Index, but did vary for some procedures by age and parity,
These findings generally align with earlier research and “illustrate clinically meaningful long-term differences in reintervention rates after a first uterus-preserving treatment for leiomyomas,” the researchers wrote in Obstetrics & Gynecology.
The Study
In a cohort of 10,324 patients ages 18-50, 19.9% were Asian, 21.2% Black, 21.3% Hispanic, and 32.5% White, with 5.2% of other races and ethnicities. The most affected age groups were 41-45 and 46-50 years. All participants underwent a first uterus-preserving procedure after leiomyoma diagnosis according to 2009-2021 electronic health records at Kaiser Permanente Northern California.
Reintervention referred to a second uterus-preserving procedure or hysterectomy. Median follow-up was 3.8 years (interquartile range, 1.8-7.4 years), and the proportions of index procedures were as follows: 18% (1857) for hysteroscopic myomectomy; 16.2% (1669) for uterine artery embolization; 21.4% (2211) for endometrial ablations; and 44.4% (4,587) for myomectomy.
Reintervention rates were higher in younger patients after uterine artery embolization, with patients ages 18-35 at the index procedure having 1.4-3.7 times greater reintervention rates than patients ages 46-50 years. Reintervention rates for hysteroscopic myomectomy varied by parity, with multiparous patients at 35% greater risk than their nulliparous counterparts.
On the age issue, the authors note that symptom recurrence may be less common in older patients, perhaps because of the onset of menopause. “Alternatively, findings may be explained by age-specific care strategies: Older patients experiencing symptom recurrence may prefer to wait until the onset of menopause rather than pursuing another surgical treatment,” they wrote.
A recent study with 7 years’ follow-up reported a 2.4 times greater risk of hysterectomy after uterine artery embolization versus myomectomy. Reintervention rates may be lower after myomectomy because otherwise asymptomatic patients pursue myomectomy to treat infertility, the authors wrote. Alternatively, myomectomy may more completely remove leiomyomas.
These common benign tumors take a toll on healthcare resources, in 2012 costing up to $9.4 billion annually (in 2010 dollars) for related surgeries, medications, and procedures. Leiomyomas are reportedly the most frequent reason for hysterectomy.
Robust data on the optimal therapeutic approach to fibroids have been sparse, however, with a 2017 comparative-effectiveness review from the Agency for Healthcare Research and Quality reporting that evidence on leiomyoma treatments was insufficient to guide clinical care. Few well-conducted trials of leiomyoma treatment have directly compared different treatment options, the authors noted.
The rate of myomectomy is reported to be 9.2 per 10,000 woman-years in Black women and 1.3 per 10,000 woman years in White women, and the recurrence rate after myomectomy can be as great as 60% when patients are followed up to 5 years.
The authors said their findings “may be a reference to discuss expectations for treatment outcomes when choosing initial uterus-preserving treatment for leiomyomas, especially for patients receiving treatment years before the likely onset of menopause.”
This research was supported by the National Institute of Arthritis and Musculoskeletal and Skin Diseases of the National Institutes of Health. Coauthor Dr. Lauren Wise is a paid consultant for AbbVie and has received in-kind donations from Swiss Precision Diagnostics and Kindara.com; she has also received payment from the Gates Foundation.
Reintervention rates after uterus-preserving surgery for leiomyomata were lowest after vaginal myomectomy, the most frequent among four therapeutic approaches, a large cohort study reported.
Accounting for censoring, the 7-year reintervention risk for vaginal myomectomy was 20.6%, followed by uterine artery embolization (26%), endometrial ablation (35.5%), and hysteroscopic myomectomy (37%).
Hysterectomies accounted for 63.2% of reinterventions according to lead author Susanna D. Mitro, PhD, a research scientist in the Division of Research and Department of Obstetrics and Gynecology at Kaiser Permanente Northern California, Oakland, and colleagues.
Risk did not vary by body mass index, race/ethnicity, or Neighborhood Deprivation Index, but did vary for some procedures by age and parity,
These findings generally align with earlier research and “illustrate clinically meaningful long-term differences in reintervention rates after a first uterus-preserving treatment for leiomyomas,” the researchers wrote in Obstetrics & Gynecology.
The Study
In a cohort of 10,324 patients ages 18-50, 19.9% were Asian, 21.2% Black, 21.3% Hispanic, and 32.5% White, with 5.2% of other races and ethnicities. The most affected age groups were 41-45 and 46-50 years. All participants underwent a first uterus-preserving procedure after leiomyoma diagnosis according to 2009-2021 electronic health records at Kaiser Permanente Northern California.
Reintervention referred to a second uterus-preserving procedure or hysterectomy. Median follow-up was 3.8 years (interquartile range, 1.8-7.4 years), and the proportions of index procedures were as follows: 18% (1857) for hysteroscopic myomectomy; 16.2% (1669) for uterine artery embolization; 21.4% (2211) for endometrial ablations; and 44.4% (4,587) for myomectomy.
Reintervention rates were higher in younger patients after uterine artery embolization, with patients ages 18-35 at the index procedure having 1.4-3.7 times greater reintervention rates than patients ages 46-50 years. Reintervention rates for hysteroscopic myomectomy varied by parity, with multiparous patients at 35% greater risk than their nulliparous counterparts.
On the age issue, the authors note that symptom recurrence may be less common in older patients, perhaps because of the onset of menopause. “Alternatively, findings may be explained by age-specific care strategies: Older patients experiencing symptom recurrence may prefer to wait until the onset of menopause rather than pursuing another surgical treatment,” they wrote.
A recent study with 7 years’ follow-up reported a 2.4 times greater risk of hysterectomy after uterine artery embolization versus myomectomy. Reintervention rates may be lower after myomectomy because otherwise asymptomatic patients pursue myomectomy to treat infertility, the authors wrote. Alternatively, myomectomy may more completely remove leiomyomas.
These common benign tumors take a toll on healthcare resources, in 2012 costing up to $9.4 billion annually (in 2010 dollars) for related surgeries, medications, and procedures. Leiomyomas are reportedly the most frequent reason for hysterectomy.
Robust data on the optimal therapeutic approach to fibroids have been sparse, however, with a 2017 comparative-effectiveness review from the Agency for Healthcare Research and Quality reporting that evidence on leiomyoma treatments was insufficient to guide clinical care. Few well-conducted trials of leiomyoma treatment have directly compared different treatment options, the authors noted.
The rate of myomectomy is reported to be 9.2 per 10,000 woman-years in Black women and 1.3 per 10,000 woman years in White women, and the recurrence rate after myomectomy can be as great as 60% when patients are followed up to 5 years.
The authors said their findings “may be a reference to discuss expectations for treatment outcomes when choosing initial uterus-preserving treatment for leiomyomas, especially for patients receiving treatment years before the likely onset of menopause.”
This research was supported by the National Institute of Arthritis and Musculoskeletal and Skin Diseases of the National Institutes of Health. Coauthor Dr. Lauren Wise is a paid consultant for AbbVie and has received in-kind donations from Swiss Precision Diagnostics and Kindara.com; she has also received payment from the Gates Foundation.
FROM OBSTETRICS & GYNECOLOGY
New CRC stool test beats FIT for sensitivity but not specificity
, according to the large prospective BLUE-C study.
The multi-target assay by Exact Sciences Corporation, the makers of Cologuard, includes new biomarkers designed to increase specificity without decreasing sensitivity. It showed a sensitivity for CRC of almost 94%, with more than 43% sensitivity for advanced precancerous lesions and nearly 91% specificity for advanced neoplasia, according to the study results, which were published in The New England Journal of Medicine.
Adherence to CRC screening in the United States is well below the 80% national target, and the quest continues for noninvasive screening assays that might improve screening adherence, noted lead author Thomas F. Imperiale, MD, AGAF, a professor of medicine at Indiana University School of medicine in Indianapolis, and colleagues.
“The test’s manufacturer developed a new version of its existing Cologuard FIT/DNA test because it took to heart the feedback from primary care providers and gastroenterologists about the test’s low specificity,” Dr. Imperiale said in an interview. “The goal of the new test was to improve specificity without losing, and perhaps even gaining, some sensitivity — a goal that is not easily accomplished when you’re trying to improve on a sensitivity for colorectal cancer that was already 92.3% in the current version of Cologuard.”
Compared with the earlier version of Cologuard, he added, the new generation retained sensitivity for CRC and advanced precancerous lesions or polyps while improving specificity by 30% (90.6% vs 86.6%) for advanced neoplasia — a combination of CRC and advanced precancerous lesions, he said. “This with the caveat, however, that the two versions were not compared head-to-head in this new study,” Dr. Imperiale said.
The higher specificity for advanced lesions is expected to translate to a lower false positive rate. Lowering false positive rates is crucial because that reduces the need for costly, invasive, and unnecessary colonoscopies, said Aasma Shaukat, MD, MPH, AGAF, director of outcomes research in NYU Langone Health’s division of gastroenterology and hepatology in New York City.
“Many physicians felt there were too many false positives with the existing version, and that is anxiety-provoking in patients and providers,” said Dr. Shaukat, who was not involved in the study.
In her view, however, the test’s moderate improvements in detecting certain lesions does not make it demonstrably superior to its predecessor, and there is always the possibility of higher cost to consider.
While acknowledging that a higher sensitivity for all advanced precancerous lesions would have been welcome, Dr. Imperiale said the test detected 75% of the most worrisome of such lesions — “the ones containing high-grade dysplastic cells and suggesting near-term conversion to cancer. And its ability to detect other advanced lesions improved as the size of the lesions increased.”
Testing details
Almost 21,000 asymptomatic participants age 40 years and older undergoing screening colonoscopy were evaluated at 186 US sites during the period 2019 to 2023. Of the cohort, 98 had CRC, 2144 had advanced precancerous lesions, 6973 had nonadvanced adenomas, and 10,961 had nonneoplastic findings or negative colonoscopy.
Advanced precancerous lesions included one or more adenomas or sessile serrated lesions measuring at least 1 cm in the longest dimension, lesions with villous histologic features, and high-grade dysplasia. The new DNA test identified 92 of 98 participants with CRC and 76 of 82 participants with screening-relevant cancers. Among the findings for the new assay:
- Sensitivity for any-stage CRC was 93.9% (95% confidence interval [CI], 87.1- 97.7)
- Sensitivity for advanced precancerous lesions was 43.4% (95% CI, 41.3-45.6)
- Sensitivity for high-grade dysplasia was 74.6% (95% CI, 65.6-82.3)
- Specificity for advanced neoplasia was 90.6% (95% CI, 90.1- 91.0).
- Specificity for nonneoplastic findings or negative colonoscopy was 92.7% (95% CI, 92.2-93.1)
- Specificity for negative colonoscopy was 93.3 (95% CI, 92.8-93.9)
- No adverse events occurred.
In the comparator assay, OC-AUTO FIT by Polymedco, sensitivity was 67.3% (95% CI, 57.1-76.5) for CRC, 23.3% (95% CI, 21.5-25.2) for advanced precancerous lesions, and 47.4% (95% CI, 37.9-56.9) for high-grade dysplasia. In the comparator FIT, however, specificity was better across all age groups — at 94.8% (95% CI, 94.4-95.1) for advanced neoplasia, 95.7% (95% CI, 95.3- 96.1) for nonneoplastic findings, and 96.0% (95% CI, 95.5-96.4) for negative colonoscopy.
In another article in the same issue of NEJM, Guardant Health’s cell-free DNA blood-based test had 83% sensitivity for CRC, 90% specificity for advanced neoplasia, and 13% sensitivity for advanced precancerous lesions in an average-risk population.
An age-related decrease in specificity was observed with the new Cologuard test, but that did not concern Dr. Imperiale because the same observation was made with the current version. “In fact, the next-gen version appears to have less of an age-related decrease in specificity than the current version, although, again, the two versions were not tested head-to-head,” he noted.
The effect of age-related background methylation of DNA is well known, he explained. “Clinicians and older patients in the screening age range do need to be aware of this effect on specificity before ordering or agreeing to do the test. I do not see this as a stumbling block to implementation, but it does require discussion between patient and ordering provider.”
The new version of the DNA test is expected to be available in about a year.
According to Dr. Imperiale, further research is needed to ascertain the test’s acceptability and adherence rates and to quantify its yield in population-based screening. Determining its cost-effectiveness and making it easier to use are other goals. “And most importantly, the degree of reduction in the incidence and mortality from colorectal cancer,” he said.
Cost-effectiveness and the selection of the testing interval may play roles in adherence, particularly in populations with lower rates of screening adherence than the general population, John M. Carethers, MD, AGAF, of the University of California, San Diego, noted in a related editorial.
“Adherence to screening varies according to age group, including persons in the 45- to 49-year age group who are now eligible for average-risk screening,” he wrote. “It is hoped that these newer tests will increase use and adherence and elevate the percentage of the population undergoing screening in order to reduce deaths from colorectal cancer.”
This study was sponsored by Exact Sciences Corporation, which conducted the stool testing at its laboratories.
Dr. Imperiale had no competing interests to disclose. Several study co-authors reported employment with Exact Sciences, or stock and intellectual property ownership. Dr. Shaukat disclosed consulting for Freenome. Dr. Carethers reported ties to Avantor Inc. and Geneoscopy.
, according to the large prospective BLUE-C study.
The multi-target assay by Exact Sciences Corporation, the makers of Cologuard, includes new biomarkers designed to increase specificity without decreasing sensitivity. It showed a sensitivity for CRC of almost 94%, with more than 43% sensitivity for advanced precancerous lesions and nearly 91% specificity for advanced neoplasia, according to the study results, which were published in The New England Journal of Medicine.
Adherence to CRC screening in the United States is well below the 80% national target, and the quest continues for noninvasive screening assays that might improve screening adherence, noted lead author Thomas F. Imperiale, MD, AGAF, a professor of medicine at Indiana University School of medicine in Indianapolis, and colleagues.
“The test’s manufacturer developed a new version of its existing Cologuard FIT/DNA test because it took to heart the feedback from primary care providers and gastroenterologists about the test’s low specificity,” Dr. Imperiale said in an interview. “The goal of the new test was to improve specificity without losing, and perhaps even gaining, some sensitivity — a goal that is not easily accomplished when you’re trying to improve on a sensitivity for colorectal cancer that was already 92.3% in the current version of Cologuard.”
Compared with the earlier version of Cologuard, he added, the new generation retained sensitivity for CRC and advanced precancerous lesions or polyps while improving specificity by 30% (90.6% vs 86.6%) for advanced neoplasia — a combination of CRC and advanced precancerous lesions, he said. “This with the caveat, however, that the two versions were not compared head-to-head in this new study,” Dr. Imperiale said.
The higher specificity for advanced lesions is expected to translate to a lower false positive rate. Lowering false positive rates is crucial because that reduces the need for costly, invasive, and unnecessary colonoscopies, said Aasma Shaukat, MD, MPH, AGAF, director of outcomes research in NYU Langone Health’s division of gastroenterology and hepatology in New York City.
“Many physicians felt there were too many false positives with the existing version, and that is anxiety-provoking in patients and providers,” said Dr. Shaukat, who was not involved in the study.
In her view, however, the test’s moderate improvements in detecting certain lesions does not make it demonstrably superior to its predecessor, and there is always the possibility of higher cost to consider.
While acknowledging that a higher sensitivity for all advanced precancerous lesions would have been welcome, Dr. Imperiale said the test detected 75% of the most worrisome of such lesions — “the ones containing high-grade dysplastic cells and suggesting near-term conversion to cancer. And its ability to detect other advanced lesions improved as the size of the lesions increased.”
Testing details
Almost 21,000 asymptomatic participants age 40 years and older undergoing screening colonoscopy were evaluated at 186 US sites during the period 2019 to 2023. Of the cohort, 98 had CRC, 2144 had advanced precancerous lesions, 6973 had nonadvanced adenomas, and 10,961 had nonneoplastic findings or negative colonoscopy.
Advanced precancerous lesions included one or more adenomas or sessile serrated lesions measuring at least 1 cm in the longest dimension, lesions with villous histologic features, and high-grade dysplasia. The new DNA test identified 92 of 98 participants with CRC and 76 of 82 participants with screening-relevant cancers. Among the findings for the new assay:
- Sensitivity for any-stage CRC was 93.9% (95% confidence interval [CI], 87.1- 97.7)
- Sensitivity for advanced precancerous lesions was 43.4% (95% CI, 41.3-45.6)
- Sensitivity for high-grade dysplasia was 74.6% (95% CI, 65.6-82.3)
- Specificity for advanced neoplasia was 90.6% (95% CI, 90.1- 91.0).
- Specificity for nonneoplastic findings or negative colonoscopy was 92.7% (95% CI, 92.2-93.1)
- Specificity for negative colonoscopy was 93.3 (95% CI, 92.8-93.9)
- No adverse events occurred.
In the comparator assay, OC-AUTO FIT by Polymedco, sensitivity was 67.3% (95% CI, 57.1-76.5) for CRC, 23.3% (95% CI, 21.5-25.2) for advanced precancerous lesions, and 47.4% (95% CI, 37.9-56.9) for high-grade dysplasia. In the comparator FIT, however, specificity was better across all age groups — at 94.8% (95% CI, 94.4-95.1) for advanced neoplasia, 95.7% (95% CI, 95.3- 96.1) for nonneoplastic findings, and 96.0% (95% CI, 95.5-96.4) for negative colonoscopy.
In another article in the same issue of NEJM, Guardant Health’s cell-free DNA blood-based test had 83% sensitivity for CRC, 90% specificity for advanced neoplasia, and 13% sensitivity for advanced precancerous lesions in an average-risk population.
An age-related decrease in specificity was observed with the new Cologuard test, but that did not concern Dr. Imperiale because the same observation was made with the current version. “In fact, the next-gen version appears to have less of an age-related decrease in specificity than the current version, although, again, the two versions were not tested head-to-head,” he noted.
The effect of age-related background methylation of DNA is well known, he explained. “Clinicians and older patients in the screening age range do need to be aware of this effect on specificity before ordering or agreeing to do the test. I do not see this as a stumbling block to implementation, but it does require discussion between patient and ordering provider.”
The new version of the DNA test is expected to be available in about a year.
According to Dr. Imperiale, further research is needed to ascertain the test’s acceptability and adherence rates and to quantify its yield in population-based screening. Determining its cost-effectiveness and making it easier to use are other goals. “And most importantly, the degree of reduction in the incidence and mortality from colorectal cancer,” he said.
Cost-effectiveness and the selection of the testing interval may play roles in adherence, particularly in populations with lower rates of screening adherence than the general population, John M. Carethers, MD, AGAF, of the University of California, San Diego, noted in a related editorial.
“Adherence to screening varies according to age group, including persons in the 45- to 49-year age group who are now eligible for average-risk screening,” he wrote. “It is hoped that these newer tests will increase use and adherence and elevate the percentage of the population undergoing screening in order to reduce deaths from colorectal cancer.”
This study was sponsored by Exact Sciences Corporation, which conducted the stool testing at its laboratories.
Dr. Imperiale had no competing interests to disclose. Several study co-authors reported employment with Exact Sciences, or stock and intellectual property ownership. Dr. Shaukat disclosed consulting for Freenome. Dr. Carethers reported ties to Avantor Inc. and Geneoscopy.
, according to the large prospective BLUE-C study.
The multi-target assay by Exact Sciences Corporation, the makers of Cologuard, includes new biomarkers designed to increase specificity without decreasing sensitivity. It showed a sensitivity for CRC of almost 94%, with more than 43% sensitivity for advanced precancerous lesions and nearly 91% specificity for advanced neoplasia, according to the study results, which were published in The New England Journal of Medicine.
Adherence to CRC screening in the United States is well below the 80% national target, and the quest continues for noninvasive screening assays that might improve screening adherence, noted lead author Thomas F. Imperiale, MD, AGAF, a professor of medicine at Indiana University School of medicine in Indianapolis, and colleagues.
“The test’s manufacturer developed a new version of its existing Cologuard FIT/DNA test because it took to heart the feedback from primary care providers and gastroenterologists about the test’s low specificity,” Dr. Imperiale said in an interview. “The goal of the new test was to improve specificity without losing, and perhaps even gaining, some sensitivity — a goal that is not easily accomplished when you’re trying to improve on a sensitivity for colorectal cancer that was already 92.3% in the current version of Cologuard.”
Compared with the earlier version of Cologuard, he added, the new generation retained sensitivity for CRC and advanced precancerous lesions or polyps while improving specificity by 30% (90.6% vs 86.6%) for advanced neoplasia — a combination of CRC and advanced precancerous lesions, he said. “This with the caveat, however, that the two versions were not compared head-to-head in this new study,” Dr. Imperiale said.
The higher specificity for advanced lesions is expected to translate to a lower false positive rate. Lowering false positive rates is crucial because that reduces the need for costly, invasive, and unnecessary colonoscopies, said Aasma Shaukat, MD, MPH, AGAF, director of outcomes research in NYU Langone Health’s division of gastroenterology and hepatology in New York City.
“Many physicians felt there were too many false positives with the existing version, and that is anxiety-provoking in patients and providers,” said Dr. Shaukat, who was not involved in the study.
In her view, however, the test’s moderate improvements in detecting certain lesions does not make it demonstrably superior to its predecessor, and there is always the possibility of higher cost to consider.
While acknowledging that a higher sensitivity for all advanced precancerous lesions would have been welcome, Dr. Imperiale said the test detected 75% of the most worrisome of such lesions — “the ones containing high-grade dysplastic cells and suggesting near-term conversion to cancer. And its ability to detect other advanced lesions improved as the size of the lesions increased.”
Testing details
Almost 21,000 asymptomatic participants age 40 years and older undergoing screening colonoscopy were evaluated at 186 US sites during the period 2019 to 2023. Of the cohort, 98 had CRC, 2144 had advanced precancerous lesions, 6973 had nonadvanced adenomas, and 10,961 had nonneoplastic findings or negative colonoscopy.
Advanced precancerous lesions included one or more adenomas or sessile serrated lesions measuring at least 1 cm in the longest dimension, lesions with villous histologic features, and high-grade dysplasia. The new DNA test identified 92 of 98 participants with CRC and 76 of 82 participants with screening-relevant cancers. Among the findings for the new assay:
- Sensitivity for any-stage CRC was 93.9% (95% confidence interval [CI], 87.1- 97.7)
- Sensitivity for advanced precancerous lesions was 43.4% (95% CI, 41.3-45.6)
- Sensitivity for high-grade dysplasia was 74.6% (95% CI, 65.6-82.3)
- Specificity for advanced neoplasia was 90.6% (95% CI, 90.1- 91.0).
- Specificity for nonneoplastic findings or negative colonoscopy was 92.7% (95% CI, 92.2-93.1)
- Specificity for negative colonoscopy was 93.3 (95% CI, 92.8-93.9)
- No adverse events occurred.
In the comparator assay, OC-AUTO FIT by Polymedco, sensitivity was 67.3% (95% CI, 57.1-76.5) for CRC, 23.3% (95% CI, 21.5-25.2) for advanced precancerous lesions, and 47.4% (95% CI, 37.9-56.9) for high-grade dysplasia. In the comparator FIT, however, specificity was better across all age groups — at 94.8% (95% CI, 94.4-95.1) for advanced neoplasia, 95.7% (95% CI, 95.3- 96.1) for nonneoplastic findings, and 96.0% (95% CI, 95.5-96.4) for negative colonoscopy.
In another article in the same issue of NEJM, Guardant Health’s cell-free DNA blood-based test had 83% sensitivity for CRC, 90% specificity for advanced neoplasia, and 13% sensitivity for advanced precancerous lesions in an average-risk population.
An age-related decrease in specificity was observed with the new Cologuard test, but that did not concern Dr. Imperiale because the same observation was made with the current version. “In fact, the next-gen version appears to have less of an age-related decrease in specificity than the current version, although, again, the two versions were not tested head-to-head,” he noted.
The effect of age-related background methylation of DNA is well known, he explained. “Clinicians and older patients in the screening age range do need to be aware of this effect on specificity before ordering or agreeing to do the test. I do not see this as a stumbling block to implementation, but it does require discussion between patient and ordering provider.”
The new version of the DNA test is expected to be available in about a year.
According to Dr. Imperiale, further research is needed to ascertain the test’s acceptability and adherence rates and to quantify its yield in population-based screening. Determining its cost-effectiveness and making it easier to use are other goals. “And most importantly, the degree of reduction in the incidence and mortality from colorectal cancer,” he said.
Cost-effectiveness and the selection of the testing interval may play roles in adherence, particularly in populations with lower rates of screening adherence than the general population, John M. Carethers, MD, AGAF, of the University of California, San Diego, noted in a related editorial.
“Adherence to screening varies according to age group, including persons in the 45- to 49-year age group who are now eligible for average-risk screening,” he wrote. “It is hoped that these newer tests will increase use and adherence and elevate the percentage of the population undergoing screening in order to reduce deaths from colorectal cancer.”
This study was sponsored by Exact Sciences Corporation, which conducted the stool testing at its laboratories.
Dr. Imperiale had no competing interests to disclose. Several study co-authors reported employment with Exact Sciences, or stock and intellectual property ownership. Dr. Shaukat disclosed consulting for Freenome. Dr. Carethers reported ties to Avantor Inc. and Geneoscopy.
FROM NEW ENGLAND JOURNAL OF MEDICINE
USPSTF: Insufficient Evidence for Primary Care Interventions to Prevent Child Maltreatment
While primary care physicians are uniquely positioned to identify mistreated minors, there is insufficient evidence of benefits and harms to support primary care interventions to prevent maltreatment in children who have no indicative signs or symptoms. That is the conclusion of the US Preventive Services Task Force (USPSTF) in an update of its 2018 statement published in JAMA Network Open.
This gap, however, might be partially filled by addressing in young patients the known social determinants of health such as economic stability, food, shelter, and healthcare access. The USPSTF statement is based on a simultaneously published evidence review and synthesis compiled by Meera Viswanathan, PhD, of the RTI International-University of North Carolina at Chapel Hill Evidence-Based Practice Center in Triangle Park, NC, and colleagues.
The review included 14,355 participants in 25 trials, of which 23 included home visits. It measured such things as direct reports to Child Protective Services or removal of children from the home and proxy measures of abuse or neglect such as injury, emergency department visits, and hospitalizations. In addition, it looked at behavioral, developmental, emotional, mental or physical health and well-being, mortality, and harms.
More than 50% of the studies analyzed consisted of children with no prior reports of maltreatment. In addition to limited and inconsistent findings, the researchers noted wide variance in screening, identifying, and reporting child maltreatment to authorities, including variations by race or ethnicity, as well as wide variance in the accuracy of screening instruments.
“Contextual evidence pointed to the potential for bias or inaccuracy in screening, identification, and reporting of child maltreatment but also highlighted the importance of addressing social determinants when intervening to prevent child maltreatment,” Dr. Viswanathan’s group wrote.
The USPSTF panel, chaired by Michael J. Barry, MD, of Harvard Medical School, Boston, Massachusetts (now immediate past chair of the Task Force), stressed that the current statement applies only to children with no signs of maltreatment: Those with direct signs should be assessed and appropriately reported.
A Common and Costly Problem
Child abuse or neglect is widespread and has long-lasting adverse effects. In 2021, the statement noted, Child Protective Services identified 600,000 children as abused or neglected, with 1821 related deaths. Most (76%) experienced neglect, but many were subjected to physical abuse (16%), sexual abuse (10%), and sex trafficking (0.2%). Of the 1820 who died, 78% experienced neglect and 43% experienced physical abuse alone or combined with maltreatment such as neglect and psychological abuse.
Benefits aside, among the potential harms of intervention, the USPSTF noted, is family stigma and bias toward non-White and low-income groups. There may be a greater probability of clinicians’ disproportionately reporting abuse for the children of Black, Hispanic, indigenous, and one-parent households. Some studies indicate that more cases of maltreatment are missed in White children, the review authors noted.
“Additional evidence is needed to clarify potential linkages between improvements in social determinants of health and child maltreatment prevention,” the USPSTF panelists concluded. They acknowledged that their recommendation does not address the effectiveness of interventions such as home visits to improve family well-being.
In an accompanying editorial Samantha Schilling, MD, MSHP, of the Department of Pediatrics at the University of North Carolina at Chapel Hill, and colleagues from the Children’s Hospital of Philadelphia in Pennsylvania admitted they were “disheartened, but not surprised” at the USPSTF’s conclusions and urged that prevention measures be continued. “It is not yet time to wave the white flag of surrender and abandon primary care–based efforts to mitigate risks for child abuse and neglect.
They sent a heartfelt message to primary care doctors: “Know this: while additional evidence is amassed, do not stop your ongoing efforts to protect vulnerable children. You are an important component of child maltreatment prevention, although your actions and support cannot be delivered (or measured) in isolation.”
Dr. Schilling and associates argued that insufficient evidence does not mean that primary care prevention efforts are ineffective, only that evidence is lacking. They pointed out that proximal outcomes along a causal pathway have been used to assess the effectiveness of preventive measures and should be considered in this context. “For example, based on evidence that counseling about minimizing exposure to UV radiation is associated with a moderate increase in use of sunscreen protection, the USPSTF recommends that counseling be provided to certain populations,” they wrote. “The USPSTF did not require direct evidence that counseling decreases skin cancer.”
More high-quality research is needed, as the USPSTF recognized. “Given the inadequacy of the current gold standard measures of child maltreatment, proximal outcomes on the complex, multifactorial, causal pathway to child abuse and neglect should be considered,” the commentators wrote.
The commentators also acknowledged that patients’ caregivers often struggle to do their best with sparse resources and that resources such as food and housing, treatment for substance use and mental health disorders, appropriate strategies to manage typical child behavior, and affordable child care too often fall short.
They argued, therefore, that consequential prevention is not possible without sustained investment in policies and programs that provide tangible support to families, reduce childhood poverty, and target relevant risk factors.
The Agency for Healthcare Research and Quality of the US Department of Health and Human Services supports the operations of the USPSTF. Dr. Barry reported grants from Healthwise, a nonprofit organization, outside of the submitted work. Dr. Silverstein reported receiving a research grant on approaches to child maltreatment prevention. Dr. Lee reported grants from the National Institute on Aging. The evidence review was supported by a grant from the Agency for Healthcare Research. Dr. Viswanathan and colleagues disclosed no conflicts of interest. Dr. Wood reported grants from the Annie E. Casey Foundation outside of the submitted work. Dr. Christian reported personal fees from multiple government agencies and legal firms and provides medical-legal expert work in child abuse cases outside of the submitted work.
While primary care physicians are uniquely positioned to identify mistreated minors, there is insufficient evidence of benefits and harms to support primary care interventions to prevent maltreatment in children who have no indicative signs or symptoms. That is the conclusion of the US Preventive Services Task Force (USPSTF) in an update of its 2018 statement published in JAMA Network Open.
This gap, however, might be partially filled by addressing in young patients the known social determinants of health such as economic stability, food, shelter, and healthcare access. The USPSTF statement is based on a simultaneously published evidence review and synthesis compiled by Meera Viswanathan, PhD, of the RTI International-University of North Carolina at Chapel Hill Evidence-Based Practice Center in Triangle Park, NC, and colleagues.
The review included 14,355 participants in 25 trials, of which 23 included home visits. It measured such things as direct reports to Child Protective Services or removal of children from the home and proxy measures of abuse or neglect such as injury, emergency department visits, and hospitalizations. In addition, it looked at behavioral, developmental, emotional, mental or physical health and well-being, mortality, and harms.
More than 50% of the studies analyzed consisted of children with no prior reports of maltreatment. In addition to limited and inconsistent findings, the researchers noted wide variance in screening, identifying, and reporting child maltreatment to authorities, including variations by race or ethnicity, as well as wide variance in the accuracy of screening instruments.
“Contextual evidence pointed to the potential for bias or inaccuracy in screening, identification, and reporting of child maltreatment but also highlighted the importance of addressing social determinants when intervening to prevent child maltreatment,” Dr. Viswanathan’s group wrote.
The USPSTF panel, chaired by Michael J. Barry, MD, of Harvard Medical School, Boston, Massachusetts (now immediate past chair of the Task Force), stressed that the current statement applies only to children with no signs of maltreatment: Those with direct signs should be assessed and appropriately reported.
A Common and Costly Problem
Child abuse or neglect is widespread and has long-lasting adverse effects. In 2021, the statement noted, Child Protective Services identified 600,000 children as abused or neglected, with 1821 related deaths. Most (76%) experienced neglect, but many were subjected to physical abuse (16%), sexual abuse (10%), and sex trafficking (0.2%). Of the 1820 who died, 78% experienced neglect and 43% experienced physical abuse alone or combined with maltreatment such as neglect and psychological abuse.
Benefits aside, among the potential harms of intervention, the USPSTF noted, is family stigma and bias toward non-White and low-income groups. There may be a greater probability of clinicians’ disproportionately reporting abuse for the children of Black, Hispanic, indigenous, and one-parent households. Some studies indicate that more cases of maltreatment are missed in White children, the review authors noted.
“Additional evidence is needed to clarify potential linkages between improvements in social determinants of health and child maltreatment prevention,” the USPSTF panelists concluded. They acknowledged that their recommendation does not address the effectiveness of interventions such as home visits to improve family well-being.
In an accompanying editorial Samantha Schilling, MD, MSHP, of the Department of Pediatrics at the University of North Carolina at Chapel Hill, and colleagues from the Children’s Hospital of Philadelphia in Pennsylvania admitted they were “disheartened, but not surprised” at the USPSTF’s conclusions and urged that prevention measures be continued. “It is not yet time to wave the white flag of surrender and abandon primary care–based efforts to mitigate risks for child abuse and neglect.
They sent a heartfelt message to primary care doctors: “Know this: while additional evidence is amassed, do not stop your ongoing efforts to protect vulnerable children. You are an important component of child maltreatment prevention, although your actions and support cannot be delivered (or measured) in isolation.”
Dr. Schilling and associates argued that insufficient evidence does not mean that primary care prevention efforts are ineffective, only that evidence is lacking. They pointed out that proximal outcomes along a causal pathway have been used to assess the effectiveness of preventive measures and should be considered in this context. “For example, based on evidence that counseling about minimizing exposure to UV radiation is associated with a moderate increase in use of sunscreen protection, the USPSTF recommends that counseling be provided to certain populations,” they wrote. “The USPSTF did not require direct evidence that counseling decreases skin cancer.”
More high-quality research is needed, as the USPSTF recognized. “Given the inadequacy of the current gold standard measures of child maltreatment, proximal outcomes on the complex, multifactorial, causal pathway to child abuse and neglect should be considered,” the commentators wrote.
The commentators also acknowledged that patients’ caregivers often struggle to do their best with sparse resources and that resources such as food and housing, treatment for substance use and mental health disorders, appropriate strategies to manage typical child behavior, and affordable child care too often fall short.
They argued, therefore, that consequential prevention is not possible without sustained investment in policies and programs that provide tangible support to families, reduce childhood poverty, and target relevant risk factors.
The Agency for Healthcare Research and Quality of the US Department of Health and Human Services supports the operations of the USPSTF. Dr. Barry reported grants from Healthwise, a nonprofit organization, outside of the submitted work. Dr. Silverstein reported receiving a research grant on approaches to child maltreatment prevention. Dr. Lee reported grants from the National Institute on Aging. The evidence review was supported by a grant from the Agency for Healthcare Research. Dr. Viswanathan and colleagues disclosed no conflicts of interest. Dr. Wood reported grants from the Annie E. Casey Foundation outside of the submitted work. Dr. Christian reported personal fees from multiple government agencies and legal firms and provides medical-legal expert work in child abuse cases outside of the submitted work.
While primary care physicians are uniquely positioned to identify mistreated minors, there is insufficient evidence of benefits and harms to support primary care interventions to prevent maltreatment in children who have no indicative signs or symptoms. That is the conclusion of the US Preventive Services Task Force (USPSTF) in an update of its 2018 statement published in JAMA Network Open.
This gap, however, might be partially filled by addressing in young patients the known social determinants of health such as economic stability, food, shelter, and healthcare access. The USPSTF statement is based on a simultaneously published evidence review and synthesis compiled by Meera Viswanathan, PhD, of the RTI International-University of North Carolina at Chapel Hill Evidence-Based Practice Center in Triangle Park, NC, and colleagues.
The review included 14,355 participants in 25 trials, of which 23 included home visits. It measured such things as direct reports to Child Protective Services or removal of children from the home and proxy measures of abuse or neglect such as injury, emergency department visits, and hospitalizations. In addition, it looked at behavioral, developmental, emotional, mental or physical health and well-being, mortality, and harms.
More than 50% of the studies analyzed consisted of children with no prior reports of maltreatment. In addition to limited and inconsistent findings, the researchers noted wide variance in screening, identifying, and reporting child maltreatment to authorities, including variations by race or ethnicity, as well as wide variance in the accuracy of screening instruments.
“Contextual evidence pointed to the potential for bias or inaccuracy in screening, identification, and reporting of child maltreatment but also highlighted the importance of addressing social determinants when intervening to prevent child maltreatment,” Dr. Viswanathan’s group wrote.
The USPSTF panel, chaired by Michael J. Barry, MD, of Harvard Medical School, Boston, Massachusetts (now immediate past chair of the Task Force), stressed that the current statement applies only to children with no signs of maltreatment: Those with direct signs should be assessed and appropriately reported.
A Common and Costly Problem
Child abuse or neglect is widespread and has long-lasting adverse effects. In 2021, the statement noted, Child Protective Services identified 600,000 children as abused or neglected, with 1821 related deaths. Most (76%) experienced neglect, but many were subjected to physical abuse (16%), sexual abuse (10%), and sex trafficking (0.2%). Of the 1820 who died, 78% experienced neglect and 43% experienced physical abuse alone or combined with maltreatment such as neglect and psychological abuse.
Benefits aside, among the potential harms of intervention, the USPSTF noted, is family stigma and bias toward non-White and low-income groups. There may be a greater probability of clinicians’ disproportionately reporting abuse for the children of Black, Hispanic, indigenous, and one-parent households. Some studies indicate that more cases of maltreatment are missed in White children, the review authors noted.
“Additional evidence is needed to clarify potential linkages between improvements in social determinants of health and child maltreatment prevention,” the USPSTF panelists concluded. They acknowledged that their recommendation does not address the effectiveness of interventions such as home visits to improve family well-being.
In an accompanying editorial Samantha Schilling, MD, MSHP, of the Department of Pediatrics at the University of North Carolina at Chapel Hill, and colleagues from the Children’s Hospital of Philadelphia in Pennsylvania admitted they were “disheartened, but not surprised” at the USPSTF’s conclusions and urged that prevention measures be continued. “It is not yet time to wave the white flag of surrender and abandon primary care–based efforts to mitigate risks for child abuse and neglect.
They sent a heartfelt message to primary care doctors: “Know this: while additional evidence is amassed, do not stop your ongoing efforts to protect vulnerable children. You are an important component of child maltreatment prevention, although your actions and support cannot be delivered (or measured) in isolation.”
Dr. Schilling and associates argued that insufficient evidence does not mean that primary care prevention efforts are ineffective, only that evidence is lacking. They pointed out that proximal outcomes along a causal pathway have been used to assess the effectiveness of preventive measures and should be considered in this context. “For example, based on evidence that counseling about minimizing exposure to UV radiation is associated with a moderate increase in use of sunscreen protection, the USPSTF recommends that counseling be provided to certain populations,” they wrote. “The USPSTF did not require direct evidence that counseling decreases skin cancer.”
More high-quality research is needed, as the USPSTF recognized. “Given the inadequacy of the current gold standard measures of child maltreatment, proximal outcomes on the complex, multifactorial, causal pathway to child abuse and neglect should be considered,” the commentators wrote.
The commentators also acknowledged that patients’ caregivers often struggle to do their best with sparse resources and that resources such as food and housing, treatment for substance use and mental health disorders, appropriate strategies to manage typical child behavior, and affordable child care too often fall short.
They argued, therefore, that consequential prevention is not possible without sustained investment in policies and programs that provide tangible support to families, reduce childhood poverty, and target relevant risk factors.
The Agency for Healthcare Research and Quality of the US Department of Health and Human Services supports the operations of the USPSTF. Dr. Barry reported grants from Healthwise, a nonprofit organization, outside of the submitted work. Dr. Silverstein reported receiving a research grant on approaches to child maltreatment prevention. Dr. Lee reported grants from the National Institute on Aging. The evidence review was supported by a grant from the Agency for Healthcare Research. Dr. Viswanathan and colleagues disclosed no conflicts of interest. Dr. Wood reported grants from the Annie E. Casey Foundation outside of the submitted work. Dr. Christian reported personal fees from multiple government agencies and legal firms and provides medical-legal expert work in child abuse cases outside of the submitted work.
FROM JAMA NETWORK OPEN