Paclitaxel Matches Cisplatin HIPEC in Ovarian Cancer

Article Type
Changed

TOPLINE:

Patients with advanced ovarian cancer undergoing interval cytoreductive surgery who received paclitaxel-based hyperthermic intraperitoneal chemotherapy (HIPEC) during surgery appeared to have comparable overall survival and disease-free survival rates to those who received cisplatin-based HIPEC.

METHODOLOGY:

  • Although the use of HIPEC remains controversial, cisplatin-based HIPEC during cytoreductive surgery may benefit patients with advanced ovarian cancer; however, there is less evidence for paclitaxel-based HIPEC, typically used in patients who are frail or intolerant to platinum agents.
  • To compare the two regimens, researchers analyzed data from the National Registry of Peritoneal Carcinomatosis, which included 846 patients (mean age, 59 years) who underwent interval cytoreductive surgery with either cisplatin-based HIPEC (n = 325) or paclitaxel-based HIPEC (n = 521). After propensity score matching, there were 199 patients per group (total = 398).
  • HIPEC was administered post-surgery with cisplatin (75-100 mg/m2 for 90 minutes) or paclitaxel (120 mg/m2 for 60 minutes), both at 42-43 °C.

TAKEAWAY:

  • Using cisplatin as the reference group, the median overall survival was not significantly different between the two options (hazard ratio [HR], 0.74; P = .16); however, the median overall survival was 82 months in the paclitaxel group vs 58 months in the cisplatin group.
  • Disease-free survival was also not significantly different between the 2 groups, with a median of 20 months in the cisplatin group and 21 months in the paclitaxel groups (HR, 0.95; 95% CI, 0.72-1.25; P = .70).
  • Overall survival was comparable during the first 20 months of follow-up and disease-free survival was equivalent during the first 15 months of follow-up, based on a predefined equivalence margin of 0.1.
  • Paclitaxel-based HIPEC was not associated with increased morbidity (odds ratio, 1.32; P = .06).

IN PRACTICE:

“Our study suggests that cisplatin and paclitaxel are two safe and effective drugs to be used for HIPEC in [interval cytoreductive surgery] for advanced ovarian cancer. As cisplatin is the preferred drug according to strong evidence, paclitaxel could be a valuable alternative for patients with any contraindication to cisplatin, with similar oncological and perioperative outcomes,” the authors wrote.

SOURCE:

This study, led by Salud González Sánchez, MD, Reina Sofía University Hospital in Córdoba, Spain, was published online in JAMA Network Open.

LIMITATIONS:

The retrospective design of this study limited causal inference. The BRCA mutation status was not captured in the national registry. Additionally, the matching procedure resulted in a moderate sample size, which could have led to residual confounding.

DISCLOSURES:

The authors did not declare any funding information and reported no relevant conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

TOPLINE:

Patients with advanced ovarian cancer undergoing interval cytoreductive surgery who received paclitaxel-based hyperthermic intraperitoneal chemotherapy (HIPEC) during surgery appeared to have comparable overall survival and disease-free survival rates to those who received cisplatin-based HIPEC.

METHODOLOGY:

  • Although the use of HIPEC remains controversial, cisplatin-based HIPEC during cytoreductive surgery may benefit patients with advanced ovarian cancer; however, there is less evidence for paclitaxel-based HIPEC, typically used in patients who are frail or intolerant to platinum agents.
  • To compare the two regimens, researchers analyzed data from the National Registry of Peritoneal Carcinomatosis, which included 846 patients (mean age, 59 years) who underwent interval cytoreductive surgery with either cisplatin-based HIPEC (n = 325) or paclitaxel-based HIPEC (n = 521). After propensity score matching, there were 199 patients per group (total = 398).
  • HIPEC was administered post-surgery with cisplatin (75-100 mg/m2 for 90 minutes) or paclitaxel (120 mg/m2 for 60 minutes), both at 42-43 °C.

TAKEAWAY:

  • Using cisplatin as the reference group, the median overall survival was not significantly different between the two options (hazard ratio [HR], 0.74; P = .16); however, the median overall survival was 82 months in the paclitaxel group vs 58 months in the cisplatin group.
  • Disease-free survival was also not significantly different between the 2 groups, with a median of 20 months in the cisplatin group and 21 months in the paclitaxel groups (HR, 0.95; 95% CI, 0.72-1.25; P = .70).
  • Overall survival was comparable during the first 20 months of follow-up and disease-free survival was equivalent during the first 15 months of follow-up, based on a predefined equivalence margin of 0.1.
  • Paclitaxel-based HIPEC was not associated with increased morbidity (odds ratio, 1.32; P = .06).

IN PRACTICE:

“Our study suggests that cisplatin and paclitaxel are two safe and effective drugs to be used for HIPEC in [interval cytoreductive surgery] for advanced ovarian cancer. As cisplatin is the preferred drug according to strong evidence, paclitaxel could be a valuable alternative for patients with any contraindication to cisplatin, with similar oncological and perioperative outcomes,” the authors wrote.

SOURCE:

This study, led by Salud González Sánchez, MD, Reina Sofía University Hospital in Córdoba, Spain, was published online in JAMA Network Open.

LIMITATIONS:

The retrospective design of this study limited causal inference. The BRCA mutation status was not captured in the national registry. Additionally, the matching procedure resulted in a moderate sample size, which could have led to residual confounding.

DISCLOSURES:

The authors did not declare any funding information and reported no relevant conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.

A version of this article first appeared on Medscape.com.

TOPLINE:

Patients with advanced ovarian cancer undergoing interval cytoreductive surgery who received paclitaxel-based hyperthermic intraperitoneal chemotherapy (HIPEC) during surgery appeared to have comparable overall survival and disease-free survival rates to those who received cisplatin-based HIPEC.

METHODOLOGY:

  • Although the use of HIPEC remains controversial, cisplatin-based HIPEC during cytoreductive surgery may benefit patients with advanced ovarian cancer; however, there is less evidence for paclitaxel-based HIPEC, typically used in patients who are frail or intolerant to platinum agents.
  • To compare the two regimens, researchers analyzed data from the National Registry of Peritoneal Carcinomatosis, which included 846 patients (mean age, 59 years) who underwent interval cytoreductive surgery with either cisplatin-based HIPEC (n = 325) or paclitaxel-based HIPEC (n = 521). After propensity score matching, there were 199 patients per group (total = 398).
  • HIPEC was administered post-surgery with cisplatin (75-100 mg/m2 for 90 minutes) or paclitaxel (120 mg/m2 for 60 minutes), both at 42-43 °C.

TAKEAWAY:

  • Using cisplatin as the reference group, the median overall survival was not significantly different between the two options (hazard ratio [HR], 0.74; P = .16); however, the median overall survival was 82 months in the paclitaxel group vs 58 months in the cisplatin group.
  • Disease-free survival was also not significantly different between the 2 groups, with a median of 20 months in the cisplatin group and 21 months in the paclitaxel groups (HR, 0.95; 95% CI, 0.72-1.25; P = .70).
  • Overall survival was comparable during the first 20 months of follow-up and disease-free survival was equivalent during the first 15 months of follow-up, based on a predefined equivalence margin of 0.1.
  • Paclitaxel-based HIPEC was not associated with increased morbidity (odds ratio, 1.32; P = .06).

IN PRACTICE:

“Our study suggests that cisplatin and paclitaxel are two safe and effective drugs to be used for HIPEC in [interval cytoreductive surgery] for advanced ovarian cancer. As cisplatin is the preferred drug according to strong evidence, paclitaxel could be a valuable alternative for patients with any contraindication to cisplatin, with similar oncological and perioperative outcomes,” the authors wrote.

SOURCE:

This study, led by Salud González Sánchez, MD, Reina Sofía University Hospital in Córdoba, Spain, was published online in JAMA Network Open.

LIMITATIONS:

The retrospective design of this study limited causal inference. The BRCA mutation status was not captured in the national registry. Additionally, the matching procedure resulted in a moderate sample size, which could have led to residual confounding.

DISCLOSURES:

The authors did not declare any funding information and reported no relevant conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Un-Gate On Date
Use ProPublica
CFC Schedule Remove Status
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date

VA To Lose 30K Positions Via Attrition, No RIFs Planned

Article Type
Changed

The initial plan to reduce the US Department of Veterans Affairs (VA) workforce by 15%—roughly 83,000 employees—has been revised. The VA announced that it expected to reduce its workforce by 30,000 positions through normal attrition, early retirements, and resignations by the end of fiscal year 2025, “eliminating the need for a large-scale reduction-in-force.” Most of the positions will not be replaced due to the federal hiring freeze, which has been extended for 3 months.

“Since March, we’ve been conducting a holistic review of the department centered on reducing bureaucracy and improving services to Veterans,” VA Secretary Doug Collins said in a press release. “A department-wide RIF is off the table, but that doesn’t mean we’re done improving VA.”

About 17,000 VA employees have left their jobs as of June 1. From now and Sept. 30, the department expects another reduction of nearly 12,000. Pete Kasperowicz, a VA spokesperson, said there would not be any reductions beyond the 30,000 planned.

The VA says it has multiple safeguards in place to ensure the reductions do not impact veteran care or benefits. All VA mission-critical positions are exempt from the voluntary early retirement authority and deferred resignation program, and > 350,000 positions are exempt from the federal hiring freeze. 

The release noted several other improvements regarding VA performance in 2025, among them that the disability claims backlog has been reduced by 30% and a record 2 million disability claims have been processed by June. More than 60,000 VA employees have also returned to the office, according to the release.

“As a result of our efforts, VA is headed in the right direction – both in terms of staff levels and customer service,” Collins said. “Our review has resulted in a host of new ideas for better serving Veterans that we will continue to pursue.” 

Publications
Topics
Sections

The initial plan to reduce the US Department of Veterans Affairs (VA) workforce by 15%—roughly 83,000 employees—has been revised. The VA announced that it expected to reduce its workforce by 30,000 positions through normal attrition, early retirements, and resignations by the end of fiscal year 2025, “eliminating the need for a large-scale reduction-in-force.” Most of the positions will not be replaced due to the federal hiring freeze, which has been extended for 3 months.

“Since March, we’ve been conducting a holistic review of the department centered on reducing bureaucracy and improving services to Veterans,” VA Secretary Doug Collins said in a press release. “A department-wide RIF is off the table, but that doesn’t mean we’re done improving VA.”

About 17,000 VA employees have left their jobs as of June 1. From now and Sept. 30, the department expects another reduction of nearly 12,000. Pete Kasperowicz, a VA spokesperson, said there would not be any reductions beyond the 30,000 planned.

The VA says it has multiple safeguards in place to ensure the reductions do not impact veteran care or benefits. All VA mission-critical positions are exempt from the voluntary early retirement authority and deferred resignation program, and > 350,000 positions are exempt from the federal hiring freeze. 

The release noted several other improvements regarding VA performance in 2025, among them that the disability claims backlog has been reduced by 30% and a record 2 million disability claims have been processed by June. More than 60,000 VA employees have also returned to the office, according to the release.

“As a result of our efforts, VA is headed in the right direction – both in terms of staff levels and customer service,” Collins said. “Our review has resulted in a host of new ideas for better serving Veterans that we will continue to pursue.” 

The initial plan to reduce the US Department of Veterans Affairs (VA) workforce by 15%—roughly 83,000 employees—has been revised. The VA announced that it expected to reduce its workforce by 30,000 positions through normal attrition, early retirements, and resignations by the end of fiscal year 2025, “eliminating the need for a large-scale reduction-in-force.” Most of the positions will not be replaced due to the federal hiring freeze, which has been extended for 3 months.

“Since March, we’ve been conducting a holistic review of the department centered on reducing bureaucracy and improving services to Veterans,” VA Secretary Doug Collins said in a press release. “A department-wide RIF is off the table, but that doesn’t mean we’re done improving VA.”

About 17,000 VA employees have left their jobs as of June 1. From now and Sept. 30, the department expects another reduction of nearly 12,000. Pete Kasperowicz, a VA spokesperson, said there would not be any reductions beyond the 30,000 planned.

The VA says it has multiple safeguards in place to ensure the reductions do not impact veteran care or benefits. All VA mission-critical positions are exempt from the voluntary early retirement authority and deferred resignation program, and > 350,000 positions are exempt from the federal hiring freeze. 

The release noted several other improvements regarding VA performance in 2025, among them that the disability claims backlog has been reduced by 30% and a record 2 million disability claims have been processed by June. More than 60,000 VA employees have also returned to the office, according to the release.

“As a result of our efforts, VA is headed in the right direction – both in terms of staff levels and customer service,” Collins said. “Our review has resulted in a host of new ideas for better serving Veterans that we will continue to pursue.” 

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Un-Gate On Date
Use ProPublica
CFC Schedule Remove Status
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date

Dementia Risk May Follow a Geographic Pattern

Article Type
Changed

TOPLINE:

Dementia incidence varied significantly by US region in a new study, with the Southeast showing a 25% higher risk and the Northwest and Rocky Mountains each showing a 23% higher risk compared to the Mid-Atlantic. Investigators said the findings highlight the need for a geographically tailored approach to address dementia risk factors and diagnostic services.

METHODOLOGY:

  • Researchers conducted a cohort study using data from the US Veterans Health Administration for more than 1.2 million older adults without dementia (mean age, 73.9 years; 98%% men) from 1999 to 2021. The average follow-up was 12.6 years.
  • Ten geographical regions across the US were defined using the CDC National Center for Chronic Disease Prevention and Health Promotion definition.
  • The diagnosis of dementia was made using International Classification of Diseases, Ninth and Tenth Revision codes from inpatient and outpatient visits.

TAKEAWAY:

  • Dementia incidence rates per 1000 person-years were lowest in the Mid-Atlantic (11.2; 95% CI, 11.1-11.4) and highest in the Southeast (14.0; 95% CI, 13.8-14.2).
  • After adjusting for demographics, compared with the Mid-Atlantic region, dementia incidence was highest in the Southeast (rate ratio [RR], 1.25), followed by the Northwest and Rocky Mountains (RR for both, 1.23), South (RR, 1.18), Southwest (RR, 1.13), and Midwest and South Atlantic (RR for both, 1.12). The Great Lakes and Northeast regions had < a 10% difference in incidence.
  • Results remained consistent after adjusting for rurality and cardiovascular comorbidities, and after accounting for competing risk for death.

IN PRACTICE:

“This study provides valuable insights into the regional variation in dementia incidence among US veterans in that we observed more than 20% greater incidence in several regions compared with the Mid-Atlantic region,” the investigators wrote. 

“By identifying areas with the highest incidence rates, resources can be better allocated and targeted interventions designed to mitigate the impact of dementia on vulnerable populations,” they added.

SOURCE:

This study was led by Christina S. Dintica, PhD, University of California, San Francisco. It was published online on June 9 in JAMA Neurology.

LIMITATIONS:

This study population was limited to US veterans, limiting the generalizability of the findings. Education level was defined using educational attainment rates in the participants’ zip codes rather than individual data. Additionally, because residential history was limited to a single location per participant, migration patterns could not be tracked. 

DISCLOSURES:

This study was supported by grants from the Alzheimer’s Association, the National Institute on Aging, and the Department of Defense. One author reported serving on data and safety monitoring boards for studies sponsored by the National Institutes of Health, as well as holding advisory board membership and receiving personal fees from industry. Full details are listed in the original article. The other four investigators reported no relevant financial conflicts.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

TOPLINE:

Dementia incidence varied significantly by US region in a new study, with the Southeast showing a 25% higher risk and the Northwest and Rocky Mountains each showing a 23% higher risk compared to the Mid-Atlantic. Investigators said the findings highlight the need for a geographically tailored approach to address dementia risk factors and diagnostic services.

METHODOLOGY:

  • Researchers conducted a cohort study using data from the US Veterans Health Administration for more than 1.2 million older adults without dementia (mean age, 73.9 years; 98%% men) from 1999 to 2021. The average follow-up was 12.6 years.
  • Ten geographical regions across the US were defined using the CDC National Center for Chronic Disease Prevention and Health Promotion definition.
  • The diagnosis of dementia was made using International Classification of Diseases, Ninth and Tenth Revision codes from inpatient and outpatient visits.

TAKEAWAY:

  • Dementia incidence rates per 1000 person-years were lowest in the Mid-Atlantic (11.2; 95% CI, 11.1-11.4) and highest in the Southeast (14.0; 95% CI, 13.8-14.2).
  • After adjusting for demographics, compared with the Mid-Atlantic region, dementia incidence was highest in the Southeast (rate ratio [RR], 1.25), followed by the Northwest and Rocky Mountains (RR for both, 1.23), South (RR, 1.18), Southwest (RR, 1.13), and Midwest and South Atlantic (RR for both, 1.12). The Great Lakes and Northeast regions had < a 10% difference in incidence.
  • Results remained consistent after adjusting for rurality and cardiovascular comorbidities, and after accounting for competing risk for death.

IN PRACTICE:

“This study provides valuable insights into the regional variation in dementia incidence among US veterans in that we observed more than 20% greater incidence in several regions compared with the Mid-Atlantic region,” the investigators wrote. 

“By identifying areas with the highest incidence rates, resources can be better allocated and targeted interventions designed to mitigate the impact of dementia on vulnerable populations,” they added.

SOURCE:

This study was led by Christina S. Dintica, PhD, University of California, San Francisco. It was published online on June 9 in JAMA Neurology.

LIMITATIONS:

This study population was limited to US veterans, limiting the generalizability of the findings. Education level was defined using educational attainment rates in the participants’ zip codes rather than individual data. Additionally, because residential history was limited to a single location per participant, migration patterns could not be tracked. 

DISCLOSURES:

This study was supported by grants from the Alzheimer’s Association, the National Institute on Aging, and the Department of Defense. One author reported serving on data and safety monitoring boards for studies sponsored by the National Institutes of Health, as well as holding advisory board membership and receiving personal fees from industry. Full details are listed in the original article. The other four investigators reported no relevant financial conflicts.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.

A version of this article first appeared on Medscape.com.

TOPLINE:

Dementia incidence varied significantly by US region in a new study, with the Southeast showing a 25% higher risk and the Northwest and Rocky Mountains each showing a 23% higher risk compared to the Mid-Atlantic. Investigators said the findings highlight the need for a geographically tailored approach to address dementia risk factors and diagnostic services.

METHODOLOGY:

  • Researchers conducted a cohort study using data from the US Veterans Health Administration for more than 1.2 million older adults without dementia (mean age, 73.9 years; 98%% men) from 1999 to 2021. The average follow-up was 12.6 years.
  • Ten geographical regions across the US were defined using the CDC National Center for Chronic Disease Prevention and Health Promotion definition.
  • The diagnosis of dementia was made using International Classification of Diseases, Ninth and Tenth Revision codes from inpatient and outpatient visits.

TAKEAWAY:

  • Dementia incidence rates per 1000 person-years were lowest in the Mid-Atlantic (11.2; 95% CI, 11.1-11.4) and highest in the Southeast (14.0; 95% CI, 13.8-14.2).
  • After adjusting for demographics, compared with the Mid-Atlantic region, dementia incidence was highest in the Southeast (rate ratio [RR], 1.25), followed by the Northwest and Rocky Mountains (RR for both, 1.23), South (RR, 1.18), Southwest (RR, 1.13), and Midwest and South Atlantic (RR for both, 1.12). The Great Lakes and Northeast regions had < a 10% difference in incidence.
  • Results remained consistent after adjusting for rurality and cardiovascular comorbidities, and after accounting for competing risk for death.

IN PRACTICE:

“This study provides valuable insights into the regional variation in dementia incidence among US veterans in that we observed more than 20% greater incidence in several regions compared with the Mid-Atlantic region,” the investigators wrote. 

“By identifying areas with the highest incidence rates, resources can be better allocated and targeted interventions designed to mitigate the impact of dementia on vulnerable populations,” they added.

SOURCE:

This study was led by Christina S. Dintica, PhD, University of California, San Francisco. It was published online on June 9 in JAMA Neurology.

LIMITATIONS:

This study population was limited to US veterans, limiting the generalizability of the findings. Education level was defined using educational attainment rates in the participants’ zip codes rather than individual data. Additionally, because residential history was limited to a single location per participant, migration patterns could not be tracked. 

DISCLOSURES:

This study was supported by grants from the Alzheimer’s Association, the National Institute on Aging, and the Department of Defense. One author reported serving on data and safety monitoring boards for studies sponsored by the National Institutes of Health, as well as holding advisory board membership and receiving personal fees from industry. Full details are listed in the original article. The other four investigators reported no relevant financial conflicts.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Un-Gate On Date
Use ProPublica
CFC Schedule Remove Status
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date

Sclerosing Mesenteritis: What GIs Need to Know About This Rare Disease

Article Type
Changed

AGA has issued an updated pragmatic review on sclerosing mesenteritis (SM). Published in Clinical Gastroenterology and Hepatology, the update evaluates available evidence for diagnosis and treatment and examines opportunities for future research in SM, previously known by such names as misty mesentery, mesenteric panniculitis, and inflammatory pseudotumor.

Led by Mark T. Worthington, MD, AGAF, a professor of medicine in the Division of Gastroenterology and Hepatology at the University of Virginia in Charlottesville, Virginia, an expert AGA panel described SM as an uncommon benign idiopathic autoimmune disease of the mesenteric fat. Although of poorly understood etiology, gastroenterologists need to be prepared to diagnose it.

“CT radiologists increasingly are reporting SM and related lesions, such as misty mesentery,” Worthington told GI & Hepatology News. “We are also seeing new SM cases caused by immune checkpoint inhibitors in cancer treatment, and the oncologists ask us to manage this because it interferes with the treatment of the underlying malignancy. Those are often readily treated because we catch them so early.” Metabolic syndrome and associated conditions increase the risk for SM, as does aging.

The recent changes are intended to help clinicians predict disease activity and the need for other testing or treatment. “For instance, most cases are indolent and do not require aggressive treatment — often no treatment at all — but for those that are aggressive, we want the clinician to be able to identify those and make sure the treatment is appropriate. The aggressive cases may warrant tertiary referral,” Worthington said. “A secondary cancer is a possibility in this condition, so drawing from the SM radiology studies, we try to help the clinician decide who needs other testing, such as PET-CT or biopsy, and who can be monitored.”

As many as 60% of cases are asymptomatic, requiring no treatment. Abdominal pain is the most frequent symptom and its location on clinical examination should correspond to the SM lesion on imaging. Treatment involves anti-inflammatory medications tailored to disease severity and clinical response.

No biopsy is not necessary if the lesion meets three of the five CT criteria reported by B. Coulier and has no features of more aggressive disease or malignancy. Although some have suggested that SM may be a paraneoplastic syndrome, current evidence does not support this. SM needs to be differentiated from other diagnoses such as non-Hodgkin’s lymphoma, peritoneal carcinomatosis, and mesenteric fibromatosis.

“There are now CT guidelines for who actually has SM, who needs a biopsy or a PET-CT to rule-out malignancy, and who doesn’t,” said Worthington. “Radiologists do not always use the Coulier criteria for diagnosis, but often they will with encouragement. From this review, a GI clinician should be able to identify SM on CT.”

Epidemiologically, retrospective CT studies have reported a frequency of 0.6%-1.1%, the panelists noted. And while demographic data are limited, a large early case series reported that SM patients had a mean age of 55 years and more likely to be men and of White race.

Patients with SM do not have a higher prevalence of autoimmunity in general, but may have increased rates of metabolic syndrome, obesity, coronary artery disease, and urolithiasis, the panelists noted.

The update allows room for differences in clinical judgment. “For instance, a longer or more frequent CT surveillance interval can be justified depending on the patient’s findings, and no one should feel locked in by these recommendations,” Worthington said.

 

Medical Therapy

Although there is no surgical cure, pharmacologic options are many. These include prednisone, tamoxifen, colchicine, azathioprine, thalidomide, cyclophosphamide, and methotrexate, as well as the biologics rituximab, infliximab and ustekinumab. Current corticosteroid-based therapies often require months to achieve a clinical response, however.

Bowel obstruction is managed nonoperatively when feasible, but medically refractory disease may require surgical bypass.

Offering his perspective on the guidance but not involved in its formulation, Gastroenterologist Stephen B. Hanauer, MD, AGAF, a professor of medicine at Northwestern Medicine in Chicago, said, “The most useful component of the practical review is the algorithm for diagnosis and determination when biopsy or follow-up imaging is reasonable in the absence of evidence.” He stressed that the recommendations are pragmatic rather than evidence-based “as there are no controlled trials and the presentation is heterogeneous.”

Dr. Stephen B. Hanauer



Hanauer added that none of the recommended treatments have been shown to impact reduction on imaging. “Hence, all of the treatments are empiric without biological or imaging endpoints.”

In his experience, patients with inflammatory features are the best candidates for immune-directed therapies as reduction in inflammatory markers is a potential endpoint, although no therapies have demonstrated an effect on imaging or progression. “As an IBD doctor, I favor steroids and azathioprine or anti-TNF directed therapy, but again, there is no evidence beyond reports of symptomatic improvement.” 

Worthington and colleagues agreed that treatment protocols have developed empirically. “Future investigation for symptomatic SM should focus on the nature of the inflammatory response, including causative cytokines and other proinflammatory mediators, the goal being targeted therapy with fewer side effects and a more rapid clinical response,” they wrote.

Currently, said Worthington, the biggest gaps remain in treatment. “Even the best studies are small and anecdotal, and we do not know the cytokine or other proinflammatory mediators.”

This guidance was supported by the AGA. Worthington reported renumeration from TriCity Surgery Center, Prescott, Ariz. Hanauer had no conflicts of interest relevant to their comments.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

AGA has issued an updated pragmatic review on sclerosing mesenteritis (SM). Published in Clinical Gastroenterology and Hepatology, the update evaluates available evidence for diagnosis and treatment and examines opportunities for future research in SM, previously known by such names as misty mesentery, mesenteric panniculitis, and inflammatory pseudotumor.

Led by Mark T. Worthington, MD, AGAF, a professor of medicine in the Division of Gastroenterology and Hepatology at the University of Virginia in Charlottesville, Virginia, an expert AGA panel described SM as an uncommon benign idiopathic autoimmune disease of the mesenteric fat. Although of poorly understood etiology, gastroenterologists need to be prepared to diagnose it.

“CT radiologists increasingly are reporting SM and related lesions, such as misty mesentery,” Worthington told GI & Hepatology News. “We are also seeing new SM cases caused by immune checkpoint inhibitors in cancer treatment, and the oncologists ask us to manage this because it interferes with the treatment of the underlying malignancy. Those are often readily treated because we catch them so early.” Metabolic syndrome and associated conditions increase the risk for SM, as does aging.

The recent changes are intended to help clinicians predict disease activity and the need for other testing or treatment. “For instance, most cases are indolent and do not require aggressive treatment — often no treatment at all — but for those that are aggressive, we want the clinician to be able to identify those and make sure the treatment is appropriate. The aggressive cases may warrant tertiary referral,” Worthington said. “A secondary cancer is a possibility in this condition, so drawing from the SM radiology studies, we try to help the clinician decide who needs other testing, such as PET-CT or biopsy, and who can be monitored.”

As many as 60% of cases are asymptomatic, requiring no treatment. Abdominal pain is the most frequent symptom and its location on clinical examination should correspond to the SM lesion on imaging. Treatment involves anti-inflammatory medications tailored to disease severity and clinical response.

No biopsy is not necessary if the lesion meets three of the five CT criteria reported by B. Coulier and has no features of more aggressive disease or malignancy. Although some have suggested that SM may be a paraneoplastic syndrome, current evidence does not support this. SM needs to be differentiated from other diagnoses such as non-Hodgkin’s lymphoma, peritoneal carcinomatosis, and mesenteric fibromatosis.

“There are now CT guidelines for who actually has SM, who needs a biopsy or a PET-CT to rule-out malignancy, and who doesn’t,” said Worthington. “Radiologists do not always use the Coulier criteria for diagnosis, but often they will with encouragement. From this review, a GI clinician should be able to identify SM on CT.”

Epidemiologically, retrospective CT studies have reported a frequency of 0.6%-1.1%, the panelists noted. And while demographic data are limited, a large early case series reported that SM patients had a mean age of 55 years and more likely to be men and of White race.

Patients with SM do not have a higher prevalence of autoimmunity in general, but may have increased rates of metabolic syndrome, obesity, coronary artery disease, and urolithiasis, the panelists noted.

The update allows room for differences in clinical judgment. “For instance, a longer or more frequent CT surveillance interval can be justified depending on the patient’s findings, and no one should feel locked in by these recommendations,” Worthington said.

 

Medical Therapy

Although there is no surgical cure, pharmacologic options are many. These include prednisone, tamoxifen, colchicine, azathioprine, thalidomide, cyclophosphamide, and methotrexate, as well as the biologics rituximab, infliximab and ustekinumab. Current corticosteroid-based therapies often require months to achieve a clinical response, however.

Bowel obstruction is managed nonoperatively when feasible, but medically refractory disease may require surgical bypass.

Offering his perspective on the guidance but not involved in its formulation, Gastroenterologist Stephen B. Hanauer, MD, AGAF, a professor of medicine at Northwestern Medicine in Chicago, said, “The most useful component of the practical review is the algorithm for diagnosis and determination when biopsy or follow-up imaging is reasonable in the absence of evidence.” He stressed that the recommendations are pragmatic rather than evidence-based “as there are no controlled trials and the presentation is heterogeneous.”

Dr. Stephen B. Hanauer



Hanauer added that none of the recommended treatments have been shown to impact reduction on imaging. “Hence, all of the treatments are empiric without biological or imaging endpoints.”

In his experience, patients with inflammatory features are the best candidates for immune-directed therapies as reduction in inflammatory markers is a potential endpoint, although no therapies have demonstrated an effect on imaging or progression. “As an IBD doctor, I favor steroids and azathioprine or anti-TNF directed therapy, but again, there is no evidence beyond reports of symptomatic improvement.” 

Worthington and colleagues agreed that treatment protocols have developed empirically. “Future investigation for symptomatic SM should focus on the nature of the inflammatory response, including causative cytokines and other proinflammatory mediators, the goal being targeted therapy with fewer side effects and a more rapid clinical response,” they wrote.

Currently, said Worthington, the biggest gaps remain in treatment. “Even the best studies are small and anecdotal, and we do not know the cytokine or other proinflammatory mediators.”

This guidance was supported by the AGA. Worthington reported renumeration from TriCity Surgery Center, Prescott, Ariz. Hanauer had no conflicts of interest relevant to their comments.

A version of this article appeared on Medscape.com.

AGA has issued an updated pragmatic review on sclerosing mesenteritis (SM). Published in Clinical Gastroenterology and Hepatology, the update evaluates available evidence for diagnosis and treatment and examines opportunities for future research in SM, previously known by such names as misty mesentery, mesenteric panniculitis, and inflammatory pseudotumor.

Led by Mark T. Worthington, MD, AGAF, a professor of medicine in the Division of Gastroenterology and Hepatology at the University of Virginia in Charlottesville, Virginia, an expert AGA panel described SM as an uncommon benign idiopathic autoimmune disease of the mesenteric fat. Although of poorly understood etiology, gastroenterologists need to be prepared to diagnose it.

“CT radiologists increasingly are reporting SM and related lesions, such as misty mesentery,” Worthington told GI & Hepatology News. “We are also seeing new SM cases caused by immune checkpoint inhibitors in cancer treatment, and the oncologists ask us to manage this because it interferes with the treatment of the underlying malignancy. Those are often readily treated because we catch them so early.” Metabolic syndrome and associated conditions increase the risk for SM, as does aging.

The recent changes are intended to help clinicians predict disease activity and the need for other testing or treatment. “For instance, most cases are indolent and do not require aggressive treatment — often no treatment at all — but for those that are aggressive, we want the clinician to be able to identify those and make sure the treatment is appropriate. The aggressive cases may warrant tertiary referral,” Worthington said. “A secondary cancer is a possibility in this condition, so drawing from the SM radiology studies, we try to help the clinician decide who needs other testing, such as PET-CT or biopsy, and who can be monitored.”

As many as 60% of cases are asymptomatic, requiring no treatment. Abdominal pain is the most frequent symptom and its location on clinical examination should correspond to the SM lesion on imaging. Treatment involves anti-inflammatory medications tailored to disease severity and clinical response.

No biopsy is not necessary if the lesion meets three of the five CT criteria reported by B. Coulier and has no features of more aggressive disease or malignancy. Although some have suggested that SM may be a paraneoplastic syndrome, current evidence does not support this. SM needs to be differentiated from other diagnoses such as non-Hodgkin’s lymphoma, peritoneal carcinomatosis, and mesenteric fibromatosis.

“There are now CT guidelines for who actually has SM, who needs a biopsy or a PET-CT to rule-out malignancy, and who doesn’t,” said Worthington. “Radiologists do not always use the Coulier criteria for diagnosis, but often they will with encouragement. From this review, a GI clinician should be able to identify SM on CT.”

Epidemiologically, retrospective CT studies have reported a frequency of 0.6%-1.1%, the panelists noted. And while demographic data are limited, a large early case series reported that SM patients had a mean age of 55 years and more likely to be men and of White race.

Patients with SM do not have a higher prevalence of autoimmunity in general, but may have increased rates of metabolic syndrome, obesity, coronary artery disease, and urolithiasis, the panelists noted.

The update allows room for differences in clinical judgment. “For instance, a longer or more frequent CT surveillance interval can be justified depending on the patient’s findings, and no one should feel locked in by these recommendations,” Worthington said.

 

Medical Therapy

Although there is no surgical cure, pharmacologic options are many. These include prednisone, tamoxifen, colchicine, azathioprine, thalidomide, cyclophosphamide, and methotrexate, as well as the biologics rituximab, infliximab and ustekinumab. Current corticosteroid-based therapies often require months to achieve a clinical response, however.

Bowel obstruction is managed nonoperatively when feasible, but medically refractory disease may require surgical bypass.

Offering his perspective on the guidance but not involved in its formulation, Gastroenterologist Stephen B. Hanauer, MD, AGAF, a professor of medicine at Northwestern Medicine in Chicago, said, “The most useful component of the practical review is the algorithm for diagnosis and determination when biopsy or follow-up imaging is reasonable in the absence of evidence.” He stressed that the recommendations are pragmatic rather than evidence-based “as there are no controlled trials and the presentation is heterogeneous.”

Dr. Stephen B. Hanauer



Hanauer added that none of the recommended treatments have been shown to impact reduction on imaging. “Hence, all of the treatments are empiric without biological or imaging endpoints.”

In his experience, patients with inflammatory features are the best candidates for immune-directed therapies as reduction in inflammatory markers is a potential endpoint, although no therapies have demonstrated an effect on imaging or progression. “As an IBD doctor, I favor steroids and azathioprine or anti-TNF directed therapy, but again, there is no evidence beyond reports of symptomatic improvement.” 

Worthington and colleagues agreed that treatment protocols have developed empirically. “Future investigation for symptomatic SM should focus on the nature of the inflammatory response, including causative cytokines and other proinflammatory mediators, the goal being targeted therapy with fewer side effects and a more rapid clinical response,” they wrote.

Currently, said Worthington, the biggest gaps remain in treatment. “Even the best studies are small and anecdotal, and we do not know the cytokine or other proinflammatory mediators.”

This guidance was supported by the AGA. Worthington reported renumeration from TriCity Surgery Center, Prescott, Ariz. Hanauer had no conflicts of interest relevant to their comments.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Un-Gate On Date
Use ProPublica
CFC Schedule Remove Status
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date

Digital Mindfulness Program May Reduce Anxiety in Patients With Chronic Obstructive Pulmonary Disease

Article Type
Changed

TOPLINE:

An 8-week smartphone-based mindfulness program using audio-guided meditation reduced anxiety and improved emotional well-being in patients with chronic obstructive pulmonary disease (COPD), also providing relief from stress, anxiety, and dyspnea following each session.

METHODOLOGY:

  • A considerable proportion of patients with COPD experience clinically significant anxiety and depressive symptoms; psychological interventions that are easy to implement as add-on treatments can alleviate these symptoms.
  • In this pilot study, 30 patients (mean age, 62.68 y; 60.5% women) with COPD and subclinical symptoms of anxiety or depression were enrolled and allocated to an 8-week self-administered digital mindfulness-based intervention (n = 14) or the waitlist control (n = 16).
  • Patients in the intervention group had an introductory face-to-face session, followed by daily smartphone audio-guided meditation adapted for patients with COPD. The waitlist group received the same intervention after the study period ended.
  • The primary endpoints were the feasibility of the intervention and its effects on anxiety and depression symptoms at baseline, 4 weeks, and 8 weeks.

TAKEAWAY:

  • Patients in the intervention group practiced mindfulness on 81.38% of the 56 intervention days.
  • After 8 weeks, the intervention group showed a significant reduction in anxiety (P = .010) compared with the waitlist group; however, no significant improvement was observed for depression.
  • Similarly, significant improvements were reported for emotional functioning (P = .004), but no significant reductions in perceived stress and hair cortisol levels were observed after 8 weeks.
  • Significant reductions were reported for momentary subjective stress (P < .001), anxiety (P = .022), and dyspnea (P < .001) immediately after meditation sessions.

IN PRACTICE:

“The investigated self-administered digital MBI [mindfulness-based intervention], including brief 10- to 15-minute meditations, was feasible and holds potential as low-threshold add-on treatment to alleviate anxiety after 8 weeks and reduce momentary subjective stress, anxiety, and dyspnea in everyday life,” the study authors wrote.

SOURCE:

This study was led by Hannah Tschenett, Department of Clinical and Health Psychology, Faculty of Psychology, University of Vienna, Vienna, Austria, and was published online in Respiratory Research.

LIMITATIONS:

This study had several limitations including a small sample size, lack of a true control group, and potential selection bias due to recruitment from centers with patients already interested in mindfulness, which may have inflated adherence. Additionally, generalizability to all patients with COPD was limited, as many were either ineligible or declined to participate.

DISCLOSURES:

This study was funded by the Scientific Medical Fund of the City of Vienna and the Karl Landsteiner Institute (KLI) for Lung Research and Pulmonary Oncology. The KLI received funding from AstraZeneca, Boehringer Ingelheim, Chiesi, Linde plc, Menarini Pharma, Novartis, and Vivisol Austria. Three authors reported being employees of KLI or receiving lecture fees from some of these pharmaceutical companies.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

TOPLINE:

An 8-week smartphone-based mindfulness program using audio-guided meditation reduced anxiety and improved emotional well-being in patients with chronic obstructive pulmonary disease (COPD), also providing relief from stress, anxiety, and dyspnea following each session.

METHODOLOGY:

  • A considerable proportion of patients with COPD experience clinically significant anxiety and depressive symptoms; psychological interventions that are easy to implement as add-on treatments can alleviate these symptoms.
  • In this pilot study, 30 patients (mean age, 62.68 y; 60.5% women) with COPD and subclinical symptoms of anxiety or depression were enrolled and allocated to an 8-week self-administered digital mindfulness-based intervention (n = 14) or the waitlist control (n = 16).
  • Patients in the intervention group had an introductory face-to-face session, followed by daily smartphone audio-guided meditation adapted for patients with COPD. The waitlist group received the same intervention after the study period ended.
  • The primary endpoints were the feasibility of the intervention and its effects on anxiety and depression symptoms at baseline, 4 weeks, and 8 weeks.

TAKEAWAY:

  • Patients in the intervention group practiced mindfulness on 81.38% of the 56 intervention days.
  • After 8 weeks, the intervention group showed a significant reduction in anxiety (P = .010) compared with the waitlist group; however, no significant improvement was observed for depression.
  • Similarly, significant improvements were reported for emotional functioning (P = .004), but no significant reductions in perceived stress and hair cortisol levels were observed after 8 weeks.
  • Significant reductions were reported for momentary subjective stress (P < .001), anxiety (P = .022), and dyspnea (P < .001) immediately after meditation sessions.

IN PRACTICE:

“The investigated self-administered digital MBI [mindfulness-based intervention], including brief 10- to 15-minute meditations, was feasible and holds potential as low-threshold add-on treatment to alleviate anxiety after 8 weeks and reduce momentary subjective stress, anxiety, and dyspnea in everyday life,” the study authors wrote.

SOURCE:

This study was led by Hannah Tschenett, Department of Clinical and Health Psychology, Faculty of Psychology, University of Vienna, Vienna, Austria, and was published online in Respiratory Research.

LIMITATIONS:

This study had several limitations including a small sample size, lack of a true control group, and potential selection bias due to recruitment from centers with patients already interested in mindfulness, which may have inflated adherence. Additionally, generalizability to all patients with COPD was limited, as many were either ineligible or declined to participate.

DISCLOSURES:

This study was funded by the Scientific Medical Fund of the City of Vienna and the Karl Landsteiner Institute (KLI) for Lung Research and Pulmonary Oncology. The KLI received funding from AstraZeneca, Boehringer Ingelheim, Chiesi, Linde plc, Menarini Pharma, Novartis, and Vivisol Austria. Three authors reported being employees of KLI or receiving lecture fees from some of these pharmaceutical companies.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.

A version of this article first appeared on Medscape.com.

TOPLINE:

An 8-week smartphone-based mindfulness program using audio-guided meditation reduced anxiety and improved emotional well-being in patients with chronic obstructive pulmonary disease (COPD), also providing relief from stress, anxiety, and dyspnea following each session.

METHODOLOGY:

  • A considerable proportion of patients with COPD experience clinically significant anxiety and depressive symptoms; psychological interventions that are easy to implement as add-on treatments can alleviate these symptoms.
  • In this pilot study, 30 patients (mean age, 62.68 y; 60.5% women) with COPD and subclinical symptoms of anxiety or depression were enrolled and allocated to an 8-week self-administered digital mindfulness-based intervention (n = 14) or the waitlist control (n = 16).
  • Patients in the intervention group had an introductory face-to-face session, followed by daily smartphone audio-guided meditation adapted for patients with COPD. The waitlist group received the same intervention after the study period ended.
  • The primary endpoints were the feasibility of the intervention and its effects on anxiety and depression symptoms at baseline, 4 weeks, and 8 weeks.

TAKEAWAY:

  • Patients in the intervention group practiced mindfulness on 81.38% of the 56 intervention days.
  • After 8 weeks, the intervention group showed a significant reduction in anxiety (P = .010) compared with the waitlist group; however, no significant improvement was observed for depression.
  • Similarly, significant improvements were reported for emotional functioning (P = .004), but no significant reductions in perceived stress and hair cortisol levels were observed after 8 weeks.
  • Significant reductions were reported for momentary subjective stress (P < .001), anxiety (P = .022), and dyspnea (P < .001) immediately after meditation sessions.

IN PRACTICE:

“The investigated self-administered digital MBI [mindfulness-based intervention], including brief 10- to 15-minute meditations, was feasible and holds potential as low-threshold add-on treatment to alleviate anxiety after 8 weeks and reduce momentary subjective stress, anxiety, and dyspnea in everyday life,” the study authors wrote.

SOURCE:

This study was led by Hannah Tschenett, Department of Clinical and Health Psychology, Faculty of Psychology, University of Vienna, Vienna, Austria, and was published online in Respiratory Research.

LIMITATIONS:

This study had several limitations including a small sample size, lack of a true control group, and potential selection bias due to recruitment from centers with patients already interested in mindfulness, which may have inflated adherence. Additionally, generalizability to all patients with COPD was limited, as many were either ineligible or declined to participate.

DISCLOSURES:

This study was funded by the Scientific Medical Fund of the City of Vienna and the Karl Landsteiner Institute (KLI) for Lung Research and Pulmonary Oncology. The KLI received funding from AstraZeneca, Boehringer Ingelheim, Chiesi, Linde plc, Menarini Pharma, Novartis, and Vivisol Austria. Three authors reported being employees of KLI or receiving lecture fees from some of these pharmaceutical companies.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Un-Gate On Date
Use ProPublica
CFC Schedule Remove Status
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date

Strength Training Can Improve Lymphedema in Breast Cancer

Article Type
Changed

TOPLINE:

A recent study found that 3 months of resistance training did not worsen lymphedema in breast cancer survivors and instead significantly improved fluid balance and increased upper extremity muscle mass. The edema index also improved, suggesting potential therapeutic benefits of intense resistance training for managing lymphedema.

METHODOLOGY:

  • Lymphedema is a common adverse effect of breast cancer treatment that can limit mobility. Although strength training can have multiple benefits for breast cancer survivors, such as increased bone density and metabolism, data on whether more intense resistance training exacerbates lymphedema in this population are limited. Worries that more intense training will lead to or worsen lymphedema have typically led to cautious recommendations.
  • Researchers conducted a cohort study involving 115 women with breast cancer (median age, 54 years; 96% White; 4% Black) between September 2022 and March 2024. Most (83%) underwent sentinel lymph node biopsy (SLNB), while 12% had axillary lymph node dissection (ALND). At baseline, 13% had clinical lymphedema, including 37% in the ALND group and 8% in the SLNB group.
  • Participants attended resistance training sessions three times a week, with intensity escalation over 3 months. Exercises involved hand weights, resistance bands, and body weight (eg, pushups) to promote strength, mobility, and muscle hypertrophy.
  • Bioimpedance analysis measured intracellular water, extracellular water, and total body water before and after exercise. Lymphedema was defined as more than a 3% increase in arm circumference discrepancy relative to preoperative ipsilateral arm measurements, along with an elevated edema index (extracellular water to total body water ratio).

TAKEAWAY:

  • No participants experienced subjective or clinical worsening of lymphedema after completing the resistance training regimen.
  • Lean mass in the affected arm increased from a median of 5.45 lb to 5.64 lb (P < .001), while lean mass in the unaffected arm rose from 5.51 lb to 5.53 lb (P < .001) after the resistance training.
  • Overall, participants’ fluid balance improved. The edema index in both arms showed a significant reduction at training completion (mean, 0.383) vs baseline (mean, 0.385), indicating reduced lymphedema. Subgroup analysis of women who underwent SLNB showed similar improvements in the edema index.

IN PRACTICE:

“These findings highlight the safety of strength and resistance training in a large group of patients with breast cancer during and after treatment,” the authors wrote. Beyond that, the authors noted, the results point to a potential role for resistance training in reducing lymphedema.

SOURCE:

This study, led by Parisa Shamsesfandabadi, MD, Allegheny Health Network, Pittsburgh, was published online in JAMA Network Open.

LIMITATIONS:

A major limitation was the absence of a control group, which prevented a direct comparison between the effects of exercise and the natural progression of lymphedema. The 3-month intervention provided limited insight into the long-term sustainability of benefits. Patient-reported outcomes were not included. Additionally, potential confounding variables such as diet, medication use, and baseline physical activity levels were not controlled for in the analysis.

DISCLOSURES:

The authors did not disclose any funding information. Several authors reported having ties with various sources. Additional disclosures are noted in the original article.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

TOPLINE:

A recent study found that 3 months of resistance training did not worsen lymphedema in breast cancer survivors and instead significantly improved fluid balance and increased upper extremity muscle mass. The edema index also improved, suggesting potential therapeutic benefits of intense resistance training for managing lymphedema.

METHODOLOGY:

  • Lymphedema is a common adverse effect of breast cancer treatment that can limit mobility. Although strength training can have multiple benefits for breast cancer survivors, such as increased bone density and metabolism, data on whether more intense resistance training exacerbates lymphedema in this population are limited. Worries that more intense training will lead to or worsen lymphedema have typically led to cautious recommendations.
  • Researchers conducted a cohort study involving 115 women with breast cancer (median age, 54 years; 96% White; 4% Black) between September 2022 and March 2024. Most (83%) underwent sentinel lymph node biopsy (SLNB), while 12% had axillary lymph node dissection (ALND). At baseline, 13% had clinical lymphedema, including 37% in the ALND group and 8% in the SLNB group.
  • Participants attended resistance training sessions three times a week, with intensity escalation over 3 months. Exercises involved hand weights, resistance bands, and body weight (eg, pushups) to promote strength, mobility, and muscle hypertrophy.
  • Bioimpedance analysis measured intracellular water, extracellular water, and total body water before and after exercise. Lymphedema was defined as more than a 3% increase in arm circumference discrepancy relative to preoperative ipsilateral arm measurements, along with an elevated edema index (extracellular water to total body water ratio).

TAKEAWAY:

  • No participants experienced subjective or clinical worsening of lymphedema after completing the resistance training regimen.
  • Lean mass in the affected arm increased from a median of 5.45 lb to 5.64 lb (P < .001), while lean mass in the unaffected arm rose from 5.51 lb to 5.53 lb (P < .001) after the resistance training.
  • Overall, participants’ fluid balance improved. The edema index in both arms showed a significant reduction at training completion (mean, 0.383) vs baseline (mean, 0.385), indicating reduced lymphedema. Subgroup analysis of women who underwent SLNB showed similar improvements in the edema index.

IN PRACTICE:

“These findings highlight the safety of strength and resistance training in a large group of patients with breast cancer during and after treatment,” the authors wrote. Beyond that, the authors noted, the results point to a potential role for resistance training in reducing lymphedema.

SOURCE:

This study, led by Parisa Shamsesfandabadi, MD, Allegheny Health Network, Pittsburgh, was published online in JAMA Network Open.

LIMITATIONS:

A major limitation was the absence of a control group, which prevented a direct comparison between the effects of exercise and the natural progression of lymphedema. The 3-month intervention provided limited insight into the long-term sustainability of benefits. Patient-reported outcomes were not included. Additionally, potential confounding variables such as diet, medication use, and baseline physical activity levels were not controlled for in the analysis.

DISCLOSURES:

The authors did not disclose any funding information. Several authors reported having ties with various sources. Additional disclosures are noted in the original article.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

TOPLINE:

A recent study found that 3 months of resistance training did not worsen lymphedema in breast cancer survivors and instead significantly improved fluid balance and increased upper extremity muscle mass. The edema index also improved, suggesting potential therapeutic benefits of intense resistance training for managing lymphedema.

METHODOLOGY:

  • Lymphedema is a common adverse effect of breast cancer treatment that can limit mobility. Although strength training can have multiple benefits for breast cancer survivors, such as increased bone density and metabolism, data on whether more intense resistance training exacerbates lymphedema in this population are limited. Worries that more intense training will lead to or worsen lymphedema have typically led to cautious recommendations.
  • Researchers conducted a cohort study involving 115 women with breast cancer (median age, 54 years; 96% White; 4% Black) between September 2022 and March 2024. Most (83%) underwent sentinel lymph node biopsy (SLNB), while 12% had axillary lymph node dissection (ALND). At baseline, 13% had clinical lymphedema, including 37% in the ALND group and 8% in the SLNB group.
  • Participants attended resistance training sessions three times a week, with intensity escalation over 3 months. Exercises involved hand weights, resistance bands, and body weight (eg, pushups) to promote strength, mobility, and muscle hypertrophy.
  • Bioimpedance analysis measured intracellular water, extracellular water, and total body water before and after exercise. Lymphedema was defined as more than a 3% increase in arm circumference discrepancy relative to preoperative ipsilateral arm measurements, along with an elevated edema index (extracellular water to total body water ratio).

TAKEAWAY:

  • No participants experienced subjective or clinical worsening of lymphedema after completing the resistance training regimen.
  • Lean mass in the affected arm increased from a median of 5.45 lb to 5.64 lb (P < .001), while lean mass in the unaffected arm rose from 5.51 lb to 5.53 lb (P < .001) after the resistance training.
  • Overall, participants’ fluid balance improved. The edema index in both arms showed a significant reduction at training completion (mean, 0.383) vs baseline (mean, 0.385), indicating reduced lymphedema. Subgroup analysis of women who underwent SLNB showed similar improvements in the edema index.

IN PRACTICE:

“These findings highlight the safety of strength and resistance training in a large group of patients with breast cancer during and after treatment,” the authors wrote. Beyond that, the authors noted, the results point to a potential role for resistance training in reducing lymphedema.

SOURCE:

This study, led by Parisa Shamsesfandabadi, MD, Allegheny Health Network, Pittsburgh, was published online in JAMA Network Open.

LIMITATIONS:

A major limitation was the absence of a control group, which prevented a direct comparison between the effects of exercise and the natural progression of lymphedema. The 3-month intervention provided limited insight into the long-term sustainability of benefits. Patient-reported outcomes were not included. Additionally, potential confounding variables such as diet, medication use, and baseline physical activity levels were not controlled for in the analysis.

DISCLOSURES:

The authors did not disclose any funding information. Several authors reported having ties with various sources. Additional disclosures are noted in the original article.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Un-Gate On Date
Use ProPublica
CFC Schedule Remove Status
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date

Ethnic Disparities in Cancer Reflect Disparities in HIV Care

Article Type
Changed

While several cancers associated with immunosuppression are much more common in White men who have sex with men living with HIV (MSMWH) than in the male general population, they are even more frequently seen in Black and Hispanic MSMWH. 

This suggests that racial and ethnic disparities in access to antiretroviral therapy and viral suppression are playing a role, said the authors of an analysis published last month in AIDS.

“Disparities in cancer risk may serve as an important proxy for disparities in HIV care,” they wrote.

The researchers at the National Cancer Institute leveraged data from the HIV/AIDS Cancer Match Study, which covers 13 US states and the District of Columbia. For this analysis, they examined cancer incidence in over 350,000 MSMWH followed for 3.2 million person years, between 2001 and 2019.

They focused on Kaposi sarcoma, non-Hodgkin lymphoma, Hodgkin lymphoma, anal cancer, and liver cancer — all malignancies that are associated with viral infections and immunosuppression. They restricted their analysis to MSM because behavioral factors (such as anal sex) contribute to increased exposure to viral infections in this population.

The study’s intersectional lens is valuable, Gita Suneja, MD, said in an interview. “It is looking at racial and ethnic disparities within an already minoritized group, which is men who have sex with men living with HIV,” said the professor of radiation oncology at the University of Utah, Salt Lake City, Utah, who was not involved in the study.

“It’s really profound to me to sit back and think about how these disparities intersect, and how somebody can be so marginalized: it’s not just race or ethnicity, it’s not just having a stigmatized medical condition, it’s the confluence of all of these factors that leads to exclusion from care and poor outcomes.”

Standardized incidence ratios (SIRs), using men of the same ethnicity and age in the general population as the comparator, were reported for MSMWH of different racial/ethnic groups. For non-Hodgkin lymphoma, the SIR was 3.11 for White MSMWH, rising to 4.84 for Black MSMWH and 5.46 for Hispanic MSMWH. 

For Hodgkin lymphoma, the SIRs were 6.35, 7.69, and 11.5, respectively. For Kaposi sarcoma, they were many orders of magnitude higher, at 417 for White MSMWH, 772 for Black MSMWH, and 887 for Hispanic MSMWH.

In contrast, for anal cancer and liver cancer, the highest SIRs were among White MSMWH.

Given the role of immunosuppression, the researchers wanted to see whether cancer incidence differed according to prior AIDS diagnosis. However, they found that within each racial/ethnic group, there were no statistically significant differences in SIR according to AIDS status.

“There were disparities across the board for [racially minoritized] groups, regardless of immunosuppression status, which leads us to believe that it isn’t just about the diagnosis of AIDS, but about many other factors that we’re not capturing in the paper,” first author Benton Meldrum, MPH, told this news organization.

One study limitation is that AIDS diagnosis is an imprecise proxy for immunosuppression. It does not capture the duration and severity of immunosuppression, nor the extent of immune restoration. Many people with a previous AIDS diagnosis are now virally suppressed.

Database studies have inherent limitations in terms of the range of parameters recorded. In an ideal world, Meldrum said, they would have had access to information on CD4 count and viral suppression over time, as well as socioeconomic factors such as income and insurance status.

Differences in timely HIV diagnosis, viral suppression, and continued engagement in care are thought to drive the differences in cancer incidence. “HIV control today helps mitigate the risk of cancer development down the road,” Suneja said.

While not addressed by this study, there may be additional differences in cancer survival. Differences in cancer care, including prompt diagnosis and access to effective treatment, could play a role.

In terms of practical interventions to address these disparities, Suneja highlights the value of programs which help patients navigate a complex healthcare system. This may include care coordination navigation, peer navigation, and delivering services in community settings.

Such interventions don’t only benefit marginalized groups but help improve healthcare access and outcomes for everyone, she said. Even people with insurance and high health literacy often struggle to remain engaged.

“When we design healthcare systems to best serve those that have been left furthest behind, we all do better,” Suneja said.



The study was funded by the Intramural Research Program of the National Cancer Institute. Suneja and Meldrum reported having no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

While several cancers associated with immunosuppression are much more common in White men who have sex with men living with HIV (MSMWH) than in the male general population, they are even more frequently seen in Black and Hispanic MSMWH. 

This suggests that racial and ethnic disparities in access to antiretroviral therapy and viral suppression are playing a role, said the authors of an analysis published last month in AIDS.

“Disparities in cancer risk may serve as an important proxy for disparities in HIV care,” they wrote.

The researchers at the National Cancer Institute leveraged data from the HIV/AIDS Cancer Match Study, which covers 13 US states and the District of Columbia. For this analysis, they examined cancer incidence in over 350,000 MSMWH followed for 3.2 million person years, between 2001 and 2019.

They focused on Kaposi sarcoma, non-Hodgkin lymphoma, Hodgkin lymphoma, anal cancer, and liver cancer — all malignancies that are associated with viral infections and immunosuppression. They restricted their analysis to MSM because behavioral factors (such as anal sex) contribute to increased exposure to viral infections in this population.

The study’s intersectional lens is valuable, Gita Suneja, MD, said in an interview. “It is looking at racial and ethnic disparities within an already minoritized group, which is men who have sex with men living with HIV,” said the professor of radiation oncology at the University of Utah, Salt Lake City, Utah, who was not involved in the study.

“It’s really profound to me to sit back and think about how these disparities intersect, and how somebody can be so marginalized: it’s not just race or ethnicity, it’s not just having a stigmatized medical condition, it’s the confluence of all of these factors that leads to exclusion from care and poor outcomes.”

Standardized incidence ratios (SIRs), using men of the same ethnicity and age in the general population as the comparator, were reported for MSMWH of different racial/ethnic groups. For non-Hodgkin lymphoma, the SIR was 3.11 for White MSMWH, rising to 4.84 for Black MSMWH and 5.46 for Hispanic MSMWH. 

For Hodgkin lymphoma, the SIRs were 6.35, 7.69, and 11.5, respectively. For Kaposi sarcoma, they were many orders of magnitude higher, at 417 for White MSMWH, 772 for Black MSMWH, and 887 for Hispanic MSMWH.

In contrast, for anal cancer and liver cancer, the highest SIRs were among White MSMWH.

Given the role of immunosuppression, the researchers wanted to see whether cancer incidence differed according to prior AIDS diagnosis. However, they found that within each racial/ethnic group, there were no statistically significant differences in SIR according to AIDS status.

“There were disparities across the board for [racially minoritized] groups, regardless of immunosuppression status, which leads us to believe that it isn’t just about the diagnosis of AIDS, but about many other factors that we’re not capturing in the paper,” first author Benton Meldrum, MPH, told this news organization.

One study limitation is that AIDS diagnosis is an imprecise proxy for immunosuppression. It does not capture the duration and severity of immunosuppression, nor the extent of immune restoration. Many people with a previous AIDS diagnosis are now virally suppressed.

Database studies have inherent limitations in terms of the range of parameters recorded. In an ideal world, Meldrum said, they would have had access to information on CD4 count and viral suppression over time, as well as socioeconomic factors such as income and insurance status.

Differences in timely HIV diagnosis, viral suppression, and continued engagement in care are thought to drive the differences in cancer incidence. “HIV control today helps mitigate the risk of cancer development down the road,” Suneja said.

While not addressed by this study, there may be additional differences in cancer survival. Differences in cancer care, including prompt diagnosis and access to effective treatment, could play a role.

In terms of practical interventions to address these disparities, Suneja highlights the value of programs which help patients navigate a complex healthcare system. This may include care coordination navigation, peer navigation, and delivering services in community settings.

Such interventions don’t only benefit marginalized groups but help improve healthcare access and outcomes for everyone, she said. Even people with insurance and high health literacy often struggle to remain engaged.

“When we design healthcare systems to best serve those that have been left furthest behind, we all do better,” Suneja said.



The study was funded by the Intramural Research Program of the National Cancer Institute. Suneja and Meldrum reported having no relevant financial relationships.

A version of this article first appeared on Medscape.com.

While several cancers associated with immunosuppression are much more common in White men who have sex with men living with HIV (MSMWH) than in the male general population, they are even more frequently seen in Black and Hispanic MSMWH. 

This suggests that racial and ethnic disparities in access to antiretroviral therapy and viral suppression are playing a role, said the authors of an analysis published last month in AIDS.

“Disparities in cancer risk may serve as an important proxy for disparities in HIV care,” they wrote.

The researchers at the National Cancer Institute leveraged data from the HIV/AIDS Cancer Match Study, which covers 13 US states and the District of Columbia. For this analysis, they examined cancer incidence in over 350,000 MSMWH followed for 3.2 million person years, between 2001 and 2019.

They focused on Kaposi sarcoma, non-Hodgkin lymphoma, Hodgkin lymphoma, anal cancer, and liver cancer — all malignancies that are associated with viral infections and immunosuppression. They restricted their analysis to MSM because behavioral factors (such as anal sex) contribute to increased exposure to viral infections in this population.

The study’s intersectional lens is valuable, Gita Suneja, MD, said in an interview. “It is looking at racial and ethnic disparities within an already minoritized group, which is men who have sex with men living with HIV,” said the professor of radiation oncology at the University of Utah, Salt Lake City, Utah, who was not involved in the study.

“It’s really profound to me to sit back and think about how these disparities intersect, and how somebody can be so marginalized: it’s not just race or ethnicity, it’s not just having a stigmatized medical condition, it’s the confluence of all of these factors that leads to exclusion from care and poor outcomes.”

Standardized incidence ratios (SIRs), using men of the same ethnicity and age in the general population as the comparator, were reported for MSMWH of different racial/ethnic groups. For non-Hodgkin lymphoma, the SIR was 3.11 for White MSMWH, rising to 4.84 for Black MSMWH and 5.46 for Hispanic MSMWH. 

For Hodgkin lymphoma, the SIRs were 6.35, 7.69, and 11.5, respectively. For Kaposi sarcoma, they were many orders of magnitude higher, at 417 for White MSMWH, 772 for Black MSMWH, and 887 for Hispanic MSMWH.

In contrast, for anal cancer and liver cancer, the highest SIRs were among White MSMWH.

Given the role of immunosuppression, the researchers wanted to see whether cancer incidence differed according to prior AIDS diagnosis. However, they found that within each racial/ethnic group, there were no statistically significant differences in SIR according to AIDS status.

“There were disparities across the board for [racially minoritized] groups, regardless of immunosuppression status, which leads us to believe that it isn’t just about the diagnosis of AIDS, but about many other factors that we’re not capturing in the paper,” first author Benton Meldrum, MPH, told this news organization.

One study limitation is that AIDS diagnosis is an imprecise proxy for immunosuppression. It does not capture the duration and severity of immunosuppression, nor the extent of immune restoration. Many people with a previous AIDS diagnosis are now virally suppressed.

Database studies have inherent limitations in terms of the range of parameters recorded. In an ideal world, Meldrum said, they would have had access to information on CD4 count and viral suppression over time, as well as socioeconomic factors such as income and insurance status.

Differences in timely HIV diagnosis, viral suppression, and continued engagement in care are thought to drive the differences in cancer incidence. “HIV control today helps mitigate the risk of cancer development down the road,” Suneja said.

While not addressed by this study, there may be additional differences in cancer survival. Differences in cancer care, including prompt diagnosis and access to effective treatment, could play a role.

In terms of practical interventions to address these disparities, Suneja highlights the value of programs which help patients navigate a complex healthcare system. This may include care coordination navigation, peer navigation, and delivering services in community settings.

Such interventions don’t only benefit marginalized groups but help improve healthcare access and outcomes for everyone, she said. Even people with insurance and high health literacy often struggle to remain engaged.

“When we design healthcare systems to best serve those that have been left furthest behind, we all do better,” Suneja said.



The study was funded by the Intramural Research Program of the National Cancer Institute. Suneja and Meldrum reported having no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Un-Gate On Date
Use ProPublica
CFC Schedule Remove Status
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date

OIG Report Reveals Lapses in VA Retention and Recruitment Process

Article Type
Changed

The Veterans Health Administration (VHA) paid about $828 million in recruitment and retention incentives from 2020 to 2023, but the process for providing an estimated $340.9 million of that was not “effectively governed” according to a recent US Department of Veterans Affairs (VA) Office of Inspector General (OIG) investigation.

About one-third of incentives were missing forms or signatures, or lacked sufficient justification, for the payments to about 130,000 VHA employees. In the report, the OIG notes the VHA has faced “long-standing challenges related to occupational shortages,” adding that a shortage occupation designation does not mean there are actual shortages at a facility.

“Most shortage occupations continue to experience annual net growth and are not critically understaffed in most facilities,” the report says.

More than 85% of incentive monies in 2022 and 2023 were paid to employees in occupations on staffing shortage lists. OIG estimated the VHA paid incentives to 38,800 employees (about 30%) where the award justification could not be verified or was insufficient.

Amplified by the COVID-19 pandemic and the PACT Act, the need to recruit and retain employees peaked in 2021, when record numbers of health care workers left their jobs. An October 2021 survey of 1000 medical professionals found nearly 1 in 5 health care workers quit during the pandemic, with most citing stress and burnout, and an additional 31% were considering quitting. When the PACT Act was signed into law in August 2022, it created thousands of newly benefits-eligible veterans.

In May 2022, the VA reported it needed to hire 52,000 employees annually for the next 5 years to keep up. In response, the VA released a 10-step plan to support recruitment and retention, focusing on raising wages when possible and finding other incentives when it wasn’t (ie, relocation bonuses or greater flexibility for remote work). The OIG report acknowledged the pandemic exacerbated VHA’s recruitment and retention challenges. 

By 2024, the VA had not only reduced employee turnover by 20% over the prior 2 years, but had also exceeded its hiring goals. The VHA workforce grew by 7.4% in fiscal year 2023, its highest rate of growth in > 15 years.

VA officials must retain the documentation for incentives for 6 years so the process can be reconstructed if necessary. However, the OIG report noted “numerous instances” where documentation couldn’t be produced and therefore could not determine whether the incentives complied with policy. 

The report also identified 28 employees who received retention incentive payments long after their award period had expired. The VA paid about $4.6 million for incentives that should have been terminated. The VA reported that it is pursuing debt collection for 27 of the 28 employees. 

Only if the “identified weaknesses” are addressed will the VHA have assurance that incentives are being used effectively, the OIG said. Its recommendations included enforcing quality control checks and establishing accountability measures. The OIG also recommended establishing oversight procedures to review retention incentives annually, recertify them if appropriate, or terminate them.

Publications
Topics
Sections

The Veterans Health Administration (VHA) paid about $828 million in recruitment and retention incentives from 2020 to 2023, but the process for providing an estimated $340.9 million of that was not “effectively governed” according to a recent US Department of Veterans Affairs (VA) Office of Inspector General (OIG) investigation.

About one-third of incentives were missing forms or signatures, or lacked sufficient justification, for the payments to about 130,000 VHA employees. In the report, the OIG notes the VHA has faced “long-standing challenges related to occupational shortages,” adding that a shortage occupation designation does not mean there are actual shortages at a facility.

“Most shortage occupations continue to experience annual net growth and are not critically understaffed in most facilities,” the report says.

More than 85% of incentive monies in 2022 and 2023 were paid to employees in occupations on staffing shortage lists. OIG estimated the VHA paid incentives to 38,800 employees (about 30%) where the award justification could not be verified or was insufficient.

Amplified by the COVID-19 pandemic and the PACT Act, the need to recruit and retain employees peaked in 2021, when record numbers of health care workers left their jobs. An October 2021 survey of 1000 medical professionals found nearly 1 in 5 health care workers quit during the pandemic, with most citing stress and burnout, and an additional 31% were considering quitting. When the PACT Act was signed into law in August 2022, it created thousands of newly benefits-eligible veterans.

In May 2022, the VA reported it needed to hire 52,000 employees annually for the next 5 years to keep up. In response, the VA released a 10-step plan to support recruitment and retention, focusing on raising wages when possible and finding other incentives when it wasn’t (ie, relocation bonuses or greater flexibility for remote work). The OIG report acknowledged the pandemic exacerbated VHA’s recruitment and retention challenges. 

By 2024, the VA had not only reduced employee turnover by 20% over the prior 2 years, but had also exceeded its hiring goals. The VHA workforce grew by 7.4% in fiscal year 2023, its highest rate of growth in > 15 years.

VA officials must retain the documentation for incentives for 6 years so the process can be reconstructed if necessary. However, the OIG report noted “numerous instances” where documentation couldn’t be produced and therefore could not determine whether the incentives complied with policy. 

The report also identified 28 employees who received retention incentive payments long after their award period had expired. The VA paid about $4.6 million for incentives that should have been terminated. The VA reported that it is pursuing debt collection for 27 of the 28 employees. 

Only if the “identified weaknesses” are addressed will the VHA have assurance that incentives are being used effectively, the OIG said. Its recommendations included enforcing quality control checks and establishing accountability measures. The OIG also recommended establishing oversight procedures to review retention incentives annually, recertify them if appropriate, or terminate them.

The Veterans Health Administration (VHA) paid about $828 million in recruitment and retention incentives from 2020 to 2023, but the process for providing an estimated $340.9 million of that was not “effectively governed” according to a recent US Department of Veterans Affairs (VA) Office of Inspector General (OIG) investigation.

About one-third of incentives were missing forms or signatures, or lacked sufficient justification, for the payments to about 130,000 VHA employees. In the report, the OIG notes the VHA has faced “long-standing challenges related to occupational shortages,” adding that a shortage occupation designation does not mean there are actual shortages at a facility.

“Most shortage occupations continue to experience annual net growth and are not critically understaffed in most facilities,” the report says.

More than 85% of incentive monies in 2022 and 2023 were paid to employees in occupations on staffing shortage lists. OIG estimated the VHA paid incentives to 38,800 employees (about 30%) where the award justification could not be verified or was insufficient.

Amplified by the COVID-19 pandemic and the PACT Act, the need to recruit and retain employees peaked in 2021, when record numbers of health care workers left their jobs. An October 2021 survey of 1000 medical professionals found nearly 1 in 5 health care workers quit during the pandemic, with most citing stress and burnout, and an additional 31% were considering quitting. When the PACT Act was signed into law in August 2022, it created thousands of newly benefits-eligible veterans.

In May 2022, the VA reported it needed to hire 52,000 employees annually for the next 5 years to keep up. In response, the VA released a 10-step plan to support recruitment and retention, focusing on raising wages when possible and finding other incentives when it wasn’t (ie, relocation bonuses or greater flexibility for remote work). The OIG report acknowledged the pandemic exacerbated VHA’s recruitment and retention challenges. 

By 2024, the VA had not only reduced employee turnover by 20% over the prior 2 years, but had also exceeded its hiring goals. The VHA workforce grew by 7.4% in fiscal year 2023, its highest rate of growth in > 15 years.

VA officials must retain the documentation for incentives for 6 years so the process can be reconstructed if necessary. However, the OIG report noted “numerous instances” where documentation couldn’t be produced and therefore could not determine whether the incentives complied with policy. 

The report also identified 28 employees who received retention incentive payments long after their award period had expired. The VA paid about $4.6 million for incentives that should have been terminated. The VA reported that it is pursuing debt collection for 27 of the 28 employees. 

Only if the “identified weaknesses” are addressed will the VHA have assurance that incentives are being used effectively, the OIG said. Its recommendations included enforcing quality control checks and establishing accountability measures. The OIG also recommended establishing oversight procedures to review retention incentives annually, recertify them if appropriate, or terminate them.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Un-Gate On Date
Use ProPublica
CFC Schedule Remove Status
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date

Celiac Blood Test Eliminates Need for Eating Gluten

Article Type
Changed

Think your patient may have celiac disease? The harsh reality is that current diagnostic tests require patients to consume gluten for an accurate diagnosis, which poses challenges for individuals already avoiding gluten.

A more tolerable approach appears to be on the horizon. Researchers in Australia have developed a blood test that can identify celiac disease with high sensitivity and specificity, even without consuming gluten.

“This is a simple and accurate test that can provide a diagnosis within a very short time frame, without the need for patients to continue eating gluten and feeling sick, or to wait months for a gastroscopy,” Olivia Moscatelli, PhD candidate, Tye-Din Lab, Walter and Eliza Hall Institute and University of Melbourne, Parkville, Australia, told GI & Hepatology News.

The study was published in Gastroenterology.

 

Most Cases Go Undiagnosed

Celiac disease is an autoimmune disorder triggered by gluten found in wheat, rye, and barley. The only available treatment is a strict, life-long gluten-free diet.

The global prevalence of celiac disease is estimated at around 1%-2%, with 50%-80% of cases either undiagnosed or diagnosed late. That’s because the current reliable diagnosis of celiac disease requires the intake of gluten, which may deter people from seeking a diagnosis.

In earlier work, the researchers, working with Robert Anderson, MBChB, BMedSc, PhD, AGAF, now with Novoviah Pharmaceuticals, made the unexpected discovery that interleukin-2 (IL-2) spiked in the blood of people with celiac disease shortly after they ate gluten.

But would this signal be present when no gluten had been consumed?

The team developed and tested a simple whole blood assay measuring IL-2 release (WBAIL- 2) for detecting gluten-specific T cells to aid in diagnosing celiac disease.

They collected blood samples from 181 volunteers — 75 with treated celiac disease on a gluten-free diet, 13 with active untreated celiac disease, 32 with nonceliac gluten sensitivity and 61 healthy controls. The blood samples were mixed with gluten in a test tube for a day to see if the IL-2 signal appeared.

The WBAIL-2 assay demonstrated high accuracy for celiac disease, even in patients following a strict gluten-free diet.

For patients with HLA-DQ2.5+ genetics, sensitivity was 90% and specificity was 95%, with lower sensitivity (56%) for patients with HLA-DQ8+ celiac disease.

The WBAIL-2 assay correlated strongly with the frequency of tetramer-positive gluten-specific CD4+ T cells used to diagnose celiac disease and monitor treatment effectiveness, and with serum IL-2 levels after gluten challenge.

The strength of the IL-2 signal correlated with the severity of a patient’s symptoms, “allowing us to predict how severely a person with celiac disease might react to gluten, without them actually having to eat it,” Moscatelli said in a news release.

“Current diagnostic practice involves a blood-based serology test followed by a confirmatory gastroscopy if positive. Both tests require the patient to eat gluten daily for 6-12 weeks prior for accurate results. We envision the new blood test (IL-2 whole blood assay) will replace the invasive gastroscopy as the confirmatory test following positive serology,” Moscatelli told GI & Hepatology News.

“In people already following a gluten-free diet, we propose they would have this new blood test done on two separate occasions and two positive results would be required for a celiac diagnosis. This would allow a large number of people who previously have been unable to go through the current diagnostic process to receive a diagnosis,” Moscatelli said.

 

Practice Changing Potential 

A blood-based test that can accurately detect celiac disease without the need for a gluten challenge would be “welcome and practice changing,” said Christopher Cao, MD, director, Celiac Disease Program, Division of Gastroenterology, Mount Sinai Health System, New York City.

“A typical ‘gluten challenge’ involves eating the equivalent of 1-2 slices of bread daily for the course of 6 weeks, and this may be incredibly difficult for patients who have already been on a gluten-free diet prior to an official celiac disease diagnosis. Inability to perform a gluten challenge limits the ability to make an accurate celiac disease diagnosis,” Cao told GI & Hepatology News.

“This study shows that gluten-stimulated interleukin release 2 assays may correlate with the presence of pathogenic gluten-specific CD4+ T cell response in celiac disease,” Cao noted.

He cautioned that “further large cohort, multicenter prospective studies are needed to assess generalizability and may be helpful in evaluating the accuracy of WBAIL-2 in non-HLA DQ2.5 genotypes.” 

Other considerations prior to implementation may include reproducibility across different laboratories and overall cost effectiveness, Cao said. “Ultimately in clinic, the role of WBAIL-2 will need to be better defined within the algorithm of celiac disease testing,” he added.

 

The Path Ahead

The researchers plan to test the performance of the IL-2 whole blood assay in a pediatric cohort, as well as in other countries to demonstrate the reproducibility of the test. In these studies, the test will likely be performed alongside the current diagnostic tests (serology and gastroscopy), Moscatelli told GI & Hepatology News.

“There are some validation studies starting in other countries already as many celiac clinicians globally are interested in bringing this test to their clinical practice. I believe the plan is to have this as an approved diagnostic test for celiac disease worldwide,” she said.

Novoviah Pharmaceuticals is managing the commercialization of the test, and the plan is to get it into clinical practice in the next 2 years, Moscatelli said.

The research was supported by Coeliac Australia, Novoviah Pharmaceuticals (who provided the proprietary test for this study), Beck Family Foundation, Butterfield Family, the Veith Foundation. A complete list of author disclosures is available with the original article. Cao had no relevant disclosures.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Think your patient may have celiac disease? The harsh reality is that current diagnostic tests require patients to consume gluten for an accurate diagnosis, which poses challenges for individuals already avoiding gluten.

A more tolerable approach appears to be on the horizon. Researchers in Australia have developed a blood test that can identify celiac disease with high sensitivity and specificity, even without consuming gluten.

“This is a simple and accurate test that can provide a diagnosis within a very short time frame, without the need for patients to continue eating gluten and feeling sick, or to wait months for a gastroscopy,” Olivia Moscatelli, PhD candidate, Tye-Din Lab, Walter and Eliza Hall Institute and University of Melbourne, Parkville, Australia, told GI & Hepatology News.

The study was published in Gastroenterology.

 

Most Cases Go Undiagnosed

Celiac disease is an autoimmune disorder triggered by gluten found in wheat, rye, and barley. The only available treatment is a strict, life-long gluten-free diet.

The global prevalence of celiac disease is estimated at around 1%-2%, with 50%-80% of cases either undiagnosed or diagnosed late. That’s because the current reliable diagnosis of celiac disease requires the intake of gluten, which may deter people from seeking a diagnosis.

In earlier work, the researchers, working with Robert Anderson, MBChB, BMedSc, PhD, AGAF, now with Novoviah Pharmaceuticals, made the unexpected discovery that interleukin-2 (IL-2) spiked in the blood of people with celiac disease shortly after they ate gluten.

But would this signal be present when no gluten had been consumed?

The team developed and tested a simple whole blood assay measuring IL-2 release (WBAIL- 2) for detecting gluten-specific T cells to aid in diagnosing celiac disease.

They collected blood samples from 181 volunteers — 75 with treated celiac disease on a gluten-free diet, 13 with active untreated celiac disease, 32 with nonceliac gluten sensitivity and 61 healthy controls. The blood samples were mixed with gluten in a test tube for a day to see if the IL-2 signal appeared.

The WBAIL-2 assay demonstrated high accuracy for celiac disease, even in patients following a strict gluten-free diet.

For patients with HLA-DQ2.5+ genetics, sensitivity was 90% and specificity was 95%, with lower sensitivity (56%) for patients with HLA-DQ8+ celiac disease.

The WBAIL-2 assay correlated strongly with the frequency of tetramer-positive gluten-specific CD4+ T cells used to diagnose celiac disease and monitor treatment effectiveness, and with serum IL-2 levels after gluten challenge.

The strength of the IL-2 signal correlated with the severity of a patient’s symptoms, “allowing us to predict how severely a person with celiac disease might react to gluten, without them actually having to eat it,” Moscatelli said in a news release.

“Current diagnostic practice involves a blood-based serology test followed by a confirmatory gastroscopy if positive. Both tests require the patient to eat gluten daily for 6-12 weeks prior for accurate results. We envision the new blood test (IL-2 whole blood assay) will replace the invasive gastroscopy as the confirmatory test following positive serology,” Moscatelli told GI & Hepatology News.

“In people already following a gluten-free diet, we propose they would have this new blood test done on two separate occasions and two positive results would be required for a celiac diagnosis. This would allow a large number of people who previously have been unable to go through the current diagnostic process to receive a diagnosis,” Moscatelli said.

 

Practice Changing Potential 

A blood-based test that can accurately detect celiac disease without the need for a gluten challenge would be “welcome and practice changing,” said Christopher Cao, MD, director, Celiac Disease Program, Division of Gastroenterology, Mount Sinai Health System, New York City.

“A typical ‘gluten challenge’ involves eating the equivalent of 1-2 slices of bread daily for the course of 6 weeks, and this may be incredibly difficult for patients who have already been on a gluten-free diet prior to an official celiac disease diagnosis. Inability to perform a gluten challenge limits the ability to make an accurate celiac disease diagnosis,” Cao told GI & Hepatology News.

“This study shows that gluten-stimulated interleukin release 2 assays may correlate with the presence of pathogenic gluten-specific CD4+ T cell response in celiac disease,” Cao noted.

He cautioned that “further large cohort, multicenter prospective studies are needed to assess generalizability and may be helpful in evaluating the accuracy of WBAIL-2 in non-HLA DQ2.5 genotypes.” 

Other considerations prior to implementation may include reproducibility across different laboratories and overall cost effectiveness, Cao said. “Ultimately in clinic, the role of WBAIL-2 will need to be better defined within the algorithm of celiac disease testing,” he added.

 

The Path Ahead

The researchers plan to test the performance of the IL-2 whole blood assay in a pediatric cohort, as well as in other countries to demonstrate the reproducibility of the test. In these studies, the test will likely be performed alongside the current diagnostic tests (serology and gastroscopy), Moscatelli told GI & Hepatology News.

“There are some validation studies starting in other countries already as many celiac clinicians globally are interested in bringing this test to their clinical practice. I believe the plan is to have this as an approved diagnostic test for celiac disease worldwide,” she said.

Novoviah Pharmaceuticals is managing the commercialization of the test, and the plan is to get it into clinical practice in the next 2 years, Moscatelli said.

The research was supported by Coeliac Australia, Novoviah Pharmaceuticals (who provided the proprietary test for this study), Beck Family Foundation, Butterfield Family, the Veith Foundation. A complete list of author disclosures is available with the original article. Cao had no relevant disclosures.

A version of this article appeared on Medscape.com.

Think your patient may have celiac disease? The harsh reality is that current diagnostic tests require patients to consume gluten for an accurate diagnosis, which poses challenges for individuals already avoiding gluten.

A more tolerable approach appears to be on the horizon. Researchers in Australia have developed a blood test that can identify celiac disease with high sensitivity and specificity, even without consuming gluten.

“This is a simple and accurate test that can provide a diagnosis within a very short time frame, without the need for patients to continue eating gluten and feeling sick, or to wait months for a gastroscopy,” Olivia Moscatelli, PhD candidate, Tye-Din Lab, Walter and Eliza Hall Institute and University of Melbourne, Parkville, Australia, told GI & Hepatology News.

The study was published in Gastroenterology.

 

Most Cases Go Undiagnosed

Celiac disease is an autoimmune disorder triggered by gluten found in wheat, rye, and barley. The only available treatment is a strict, life-long gluten-free diet.

The global prevalence of celiac disease is estimated at around 1%-2%, with 50%-80% of cases either undiagnosed or diagnosed late. That’s because the current reliable diagnosis of celiac disease requires the intake of gluten, which may deter people from seeking a diagnosis.

In earlier work, the researchers, working with Robert Anderson, MBChB, BMedSc, PhD, AGAF, now with Novoviah Pharmaceuticals, made the unexpected discovery that interleukin-2 (IL-2) spiked in the blood of people with celiac disease shortly after they ate gluten.

But would this signal be present when no gluten had been consumed?

The team developed and tested a simple whole blood assay measuring IL-2 release (WBAIL- 2) for detecting gluten-specific T cells to aid in diagnosing celiac disease.

They collected blood samples from 181 volunteers — 75 with treated celiac disease on a gluten-free diet, 13 with active untreated celiac disease, 32 with nonceliac gluten sensitivity and 61 healthy controls. The blood samples were mixed with gluten in a test tube for a day to see if the IL-2 signal appeared.

The WBAIL-2 assay demonstrated high accuracy for celiac disease, even in patients following a strict gluten-free diet.

For patients with HLA-DQ2.5+ genetics, sensitivity was 90% and specificity was 95%, with lower sensitivity (56%) for patients with HLA-DQ8+ celiac disease.

The WBAIL-2 assay correlated strongly with the frequency of tetramer-positive gluten-specific CD4+ T cells used to diagnose celiac disease and monitor treatment effectiveness, and with serum IL-2 levels after gluten challenge.

The strength of the IL-2 signal correlated with the severity of a patient’s symptoms, “allowing us to predict how severely a person with celiac disease might react to gluten, without them actually having to eat it,” Moscatelli said in a news release.

“Current diagnostic practice involves a blood-based serology test followed by a confirmatory gastroscopy if positive. Both tests require the patient to eat gluten daily for 6-12 weeks prior for accurate results. We envision the new blood test (IL-2 whole blood assay) will replace the invasive gastroscopy as the confirmatory test following positive serology,” Moscatelli told GI & Hepatology News.

“In people already following a gluten-free diet, we propose they would have this new blood test done on two separate occasions and two positive results would be required for a celiac diagnosis. This would allow a large number of people who previously have been unable to go through the current diagnostic process to receive a diagnosis,” Moscatelli said.

 

Practice Changing Potential 

A blood-based test that can accurately detect celiac disease without the need for a gluten challenge would be “welcome and practice changing,” said Christopher Cao, MD, director, Celiac Disease Program, Division of Gastroenterology, Mount Sinai Health System, New York City.

“A typical ‘gluten challenge’ involves eating the equivalent of 1-2 slices of bread daily for the course of 6 weeks, and this may be incredibly difficult for patients who have already been on a gluten-free diet prior to an official celiac disease diagnosis. Inability to perform a gluten challenge limits the ability to make an accurate celiac disease diagnosis,” Cao told GI & Hepatology News.

“This study shows that gluten-stimulated interleukin release 2 assays may correlate with the presence of pathogenic gluten-specific CD4+ T cell response in celiac disease,” Cao noted.

He cautioned that “further large cohort, multicenter prospective studies are needed to assess generalizability and may be helpful in evaluating the accuracy of WBAIL-2 in non-HLA DQ2.5 genotypes.” 

Other considerations prior to implementation may include reproducibility across different laboratories and overall cost effectiveness, Cao said. “Ultimately in clinic, the role of WBAIL-2 will need to be better defined within the algorithm of celiac disease testing,” he added.

 

The Path Ahead

The researchers plan to test the performance of the IL-2 whole blood assay in a pediatric cohort, as well as in other countries to demonstrate the reproducibility of the test. In these studies, the test will likely be performed alongside the current diagnostic tests (serology and gastroscopy), Moscatelli told GI & Hepatology News.

“There are some validation studies starting in other countries already as many celiac clinicians globally are interested in bringing this test to their clinical practice. I believe the plan is to have this as an approved diagnostic test for celiac disease worldwide,” she said.

Novoviah Pharmaceuticals is managing the commercialization of the test, and the plan is to get it into clinical practice in the next 2 years, Moscatelli said.

The research was supported by Coeliac Australia, Novoviah Pharmaceuticals (who provided the proprietary test for this study), Beck Family Foundation, Butterfield Family, the Veith Foundation. A complete list of author disclosures is available with the original article. Cao had no relevant disclosures.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Un-Gate On Date
Use ProPublica
CFC Schedule Remove Status
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date

Can Modulation of the Microbiome Improve Cancer Immunotherapy Tolerance and Efficacy?

Article Type
Changed

WASHINGTON — For years, oncologist Jonathan Peled, MD, PhD, and his colleagues at Memorial Sloan Kettering Cancer Center (MSKCC) in New York City have been documenting gut microbiota disruption during allogeneic hematopoietic stem cell transplantation (allo-HSCT) and its role in frequent and potentially fatal bloodstream infections (BSIs) in the first 100 days after transplant.

Dr. Jonathan Peled

Modulating microbiome composition to improve outcomes after allo-HSCT for hematological malignancies is a prime goal, and at the Gut Microbiota for Health (GMFH) World Summit 2025, Peled shared two new findings.

In one study, his team found that sucrose can exacerbate antibiotic-induced microbiome injury in patients undergoing allo-HSCT — a finding that “raises the question of whether our dietary recommendations [for] allo-HSCT patients are correct,” said Peled, assistant attending at MSKCC, during a session on the gut microbiome and oncology.

And in another study, they found that a rationally designed probiotic formulation may help lower the incidence of bacterial BSIs. In December 2024, the probiotic formulation (SER-155, Seres Therapeutics, Inc.) was granted breakthrough therapy designation by the FDA.

With immunotherapies more broadly, researchers are increasingly looking at diet and modulation of the microbiome to improve both treatment tolerance and efficacy, experts said at the meeting convened by the AGA and the European Society of Neurogastroenterology and Motility.

“Cancer patients and caregivers are asking, ‘What should I eat?’” said Carrie Daniel-MacDougall, PhD, MPH, a nutritional epidemiologist at the University of Texas MD Anderson Cancer Center in Houston. “They’re not just focused on side effects — they want a good outcome for their treatment, and they’re exploring a lot of dietary strategies [for which there] is not a lot of evidence.”

Clinicians are challenged by the fact that “we don’t typically collect dietary data in clinical trials of cancer drugs,” leaving them to extrapolate from evidence-based diet guidelines for cancer prevention, Daniel-MacDougall said.

But “I think that’s starting to shift,” she said, with the microbiome being increasingly recognized for its potential influences on therapeutic response and clinical trials underway looking at “a healthy dietary pattern not just for prevention but survival.”

 

Diet and Probiotics After allo-HSCT

The patterns of microbiota disruption during allo-HSCT — a procedure that includes antibiotic administration, chemotherapy, and sometimes irradiation — are characterized by loss of diversity and the expansion of potentially pathogenic organisms, most commonly Enterococcus, said Peled.

This has been demonstrated across transplantation centers. In a multicenter, international study published in 2020, the patterns of microbiota disruption and their impact on mortality were similar across MSK and other transplantation centers, with higher diversity of intestinal microbiota associated with lower mortality.

Other studies have shown that Enterococcus domination alone (defined arbitrarily as > 30% of fecal microbial composition) is associated with graft vs host disease and higher mortality after allo-HSCT and that intestinal domination by Proteobacteria coincides temporally with BSIs, he said.

Autologous fecal microbiota transplantation (FMT) has been shown to largely restore the microbiota composition the patient had before antibiotic treatment and allo-HSCT, he said, making fecal sample banking and posttreatment FMT a potential approach for reconstituting the gut microbiome and improving outcomes.

But “lately we’ve been very interested in diet for modulating [harmful] patterns” in the microbiome composition, Peled said.

In the new study suggesting a role for sugar avoidance, published last year as a bioRxiv preprint, Peled and his colleagues collected real-time dietary intake data (40,702 food entries) from 173 patients hospitalized for several weeks for allo-HSCT at MSK and analyzed it alongside longitudinally collected fecal samples. They used a Bayesian mixed-effects model to identify dietary components that may correlate with microbial disruption.

“What jumped out as very predictive of a low diversity fecal sample [and expansion of Enterococcus] in the 2 days prior to collection was the interaction between antibiotics and the consumption of sweets” — foods rich in simple sugars, Peled said. The relationship between sugar and the microbiome occurred only during periods of antibiotic exposure.

“And it was particularly perplexing because the foods that fall into the ‘sweets’ category are foods we encourage people to eat clinically when they’re not feeling well and food intake drops dramatically,” he said. This includes foods like nutritional drinks or shakes, Italian ice, gelatin dessert, and sports drinks.

(In a mouse model of post-antibiotic Enterococcus expansion, Peled and his co-investigators then validated the findings and ruled out the impact of any reductions in fiber.)

In addition to possibly revising dietary recommendations for patients undergoing allo-HSCT, the findings raise the question of whether avoiding sugar intake while on antibiotics, in general, is a way to mitigate antibiotic-induced dysbiosis, he said.

To test the role of probiotics, Peled and colleagues collaborated with Seres Therapeutics on a phase 1b trial of an oral combination (SER-155) of 16 fermented strains “selected rationally,” he said, for their ability to decolonize gut pathogens, improve gut barrier function (in vitro), and reduce gut inflammation and local immune activation.

After a safety lead-in, patients were randomized to receive SER-155 (20) or placebo (14) three times — prior to transplant, upon neutrophil engraftment (with vancomycin “conditioning”), and after transplant. “The strains succeeded in grafting in the [gastrointestinal] GI tract…and some of them persisted all the way through to day 100,” Peled said.

The incidence of pathogen domination was substantially lower in the probiotic recipients compared to an MSK historical control cohort, and the incidence of BSIs was significantly lower compared to the placebo arm (10% vs 43%, respectively, representing a 77% relative risk reduction), he said.

 

Diet and Immunotherapy Response: Trials at MD Anderson

One of the first trials Daniel-MacDougall launched at MD Anderson on diet and the microbiome randomized 55 patients who were obese and had a history of colorectal cancer or precancerous polyps to add a cup of beans to their usual diet or to continue their usual diet without beans. There was a crossover at 8 weeks in the 16-week BE GONE trial; stool and fasting blood were collected every 4 weeks.

“Beans are a prebiotic super-house in my opinion, and they’re also something this population would avoid,” said Daniel-MacDougall, associate professor in the department of epidemiology at MD Anderson and faculty director of the Bionutrition Research Core and Research Kitchen.

“We saw a modest increase in alpha diversity [in the intervention group] and similar trends with microbiota-derived metabolites” that regressed when patients returned to their usual diet, she said. The researchers also documented decreases in proteomic biomarkers of intestinal and systemic immune and inflammatory response.

The impact of diet on cancer survival was shown in subsequent research, including an observational study published in Science in 2021 of patients with melanoma receiving immune checkpoint blockade (ICB) treatment. “Patients who consumed insufficient dietary fiber at the start of therapy tended to do worse [than those reporting sufficient fiber intake],” with significantly lower progression-free survival, Daniel-MacDougall said.

“And interestingly, when we looked at dietary fiber [with and without] probiotic use, patients who had sufficient fiber but did not take probiotics did the best,” she said. [The probiotics were not endorsed or selected by their physicians.]

Now, the researchers at MD Anderson are moving into “precision nutrition” research, Daniel-MacDougall said, with a phase 2 randomized, double-blind trial of high dietary fiber intake (a target of 50 g/d from whole foods) vs a healthy control diet (20 g/d of fiber) in patients with melanoma receiving ICB.

The study, which is underway, is a fully controlled feeding study, with all meals and snacks provided by MD Anderson and macronutrients controlled. Researchers are collecting blood, stool, and tumor tissue (if available) to answer questions about the microbiome, changes in systemic and tissue immunity, disease response and immunotherapy toxicity, and other issues.

Peled disclosed IP licensing and research support from Seres Therapeutics; consulting with Da Volterra, MaaT Pharma, and CSL Behring; and advisory/equity with Postbiotics + Research LLC and Prodigy Biosciences. Daniel-MacDougall reported having no disclosures.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

WASHINGTON — For years, oncologist Jonathan Peled, MD, PhD, and his colleagues at Memorial Sloan Kettering Cancer Center (MSKCC) in New York City have been documenting gut microbiota disruption during allogeneic hematopoietic stem cell transplantation (allo-HSCT) and its role in frequent and potentially fatal bloodstream infections (BSIs) in the first 100 days after transplant.

Dr. Jonathan Peled

Modulating microbiome composition to improve outcomes after allo-HSCT for hematological malignancies is a prime goal, and at the Gut Microbiota for Health (GMFH) World Summit 2025, Peled shared two new findings.

In one study, his team found that sucrose can exacerbate antibiotic-induced microbiome injury in patients undergoing allo-HSCT — a finding that “raises the question of whether our dietary recommendations [for] allo-HSCT patients are correct,” said Peled, assistant attending at MSKCC, during a session on the gut microbiome and oncology.

And in another study, they found that a rationally designed probiotic formulation may help lower the incidence of bacterial BSIs. In December 2024, the probiotic formulation (SER-155, Seres Therapeutics, Inc.) was granted breakthrough therapy designation by the FDA.

With immunotherapies more broadly, researchers are increasingly looking at diet and modulation of the microbiome to improve both treatment tolerance and efficacy, experts said at the meeting convened by the AGA and the European Society of Neurogastroenterology and Motility.

“Cancer patients and caregivers are asking, ‘What should I eat?’” said Carrie Daniel-MacDougall, PhD, MPH, a nutritional epidemiologist at the University of Texas MD Anderson Cancer Center in Houston. “They’re not just focused on side effects — they want a good outcome for their treatment, and they’re exploring a lot of dietary strategies [for which there] is not a lot of evidence.”

Clinicians are challenged by the fact that “we don’t typically collect dietary data in clinical trials of cancer drugs,” leaving them to extrapolate from evidence-based diet guidelines for cancer prevention, Daniel-MacDougall said.

But “I think that’s starting to shift,” she said, with the microbiome being increasingly recognized for its potential influences on therapeutic response and clinical trials underway looking at “a healthy dietary pattern not just for prevention but survival.”

 

Diet and Probiotics After allo-HSCT

The patterns of microbiota disruption during allo-HSCT — a procedure that includes antibiotic administration, chemotherapy, and sometimes irradiation — are characterized by loss of diversity and the expansion of potentially pathogenic organisms, most commonly Enterococcus, said Peled.

This has been demonstrated across transplantation centers. In a multicenter, international study published in 2020, the patterns of microbiota disruption and their impact on mortality were similar across MSK and other transplantation centers, with higher diversity of intestinal microbiota associated with lower mortality.

Other studies have shown that Enterococcus domination alone (defined arbitrarily as > 30% of fecal microbial composition) is associated with graft vs host disease and higher mortality after allo-HSCT and that intestinal domination by Proteobacteria coincides temporally with BSIs, he said.

Autologous fecal microbiota transplantation (FMT) has been shown to largely restore the microbiota composition the patient had before antibiotic treatment and allo-HSCT, he said, making fecal sample banking and posttreatment FMT a potential approach for reconstituting the gut microbiome and improving outcomes.

But “lately we’ve been very interested in diet for modulating [harmful] patterns” in the microbiome composition, Peled said.

In the new study suggesting a role for sugar avoidance, published last year as a bioRxiv preprint, Peled and his colleagues collected real-time dietary intake data (40,702 food entries) from 173 patients hospitalized for several weeks for allo-HSCT at MSK and analyzed it alongside longitudinally collected fecal samples. They used a Bayesian mixed-effects model to identify dietary components that may correlate with microbial disruption.

“What jumped out as very predictive of a low diversity fecal sample [and expansion of Enterococcus] in the 2 days prior to collection was the interaction between antibiotics and the consumption of sweets” — foods rich in simple sugars, Peled said. The relationship between sugar and the microbiome occurred only during periods of antibiotic exposure.

“And it was particularly perplexing because the foods that fall into the ‘sweets’ category are foods we encourage people to eat clinically when they’re not feeling well and food intake drops dramatically,” he said. This includes foods like nutritional drinks or shakes, Italian ice, gelatin dessert, and sports drinks.

(In a mouse model of post-antibiotic Enterococcus expansion, Peled and his co-investigators then validated the findings and ruled out the impact of any reductions in fiber.)

In addition to possibly revising dietary recommendations for patients undergoing allo-HSCT, the findings raise the question of whether avoiding sugar intake while on antibiotics, in general, is a way to mitigate antibiotic-induced dysbiosis, he said.

To test the role of probiotics, Peled and colleagues collaborated with Seres Therapeutics on a phase 1b trial of an oral combination (SER-155) of 16 fermented strains “selected rationally,” he said, for their ability to decolonize gut pathogens, improve gut barrier function (in vitro), and reduce gut inflammation and local immune activation.

After a safety lead-in, patients were randomized to receive SER-155 (20) or placebo (14) three times — prior to transplant, upon neutrophil engraftment (with vancomycin “conditioning”), and after transplant. “The strains succeeded in grafting in the [gastrointestinal] GI tract…and some of them persisted all the way through to day 100,” Peled said.

The incidence of pathogen domination was substantially lower in the probiotic recipients compared to an MSK historical control cohort, and the incidence of BSIs was significantly lower compared to the placebo arm (10% vs 43%, respectively, representing a 77% relative risk reduction), he said.

 

Diet and Immunotherapy Response: Trials at MD Anderson

One of the first trials Daniel-MacDougall launched at MD Anderson on diet and the microbiome randomized 55 patients who were obese and had a history of colorectal cancer or precancerous polyps to add a cup of beans to their usual diet or to continue their usual diet without beans. There was a crossover at 8 weeks in the 16-week BE GONE trial; stool and fasting blood were collected every 4 weeks.

“Beans are a prebiotic super-house in my opinion, and they’re also something this population would avoid,” said Daniel-MacDougall, associate professor in the department of epidemiology at MD Anderson and faculty director of the Bionutrition Research Core and Research Kitchen.

“We saw a modest increase in alpha diversity [in the intervention group] and similar trends with microbiota-derived metabolites” that regressed when patients returned to their usual diet, she said. The researchers also documented decreases in proteomic biomarkers of intestinal and systemic immune and inflammatory response.

The impact of diet on cancer survival was shown in subsequent research, including an observational study published in Science in 2021 of patients with melanoma receiving immune checkpoint blockade (ICB) treatment. “Patients who consumed insufficient dietary fiber at the start of therapy tended to do worse [than those reporting sufficient fiber intake],” with significantly lower progression-free survival, Daniel-MacDougall said.

“And interestingly, when we looked at dietary fiber [with and without] probiotic use, patients who had sufficient fiber but did not take probiotics did the best,” she said. [The probiotics were not endorsed or selected by their physicians.]

Now, the researchers at MD Anderson are moving into “precision nutrition” research, Daniel-MacDougall said, with a phase 2 randomized, double-blind trial of high dietary fiber intake (a target of 50 g/d from whole foods) vs a healthy control diet (20 g/d of fiber) in patients with melanoma receiving ICB.

The study, which is underway, is a fully controlled feeding study, with all meals and snacks provided by MD Anderson and macronutrients controlled. Researchers are collecting blood, stool, and tumor tissue (if available) to answer questions about the microbiome, changes in systemic and tissue immunity, disease response and immunotherapy toxicity, and other issues.

Peled disclosed IP licensing and research support from Seres Therapeutics; consulting with Da Volterra, MaaT Pharma, and CSL Behring; and advisory/equity with Postbiotics + Research LLC and Prodigy Biosciences. Daniel-MacDougall reported having no disclosures.

A version of this article appeared on Medscape.com.

WASHINGTON — For years, oncologist Jonathan Peled, MD, PhD, and his colleagues at Memorial Sloan Kettering Cancer Center (MSKCC) in New York City have been documenting gut microbiota disruption during allogeneic hematopoietic stem cell transplantation (allo-HSCT) and its role in frequent and potentially fatal bloodstream infections (BSIs) in the first 100 days after transplant.

Dr. Jonathan Peled

Modulating microbiome composition to improve outcomes after allo-HSCT for hematological malignancies is a prime goal, and at the Gut Microbiota for Health (GMFH) World Summit 2025, Peled shared two new findings.

In one study, his team found that sucrose can exacerbate antibiotic-induced microbiome injury in patients undergoing allo-HSCT — a finding that “raises the question of whether our dietary recommendations [for] allo-HSCT patients are correct,” said Peled, assistant attending at MSKCC, during a session on the gut microbiome and oncology.

And in another study, they found that a rationally designed probiotic formulation may help lower the incidence of bacterial BSIs. In December 2024, the probiotic formulation (SER-155, Seres Therapeutics, Inc.) was granted breakthrough therapy designation by the FDA.

With immunotherapies more broadly, researchers are increasingly looking at diet and modulation of the microbiome to improve both treatment tolerance and efficacy, experts said at the meeting convened by the AGA and the European Society of Neurogastroenterology and Motility.

“Cancer patients and caregivers are asking, ‘What should I eat?’” said Carrie Daniel-MacDougall, PhD, MPH, a nutritional epidemiologist at the University of Texas MD Anderson Cancer Center in Houston. “They’re not just focused on side effects — they want a good outcome for their treatment, and they’re exploring a lot of dietary strategies [for which there] is not a lot of evidence.”

Clinicians are challenged by the fact that “we don’t typically collect dietary data in clinical trials of cancer drugs,” leaving them to extrapolate from evidence-based diet guidelines for cancer prevention, Daniel-MacDougall said.

But “I think that’s starting to shift,” she said, with the microbiome being increasingly recognized for its potential influences on therapeutic response and clinical trials underway looking at “a healthy dietary pattern not just for prevention but survival.”

 

Diet and Probiotics After allo-HSCT

The patterns of microbiota disruption during allo-HSCT — a procedure that includes antibiotic administration, chemotherapy, and sometimes irradiation — are characterized by loss of diversity and the expansion of potentially pathogenic organisms, most commonly Enterococcus, said Peled.

This has been demonstrated across transplantation centers. In a multicenter, international study published in 2020, the patterns of microbiota disruption and their impact on mortality were similar across MSK and other transplantation centers, with higher diversity of intestinal microbiota associated with lower mortality.

Other studies have shown that Enterococcus domination alone (defined arbitrarily as > 30% of fecal microbial composition) is associated with graft vs host disease and higher mortality after allo-HSCT and that intestinal domination by Proteobacteria coincides temporally with BSIs, he said.

Autologous fecal microbiota transplantation (FMT) has been shown to largely restore the microbiota composition the patient had before antibiotic treatment and allo-HSCT, he said, making fecal sample banking and posttreatment FMT a potential approach for reconstituting the gut microbiome and improving outcomes.

But “lately we’ve been very interested in diet for modulating [harmful] patterns” in the microbiome composition, Peled said.

In the new study suggesting a role for sugar avoidance, published last year as a bioRxiv preprint, Peled and his colleagues collected real-time dietary intake data (40,702 food entries) from 173 patients hospitalized for several weeks for allo-HSCT at MSK and analyzed it alongside longitudinally collected fecal samples. They used a Bayesian mixed-effects model to identify dietary components that may correlate with microbial disruption.

“What jumped out as very predictive of a low diversity fecal sample [and expansion of Enterococcus] in the 2 days prior to collection was the interaction between antibiotics and the consumption of sweets” — foods rich in simple sugars, Peled said. The relationship between sugar and the microbiome occurred only during periods of antibiotic exposure.

“And it was particularly perplexing because the foods that fall into the ‘sweets’ category are foods we encourage people to eat clinically when they’re not feeling well and food intake drops dramatically,” he said. This includes foods like nutritional drinks or shakes, Italian ice, gelatin dessert, and sports drinks.

(In a mouse model of post-antibiotic Enterococcus expansion, Peled and his co-investigators then validated the findings and ruled out the impact of any reductions in fiber.)

In addition to possibly revising dietary recommendations for patients undergoing allo-HSCT, the findings raise the question of whether avoiding sugar intake while on antibiotics, in general, is a way to mitigate antibiotic-induced dysbiosis, he said.

To test the role of probiotics, Peled and colleagues collaborated with Seres Therapeutics on a phase 1b trial of an oral combination (SER-155) of 16 fermented strains “selected rationally,” he said, for their ability to decolonize gut pathogens, improve gut barrier function (in vitro), and reduce gut inflammation and local immune activation.

After a safety lead-in, patients were randomized to receive SER-155 (20) or placebo (14) three times — prior to transplant, upon neutrophil engraftment (with vancomycin “conditioning”), and after transplant. “The strains succeeded in grafting in the [gastrointestinal] GI tract…and some of them persisted all the way through to day 100,” Peled said.

The incidence of pathogen domination was substantially lower in the probiotic recipients compared to an MSK historical control cohort, and the incidence of BSIs was significantly lower compared to the placebo arm (10% vs 43%, respectively, representing a 77% relative risk reduction), he said.

 

Diet and Immunotherapy Response: Trials at MD Anderson

One of the first trials Daniel-MacDougall launched at MD Anderson on diet and the microbiome randomized 55 patients who were obese and had a history of colorectal cancer or precancerous polyps to add a cup of beans to their usual diet or to continue their usual diet without beans. There was a crossover at 8 weeks in the 16-week BE GONE trial; stool and fasting blood were collected every 4 weeks.

“Beans are a prebiotic super-house in my opinion, and they’re also something this population would avoid,” said Daniel-MacDougall, associate professor in the department of epidemiology at MD Anderson and faculty director of the Bionutrition Research Core and Research Kitchen.

“We saw a modest increase in alpha diversity [in the intervention group] and similar trends with microbiota-derived metabolites” that regressed when patients returned to their usual diet, she said. The researchers also documented decreases in proteomic biomarkers of intestinal and systemic immune and inflammatory response.

The impact of diet on cancer survival was shown in subsequent research, including an observational study published in Science in 2021 of patients with melanoma receiving immune checkpoint blockade (ICB) treatment. “Patients who consumed insufficient dietary fiber at the start of therapy tended to do worse [than those reporting sufficient fiber intake],” with significantly lower progression-free survival, Daniel-MacDougall said.

“And interestingly, when we looked at dietary fiber [with and without] probiotic use, patients who had sufficient fiber but did not take probiotics did the best,” she said. [The probiotics were not endorsed or selected by their physicians.]

Now, the researchers at MD Anderson are moving into “precision nutrition” research, Daniel-MacDougall said, with a phase 2 randomized, double-blind trial of high dietary fiber intake (a target of 50 g/d from whole foods) vs a healthy control diet (20 g/d of fiber) in patients with melanoma receiving ICB.

The study, which is underway, is a fully controlled feeding study, with all meals and snacks provided by MD Anderson and macronutrients controlled. Researchers are collecting blood, stool, and tumor tissue (if available) to answer questions about the microbiome, changes in systemic and tissue immunity, disease response and immunotherapy toxicity, and other issues.

Peled disclosed IP licensing and research support from Seres Therapeutics; consulting with Da Volterra, MaaT Pharma, and CSL Behring; and advisory/equity with Postbiotics + Research LLC and Prodigy Biosciences. Daniel-MacDougall reported having no disclosures.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Un-Gate On Date
Use ProPublica
CFC Schedule Remove Status
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date