User login
Sleep Changes in IBD Could Signal Inflammation, Flareups
, an observational study suggested.
Sleep data from 101 study participants over a mean duration of about 228 days revealed that altered sleep architecture was only apparent when inflammation was present — symptoms alone did not impact sleep cycles or signal inflammation.
“We thought symptoms might have an impact on sleep, but interestingly, our data showed that measurable changes like reduced rapid eye movement (REM) sleep and increased light sleep only occurred during periods of active inflammation,” Robert Hirten, MD, associate professor of Medicine (Gastroenterology), and Artificial Intelligence and Human Health, at the Icahn School of Medicine at Mount Sinai, New York City, told GI & Hepatology News.
“It was also interesting to see distinct patterns in sleep metrics begin to shift over the 45 days before a flare, suggesting the potential for sleep to serve as an early indicator of disease activity,” he added.
“Sleep is often overlooked in the management of IBD, but it may provide valuable insights into a patient’s underlying disease state,” he said. “While sleep monitoring isn’t yet a standard part of IBD care, this study highlights its potential as a noninvasive window into disease activity, and a promising area for future clinical integration.”
The study was published online in Clinical Gastroenterology and Hepatology.
Less REM Sleep, More Light Sleep
Researchers assessed the impact of inflammation and symptoms on sleep architecture in IBD by analyzing data from 101 individuals who answered daily disease activity surveys and wore a wearable device.
The mean age of participants was 41 years and 65.3% were women. Sixty-three participants (62.4%) had Crohn’s disease (CD) and 38 (37.6%) had ulcerative colitis (UC).
Almost 40 (39.6%) participants used an Apple Watch; 50 (49.5%) used a Fitbit; and 11 (10.9%) used an Oura ring. Sleep architecture, sleep efficiency, and total hours asleep were collected from the devices. Participants were encouraged to wear their devices for at least 4 days per week and 8 hours per day and were not required to wear them at night. Participants provided data by linking their devices to ehive, Mount Sinai’s custom app.
Daily clinical disease activity was assessed using the UC or CD Patient Reported Outcome-2 survey. Participants were asked to answer at least four daily surveys each week.
Associations between sleep metrics and periods of symptomatic and inflammatory flares, and combinations of symptomatic and inflammatory activity, were compared to periods of symptomatic and inflammatory remission.
Furthermore, researchers explored the rate of change in sleep metrics for 45 days before and after inflammatory and symptomatic flares.
Participants contributed a mean duration of 228.16 nights of wearable data. During active inflammation, they spent a lower percentage of sleep time in REM (20% vs 21.59%) and a greater percentage of sleep time in light sleep (62.23% vs 59.95%) than during inflammatory remission. No differences were observed in the mean percentage of time in deep sleep, sleep efficiency, or total time asleep.
During symptomatic flares, there were no differences in the percentage of sleep time in REM sleep, deep sleep, light sleep, or sleep efficiency compared with periods of inflammatory remission. However, participants slept less overall during symptomatic flares compared with during symptomatic remission.
Compared with during asymptomatic and uninflamed periods, during asymptomatic but inflamed periods, participants spent a lower percentage of time in REM sleep, and more time in light sleep; however, there were no differences in sleep efficiency or total time asleep.
Similarly, participants had more light sleep and less REM sleep during symptomatic and inflammatory flares than during asymptomatic and uninflamed periods — but there were no differences in the percentage of time spent in deep sleep, in sleep efficiency, and the total time asleep.
Symptomatic flares alone, without inflammation, did not impact sleep metrics, the researchers concluded. However, periods with active inflammation were associated with a significantly smaller percentage of sleep time in REM sleep and a greater percentage of sleep time in light sleep.
The team also performed longitudinal mapping of sleep patterns before, during, and after disease exacerbations by analyzing sleep data for 6 weeks before and 6 weeks after flare episodes.
They found that sleep disturbances significantly worsen leading up to inflammatory flares and improve afterward, suggesting that sleep changes may signal upcoming increased disease activity. Evaluating the intersection of inflammatory and symptomatic flares, altered sleep architecture was only evident when inflammation was present.
“These findings raise important questions about whether intervening on sleep can actually impact inflammation or disease trajectory in IBD,” Hirten said. “Next steps include studying whether targeted sleep interventions can improve both sleep and IBD outcomes.”
While this research is still in the early stages, he said, “it suggests that sleep may have a relationship with inflammatory activity in IBD. For patients, it reinforces the value of paying attention to sleep changes.”
The findings also show the potential of wearable devices to guide more personalized monitoring, he added. “More work is needed before sleep metrics can be used routinely in clinical decision-making.”
Validates the Use of Wearables
Commenting on the study for GI & Hepatology News, Michael Mintz, MD, a gastroenterologist at Weill Cornell Medicine and NewYork-Presbyterian in New York City, observed, “Gastrointestinal symptoms often do not correlate with objective disease activity in IBD, creating a diagnostic challenge for gastroenterologists. Burdensome, expensive, and/or invasive testing, such as colonoscopies, stool tests, or imaging, are frequently required to monitor disease activity.”
“This study is a first step in objectively monitoring inflammation in a patient-centric way that does not create undue burden to our patients,” he said. “It also provides longitudinal data that suggests changes in sleep patterns can pre-date disease flares, which ideally can lead to earlier intervention to prevent disease complications.”
Like Hirten, he noted that clinical decisions, such as changing IBD therapy, should not be based on the results of this study. “Rather this provides validation that wearable technology can provide useful objective data that correlates with disease activity.”
Furthermore, he said, it is not clear whether analyzing sleep data is a cost-effective way of monitoring IBD disease activity, or whether that data should be used alone or in combination with other objective disease markers, to influence clinical decision-making.
“This study provides proof of concept that there is a relationship between sleep characteristics and objective inflammation, but further studies are needed,” he said. “I am hopeful that this technology will give us another tool that we can use in clinical practice to monitor disease activity and improve outcomes in a way that is comfortable and convenient for our patients.”
This study was supported by a grant to Hirten from the US National Institutes of Health. Hirten reported receiving consulting fees from Bristol Meyers Squibb, AbbVie; stock options from Salvo Health; and research support from Janssen, Intralytix, EnLiSense, Crohn’s and Colitis Foundation. Mintz declared no competing interests.
A version of this article appeared on Medscape.com.
, an observational study suggested.
Sleep data from 101 study participants over a mean duration of about 228 days revealed that altered sleep architecture was only apparent when inflammation was present — symptoms alone did not impact sleep cycles or signal inflammation.
“We thought symptoms might have an impact on sleep, but interestingly, our data showed that measurable changes like reduced rapid eye movement (REM) sleep and increased light sleep only occurred during periods of active inflammation,” Robert Hirten, MD, associate professor of Medicine (Gastroenterology), and Artificial Intelligence and Human Health, at the Icahn School of Medicine at Mount Sinai, New York City, told GI & Hepatology News.
“It was also interesting to see distinct patterns in sleep metrics begin to shift over the 45 days before a flare, suggesting the potential for sleep to serve as an early indicator of disease activity,” he added.
“Sleep is often overlooked in the management of IBD, but it may provide valuable insights into a patient’s underlying disease state,” he said. “While sleep monitoring isn’t yet a standard part of IBD care, this study highlights its potential as a noninvasive window into disease activity, and a promising area for future clinical integration.”
The study was published online in Clinical Gastroenterology and Hepatology.
Less REM Sleep, More Light Sleep
Researchers assessed the impact of inflammation and symptoms on sleep architecture in IBD by analyzing data from 101 individuals who answered daily disease activity surveys and wore a wearable device.
The mean age of participants was 41 years and 65.3% were women. Sixty-three participants (62.4%) had Crohn’s disease (CD) and 38 (37.6%) had ulcerative colitis (UC).
Almost 40 (39.6%) participants used an Apple Watch; 50 (49.5%) used a Fitbit; and 11 (10.9%) used an Oura ring. Sleep architecture, sleep efficiency, and total hours asleep were collected from the devices. Participants were encouraged to wear their devices for at least 4 days per week and 8 hours per day and were not required to wear them at night. Participants provided data by linking their devices to ehive, Mount Sinai’s custom app.
Daily clinical disease activity was assessed using the UC or CD Patient Reported Outcome-2 survey. Participants were asked to answer at least four daily surveys each week.
Associations between sleep metrics and periods of symptomatic and inflammatory flares, and combinations of symptomatic and inflammatory activity, were compared to periods of symptomatic and inflammatory remission.
Furthermore, researchers explored the rate of change in sleep metrics for 45 days before and after inflammatory and symptomatic flares.
Participants contributed a mean duration of 228.16 nights of wearable data. During active inflammation, they spent a lower percentage of sleep time in REM (20% vs 21.59%) and a greater percentage of sleep time in light sleep (62.23% vs 59.95%) than during inflammatory remission. No differences were observed in the mean percentage of time in deep sleep, sleep efficiency, or total time asleep.
During symptomatic flares, there were no differences in the percentage of sleep time in REM sleep, deep sleep, light sleep, or sleep efficiency compared with periods of inflammatory remission. However, participants slept less overall during symptomatic flares compared with during symptomatic remission.
Compared with during asymptomatic and uninflamed periods, during asymptomatic but inflamed periods, participants spent a lower percentage of time in REM sleep, and more time in light sleep; however, there were no differences in sleep efficiency or total time asleep.
Similarly, participants had more light sleep and less REM sleep during symptomatic and inflammatory flares than during asymptomatic and uninflamed periods — but there were no differences in the percentage of time spent in deep sleep, in sleep efficiency, and the total time asleep.
Symptomatic flares alone, without inflammation, did not impact sleep metrics, the researchers concluded. However, periods with active inflammation were associated with a significantly smaller percentage of sleep time in REM sleep and a greater percentage of sleep time in light sleep.
The team also performed longitudinal mapping of sleep patterns before, during, and after disease exacerbations by analyzing sleep data for 6 weeks before and 6 weeks after flare episodes.
They found that sleep disturbances significantly worsen leading up to inflammatory flares and improve afterward, suggesting that sleep changes may signal upcoming increased disease activity. Evaluating the intersection of inflammatory and symptomatic flares, altered sleep architecture was only evident when inflammation was present.
“These findings raise important questions about whether intervening on sleep can actually impact inflammation or disease trajectory in IBD,” Hirten said. “Next steps include studying whether targeted sleep interventions can improve both sleep and IBD outcomes.”
While this research is still in the early stages, he said, “it suggests that sleep may have a relationship with inflammatory activity in IBD. For patients, it reinforces the value of paying attention to sleep changes.”
The findings also show the potential of wearable devices to guide more personalized monitoring, he added. “More work is needed before sleep metrics can be used routinely in clinical decision-making.”
Validates the Use of Wearables
Commenting on the study for GI & Hepatology News, Michael Mintz, MD, a gastroenterologist at Weill Cornell Medicine and NewYork-Presbyterian in New York City, observed, “Gastrointestinal symptoms often do not correlate with objective disease activity in IBD, creating a diagnostic challenge for gastroenterologists. Burdensome, expensive, and/or invasive testing, such as colonoscopies, stool tests, or imaging, are frequently required to monitor disease activity.”
“This study is a first step in objectively monitoring inflammation in a patient-centric way that does not create undue burden to our patients,” he said. “It also provides longitudinal data that suggests changes in sleep patterns can pre-date disease flares, which ideally can lead to earlier intervention to prevent disease complications.”
Like Hirten, he noted that clinical decisions, such as changing IBD therapy, should not be based on the results of this study. “Rather this provides validation that wearable technology can provide useful objective data that correlates with disease activity.”
Furthermore, he said, it is not clear whether analyzing sleep data is a cost-effective way of monitoring IBD disease activity, or whether that data should be used alone or in combination with other objective disease markers, to influence clinical decision-making.
“This study provides proof of concept that there is a relationship between sleep characteristics and objective inflammation, but further studies are needed,” he said. “I am hopeful that this technology will give us another tool that we can use in clinical practice to monitor disease activity and improve outcomes in a way that is comfortable and convenient for our patients.”
This study was supported by a grant to Hirten from the US National Institutes of Health. Hirten reported receiving consulting fees from Bristol Meyers Squibb, AbbVie; stock options from Salvo Health; and research support from Janssen, Intralytix, EnLiSense, Crohn’s and Colitis Foundation. Mintz declared no competing interests.
A version of this article appeared on Medscape.com.
, an observational study suggested.
Sleep data from 101 study participants over a mean duration of about 228 days revealed that altered sleep architecture was only apparent when inflammation was present — symptoms alone did not impact sleep cycles or signal inflammation.
“We thought symptoms might have an impact on sleep, but interestingly, our data showed that measurable changes like reduced rapid eye movement (REM) sleep and increased light sleep only occurred during periods of active inflammation,” Robert Hirten, MD, associate professor of Medicine (Gastroenterology), and Artificial Intelligence and Human Health, at the Icahn School of Medicine at Mount Sinai, New York City, told GI & Hepatology News.
“It was also interesting to see distinct patterns in sleep metrics begin to shift over the 45 days before a flare, suggesting the potential for sleep to serve as an early indicator of disease activity,” he added.
“Sleep is often overlooked in the management of IBD, but it may provide valuable insights into a patient’s underlying disease state,” he said. “While sleep monitoring isn’t yet a standard part of IBD care, this study highlights its potential as a noninvasive window into disease activity, and a promising area for future clinical integration.”
The study was published online in Clinical Gastroenterology and Hepatology.
Less REM Sleep, More Light Sleep
Researchers assessed the impact of inflammation and symptoms on sleep architecture in IBD by analyzing data from 101 individuals who answered daily disease activity surveys and wore a wearable device.
The mean age of participants was 41 years and 65.3% were women. Sixty-three participants (62.4%) had Crohn’s disease (CD) and 38 (37.6%) had ulcerative colitis (UC).
Almost 40 (39.6%) participants used an Apple Watch; 50 (49.5%) used a Fitbit; and 11 (10.9%) used an Oura ring. Sleep architecture, sleep efficiency, and total hours asleep were collected from the devices. Participants were encouraged to wear their devices for at least 4 days per week and 8 hours per day and were not required to wear them at night. Participants provided data by linking their devices to ehive, Mount Sinai’s custom app.
Daily clinical disease activity was assessed using the UC or CD Patient Reported Outcome-2 survey. Participants were asked to answer at least four daily surveys each week.
Associations between sleep metrics and periods of symptomatic and inflammatory flares, and combinations of symptomatic and inflammatory activity, were compared to periods of symptomatic and inflammatory remission.
Furthermore, researchers explored the rate of change in sleep metrics for 45 days before and after inflammatory and symptomatic flares.
Participants contributed a mean duration of 228.16 nights of wearable data. During active inflammation, they spent a lower percentage of sleep time in REM (20% vs 21.59%) and a greater percentage of sleep time in light sleep (62.23% vs 59.95%) than during inflammatory remission. No differences were observed in the mean percentage of time in deep sleep, sleep efficiency, or total time asleep.
During symptomatic flares, there were no differences in the percentage of sleep time in REM sleep, deep sleep, light sleep, or sleep efficiency compared with periods of inflammatory remission. However, participants slept less overall during symptomatic flares compared with during symptomatic remission.
Compared with during asymptomatic and uninflamed periods, during asymptomatic but inflamed periods, participants spent a lower percentage of time in REM sleep, and more time in light sleep; however, there were no differences in sleep efficiency or total time asleep.
Similarly, participants had more light sleep and less REM sleep during symptomatic and inflammatory flares than during asymptomatic and uninflamed periods — but there were no differences in the percentage of time spent in deep sleep, in sleep efficiency, and the total time asleep.
Symptomatic flares alone, without inflammation, did not impact sleep metrics, the researchers concluded. However, periods with active inflammation were associated with a significantly smaller percentage of sleep time in REM sleep and a greater percentage of sleep time in light sleep.
The team also performed longitudinal mapping of sleep patterns before, during, and after disease exacerbations by analyzing sleep data for 6 weeks before and 6 weeks after flare episodes.
They found that sleep disturbances significantly worsen leading up to inflammatory flares and improve afterward, suggesting that sleep changes may signal upcoming increased disease activity. Evaluating the intersection of inflammatory and symptomatic flares, altered sleep architecture was only evident when inflammation was present.
“These findings raise important questions about whether intervening on sleep can actually impact inflammation or disease trajectory in IBD,” Hirten said. “Next steps include studying whether targeted sleep interventions can improve both sleep and IBD outcomes.”
While this research is still in the early stages, he said, “it suggests that sleep may have a relationship with inflammatory activity in IBD. For patients, it reinforces the value of paying attention to sleep changes.”
The findings also show the potential of wearable devices to guide more personalized monitoring, he added. “More work is needed before sleep metrics can be used routinely in clinical decision-making.”
Validates the Use of Wearables
Commenting on the study for GI & Hepatology News, Michael Mintz, MD, a gastroenterologist at Weill Cornell Medicine and NewYork-Presbyterian in New York City, observed, “Gastrointestinal symptoms often do not correlate with objective disease activity in IBD, creating a diagnostic challenge for gastroenterologists. Burdensome, expensive, and/or invasive testing, such as colonoscopies, stool tests, or imaging, are frequently required to monitor disease activity.”
“This study is a first step in objectively monitoring inflammation in a patient-centric way that does not create undue burden to our patients,” he said. “It also provides longitudinal data that suggests changes in sleep patterns can pre-date disease flares, which ideally can lead to earlier intervention to prevent disease complications.”
Like Hirten, he noted that clinical decisions, such as changing IBD therapy, should not be based on the results of this study. “Rather this provides validation that wearable technology can provide useful objective data that correlates with disease activity.”
Furthermore, he said, it is not clear whether analyzing sleep data is a cost-effective way of monitoring IBD disease activity, or whether that data should be used alone or in combination with other objective disease markers, to influence clinical decision-making.
“This study provides proof of concept that there is a relationship between sleep characteristics and objective inflammation, but further studies are needed,” he said. “I am hopeful that this technology will give us another tool that we can use in clinical practice to monitor disease activity and improve outcomes in a way that is comfortable and convenient for our patients.”
This study was supported by a grant to Hirten from the US National Institutes of Health. Hirten reported receiving consulting fees from Bristol Meyers Squibb, AbbVie; stock options from Salvo Health; and research support from Janssen, Intralytix, EnLiSense, Crohn’s and Colitis Foundation. Mintz declared no competing interests.
A version of this article appeared on Medscape.com.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Can Nonresponders to Antiobesity Medicines Be Predicted?
, enabling clinicians to better tailor antiobesity medication (AOM) to the patient.
Currently, patient response to AOMs varies widely, with some patients responding robustly to AOMs and others responding weakly or not at all.
For example, trials of the GLP-1 semaglutide found that 32%-39.6% of people are “super responders,” achieving weight loss in excess of 20%, and a subgroup of 10.2%-16.7% of individuals are nonresponders. Similar variability was found with other AOMs, including the GLP-1 liraglutide and tirzepatide, a dual GLP-1/glucose-dependent insulinotropic polypeptide receptor agonist.
Studies of semaglutide suggest that people with obesity and type 2 diabetes (T2D) lose less weight on the drug than those without T2D, and men tend to lose less weight than women.
However, little else is known about predictors of response rates for various AOMs, and medication selection is typically based on patient or physician preference, comorbidities, medication interactions, and insurance coverage.
Although definitions of a “nonresponder” vary, the Endocrine Society’s latest guideline, which many clinicians follow, states that an AOM is considered effective if patients lose more than 5% of their body weight within 3 months.
Can nonresponders and lower responders be identified and helped? Yes, but it’s complicated.
“Treating obesity effectively means recognizing that not all patients respond the same way to the same treatment, and that’s not a failure; it’s a signal,” said Andres Acosta, MD, PhD, an obesity expert at Mayo Clinic, Rochester, Minnesota, and a cofounder of Phenomix Sciences, a biotech company in Menlo Park, California.
“Obesity is not a single disease. It’s a complex, multifactorial condition driven by diverse biological pathways,” he told GI & Hepatology News. “Semaglutide and other GLP-1s primarily act by reducing appetite and slowing gastric emptying, but not all patients have obesity that is primarily driven by appetite dysregulation.”
Phenotype-Based Profiling
Figuring out what drives an individual’s obesity is where a phenotype-based profiling test could possibly help.
Acosta and colleagues previously used a variety of validated studies and questionnaires to identify four phenotypes that represent distinct biologic drivers of obesity: hungry brain (abnormal satiation), emotional hunger (hedonic eating), hungry gut (abnormal satiety), and slow burn (decreased metabolic rate). In their pragmatic clinical trial, phenotype-guided AOM selection was associated with 1.75-fold greater weight loss after 12 months than the standard approach to drug selection, with mean weight loss of 15.9% and 9%, respectively.
“If a patient’s obesity isn’t primarily rooted in the mechanisms targeted by a particular drug, their response will naturally be limited,” Acosta said. “It’s not that they’re failing the medication; the medication simply isn’t the right match for their biology.”
For their new study, published online in Cell Metabolism, Acosta and colleagues built on their previous research by analyzing the genetic and nongenetic factors that influenced calories needed to reach satiation (Calories to Satiation [CTS]) in adults with obesity. They then used machine learning techniques to develop a CTS gene risk score (CTS-GRS) that could be measured by a DNA saliva test.
The study included 717 adults with obesity (mean age, 41; 75% women) with marked variability in satiation, ranging from 140 to 2166 kcals to reach satiation.
CTS was assessed through an ad libitum meal, combined with physiological and behavioral evaluations, including calorimetry, imaging, blood sampling, and gastric emptying tests. The largest contributors to CTS variability were sex and genetic factors, while other anthropometric measurements played lesser roles.
Various analyses and assessments of participants’ CTS-GRS scores showed that individuals with a high CTS-GRS, or hungry brain phenotype, experienced significantly greater weight loss when treated with phentermine/topiramate than those with a low CTS-GRS, or hungry gut, phenotype. After 52 weeks of treatment, individuals with the hungry brain phenotype lost an average of 17.4% of their body weight compared with 11.2% in those with the hungry gut phenotype.
An analysis of a separate 16-week study showed that patients with the hungry gut phenotype responded better to the GLP-1 liraglutide, losing 6.4% total body weight, compared to 3.3% for those with the hungry brain phenotype.
Overall, the CTS-GRS test predicted drug response with up to 84% accuracy (area under the curve, 0.76 in men and 0.84 in women). The authors acknowledged that these results need to be replicated prospectively and in more diverse populations to validate the test’s predictive ability.
“This kind of phenotype-based profiling allows us to predict which patients are more likely to respond and who might need a different intervention,” Acosta said. “It’s a critical step toward eliminating trial-and-error in obesity treatment.”
The test (MyPhenome test) is used at more than 80 healthcare clinics in the United States, according to Phenomix Sciences, which manufactures it. A company spokesperson said the test does not require FDA approval because it is used to predict obesity phenotypes to help inform treatment, but not to identify specific medications or other interventions. “If it were to do the latter,” the spokesperson said, “it would be considered a ‘companion diagnostic’ and subject to the FDA clearance process.”
What to Do if an AOM Isn’t Working?
It’s one thing to predict whether an individual might do better on one drug vs another, but what should clinicians do meanwhile to optimize weight loss for their patients who may be struggling on a particular drug?
“Efforts to predict the response to GLP-1 therapy have been a hot topic,” noted Sriram Machineni, MD, associate professor at Montefiore Medical Center, Bronx, New York, and founding director of the Fleischer Institute Medical Weight Center at Montefiore Einstein. Although the current study showed that genetic testing could predict responders, like Acosta, he agreed that the results need to be replicated in a prospective manner.
“In the absence of a validated tool for predicting response to specific medications, we use a prioritization process for trialing medications,” Machineni told GI & Hepatology News. “The prioritization is based on the suitability of the side-effect profile to the specific patient, including contraindications; benefits independent of weight loss, such as cardiovascular protection for semaglutide; average efficacy; and financial accessibility for patients.”
Predicting responders isn’t straightforward, said Robert Kushner, MD, professor of medicine and medical education at the Feinberg School of Medicine at Northwestern University and medical director of the Wellness Institute at Northwestern Memorial Hospital in Chicago.
“Despite looking at baseline demographic data such as race, ethnicity, age, weight, and BMI, we are unable to predict who will lose more or less weight,” he told GI & Hepatology News. The one exception is that women generally lose more weight than men. “However, even among females, we cannot discern which females will lose more weight than other females,” he said.
If an individual is not showing sufficient weight loss on a particular medication, “we first explore potential reasons that can be addressed, such as the patient is not taking the medication or is skipping doses,” Kushner said. If need be, they discuss changing to a different drug to improve compliance. He also stresses the importance of making lifestyle changes in diet and physical activity for patients taking AOMs.
Often patients who do not lose at least 5% of their weight within 3 months are not likely to respond well to that medication even if they remain on it. “So, early response rates determine longer-term success,” Kushner said.
Acosta said that if a patient isn’t responding to one class of medication, he pivots to a treatment better aligned with their phenotype. “That could mean switching from a GLP-1 to a medication like [naltrexone/bupropion] or trying a new method altogether,” he said. “The key is that the treatment decision is rooted in the patient’s biology, not just a reaction to short-term results. We also emphasize the importance of long-term follow-up and support.”
The goal isn’t just weight loss but also improved health and quality of life, Acosta said. “Whether through medication, surgery, or behavior change, what matters most is tailoring the care plan to each individual’s unique biology and needs.”
The new study received support from the Mayo Clinic Clinical Research Trials Unit, Vivus Inc., and Phenomix Sciences. Acosta is supported by a National Institutes of Health grant.
Acosta is a co-founder and inventor of intellectual property licensed to Phenomix Sciences Inc.; has served as a consultant for Rhythm Pharmaceuticals, Gila Therapeutics, Amgen, General Mills, Boehringer Ingelheim, Currax Pharmaceuticals, Nestlé, Bausch Health, and Rare Diseases; and has received research support or had contracts with Vivus Inc., Satiogen Pharmaceuticals, Boehringer Ingelheim, and Rhythm Pharmaceuticals. Machineni has been involved in semaglutide and tirzepatide clinical trials and has been a consultant to Novo Nordisk, Eli Lilly and Company, and Rhythm Pharmaceuticals. Kushner is on the scientific advisory board for Novo Nordisk.
A version of this article appeared on Medscape.com.
, enabling clinicians to better tailor antiobesity medication (AOM) to the patient.
Currently, patient response to AOMs varies widely, with some patients responding robustly to AOMs and others responding weakly or not at all.
For example, trials of the GLP-1 semaglutide found that 32%-39.6% of people are “super responders,” achieving weight loss in excess of 20%, and a subgroup of 10.2%-16.7% of individuals are nonresponders. Similar variability was found with other AOMs, including the GLP-1 liraglutide and tirzepatide, a dual GLP-1/glucose-dependent insulinotropic polypeptide receptor agonist.
Studies of semaglutide suggest that people with obesity and type 2 diabetes (T2D) lose less weight on the drug than those without T2D, and men tend to lose less weight than women.
However, little else is known about predictors of response rates for various AOMs, and medication selection is typically based on patient or physician preference, comorbidities, medication interactions, and insurance coverage.
Although definitions of a “nonresponder” vary, the Endocrine Society’s latest guideline, which many clinicians follow, states that an AOM is considered effective if patients lose more than 5% of their body weight within 3 months.
Can nonresponders and lower responders be identified and helped? Yes, but it’s complicated.
“Treating obesity effectively means recognizing that not all patients respond the same way to the same treatment, and that’s not a failure; it’s a signal,” said Andres Acosta, MD, PhD, an obesity expert at Mayo Clinic, Rochester, Minnesota, and a cofounder of Phenomix Sciences, a biotech company in Menlo Park, California.
“Obesity is not a single disease. It’s a complex, multifactorial condition driven by diverse biological pathways,” he told GI & Hepatology News. “Semaglutide and other GLP-1s primarily act by reducing appetite and slowing gastric emptying, but not all patients have obesity that is primarily driven by appetite dysregulation.”
Phenotype-Based Profiling
Figuring out what drives an individual’s obesity is where a phenotype-based profiling test could possibly help.
Acosta and colleagues previously used a variety of validated studies and questionnaires to identify four phenotypes that represent distinct biologic drivers of obesity: hungry brain (abnormal satiation), emotional hunger (hedonic eating), hungry gut (abnormal satiety), and slow burn (decreased metabolic rate). In their pragmatic clinical trial, phenotype-guided AOM selection was associated with 1.75-fold greater weight loss after 12 months than the standard approach to drug selection, with mean weight loss of 15.9% and 9%, respectively.
“If a patient’s obesity isn’t primarily rooted in the mechanisms targeted by a particular drug, their response will naturally be limited,” Acosta said. “It’s not that they’re failing the medication; the medication simply isn’t the right match for their biology.”
For their new study, published online in Cell Metabolism, Acosta and colleagues built on their previous research by analyzing the genetic and nongenetic factors that influenced calories needed to reach satiation (Calories to Satiation [CTS]) in adults with obesity. They then used machine learning techniques to develop a CTS gene risk score (CTS-GRS) that could be measured by a DNA saliva test.
The study included 717 adults with obesity (mean age, 41; 75% women) with marked variability in satiation, ranging from 140 to 2166 kcals to reach satiation.
CTS was assessed through an ad libitum meal, combined with physiological and behavioral evaluations, including calorimetry, imaging, blood sampling, and gastric emptying tests. The largest contributors to CTS variability were sex and genetic factors, while other anthropometric measurements played lesser roles.
Various analyses and assessments of participants’ CTS-GRS scores showed that individuals with a high CTS-GRS, or hungry brain phenotype, experienced significantly greater weight loss when treated with phentermine/topiramate than those with a low CTS-GRS, or hungry gut, phenotype. After 52 weeks of treatment, individuals with the hungry brain phenotype lost an average of 17.4% of their body weight compared with 11.2% in those with the hungry gut phenotype.
An analysis of a separate 16-week study showed that patients with the hungry gut phenotype responded better to the GLP-1 liraglutide, losing 6.4% total body weight, compared to 3.3% for those with the hungry brain phenotype.
Overall, the CTS-GRS test predicted drug response with up to 84% accuracy (area under the curve, 0.76 in men and 0.84 in women). The authors acknowledged that these results need to be replicated prospectively and in more diverse populations to validate the test’s predictive ability.
“This kind of phenotype-based profiling allows us to predict which patients are more likely to respond and who might need a different intervention,” Acosta said. “It’s a critical step toward eliminating trial-and-error in obesity treatment.”
The test (MyPhenome test) is used at more than 80 healthcare clinics in the United States, according to Phenomix Sciences, which manufactures it. A company spokesperson said the test does not require FDA approval because it is used to predict obesity phenotypes to help inform treatment, but not to identify specific medications or other interventions. “If it were to do the latter,” the spokesperson said, “it would be considered a ‘companion diagnostic’ and subject to the FDA clearance process.”
What to Do if an AOM Isn’t Working?
It’s one thing to predict whether an individual might do better on one drug vs another, but what should clinicians do meanwhile to optimize weight loss for their patients who may be struggling on a particular drug?
“Efforts to predict the response to GLP-1 therapy have been a hot topic,” noted Sriram Machineni, MD, associate professor at Montefiore Medical Center, Bronx, New York, and founding director of the Fleischer Institute Medical Weight Center at Montefiore Einstein. Although the current study showed that genetic testing could predict responders, like Acosta, he agreed that the results need to be replicated in a prospective manner.
“In the absence of a validated tool for predicting response to specific medications, we use a prioritization process for trialing medications,” Machineni told GI & Hepatology News. “The prioritization is based on the suitability of the side-effect profile to the specific patient, including contraindications; benefits independent of weight loss, such as cardiovascular protection for semaglutide; average efficacy; and financial accessibility for patients.”
Predicting responders isn’t straightforward, said Robert Kushner, MD, professor of medicine and medical education at the Feinberg School of Medicine at Northwestern University and medical director of the Wellness Institute at Northwestern Memorial Hospital in Chicago.
“Despite looking at baseline demographic data such as race, ethnicity, age, weight, and BMI, we are unable to predict who will lose more or less weight,” he told GI & Hepatology News. The one exception is that women generally lose more weight than men. “However, even among females, we cannot discern which females will lose more weight than other females,” he said.
If an individual is not showing sufficient weight loss on a particular medication, “we first explore potential reasons that can be addressed, such as the patient is not taking the medication or is skipping doses,” Kushner said. If need be, they discuss changing to a different drug to improve compliance. He also stresses the importance of making lifestyle changes in diet and physical activity for patients taking AOMs.
Often patients who do not lose at least 5% of their weight within 3 months are not likely to respond well to that medication even if they remain on it. “So, early response rates determine longer-term success,” Kushner said.
Acosta said that if a patient isn’t responding to one class of medication, he pivots to a treatment better aligned with their phenotype. “That could mean switching from a GLP-1 to a medication like [naltrexone/bupropion] or trying a new method altogether,” he said. “The key is that the treatment decision is rooted in the patient’s biology, not just a reaction to short-term results. We also emphasize the importance of long-term follow-up and support.”
The goal isn’t just weight loss but also improved health and quality of life, Acosta said. “Whether through medication, surgery, or behavior change, what matters most is tailoring the care plan to each individual’s unique biology and needs.”
The new study received support from the Mayo Clinic Clinical Research Trials Unit, Vivus Inc., and Phenomix Sciences. Acosta is supported by a National Institutes of Health grant.
Acosta is a co-founder and inventor of intellectual property licensed to Phenomix Sciences Inc.; has served as a consultant for Rhythm Pharmaceuticals, Gila Therapeutics, Amgen, General Mills, Boehringer Ingelheim, Currax Pharmaceuticals, Nestlé, Bausch Health, and Rare Diseases; and has received research support or had contracts with Vivus Inc., Satiogen Pharmaceuticals, Boehringer Ingelheim, and Rhythm Pharmaceuticals. Machineni has been involved in semaglutide and tirzepatide clinical trials and has been a consultant to Novo Nordisk, Eli Lilly and Company, and Rhythm Pharmaceuticals. Kushner is on the scientific advisory board for Novo Nordisk.
A version of this article appeared on Medscape.com.
, enabling clinicians to better tailor antiobesity medication (AOM) to the patient.
Currently, patient response to AOMs varies widely, with some patients responding robustly to AOMs and others responding weakly or not at all.
For example, trials of the GLP-1 semaglutide found that 32%-39.6% of people are “super responders,” achieving weight loss in excess of 20%, and a subgroup of 10.2%-16.7% of individuals are nonresponders. Similar variability was found with other AOMs, including the GLP-1 liraglutide and tirzepatide, a dual GLP-1/glucose-dependent insulinotropic polypeptide receptor agonist.
Studies of semaglutide suggest that people with obesity and type 2 diabetes (T2D) lose less weight on the drug than those without T2D, and men tend to lose less weight than women.
However, little else is known about predictors of response rates for various AOMs, and medication selection is typically based on patient or physician preference, comorbidities, medication interactions, and insurance coverage.
Although definitions of a “nonresponder” vary, the Endocrine Society’s latest guideline, which many clinicians follow, states that an AOM is considered effective if patients lose more than 5% of their body weight within 3 months.
Can nonresponders and lower responders be identified and helped? Yes, but it’s complicated.
“Treating obesity effectively means recognizing that not all patients respond the same way to the same treatment, and that’s not a failure; it’s a signal,” said Andres Acosta, MD, PhD, an obesity expert at Mayo Clinic, Rochester, Minnesota, and a cofounder of Phenomix Sciences, a biotech company in Menlo Park, California.
“Obesity is not a single disease. It’s a complex, multifactorial condition driven by diverse biological pathways,” he told GI & Hepatology News. “Semaglutide and other GLP-1s primarily act by reducing appetite and slowing gastric emptying, but not all patients have obesity that is primarily driven by appetite dysregulation.”
Phenotype-Based Profiling
Figuring out what drives an individual’s obesity is where a phenotype-based profiling test could possibly help.
Acosta and colleagues previously used a variety of validated studies and questionnaires to identify four phenotypes that represent distinct biologic drivers of obesity: hungry brain (abnormal satiation), emotional hunger (hedonic eating), hungry gut (abnormal satiety), and slow burn (decreased metabolic rate). In their pragmatic clinical trial, phenotype-guided AOM selection was associated with 1.75-fold greater weight loss after 12 months than the standard approach to drug selection, with mean weight loss of 15.9% and 9%, respectively.
“If a patient’s obesity isn’t primarily rooted in the mechanisms targeted by a particular drug, their response will naturally be limited,” Acosta said. “It’s not that they’re failing the medication; the medication simply isn’t the right match for their biology.”
For their new study, published online in Cell Metabolism, Acosta and colleagues built on their previous research by analyzing the genetic and nongenetic factors that influenced calories needed to reach satiation (Calories to Satiation [CTS]) in adults with obesity. They then used machine learning techniques to develop a CTS gene risk score (CTS-GRS) that could be measured by a DNA saliva test.
The study included 717 adults with obesity (mean age, 41; 75% women) with marked variability in satiation, ranging from 140 to 2166 kcals to reach satiation.
CTS was assessed through an ad libitum meal, combined with physiological and behavioral evaluations, including calorimetry, imaging, blood sampling, and gastric emptying tests. The largest contributors to CTS variability were sex and genetic factors, while other anthropometric measurements played lesser roles.
Various analyses and assessments of participants’ CTS-GRS scores showed that individuals with a high CTS-GRS, or hungry brain phenotype, experienced significantly greater weight loss when treated with phentermine/topiramate than those with a low CTS-GRS, or hungry gut, phenotype. After 52 weeks of treatment, individuals with the hungry brain phenotype lost an average of 17.4% of their body weight compared with 11.2% in those with the hungry gut phenotype.
An analysis of a separate 16-week study showed that patients with the hungry gut phenotype responded better to the GLP-1 liraglutide, losing 6.4% total body weight, compared to 3.3% for those with the hungry brain phenotype.
Overall, the CTS-GRS test predicted drug response with up to 84% accuracy (area under the curve, 0.76 in men and 0.84 in women). The authors acknowledged that these results need to be replicated prospectively and in more diverse populations to validate the test’s predictive ability.
“This kind of phenotype-based profiling allows us to predict which patients are more likely to respond and who might need a different intervention,” Acosta said. “It’s a critical step toward eliminating trial-and-error in obesity treatment.”
The test (MyPhenome test) is used at more than 80 healthcare clinics in the United States, according to Phenomix Sciences, which manufactures it. A company spokesperson said the test does not require FDA approval because it is used to predict obesity phenotypes to help inform treatment, but not to identify specific medications or other interventions. “If it were to do the latter,” the spokesperson said, “it would be considered a ‘companion diagnostic’ and subject to the FDA clearance process.”
What to Do if an AOM Isn’t Working?
It’s one thing to predict whether an individual might do better on one drug vs another, but what should clinicians do meanwhile to optimize weight loss for their patients who may be struggling on a particular drug?
“Efforts to predict the response to GLP-1 therapy have been a hot topic,” noted Sriram Machineni, MD, associate professor at Montefiore Medical Center, Bronx, New York, and founding director of the Fleischer Institute Medical Weight Center at Montefiore Einstein. Although the current study showed that genetic testing could predict responders, like Acosta, he agreed that the results need to be replicated in a prospective manner.
“In the absence of a validated tool for predicting response to specific medications, we use a prioritization process for trialing medications,” Machineni told GI & Hepatology News. “The prioritization is based on the suitability of the side-effect profile to the specific patient, including contraindications; benefits independent of weight loss, such as cardiovascular protection for semaglutide; average efficacy; and financial accessibility for patients.”
Predicting responders isn’t straightforward, said Robert Kushner, MD, professor of medicine and medical education at the Feinberg School of Medicine at Northwestern University and medical director of the Wellness Institute at Northwestern Memorial Hospital in Chicago.
“Despite looking at baseline demographic data such as race, ethnicity, age, weight, and BMI, we are unable to predict who will lose more or less weight,” he told GI & Hepatology News. The one exception is that women generally lose more weight than men. “However, even among females, we cannot discern which females will lose more weight than other females,” he said.
If an individual is not showing sufficient weight loss on a particular medication, “we first explore potential reasons that can be addressed, such as the patient is not taking the medication or is skipping doses,” Kushner said. If need be, they discuss changing to a different drug to improve compliance. He also stresses the importance of making lifestyle changes in diet and physical activity for patients taking AOMs.
Often patients who do not lose at least 5% of their weight within 3 months are not likely to respond well to that medication even if they remain on it. “So, early response rates determine longer-term success,” Kushner said.
Acosta said that if a patient isn’t responding to one class of medication, he pivots to a treatment better aligned with their phenotype. “That could mean switching from a GLP-1 to a medication like [naltrexone/bupropion] or trying a new method altogether,” he said. “The key is that the treatment decision is rooted in the patient’s biology, not just a reaction to short-term results. We also emphasize the importance of long-term follow-up and support.”
The goal isn’t just weight loss but also improved health and quality of life, Acosta said. “Whether through medication, surgery, or behavior change, what matters most is tailoring the care plan to each individual’s unique biology and needs.”
The new study received support from the Mayo Clinic Clinical Research Trials Unit, Vivus Inc., and Phenomix Sciences. Acosta is supported by a National Institutes of Health grant.
Acosta is a co-founder and inventor of intellectual property licensed to Phenomix Sciences Inc.; has served as a consultant for Rhythm Pharmaceuticals, Gila Therapeutics, Amgen, General Mills, Boehringer Ingelheim, Currax Pharmaceuticals, Nestlé, Bausch Health, and Rare Diseases; and has received research support or had contracts with Vivus Inc., Satiogen Pharmaceuticals, Boehringer Ingelheim, and Rhythm Pharmaceuticals. Machineni has been involved in semaglutide and tirzepatide clinical trials and has been a consultant to Novo Nordisk, Eli Lilly and Company, and Rhythm Pharmaceuticals. Kushner is on the scientific advisory board for Novo Nordisk.
A version of this article appeared on Medscape.com.
More Evidence Supports ‘Individualized Approach’ to Pre-Endoscopy GLP-1 RAs
in The American Journal of Gastroenterology. Moreover, most instances occurred in patients using the drugs for type 2 diabetes (T2D) rather than for weight loss alone.
The findings suggest adopting an individualized approach rather than universal preoperative withholding of GLP-1 RAs before endoscopy, concluded Jennifer Phan, MD, medical director of the Hoag Advanced Endoscopy Center in Newport Beach, California, and colleagues. These agents are associated with slowed gastric emptying, possibly raising the risk for pulmonary aspiration. The study identified comorbid uncontrolled T2D as a risk factor for retained gastric contents.
Recommendations from gastroenterological societies and the American Society of Anesthesiologists (ASA) differ regarding pre-endoscopic holding of these ubiquitous agents used for obesity and T2D. “Many patients undergo routine endoscopic procedures, and there was concern from the anesthesia safety perspective for retained gastric contents,” Phan told GI & Hepatology News. “At first these events were seen in a handful of cases; however, out of precaution this resulted in a statement from the ASA recommending that patients hold their GLP-1 medications for at least 1 week prior to a routine endoscopic procedure.”
That guidance resulted in protocol changes within endoscopy units, cancelled procedures, and potential delays in patient care. “We wanted to study whether this concern was clinically valid and to help identify which subgroup of patients are at highest risk in order to best inform anesthesia and endoscopy practices,” Phan added.
The ASA updated its guidance in 2023.
The current study aligns with other research showing that rates of clinically relevant retained gastric contents are < 10%, Phan said. For instance, the American Gastroenterological Association (AGA) published a rapid clinical practice update in November 2023 that found insufficient evidence to support patients stopping the medications before endoscopic procedures. AGA guidance suggests an individual approach for each patient on a GLP-1 RA rather than a blanket statement on how to manage all patients taking the medications.
“Our initial hypothesis was that the rates of clinically relevant retained gastric contents in patients on GLP-1 RA medications would be low,” Phan noted. “This was born out of anecdotal experience of the limited number of aborted procedures we experienced before the ASA statement.”
Her group also hypothesized that the indication for which the GLP-1 RA was prescribed would be important, with patients taking GLP-1 RA medications for diabetes potentially having a higher likelihood of retained contents given the concomitant propensity for delayed gastric motility related to uncontrolled hyperglycemia.
The Study
The investigators identified 815 patients on confirmed GLP-1 RA medications of various types receiving endoscopy from 2021 to 2023 at four centers. Demographics, prescribing practices, and procedure outcomes were captured. GLP-1 RA management of preoperative holding was retroactively classified per ASA guidance.
Of the 815 patients (mean age, 67.7 years; 57.7% women; 53.9% White individuals), 70 (8.7%) exhibited retained gastric contents on endoscopy. Of these 65 (93%) had T2D with a median A1c of 6.5%. Among those with retained contents, most had a minimal (10, 14.3%) or moderate (31, 44.3%) amount of food retained, although 29 (41.4%) had a large quantity. Only one patient required unplanned intubation because of a large quantity of residual content, and none had aspiration events.
In multivariate analysis, the odds ratio of retention in those with diabetes was 4.1. “Given the predominance of diabetes in those with retained gastric contents, we highlight the potential to risk-stratify patients who require further preprocedural consideration,” the authors wrote.
Those with GLP-1 RA held per ASA guidance (406, 49.8%) were less likely to have retained contents (4.4% vs 12.7%; P < .001), but no significant differences for intubation (0% vs 2%; P = .53) or aborting procedure rates (28% vs 18%; P = .40) due to gastric retention were observed.
On multivariable analysis, the likelihood of food retention increased by 36% (95% CI, 1.15-1.60) for every 1% increase in glycosylated hemoglobin after adjusting for GLP-1 RA type and preoperative medication hold.
“Our study can help to differentiate which patients can be at largest risk for retained gastric contents,” Phan said, noting the impact of increasing percentages of A1C. “There’s a 36% increased likelihood of food retention in patients on GLP-1 medications, so a blanket policy to hold GLP-1s in patients who are nondiabetic and taking the medication for obesity may not be the best approach. But if patients have uncontrolled hyperglycemia, then an approach of caution is clinically valid.” In that context, holding the GLP-1 RA injection or lengthening the preoperative clear-liquid diet policy should be considered.
She noted that the study results are generalizable because the study was conducted across multiple types of hospital systems, both university and county, and included all types of GLP-1 RA.
Offering an anesthesiologist’s perspective on the study, Paul Potnuru, MD, an assistant professor in the Department of Anesthesiology, Critical Care, and Pain Medicine at UTHealth Houston and not involved in the study, called the findings “somewhat reassuring” but said the risk for aspiration was still a consideration.
A recent review, however, reported that the risk for GLP-1 RA-associated pulmonary aspiration was low.
Potnuru acknowledged that the original ASA guidance on preoperative GLP-1 RA cessation led to some confusion. “There were not a lot of data on the issue, but some studies found that even with stopping GLP-1s 2 weeks preoperatively some patients still retained gastric content,” he told GI & Hepatology News.
A study at his center recently reported that 56% of GLP-1 RA users had increased pre-anesthesia residual gastric content compared with 19% of nonusers.
From the anesthesiologist’s clinical vantage point, the margin of safety is an issue even if aspiration risk is low. “If there’s a 1 in 1000 chance or even a 1 in 3000 chance, that can be considered too high,” Potnuru said.
He further noted that the current study included only 815 patients, not nearly enough for definitive data. In addition, a retrospective study based on medical records can’t really capture all the real-world procedural changes made in the operating room. “It’s common for anesthesiologists not to document all cases of intubation, for example,” he said.
While the ideal is a completely empty stomach, he agreed that a practical alternative to stopping GLP-1 RA therapy, especially that prescribed for diabetes, would be a 24-hour liquid diet, which would clear the stomach quickly. “If you stop these drugs in patients taking them for diabetes, you get a worsening of their glycemic control,” he said.
He noted that patients have different risk tolerances, with some willing to go ahead even if ultrasound shows gastric retention, while some opt to cancel.
Prospective studies are needed, Potnuru added, “because you find more if you know what you’re looking for.” His center is starting a clinical trial in 150 patients to assess the impact of a 24-hour, liquids-only diet on gastric retention.
According to Phan, other research is following GLP-1 RA users undergoing colonoscopy. “Future studies can look at the added value of point-of-care abdominal ultrasound to see if it increases precision preoperative management in these patients on GLP-1 medications.”
Other groups are examining the safety of these agents in the general context of sedation. “It’s worth noting that the studies are being done on currently available medications and may not apply to future medications such as triple agonists or anti-amylins that may come on the market in the near future,” Phan said.
This study received no financial support. Neither the study authors nor Potnuru had any conflicts of interest.
A version of this article appeared on Medscape.com.
in The American Journal of Gastroenterology. Moreover, most instances occurred in patients using the drugs for type 2 diabetes (T2D) rather than for weight loss alone.
The findings suggest adopting an individualized approach rather than universal preoperative withholding of GLP-1 RAs before endoscopy, concluded Jennifer Phan, MD, medical director of the Hoag Advanced Endoscopy Center in Newport Beach, California, and colleagues. These agents are associated with slowed gastric emptying, possibly raising the risk for pulmonary aspiration. The study identified comorbid uncontrolled T2D as a risk factor for retained gastric contents.
Recommendations from gastroenterological societies and the American Society of Anesthesiologists (ASA) differ regarding pre-endoscopic holding of these ubiquitous agents used for obesity and T2D. “Many patients undergo routine endoscopic procedures, and there was concern from the anesthesia safety perspective for retained gastric contents,” Phan told GI & Hepatology News. “At first these events were seen in a handful of cases; however, out of precaution this resulted in a statement from the ASA recommending that patients hold their GLP-1 medications for at least 1 week prior to a routine endoscopic procedure.”
That guidance resulted in protocol changes within endoscopy units, cancelled procedures, and potential delays in patient care. “We wanted to study whether this concern was clinically valid and to help identify which subgroup of patients are at highest risk in order to best inform anesthesia and endoscopy practices,” Phan added.
The ASA updated its guidance in 2023.
The current study aligns with other research showing that rates of clinically relevant retained gastric contents are < 10%, Phan said. For instance, the American Gastroenterological Association (AGA) published a rapid clinical practice update in November 2023 that found insufficient evidence to support patients stopping the medications before endoscopic procedures. AGA guidance suggests an individual approach for each patient on a GLP-1 RA rather than a blanket statement on how to manage all patients taking the medications.
“Our initial hypothesis was that the rates of clinically relevant retained gastric contents in patients on GLP-1 RA medications would be low,” Phan noted. “This was born out of anecdotal experience of the limited number of aborted procedures we experienced before the ASA statement.”
Her group also hypothesized that the indication for which the GLP-1 RA was prescribed would be important, with patients taking GLP-1 RA medications for diabetes potentially having a higher likelihood of retained contents given the concomitant propensity for delayed gastric motility related to uncontrolled hyperglycemia.
The Study
The investigators identified 815 patients on confirmed GLP-1 RA medications of various types receiving endoscopy from 2021 to 2023 at four centers. Demographics, prescribing practices, and procedure outcomes were captured. GLP-1 RA management of preoperative holding was retroactively classified per ASA guidance.
Of the 815 patients (mean age, 67.7 years; 57.7% women; 53.9% White individuals), 70 (8.7%) exhibited retained gastric contents on endoscopy. Of these 65 (93%) had T2D with a median A1c of 6.5%. Among those with retained contents, most had a minimal (10, 14.3%) or moderate (31, 44.3%) amount of food retained, although 29 (41.4%) had a large quantity. Only one patient required unplanned intubation because of a large quantity of residual content, and none had aspiration events.
In multivariate analysis, the odds ratio of retention in those with diabetes was 4.1. “Given the predominance of diabetes in those with retained gastric contents, we highlight the potential to risk-stratify patients who require further preprocedural consideration,” the authors wrote.
Those with GLP-1 RA held per ASA guidance (406, 49.8%) were less likely to have retained contents (4.4% vs 12.7%; P < .001), but no significant differences for intubation (0% vs 2%; P = .53) or aborting procedure rates (28% vs 18%; P = .40) due to gastric retention were observed.
On multivariable analysis, the likelihood of food retention increased by 36% (95% CI, 1.15-1.60) for every 1% increase in glycosylated hemoglobin after adjusting for GLP-1 RA type and preoperative medication hold.
“Our study can help to differentiate which patients can be at largest risk for retained gastric contents,” Phan said, noting the impact of increasing percentages of A1C. “There’s a 36% increased likelihood of food retention in patients on GLP-1 medications, so a blanket policy to hold GLP-1s in patients who are nondiabetic and taking the medication for obesity may not be the best approach. But if patients have uncontrolled hyperglycemia, then an approach of caution is clinically valid.” In that context, holding the GLP-1 RA injection or lengthening the preoperative clear-liquid diet policy should be considered.
She noted that the study results are generalizable because the study was conducted across multiple types of hospital systems, both university and county, and included all types of GLP-1 RA.
Offering an anesthesiologist’s perspective on the study, Paul Potnuru, MD, an assistant professor in the Department of Anesthesiology, Critical Care, and Pain Medicine at UTHealth Houston and not involved in the study, called the findings “somewhat reassuring” but said the risk for aspiration was still a consideration.
A recent review, however, reported that the risk for GLP-1 RA-associated pulmonary aspiration was low.
Potnuru acknowledged that the original ASA guidance on preoperative GLP-1 RA cessation led to some confusion. “There were not a lot of data on the issue, but some studies found that even with stopping GLP-1s 2 weeks preoperatively some patients still retained gastric content,” he told GI & Hepatology News.
A study at his center recently reported that 56% of GLP-1 RA users had increased pre-anesthesia residual gastric content compared with 19% of nonusers.
From the anesthesiologist’s clinical vantage point, the margin of safety is an issue even if aspiration risk is low. “If there’s a 1 in 1000 chance or even a 1 in 3000 chance, that can be considered too high,” Potnuru said.
He further noted that the current study included only 815 patients, not nearly enough for definitive data. In addition, a retrospective study based on medical records can’t really capture all the real-world procedural changes made in the operating room. “It’s common for anesthesiologists not to document all cases of intubation, for example,” he said.
While the ideal is a completely empty stomach, he agreed that a practical alternative to stopping GLP-1 RA therapy, especially that prescribed for diabetes, would be a 24-hour liquid diet, which would clear the stomach quickly. “If you stop these drugs in patients taking them for diabetes, you get a worsening of their glycemic control,” he said.
He noted that patients have different risk tolerances, with some willing to go ahead even if ultrasound shows gastric retention, while some opt to cancel.
Prospective studies are needed, Potnuru added, “because you find more if you know what you’re looking for.” His center is starting a clinical trial in 150 patients to assess the impact of a 24-hour, liquids-only diet on gastric retention.
According to Phan, other research is following GLP-1 RA users undergoing colonoscopy. “Future studies can look at the added value of point-of-care abdominal ultrasound to see if it increases precision preoperative management in these patients on GLP-1 medications.”
Other groups are examining the safety of these agents in the general context of sedation. “It’s worth noting that the studies are being done on currently available medications and may not apply to future medications such as triple agonists or anti-amylins that may come on the market in the near future,” Phan said.
This study received no financial support. Neither the study authors nor Potnuru had any conflicts of interest.
A version of this article appeared on Medscape.com.
in The American Journal of Gastroenterology. Moreover, most instances occurred in patients using the drugs for type 2 diabetes (T2D) rather than for weight loss alone.
The findings suggest adopting an individualized approach rather than universal preoperative withholding of GLP-1 RAs before endoscopy, concluded Jennifer Phan, MD, medical director of the Hoag Advanced Endoscopy Center in Newport Beach, California, and colleagues. These agents are associated with slowed gastric emptying, possibly raising the risk for pulmonary aspiration. The study identified comorbid uncontrolled T2D as a risk factor for retained gastric contents.
Recommendations from gastroenterological societies and the American Society of Anesthesiologists (ASA) differ regarding pre-endoscopic holding of these ubiquitous agents used for obesity and T2D. “Many patients undergo routine endoscopic procedures, and there was concern from the anesthesia safety perspective for retained gastric contents,” Phan told GI & Hepatology News. “At first these events were seen in a handful of cases; however, out of precaution this resulted in a statement from the ASA recommending that patients hold their GLP-1 medications for at least 1 week prior to a routine endoscopic procedure.”
That guidance resulted in protocol changes within endoscopy units, cancelled procedures, and potential delays in patient care. “We wanted to study whether this concern was clinically valid and to help identify which subgroup of patients are at highest risk in order to best inform anesthesia and endoscopy practices,” Phan added.
The ASA updated its guidance in 2023.
The current study aligns with other research showing that rates of clinically relevant retained gastric contents are < 10%, Phan said. For instance, the American Gastroenterological Association (AGA) published a rapid clinical practice update in November 2023 that found insufficient evidence to support patients stopping the medications before endoscopic procedures. AGA guidance suggests an individual approach for each patient on a GLP-1 RA rather than a blanket statement on how to manage all patients taking the medications.
“Our initial hypothesis was that the rates of clinically relevant retained gastric contents in patients on GLP-1 RA medications would be low,” Phan noted. “This was born out of anecdotal experience of the limited number of aborted procedures we experienced before the ASA statement.”
Her group also hypothesized that the indication for which the GLP-1 RA was prescribed would be important, with patients taking GLP-1 RA medications for diabetes potentially having a higher likelihood of retained contents given the concomitant propensity for delayed gastric motility related to uncontrolled hyperglycemia.
The Study
The investigators identified 815 patients on confirmed GLP-1 RA medications of various types receiving endoscopy from 2021 to 2023 at four centers. Demographics, prescribing practices, and procedure outcomes were captured. GLP-1 RA management of preoperative holding was retroactively classified per ASA guidance.
Of the 815 patients (mean age, 67.7 years; 57.7% women; 53.9% White individuals), 70 (8.7%) exhibited retained gastric contents on endoscopy. Of these 65 (93%) had T2D with a median A1c of 6.5%. Among those with retained contents, most had a minimal (10, 14.3%) or moderate (31, 44.3%) amount of food retained, although 29 (41.4%) had a large quantity. Only one patient required unplanned intubation because of a large quantity of residual content, and none had aspiration events.
In multivariate analysis, the odds ratio of retention in those with diabetes was 4.1. “Given the predominance of diabetes in those with retained gastric contents, we highlight the potential to risk-stratify patients who require further preprocedural consideration,” the authors wrote.
Those with GLP-1 RA held per ASA guidance (406, 49.8%) were less likely to have retained contents (4.4% vs 12.7%; P < .001), but no significant differences for intubation (0% vs 2%; P = .53) or aborting procedure rates (28% vs 18%; P = .40) due to gastric retention were observed.
On multivariable analysis, the likelihood of food retention increased by 36% (95% CI, 1.15-1.60) for every 1% increase in glycosylated hemoglobin after adjusting for GLP-1 RA type and preoperative medication hold.
“Our study can help to differentiate which patients can be at largest risk for retained gastric contents,” Phan said, noting the impact of increasing percentages of A1C. “There’s a 36% increased likelihood of food retention in patients on GLP-1 medications, so a blanket policy to hold GLP-1s in patients who are nondiabetic and taking the medication for obesity may not be the best approach. But if patients have uncontrolled hyperglycemia, then an approach of caution is clinically valid.” In that context, holding the GLP-1 RA injection or lengthening the preoperative clear-liquid diet policy should be considered.
She noted that the study results are generalizable because the study was conducted across multiple types of hospital systems, both university and county, and included all types of GLP-1 RA.
Offering an anesthesiologist’s perspective on the study, Paul Potnuru, MD, an assistant professor in the Department of Anesthesiology, Critical Care, and Pain Medicine at UTHealth Houston and not involved in the study, called the findings “somewhat reassuring” but said the risk for aspiration was still a consideration.
A recent review, however, reported that the risk for GLP-1 RA-associated pulmonary aspiration was low.
Potnuru acknowledged that the original ASA guidance on preoperative GLP-1 RA cessation led to some confusion. “There were not a lot of data on the issue, but some studies found that even with stopping GLP-1s 2 weeks preoperatively some patients still retained gastric content,” he told GI & Hepatology News.
A study at his center recently reported that 56% of GLP-1 RA users had increased pre-anesthesia residual gastric content compared with 19% of nonusers.
From the anesthesiologist’s clinical vantage point, the margin of safety is an issue even if aspiration risk is low. “If there’s a 1 in 1000 chance or even a 1 in 3000 chance, that can be considered too high,” Potnuru said.
He further noted that the current study included only 815 patients, not nearly enough for definitive data. In addition, a retrospective study based on medical records can’t really capture all the real-world procedural changes made in the operating room. “It’s common for anesthesiologists not to document all cases of intubation, for example,” he said.
While the ideal is a completely empty stomach, he agreed that a practical alternative to stopping GLP-1 RA therapy, especially that prescribed for diabetes, would be a 24-hour liquid diet, which would clear the stomach quickly. “If you stop these drugs in patients taking them for diabetes, you get a worsening of their glycemic control,” he said.
He noted that patients have different risk tolerances, with some willing to go ahead even if ultrasound shows gastric retention, while some opt to cancel.
Prospective studies are needed, Potnuru added, “because you find more if you know what you’re looking for.” His center is starting a clinical trial in 150 patients to assess the impact of a 24-hour, liquids-only diet on gastric retention.
According to Phan, other research is following GLP-1 RA users undergoing colonoscopy. “Future studies can look at the added value of point-of-care abdominal ultrasound to see if it increases precision preoperative management in these patients on GLP-1 medications.”
Other groups are examining the safety of these agents in the general context of sedation. “It’s worth noting that the studies are being done on currently available medications and may not apply to future medications such as triple agonists or anti-amylins that may come on the market in the near future,” Phan said.
This study received no financial support. Neither the study authors nor Potnuru had any conflicts of interest.
A version of this article appeared on Medscape.com.
What Effect Can a ‘Caring Message’ Intervention Have?
What Effect Can a ‘Caring Message’ Intervention Have?
Caring messages to veterans at risk for suicide come in many forms: cards, letters, phone calls, email, and text messages. Each message can have a major impact on the veteran’s mental health and their decision to use health care provided by the US Department of Veterans Affairs (VA). A recent study outlined ways to centralize that impact, ensuring the caring message reaches those who need it most.
The study examined the impact of the VA Veterans Crisis Line (VCL) caring letters intervention among veterans at increased psychiatric risk. It focused on veterans with ≥ 2 Veterans Health Administration (VHA) health service encounters within 24 months prior to VCL contact. The primary outcome was suicide-related events (SRE), including suicide attempts, intentional self-harm, and suicidal self-directed violence. Secondary outcomes included VHA health care use (all-cause inpatient and outpatient, mental health outpatient, mental health inpatient, and emergency department).
Of 186,514 VCL callers, 8.3% had a psychiatric hospitalization, 4.8% were flagged as high-risk by the REACH VET program, 6.2% had an SRE, and 12.9% met any of these criteria in the year prior to initial VCL contact. There was no association between caring letters and all-cause mortality or SRE, even though caring letters is one of the only interventions to demonstrate a reduction in suicide mortality as a randomized controlled trial.
While reducing suicide has not been the expected result, caring letters have consistently been associated with increased use of outpatient mental health services. The analysis found that veterans with and without indicators of elevated psychiatric risk were using services more. That, the researchers suggest, is more evidence that caring letters might prompt engagement with VHA care, even among veterans not identified as high risk.
Psychiatrist Jerome A. Motto, MD believed long-term supportive but nondemanding contact could reduce a suicidal person’s sense of isolation and enhance feelings of connectedness. His 1976 intervention established a plan to “exert a suicide prevention influence on high-risk persons who decline to enter the health care system.” In Motto’s 5-year follow-up study of 3,006 psychiatric inpatients, half of those who were not following their postdischarge treatment plan received calls or letters expressing interest in their well-being. Suicidal deaths were found to “diverge progressively,” leading Motto to claim the study showed “tentative evidence” that a high-risk population for suicide can be identified and that risk might be reduced through a systematic approach.
Despite those findings, the results of studies on repeated follow-up contact have been mixed. One review outlined how 5 studies showed a statistically significant reduction in suicidal behavior, 4 showed mixed results with trends toward a preventive effect, and 2 studies did not show a preventive effect.
In 2020, the VA launched an intervention for veterans who contacted the VCL. In the first 12 months, CLs were sent to > 100,000 veterans. In feedback interviews, participants described feeling appreciated, cared for, encouraged, and connected. They also said that the CLs helped them engage with community resources and made them more likely to seek VA care. Even veterans who were skeptical of the utility of the caring letters sometimes admitted keeping them.
Finding effective ways to prevent suicide among veterans has been a top priority for the VA. In 2021, then-US Surgeon General Jerome Adams issued a Call to Action that recommended using caring letters when gaps in care may exist, including following crisis line calls.
Caring messages to veterans at risk for suicide come in many forms: cards, letters, phone calls, email, and text messages. Each message can have a major impact on the veteran’s mental health and their decision to use health care provided by the US Department of Veterans Affairs (VA). A recent study outlined ways to centralize that impact, ensuring the caring message reaches those who need it most.
The study examined the impact of the VA Veterans Crisis Line (VCL) caring letters intervention among veterans at increased psychiatric risk. It focused on veterans with ≥ 2 Veterans Health Administration (VHA) health service encounters within 24 months prior to VCL contact. The primary outcome was suicide-related events (SRE), including suicide attempts, intentional self-harm, and suicidal self-directed violence. Secondary outcomes included VHA health care use (all-cause inpatient and outpatient, mental health outpatient, mental health inpatient, and emergency department).
Of 186,514 VCL callers, 8.3% had a psychiatric hospitalization, 4.8% were flagged as high-risk by the REACH VET program, 6.2% had an SRE, and 12.9% met any of these criteria in the year prior to initial VCL contact. There was no association between caring letters and all-cause mortality or SRE, even though caring letters is one of the only interventions to demonstrate a reduction in suicide mortality as a randomized controlled trial.
While reducing suicide has not been the expected result, caring letters have consistently been associated with increased use of outpatient mental health services. The analysis found that veterans with and without indicators of elevated psychiatric risk were using services more. That, the researchers suggest, is more evidence that caring letters might prompt engagement with VHA care, even among veterans not identified as high risk.
Psychiatrist Jerome A. Motto, MD believed long-term supportive but nondemanding contact could reduce a suicidal person’s sense of isolation and enhance feelings of connectedness. His 1976 intervention established a plan to “exert a suicide prevention influence on high-risk persons who decline to enter the health care system.” In Motto’s 5-year follow-up study of 3,006 psychiatric inpatients, half of those who were not following their postdischarge treatment plan received calls or letters expressing interest in their well-being. Suicidal deaths were found to “diverge progressively,” leading Motto to claim the study showed “tentative evidence” that a high-risk population for suicide can be identified and that risk might be reduced through a systematic approach.
Despite those findings, the results of studies on repeated follow-up contact have been mixed. One review outlined how 5 studies showed a statistically significant reduction in suicidal behavior, 4 showed mixed results with trends toward a preventive effect, and 2 studies did not show a preventive effect.
In 2020, the VA launched an intervention for veterans who contacted the VCL. In the first 12 months, CLs were sent to > 100,000 veterans. In feedback interviews, participants described feeling appreciated, cared for, encouraged, and connected. They also said that the CLs helped them engage with community resources and made them more likely to seek VA care. Even veterans who were skeptical of the utility of the caring letters sometimes admitted keeping them.
Finding effective ways to prevent suicide among veterans has been a top priority for the VA. In 2021, then-US Surgeon General Jerome Adams issued a Call to Action that recommended using caring letters when gaps in care may exist, including following crisis line calls.
Caring messages to veterans at risk for suicide come in many forms: cards, letters, phone calls, email, and text messages. Each message can have a major impact on the veteran’s mental health and their decision to use health care provided by the US Department of Veterans Affairs (VA). A recent study outlined ways to centralize that impact, ensuring the caring message reaches those who need it most.
The study examined the impact of the VA Veterans Crisis Line (VCL) caring letters intervention among veterans at increased psychiatric risk. It focused on veterans with ≥ 2 Veterans Health Administration (VHA) health service encounters within 24 months prior to VCL contact. The primary outcome was suicide-related events (SRE), including suicide attempts, intentional self-harm, and suicidal self-directed violence. Secondary outcomes included VHA health care use (all-cause inpatient and outpatient, mental health outpatient, mental health inpatient, and emergency department).
Of 186,514 VCL callers, 8.3% had a psychiatric hospitalization, 4.8% were flagged as high-risk by the REACH VET program, 6.2% had an SRE, and 12.9% met any of these criteria in the year prior to initial VCL contact. There was no association between caring letters and all-cause mortality or SRE, even though caring letters is one of the only interventions to demonstrate a reduction in suicide mortality as a randomized controlled trial.
While reducing suicide has not been the expected result, caring letters have consistently been associated with increased use of outpatient mental health services. The analysis found that veterans with and without indicators of elevated psychiatric risk were using services more. That, the researchers suggest, is more evidence that caring letters might prompt engagement with VHA care, even among veterans not identified as high risk.
Psychiatrist Jerome A. Motto, MD believed long-term supportive but nondemanding contact could reduce a suicidal person’s sense of isolation and enhance feelings of connectedness. His 1976 intervention established a plan to “exert a suicide prevention influence on high-risk persons who decline to enter the health care system.” In Motto’s 5-year follow-up study of 3,006 psychiatric inpatients, half of those who were not following their postdischarge treatment plan received calls or letters expressing interest in their well-being. Suicidal deaths were found to “diverge progressively,” leading Motto to claim the study showed “tentative evidence” that a high-risk population for suicide can be identified and that risk might be reduced through a systematic approach.
Despite those findings, the results of studies on repeated follow-up contact have been mixed. One review outlined how 5 studies showed a statistically significant reduction in suicidal behavior, 4 showed mixed results with trends toward a preventive effect, and 2 studies did not show a preventive effect.
In 2020, the VA launched an intervention for veterans who contacted the VCL. In the first 12 months, CLs were sent to > 100,000 veterans. In feedback interviews, participants described feeling appreciated, cared for, encouraged, and connected. They also said that the CLs helped them engage with community resources and made them more likely to seek VA care. Even veterans who were skeptical of the utility of the caring letters sometimes admitted keeping them.
Finding effective ways to prevent suicide among veterans has been a top priority for the VA. In 2021, then-US Surgeon General Jerome Adams issued a Call to Action that recommended using caring letters when gaps in care may exist, including following crisis line calls.
What Effect Can a ‘Caring Message’ Intervention Have?
What Effect Can a ‘Caring Message’ Intervention Have?
Military Service May Increase Risk for Early Menopause
Military Service May Increase Risk for Early Menopause
Traumatic and environmental exposures during military service may put women veterans at risk for early menopause, a recent longitudinal analysis of data from 668 women in the Gulf War Era Cohort Study found.
The study examined associations between possible early menopause (aged < 45 years) and participants’ Gulf War deployment, military environmental exposures (MEEs), Gulf War Illness, military sexual trauma (MST) and posttraumatic stress disorder (PTSD).
Of 384 Gulf War–deployed veterans, 63% reported MEEs and 26% reported MST during deployment. More than half (57%) of study participants (both Gulf War veterans and nondeployed veterans) met criteria for Gulf War Illness, and 23% met criteria for probable PTSD.
At follow-up, 15% of the women had possible early menopause—higher than population estimates for early menopause in the US, which range from 5% to 10%.
Gulf War deployment, Gulf War–related environmental exposures, and MST during deployment were not significantly associated with early menopause. However, both Gulf War Illness (odds ratio [OR], 1.83; 95% CI, 1.14 to 2.95) and probable PTSD (OR, 2.45; 95% CI, 1.54 to 3.90) were strongly associated with early menopause. Women with probable PTSD at baseline had more than double the odds of possible early menopause.
Previous research suggests that deployment, MEEs, and Gulf War Illness are broadly associated with adverse reproductive health conditions in women veterans. Exposure to persistent organic pollutants and combustion byproducts (eg, from industrial processes and burn pits) have been linked to ovarian dysfunction and oocyte destruction presumed to contribute to accelerated ovarian aging.
The average age for menopause in the US is 52 years. About 5% of women go through early menopause naturally. Early and premature (< 40 years) menopause may also result from a medical or surgical cause, such as a hysterectomy. Regardless the cause, early menopause can have a profound impact on a woman’s physical, emotional, and mental health. It is associated with premature mortality, poor bone health, sexual dysfunction, a 50% increased risk of cardiovascular disease, and 2-fold increased odds of depression.
“Sometimes we talk about menopause symptoms thinking that they're just sort of 1 brief point in time, but we're also talking about things that may affect women's health and functioning for a third or half of a lifespan,” Carolyn Gibson, PhD, MPH, said at the 2024 Spotlight on Women's Health Cyberseminar Series.
Gibson, a staff psychologist at the San Francisco Veterans Affairs (VA) Women’s Mental Health Program and lead author on the recent early-menopause study, pointed to some of the chronic physical health issues that might develop, such as cardiovascular disease, but also the psychological effects.
“It's just important,” she said during the Cyberseminar Series. “To think about the number of things that women in midlife tend to be juggling and managing, all of which may turn up the volume on symptom experience, effect of vulnerability to health and mental health challenges during this period.”
The findings of the study have clinical implications. Midlife women veterans (aged 45 to 64 years) are the largest group of women veterans enrolled in VA health care. Early menopause brings additional age-related care considerations. The authors advise prioritizing support for routine screening for menopause status and symptoms as well as gender-sensitive training, resources, and staffing to provide comprehensive, trauma-informed, evidence-based menopause care for women at any age.
Traumatic and environmental exposures during military service may put women veterans at risk for early menopause, a recent longitudinal analysis of data from 668 women in the Gulf War Era Cohort Study found.
The study examined associations between possible early menopause (aged < 45 years) and participants’ Gulf War deployment, military environmental exposures (MEEs), Gulf War Illness, military sexual trauma (MST) and posttraumatic stress disorder (PTSD).
Of 384 Gulf War–deployed veterans, 63% reported MEEs and 26% reported MST during deployment. More than half (57%) of study participants (both Gulf War veterans and nondeployed veterans) met criteria for Gulf War Illness, and 23% met criteria for probable PTSD.
At follow-up, 15% of the women had possible early menopause—higher than population estimates for early menopause in the US, which range from 5% to 10%.
Gulf War deployment, Gulf War–related environmental exposures, and MST during deployment were not significantly associated with early menopause. However, both Gulf War Illness (odds ratio [OR], 1.83; 95% CI, 1.14 to 2.95) and probable PTSD (OR, 2.45; 95% CI, 1.54 to 3.90) were strongly associated with early menopause. Women with probable PTSD at baseline had more than double the odds of possible early menopause.
Previous research suggests that deployment, MEEs, and Gulf War Illness are broadly associated with adverse reproductive health conditions in women veterans. Exposure to persistent organic pollutants and combustion byproducts (eg, from industrial processes and burn pits) have been linked to ovarian dysfunction and oocyte destruction presumed to contribute to accelerated ovarian aging.
The average age for menopause in the US is 52 years. About 5% of women go through early menopause naturally. Early and premature (< 40 years) menopause may also result from a medical or surgical cause, such as a hysterectomy. Regardless the cause, early menopause can have a profound impact on a woman’s physical, emotional, and mental health. It is associated with premature mortality, poor bone health, sexual dysfunction, a 50% increased risk of cardiovascular disease, and 2-fold increased odds of depression.
“Sometimes we talk about menopause symptoms thinking that they're just sort of 1 brief point in time, but we're also talking about things that may affect women's health and functioning for a third or half of a lifespan,” Carolyn Gibson, PhD, MPH, said at the 2024 Spotlight on Women's Health Cyberseminar Series.
Gibson, a staff psychologist at the San Francisco Veterans Affairs (VA) Women’s Mental Health Program and lead author on the recent early-menopause study, pointed to some of the chronic physical health issues that might develop, such as cardiovascular disease, but also the psychological effects.
“It's just important,” she said during the Cyberseminar Series. “To think about the number of things that women in midlife tend to be juggling and managing, all of which may turn up the volume on symptom experience, effect of vulnerability to health and mental health challenges during this period.”
The findings of the study have clinical implications. Midlife women veterans (aged 45 to 64 years) are the largest group of women veterans enrolled in VA health care. Early menopause brings additional age-related care considerations. The authors advise prioritizing support for routine screening for menopause status and symptoms as well as gender-sensitive training, resources, and staffing to provide comprehensive, trauma-informed, evidence-based menopause care for women at any age.
Traumatic and environmental exposures during military service may put women veterans at risk for early menopause, a recent longitudinal analysis of data from 668 women in the Gulf War Era Cohort Study found.
The study examined associations between possible early menopause (aged < 45 years) and participants’ Gulf War deployment, military environmental exposures (MEEs), Gulf War Illness, military sexual trauma (MST) and posttraumatic stress disorder (PTSD).
Of 384 Gulf War–deployed veterans, 63% reported MEEs and 26% reported MST during deployment. More than half (57%) of study participants (both Gulf War veterans and nondeployed veterans) met criteria for Gulf War Illness, and 23% met criteria for probable PTSD.
At follow-up, 15% of the women had possible early menopause—higher than population estimates for early menopause in the US, which range from 5% to 10%.
Gulf War deployment, Gulf War–related environmental exposures, and MST during deployment were not significantly associated with early menopause. However, both Gulf War Illness (odds ratio [OR], 1.83; 95% CI, 1.14 to 2.95) and probable PTSD (OR, 2.45; 95% CI, 1.54 to 3.90) were strongly associated with early menopause. Women with probable PTSD at baseline had more than double the odds of possible early menopause.
Previous research suggests that deployment, MEEs, and Gulf War Illness are broadly associated with adverse reproductive health conditions in women veterans. Exposure to persistent organic pollutants and combustion byproducts (eg, from industrial processes and burn pits) have been linked to ovarian dysfunction and oocyte destruction presumed to contribute to accelerated ovarian aging.
The average age for menopause in the US is 52 years. About 5% of women go through early menopause naturally. Early and premature (< 40 years) menopause may also result from a medical or surgical cause, such as a hysterectomy. Regardless the cause, early menopause can have a profound impact on a woman’s physical, emotional, and mental health. It is associated with premature mortality, poor bone health, sexual dysfunction, a 50% increased risk of cardiovascular disease, and 2-fold increased odds of depression.
“Sometimes we talk about menopause symptoms thinking that they're just sort of 1 brief point in time, but we're also talking about things that may affect women's health and functioning for a third or half of a lifespan,” Carolyn Gibson, PhD, MPH, said at the 2024 Spotlight on Women's Health Cyberseminar Series.
Gibson, a staff psychologist at the San Francisco Veterans Affairs (VA) Women’s Mental Health Program and lead author on the recent early-menopause study, pointed to some of the chronic physical health issues that might develop, such as cardiovascular disease, but also the psychological effects.
“It's just important,” she said during the Cyberseminar Series. “To think about the number of things that women in midlife tend to be juggling and managing, all of which may turn up the volume on symptom experience, effect of vulnerability to health and mental health challenges during this period.”
The findings of the study have clinical implications. Midlife women veterans (aged 45 to 64 years) are the largest group of women veterans enrolled in VA health care. Early menopause brings additional age-related care considerations. The authors advise prioritizing support for routine screening for menopause status and symptoms as well as gender-sensitive training, resources, and staffing to provide comprehensive, trauma-informed, evidence-based menopause care for women at any age.
Military Service May Increase Risk for Early Menopause
Military Service May Increase Risk for Early Menopause
FDA Advisory Panel Votes NO on Belantamab for Myeloma
A bid by GlaxoSmithKline (GSK) to bring its multiple myeloma drug belantamab mafodotin (Blenrep) back to the market hit a stumbling block during an FDA panel meeting held on July 17.
The FDA’s Oncologic Drugs Advisory Committee (ODAC) voted 5-3 against belantamab in combination with bortezomib and dexamethasone and 7-1 against belantamab in combination with pomalidamide and dexamethasone on the specific questions of whether the benefits of each treatment regimen at the proposed doses outweigh the risks for patients with relapsed or refractory disease after at least one prior line of therapy.
ODAC members voting no cited concerns about the lack of exploration of optimal dosing, as well as high rates of ocular toxicity and a lack of diversity among trial participants.
“This was a challenging decision because the efficacy data were strong, but the toxicity data were also very strong,” said Neil Vasan, MD, PhD, of New York University Langone Health in New York City.
Regarding optimal dosing, Vasan, who voted no on both questions, cited “a missed opportunity over the course of many years during the development of this drug to explore these different dosages,” but he also noted that “the building blocks are here to explore this question in the future.”
Belantamab, an antibody-drug conjugate targeting B-cell maturation antigen, was granted accelerated approval as a late-line therapy for relapsed or refractory multiple myeloma in August 2020 based on findings from the DREAMM-2 trial. However, GSK voluntarily withdrew the drug from the US market in 2023 after the confirmatory DREAMM-3 trial did not meet its primary endpoint of improved progression-free survival (PFS).
The company continued to explore belantamab in combination with other agents and in earlier lines of therapy. Based on findings from the DREAMM-7 and DREAMM-8 trials, which both showed improved PFS vs standard-of-care triplet therapies, the company submitted a new Biologics License Application in November 2024 seeking approval of the belantamab-based regimens.
Findings from DREAMM-7 and DREAMM-8 were reported at the 2025 American Society of Clinical Oncology conference in Chicago in June.
Both studies met their primary PFS endpoints, but the FDA expressed concerns about adverse events, dosing, and the relevance of the data for US patients and therefore sought input from ODAC members on the proposed dosages of 2.5 mg/kg every 3 weeks for the belantamab plus bortezomib and dexamethasone combination and 2.5 mg/kg in cycle 1, followed by 1.0 mg/kg every 4 weeks for the belantamab plus pomalidamide and dexamethasone combination.
Although GSK and several patients with multiple myeloma touted life-saving benefits of belantamab and argued that ocular toxicity associated with treatment is manageable and transient, most — but not all — ODAC members were unconvinced, at least as to the immediate questions regarding the benefit-risk profile.
“This is probably one of the most difficult votes I’ve done as a member of this committee,” said Grzegorz S. Nowakowski, MD, of the Mayo Clinic, Rochester, Minnesota, who voted yes on belantamab plus bortezomib and dexamethasone.
Nowakowski noted mistakes made from a regulatory perspective, including a lack of appropriate US patient representation in the trials and attention to dose optimization, but ultimately said that, as a practicing hematologist, he couldn’t ignore the drug’s clear activity, including a possible overall survival benefit, and the potential for mitigating toxicity with careful follow-up and dose reductions.
John DeFlice, MD, of Cedars-Sinai Samuel Oschin Cancer Center in Los Angeles — a multiple myeloma survivor and patient representative on the committee — voted yes on both questions, noting that, based on the testimony of patients and the clinical experience of the investigators, belantamab is “an amazing drug for an incurable disease.”
“I think [these] are the wrong issues to be evaluated,” DeFlice said of the specific questions posed by the FDA at the hearing.
The FDA considers the recommendations of its advisory panels in making final approval decisions but is not bound by them.
A version of this article first appeared on Medscape.com.
A bid by GlaxoSmithKline (GSK) to bring its multiple myeloma drug belantamab mafodotin (Blenrep) back to the market hit a stumbling block during an FDA panel meeting held on July 17.
The FDA’s Oncologic Drugs Advisory Committee (ODAC) voted 5-3 against belantamab in combination with bortezomib and dexamethasone and 7-1 against belantamab in combination with pomalidamide and dexamethasone on the specific questions of whether the benefits of each treatment regimen at the proposed doses outweigh the risks for patients with relapsed or refractory disease after at least one prior line of therapy.
ODAC members voting no cited concerns about the lack of exploration of optimal dosing, as well as high rates of ocular toxicity and a lack of diversity among trial participants.
“This was a challenging decision because the efficacy data were strong, but the toxicity data were also very strong,” said Neil Vasan, MD, PhD, of New York University Langone Health in New York City.
Regarding optimal dosing, Vasan, who voted no on both questions, cited “a missed opportunity over the course of many years during the development of this drug to explore these different dosages,” but he also noted that “the building blocks are here to explore this question in the future.”
Belantamab, an antibody-drug conjugate targeting B-cell maturation antigen, was granted accelerated approval as a late-line therapy for relapsed or refractory multiple myeloma in August 2020 based on findings from the DREAMM-2 trial. However, GSK voluntarily withdrew the drug from the US market in 2023 after the confirmatory DREAMM-3 trial did not meet its primary endpoint of improved progression-free survival (PFS).
The company continued to explore belantamab in combination with other agents and in earlier lines of therapy. Based on findings from the DREAMM-7 and DREAMM-8 trials, which both showed improved PFS vs standard-of-care triplet therapies, the company submitted a new Biologics License Application in November 2024 seeking approval of the belantamab-based regimens.
Findings from DREAMM-7 and DREAMM-8 were reported at the 2025 American Society of Clinical Oncology conference in Chicago in June.
Both studies met their primary PFS endpoints, but the FDA expressed concerns about adverse events, dosing, and the relevance of the data for US patients and therefore sought input from ODAC members on the proposed dosages of 2.5 mg/kg every 3 weeks for the belantamab plus bortezomib and dexamethasone combination and 2.5 mg/kg in cycle 1, followed by 1.0 mg/kg every 4 weeks for the belantamab plus pomalidamide and dexamethasone combination.
Although GSK and several patients with multiple myeloma touted life-saving benefits of belantamab and argued that ocular toxicity associated with treatment is manageable and transient, most — but not all — ODAC members were unconvinced, at least as to the immediate questions regarding the benefit-risk profile.
“This is probably one of the most difficult votes I’ve done as a member of this committee,” said Grzegorz S. Nowakowski, MD, of the Mayo Clinic, Rochester, Minnesota, who voted yes on belantamab plus bortezomib and dexamethasone.
Nowakowski noted mistakes made from a regulatory perspective, including a lack of appropriate US patient representation in the trials and attention to dose optimization, but ultimately said that, as a practicing hematologist, he couldn’t ignore the drug’s clear activity, including a possible overall survival benefit, and the potential for mitigating toxicity with careful follow-up and dose reductions.
John DeFlice, MD, of Cedars-Sinai Samuel Oschin Cancer Center in Los Angeles — a multiple myeloma survivor and patient representative on the committee — voted yes on both questions, noting that, based on the testimony of patients and the clinical experience of the investigators, belantamab is “an amazing drug for an incurable disease.”
“I think [these] are the wrong issues to be evaluated,” DeFlice said of the specific questions posed by the FDA at the hearing.
The FDA considers the recommendations of its advisory panels in making final approval decisions but is not bound by them.
A version of this article first appeared on Medscape.com.
A bid by GlaxoSmithKline (GSK) to bring its multiple myeloma drug belantamab mafodotin (Blenrep) back to the market hit a stumbling block during an FDA panel meeting held on July 17.
The FDA’s Oncologic Drugs Advisory Committee (ODAC) voted 5-3 against belantamab in combination with bortezomib and dexamethasone and 7-1 against belantamab in combination with pomalidamide and dexamethasone on the specific questions of whether the benefits of each treatment regimen at the proposed doses outweigh the risks for patients with relapsed or refractory disease after at least one prior line of therapy.
ODAC members voting no cited concerns about the lack of exploration of optimal dosing, as well as high rates of ocular toxicity and a lack of diversity among trial participants.
“This was a challenging decision because the efficacy data were strong, but the toxicity data were also very strong,” said Neil Vasan, MD, PhD, of New York University Langone Health in New York City.
Regarding optimal dosing, Vasan, who voted no on both questions, cited “a missed opportunity over the course of many years during the development of this drug to explore these different dosages,” but he also noted that “the building blocks are here to explore this question in the future.”
Belantamab, an antibody-drug conjugate targeting B-cell maturation antigen, was granted accelerated approval as a late-line therapy for relapsed or refractory multiple myeloma in August 2020 based on findings from the DREAMM-2 trial. However, GSK voluntarily withdrew the drug from the US market in 2023 after the confirmatory DREAMM-3 trial did not meet its primary endpoint of improved progression-free survival (PFS).
The company continued to explore belantamab in combination with other agents and in earlier lines of therapy. Based on findings from the DREAMM-7 and DREAMM-8 trials, which both showed improved PFS vs standard-of-care triplet therapies, the company submitted a new Biologics License Application in November 2024 seeking approval of the belantamab-based regimens.
Findings from DREAMM-7 and DREAMM-8 were reported at the 2025 American Society of Clinical Oncology conference in Chicago in June.
Both studies met their primary PFS endpoints, but the FDA expressed concerns about adverse events, dosing, and the relevance of the data for US patients and therefore sought input from ODAC members on the proposed dosages of 2.5 mg/kg every 3 weeks for the belantamab plus bortezomib and dexamethasone combination and 2.5 mg/kg in cycle 1, followed by 1.0 mg/kg every 4 weeks for the belantamab plus pomalidamide and dexamethasone combination.
Although GSK and several patients with multiple myeloma touted life-saving benefits of belantamab and argued that ocular toxicity associated with treatment is manageable and transient, most — but not all — ODAC members were unconvinced, at least as to the immediate questions regarding the benefit-risk profile.
“This is probably one of the most difficult votes I’ve done as a member of this committee,” said Grzegorz S. Nowakowski, MD, of the Mayo Clinic, Rochester, Minnesota, who voted yes on belantamab plus bortezomib and dexamethasone.
Nowakowski noted mistakes made from a regulatory perspective, including a lack of appropriate US patient representation in the trials and attention to dose optimization, but ultimately said that, as a practicing hematologist, he couldn’t ignore the drug’s clear activity, including a possible overall survival benefit, and the potential for mitigating toxicity with careful follow-up and dose reductions.
John DeFlice, MD, of Cedars-Sinai Samuel Oschin Cancer Center in Los Angeles — a multiple myeloma survivor and patient representative on the committee — voted yes on both questions, noting that, based on the testimony of patients and the clinical experience of the investigators, belantamab is “an amazing drug for an incurable disease.”
“I think [these] are the wrong issues to be evaluated,” DeFlice said of the specific questions posed by the FDA at the hearing.
The FDA considers the recommendations of its advisory panels in making final approval decisions but is not bound by them.
A version of this article first appeared on Medscape.com.
Stay Alert to Sleep Apnea Burden in the Military
Obstructive sleep apnea (OSA) was associated with a significantly increased risk for adverse health outcomes and health care resource use among military personnel in the US, according to data from about 120,000 active-duty service members.
OSA and other clinical sleep disorders are common among military personnel, driven in part by demanding, nontraditional work schedules that can exacerbate sleep problems, but OSA’s impact in this population has not been well-studied, Emerson M. Wickwire, PhD, of the University of Maryland School of Medicine, Baltimore, and colleagues wrote in a new paper published in Chest.
In the current health economic climate of increasing costs and limited resources, the economic aspects of sleep disorders have never been more important, Wickwire said in an interview. The data in this study are the first to quantify the health and utilization burden of OSA in the US military and can support military decision-makers regarding allocation of scarce resources, he said.
To assess the burden of OSA in the military, they reviewed fully de-identified data from 59,203 active-duty military personnel with diagnoses of OSA and compared them with 59,203 active-duty military personnel without OSA. The participants ranged in age from 18 to 64 years; 7.4% were women and 64.5% were white individuals. Study outcomes included new diagnoses of physical and psychological health conditions, as well as health care resource use in the first year after the index date.
About one third of the participants were in the Army (38.7%), 25.6% were in the Air Force, 23.5% were in the Navy, 5.8% were in the Marines, 5.7% were in the Coast Guard, and 0.7% were in the Public Health Service.
Over the 1-year study period, military personnel with OSA diagnoses were significantly more likely to experience new physical and psychological adverse events than control individuals without OSA, based on proportional hazards models. The physical conditions with the greatest increased risk in the OSA group were traumatic brain injury and cardiovascular disease (which included acute myocardial infarction, atrial fibrillation, ischemic heart disease, and peripheral procedures), with hazard ratios (HRs) 3.27 and 2.32, respectively. The psychological conditions with the greatest increased risk in the OSA group vs control individuals were posttraumatic stress disorder (PTSD) and anxiety (HR, 4.41, and HR, 3.35, respectively).
Individuals with OSA also showed increased use of healthcare resources compared with control individuals without OSA, with an additional 170,511 outpatient visits, 66 inpatient visits, and 1,852 emergency department visits.
Don’t Discount OSA in Military Personnel
“From a clinical perspective, these findings underscore the importance of recognizing OSA as a critical risk factor for a wide array of physical and psychological health outcomes,” the researchers wrote in their discussion.
The results highlight the need for more clinical attention to patient screening, triage, and delivery of care, but efforts are limited by the documented shortage of sleep specialists in the military health system, they noted.
Key limitations of the study include the use of an administrative claims data source, which did not include clinical information such as disease severity or daytime symptoms, and the nonrandomized, observational study design, Wickwire told this news organization.
Looking ahead, the researchers at the University of Maryland School of Medicine and the Uniformed Services University, Bethesda, Maryland, are launching a new trial to assess the clinical effectiveness and cost-effectiveness of telehealth visits for military beneficiaries diagnosed with OSA as a way to manage the shortage of sleep specialists in the military health system, according to a press release from the University of Maryland.
“Although the association between poor sleep and traumatic stress is well-known, present results highlight striking associations between sleep apnea and posttraumatic stress disorder, traumatic brain injury, and musculoskeletal injuries, which are key outcomes from the military perspective,” Wickwire told this news organization.
“Our most important clinical recommendation is for healthcare providers to be on alert for signs and symptoms of OSA, including snoring, daytime sleepiness, and morning dry mouth,” said Wickwire. “Primary care and mental health providers should be especially attuned,” he added.
Results Not Surprising, but Research Gaps Remain
“The sleep health of active-duty military personnel is not only vital for optimal military performance but also impacts the health of Veterans after separation from the military,” said Q. Afifa Shamim-Uzzaman, MD, an associate professor and a sleep medicine specialist at the University of Michigan, Ann Arbor, Michigan, in an interview.
The current study identifies increased utilization of healthcare resources by active-duty personnel with sleep apnea, and outcomes were not surprising, said Shamim-Uzzaman, who is employed by the Veterans’ Health Administration, but was not involved in the current study.
The association between untreated OSA and medical and psychological comorbidities such as cardiovascular disease, diabetes, and mood disorders such as depression and anxiety is well-known, Shamim-Uzzaman said. “Patients with depression who also have sleep disturbances are at higher risk for suicide — the strength of this association is such that it led the Veterans’ Health Administration to mandate suicide screening for Veterans seen in its sleep clinics,” he added.
“We also know that untreated OSA is associated with excessive daytime sleepiness, slowed reaction times, and increased risk of motor vehicle accidents, all of which can contribute to sustaining injuries such as traumatic brain injury,” said Shamim-Uzzaman. “Emerging evidence also suggests that sleep disruption prior to exposure to trauma increases the risk of developing PTSD. Therefore, it is not surprising that patients with sleep apnea would have higher healthcare utilization for non-OSA conditions than those without sleep apnea,” he noted.
In clinical practice, the study underscores the importance of identifying and managing sleep health in military personnel, who frequently work nontraditional schedules with long, sustained shifts in grueling conditions not conducive to healthy sleep, Shamim-Uzzaman told this news organization. “Although the harsh work environments that our active-duty military endure come part and parcel with the job, clinicians caring for these individuals should ask specifically about their sleep and working schedules to optimize sleep as best as possible; this should include, but not be limited to, screening and testing for sleep disordered breathing and insomnia,” he said.
The current study has several limitations, including the inability to control for smoking or alcohol use, which are common in military personnel and associated with increased morbidity, said Shamim-Uzzaman. The study also did not assess the impact of other confounding factors, such as sleep duration and daytime sleepiness, that could impact the results, especially the association of OSA and traumatic brain injury, he noted. “More research is needed to assess the impact of these factors as well as the effect of treatment of OSA on comorbidities and healthcare utilization,” he said.
This study was supported by the Military Health Services Research Program.
Wickwire’s institution had received research funding from the American Academy of Sleep Medicine Foundation, Department of Defense, Merck, National Institutes of Health/National Institute on Aging, ResMed, the ResMed Foundation, and the SRS Foundation. Wickwire disclosed serving as a scientific consultant to Axsome Therapeutics, Dayzz, Eisai, EnsoData, Idorsia, Merck, Nox Health, Primasun, Purdue, and ResMed and is an equity shareholder in Well Tap.
Shamim-Uzzaman is an employee of the Veterans’ Health Administration.
A version of this article first appeared on Medscape.com.
Obstructive sleep apnea (OSA) was associated with a significantly increased risk for adverse health outcomes and health care resource use among military personnel in the US, according to data from about 120,000 active-duty service members.
OSA and other clinical sleep disorders are common among military personnel, driven in part by demanding, nontraditional work schedules that can exacerbate sleep problems, but OSA’s impact in this population has not been well-studied, Emerson M. Wickwire, PhD, of the University of Maryland School of Medicine, Baltimore, and colleagues wrote in a new paper published in Chest.
In the current health economic climate of increasing costs and limited resources, the economic aspects of sleep disorders have never been more important, Wickwire said in an interview. The data in this study are the first to quantify the health and utilization burden of OSA in the US military and can support military decision-makers regarding allocation of scarce resources, he said.
To assess the burden of OSA in the military, they reviewed fully de-identified data from 59,203 active-duty military personnel with diagnoses of OSA and compared them with 59,203 active-duty military personnel without OSA. The participants ranged in age from 18 to 64 years; 7.4% were women and 64.5% were white individuals. Study outcomes included new diagnoses of physical and psychological health conditions, as well as health care resource use in the first year after the index date.
About one third of the participants were in the Army (38.7%), 25.6% were in the Air Force, 23.5% were in the Navy, 5.8% were in the Marines, 5.7% were in the Coast Guard, and 0.7% were in the Public Health Service.
Over the 1-year study period, military personnel with OSA diagnoses were significantly more likely to experience new physical and psychological adverse events than control individuals without OSA, based on proportional hazards models. The physical conditions with the greatest increased risk in the OSA group were traumatic brain injury and cardiovascular disease (which included acute myocardial infarction, atrial fibrillation, ischemic heart disease, and peripheral procedures), with hazard ratios (HRs) 3.27 and 2.32, respectively. The psychological conditions with the greatest increased risk in the OSA group vs control individuals were posttraumatic stress disorder (PTSD) and anxiety (HR, 4.41, and HR, 3.35, respectively).
Individuals with OSA also showed increased use of healthcare resources compared with control individuals without OSA, with an additional 170,511 outpatient visits, 66 inpatient visits, and 1,852 emergency department visits.
Don’t Discount OSA in Military Personnel
“From a clinical perspective, these findings underscore the importance of recognizing OSA as a critical risk factor for a wide array of physical and psychological health outcomes,” the researchers wrote in their discussion.
The results highlight the need for more clinical attention to patient screening, triage, and delivery of care, but efforts are limited by the documented shortage of sleep specialists in the military health system, they noted.
Key limitations of the study include the use of an administrative claims data source, which did not include clinical information such as disease severity or daytime symptoms, and the nonrandomized, observational study design, Wickwire told this news organization.
Looking ahead, the researchers at the University of Maryland School of Medicine and the Uniformed Services University, Bethesda, Maryland, are launching a new trial to assess the clinical effectiveness and cost-effectiveness of telehealth visits for military beneficiaries diagnosed with OSA as a way to manage the shortage of sleep specialists in the military health system, according to a press release from the University of Maryland.
“Although the association between poor sleep and traumatic stress is well-known, present results highlight striking associations between sleep apnea and posttraumatic stress disorder, traumatic brain injury, and musculoskeletal injuries, which are key outcomes from the military perspective,” Wickwire told this news organization.
“Our most important clinical recommendation is for healthcare providers to be on alert for signs and symptoms of OSA, including snoring, daytime sleepiness, and morning dry mouth,” said Wickwire. “Primary care and mental health providers should be especially attuned,” he added.
Results Not Surprising, but Research Gaps Remain
“The sleep health of active-duty military personnel is not only vital for optimal military performance but also impacts the health of Veterans after separation from the military,” said Q. Afifa Shamim-Uzzaman, MD, an associate professor and a sleep medicine specialist at the University of Michigan, Ann Arbor, Michigan, in an interview.
The current study identifies increased utilization of healthcare resources by active-duty personnel with sleep apnea, and outcomes were not surprising, said Shamim-Uzzaman, who is employed by the Veterans’ Health Administration, but was not involved in the current study.
The association between untreated OSA and medical and psychological comorbidities such as cardiovascular disease, diabetes, and mood disorders such as depression and anxiety is well-known, Shamim-Uzzaman said. “Patients with depression who also have sleep disturbances are at higher risk for suicide — the strength of this association is such that it led the Veterans’ Health Administration to mandate suicide screening for Veterans seen in its sleep clinics,” he added.
“We also know that untreated OSA is associated with excessive daytime sleepiness, slowed reaction times, and increased risk of motor vehicle accidents, all of which can contribute to sustaining injuries such as traumatic brain injury,” said Shamim-Uzzaman. “Emerging evidence also suggests that sleep disruption prior to exposure to trauma increases the risk of developing PTSD. Therefore, it is not surprising that patients with sleep apnea would have higher healthcare utilization for non-OSA conditions than those without sleep apnea,” he noted.
In clinical practice, the study underscores the importance of identifying and managing sleep health in military personnel, who frequently work nontraditional schedules with long, sustained shifts in grueling conditions not conducive to healthy sleep, Shamim-Uzzaman told this news organization. “Although the harsh work environments that our active-duty military endure come part and parcel with the job, clinicians caring for these individuals should ask specifically about their sleep and working schedules to optimize sleep as best as possible; this should include, but not be limited to, screening and testing for sleep disordered breathing and insomnia,” he said.
The current study has several limitations, including the inability to control for smoking or alcohol use, which are common in military personnel and associated with increased morbidity, said Shamim-Uzzaman. The study also did not assess the impact of other confounding factors, such as sleep duration and daytime sleepiness, that could impact the results, especially the association of OSA and traumatic brain injury, he noted. “More research is needed to assess the impact of these factors as well as the effect of treatment of OSA on comorbidities and healthcare utilization,” he said.
This study was supported by the Military Health Services Research Program.
Wickwire’s institution had received research funding from the American Academy of Sleep Medicine Foundation, Department of Defense, Merck, National Institutes of Health/National Institute on Aging, ResMed, the ResMed Foundation, and the SRS Foundation. Wickwire disclosed serving as a scientific consultant to Axsome Therapeutics, Dayzz, Eisai, EnsoData, Idorsia, Merck, Nox Health, Primasun, Purdue, and ResMed and is an equity shareholder in Well Tap.
Shamim-Uzzaman is an employee of the Veterans’ Health Administration.
A version of this article first appeared on Medscape.com.
Obstructive sleep apnea (OSA) was associated with a significantly increased risk for adverse health outcomes and health care resource use among military personnel in the US, according to data from about 120,000 active-duty service members.
OSA and other clinical sleep disorders are common among military personnel, driven in part by demanding, nontraditional work schedules that can exacerbate sleep problems, but OSA’s impact in this population has not been well-studied, Emerson M. Wickwire, PhD, of the University of Maryland School of Medicine, Baltimore, and colleagues wrote in a new paper published in Chest.
In the current health economic climate of increasing costs and limited resources, the economic aspects of sleep disorders have never been more important, Wickwire said in an interview. The data in this study are the first to quantify the health and utilization burden of OSA in the US military and can support military decision-makers regarding allocation of scarce resources, he said.
To assess the burden of OSA in the military, they reviewed fully de-identified data from 59,203 active-duty military personnel with diagnoses of OSA and compared them with 59,203 active-duty military personnel without OSA. The participants ranged in age from 18 to 64 years; 7.4% were women and 64.5% were white individuals. Study outcomes included new diagnoses of physical and psychological health conditions, as well as health care resource use in the first year after the index date.
About one third of the participants were in the Army (38.7%), 25.6% were in the Air Force, 23.5% were in the Navy, 5.8% were in the Marines, 5.7% were in the Coast Guard, and 0.7% were in the Public Health Service.
Over the 1-year study period, military personnel with OSA diagnoses were significantly more likely to experience new physical and psychological adverse events than control individuals without OSA, based on proportional hazards models. The physical conditions with the greatest increased risk in the OSA group were traumatic brain injury and cardiovascular disease (which included acute myocardial infarction, atrial fibrillation, ischemic heart disease, and peripheral procedures), with hazard ratios (HRs) 3.27 and 2.32, respectively. The psychological conditions with the greatest increased risk in the OSA group vs control individuals were posttraumatic stress disorder (PTSD) and anxiety (HR, 4.41, and HR, 3.35, respectively).
Individuals with OSA also showed increased use of healthcare resources compared with control individuals without OSA, with an additional 170,511 outpatient visits, 66 inpatient visits, and 1,852 emergency department visits.
Don’t Discount OSA in Military Personnel
“From a clinical perspective, these findings underscore the importance of recognizing OSA as a critical risk factor for a wide array of physical and psychological health outcomes,” the researchers wrote in their discussion.
The results highlight the need for more clinical attention to patient screening, triage, and delivery of care, but efforts are limited by the documented shortage of sleep specialists in the military health system, they noted.
Key limitations of the study include the use of an administrative claims data source, which did not include clinical information such as disease severity or daytime symptoms, and the nonrandomized, observational study design, Wickwire told this news organization.
Looking ahead, the researchers at the University of Maryland School of Medicine and the Uniformed Services University, Bethesda, Maryland, are launching a new trial to assess the clinical effectiveness and cost-effectiveness of telehealth visits for military beneficiaries diagnosed with OSA as a way to manage the shortage of sleep specialists in the military health system, according to a press release from the University of Maryland.
“Although the association between poor sleep and traumatic stress is well-known, present results highlight striking associations between sleep apnea and posttraumatic stress disorder, traumatic brain injury, and musculoskeletal injuries, which are key outcomes from the military perspective,” Wickwire told this news organization.
“Our most important clinical recommendation is for healthcare providers to be on alert for signs and symptoms of OSA, including snoring, daytime sleepiness, and morning dry mouth,” said Wickwire. “Primary care and mental health providers should be especially attuned,” he added.
Results Not Surprising, but Research Gaps Remain
“The sleep health of active-duty military personnel is not only vital for optimal military performance but also impacts the health of Veterans after separation from the military,” said Q. Afifa Shamim-Uzzaman, MD, an associate professor and a sleep medicine specialist at the University of Michigan, Ann Arbor, Michigan, in an interview.
The current study identifies increased utilization of healthcare resources by active-duty personnel with sleep apnea, and outcomes were not surprising, said Shamim-Uzzaman, who is employed by the Veterans’ Health Administration, but was not involved in the current study.
The association between untreated OSA and medical and psychological comorbidities such as cardiovascular disease, diabetes, and mood disorders such as depression and anxiety is well-known, Shamim-Uzzaman said. “Patients with depression who also have sleep disturbances are at higher risk for suicide — the strength of this association is such that it led the Veterans’ Health Administration to mandate suicide screening for Veterans seen in its sleep clinics,” he added.
“We also know that untreated OSA is associated with excessive daytime sleepiness, slowed reaction times, and increased risk of motor vehicle accidents, all of which can contribute to sustaining injuries such as traumatic brain injury,” said Shamim-Uzzaman. “Emerging evidence also suggests that sleep disruption prior to exposure to trauma increases the risk of developing PTSD. Therefore, it is not surprising that patients with sleep apnea would have higher healthcare utilization for non-OSA conditions than those without sleep apnea,” he noted.
In clinical practice, the study underscores the importance of identifying and managing sleep health in military personnel, who frequently work nontraditional schedules with long, sustained shifts in grueling conditions not conducive to healthy sleep, Shamim-Uzzaman told this news organization. “Although the harsh work environments that our active-duty military endure come part and parcel with the job, clinicians caring for these individuals should ask specifically about their sleep and working schedules to optimize sleep as best as possible; this should include, but not be limited to, screening and testing for sleep disordered breathing and insomnia,” he said.
The current study has several limitations, including the inability to control for smoking or alcohol use, which are common in military personnel and associated with increased morbidity, said Shamim-Uzzaman. The study also did not assess the impact of other confounding factors, such as sleep duration and daytime sleepiness, that could impact the results, especially the association of OSA and traumatic brain injury, he noted. “More research is needed to assess the impact of these factors as well as the effect of treatment of OSA on comorbidities and healthcare utilization,” he said.
This study was supported by the Military Health Services Research Program.
Wickwire’s institution had received research funding from the American Academy of Sleep Medicine Foundation, Department of Defense, Merck, National Institutes of Health/National Institute on Aging, ResMed, the ResMed Foundation, and the SRS Foundation. Wickwire disclosed serving as a scientific consultant to Axsome Therapeutics, Dayzz, Eisai, EnsoData, Idorsia, Merck, Nox Health, Primasun, Purdue, and ResMed and is an equity shareholder in Well Tap.
Shamim-Uzzaman is an employee of the Veterans’ Health Administration.
A version of this article first appeared on Medscape.com.
You Are When You Eat: Microbiome Rhythm and Metabolic Health
Similar to circadian rhythms that help regulate when we naturally fall asleep and wake up, microbial rhythms in our gut are naturally active at certain times of the day to help regulate our digestion.
Investigators from the University of California, San Diego sought out to track these microbial rhythms to determine whether aligning the times we eat to when our gut microbes are most active – time-restricted feeding (TRF) – can bolster our metabolic health. Their research was published recently in Cell Host & Microbe.
“Microbial rhythms are daily fluctuations in the composition and function of microbes living in our gut. Much like how our bodies follow an internal clock (circadian rhythm), gut microbes also have their own rhythms, adjusting their activities based on the time of day and when we eat,” said Amir Zarrinpar, MD, PhD, a gastroenterologist at UC San Diego School of Medicine, and senior author of the study.
Zarrinpar and his team were particularly interested in observing whether adopting the TRF approach counteracted the harmful metabolic effects often associated with consuming a high-fat diet.
The study is also notable for the team’s use of technology able to observe real-time microbial changes in the gut — something not previously attainable with existing metagenomics.
How the Study Evolved With New Tech
Researchers separated three groups of mice to analyze their microbiome activity: one on a high-fat diet with unrestricted access, another on the same high-fat diet within a TRF window of 8 hours per day, and a control group on a normal chow diet with unrestricted access.
“In mice, [their] microbial rhythms are well-aligned with their nocturnal lifestyle. For example, during their active (nighttime) period, certain beneficial microbial activities increase, helping digest food, absorb nutrients, and regulate metabolism,” said Zarrinpar. As a result, the team made sure the mice’s TRF window was at night or when they would normally be awake.
“We chose an 8-hour feeding window based on earlier research showing this time period allows mice to consume the same total calories as those with unlimited food access,” said Zarrinpar. “By controlling [the] calories in this way, we ensure any metabolic or microbial benefits we observe are specifically due to the timing of eating, rather than differences in total food intake.”
But before any observations could be made, the team first needed a way to see real-time changes in the animals’ gut microbiomes.
Zarrinpar and his team were able to uncover this, thanks to metatranscriptomics, a technique used to capture real-time microbial activity by profiling RNA transcripts. Compared with the more traditional technique of metagenomics, which could only be used to identify which genes were present, metatranscriptomics provided more in-depth temporal and activity-related context, allowing the team to observe dynamic microbial changes.
“[Metatranscriptomics] helps us understand not just which microbes are present, but specifically what they are doing at any given moment,” said Zarrinpar. “In contrast, metagenomics looks only at microbial DNA, which provides information about what microbes are potentially capable of doing, but doesn’t tell us if those genes are actively expressed. By comparing microbial gene expression (using metatranscriptomics) and microbial gene abundance (using metagenomics) across different diet and feeding conditions in [light and dark] phases, we aimed to identify how feeding timing might influence microbial activity.”
Because metagenomics focuses on stable genetic material, this technique cannot capture the real-time microbial responses to dietary timing presented in rapidly changing, short-lived RNA. At the same time, the instability of the RNA makes it difficult to test hypotheses experimentally and explains why researchers haven’t more widely relied on metatranscriptomics.
To overcome this difficulty, Zarrinpar and his team had to wait to take advantage of improved bioinformatics tools to simplify their analysis of complex datasets. “It took several years for us to analyze this dataset because robust computational tools for metatranscriptomic analysis were not widely available when we initially collected our samples. Additionally, sequencing costs were very high. To clearly identify microbial activity, we needed deep sequencing coverage to distinguish species-level differences in gene expression, especially for genes that are common across multiple types of microbes,” said Zarrinpar.
What They Found
After monitoring these groups of mice for 8 weeks, the results were revealed.
As predicted,
“This unusual daytime activity interferes with important physiological processes. Consequently, the animals experience circadian misalignment, a condition similar to what human shift workers experience when their sleep-wake and eating cycles don’t match their internal biological clocks,” he continued. “This misalignment can negatively affect metabolism, immunity, and overall health, potentially leading to metabolic diseases.”
For the mice that consumed a high-fat diet within a TRF window, metabolic phenotyping demonstrated that their specific diet regimen had protected them from harmful high-fat induced effects including adiposity, inflammation, and insulin resistance.
Even more promising, the mice not only were protected from metabolic disruption but also experienced physiological improvements including glucose homeostasis and the partial restoration of the daily microbial rhythms absent in the mice with unrestricted access to a high-fat diet.
While the TRF approach did not fully restore the normal, healthy rhythmicity seen in the control mice, the researchers noted distinct shifts in microbial patterns that indicated time-dependent enrichment in genes attributed to lipid and carbohydrate metabolism.
Better Metabolic Health — and Better Tools for Researching It
Thankfully, the latest advancements in sequencing technology, including long-read sequencing methods, are making metatranscriptomics easier for research. “These newer platforms offer greater resolution at a lower cost, making metatranscriptomics increasingly accessible,” said Zarrinpar. With these emerging technologies, he believes metatranscriptomics will become a more standard, widely used method for researchers to better understand the influence of microbial activity on our health.
These tools, for example, enabled Zarrinpar and the team to delve deeper and focus on the transcription of a particular enzyme they identified as a pivotal influence in observable metabolic improvements: bile salt hydrolase (BSH), known to regulate lipid and glucose metabolism. The TRF approach notably enhanced the expression of the BSH gene during the daytime in the gut microbe Dubosiella newyorkensis, which has a functional human equivalent.
To determine why this happened, the team leveraged genetic engineering to insert several active BSH gene variants into a benign strain of gut bacteria to administer to the mice. The only variant to produce metabolic improvements was the one derived from Dubosiella newyorkensis; the mice who were given this BSH-expressing engineered native bacteria (ENB) had increased lean muscle mass, less body fat, lower insulin levels, enhanced insulin sensitivity, and better blood glucose regulation.
“It is still early to know the full clinical potential of this new BSH-expressing engineered native bacterium,” said Zarrinpar. “However, our long-term goal is to develop a therapeutic that can be administered as a single dose, stably colonize the gut, and provide long-lasting metabolic benefits.” Testing the engineered bacteria in obese and diabetic mice on a high-fat diet would be a next step to determine whether its potential indeed holds up. If proven successful, it could then be used to develop future targeted therapies and interventions to treat common metabolic disorders.
With this engineered bacteria, Zarrinpar and his team are hopeful that it alone can replicate the microbial benefits associated with following a TRF dietary schedule. “In our study, the engineered bacterium continuously expressed the enzyme DnBSH1, independently of dietary or environmental factors. As a result, the bacterium provided metabolic benefits similar to those seen with TRF, even without requiring the mice to strictly adhere to a TRF schedule,” said Zarrinpar.
“This suggests the exciting possibility that this engineered microbe might serve either as a replacement for TRF or as a way to enhance its beneficial effects,” he continued. “Further studies will help determine whether combining this ENB with TRF could provide additional or synergistic improvements in metabolic health.”
Looking Ahead
“As the pioneer of the single anastomosis duodenal switch which separates bile from food until halfway down the GI tract, I agree that bile is very important in controlling metabolism and glucose,” said Mitchell Roslin, MD, chief director of bariatric and metabolic surgery at Lenox Hill Hospital, and the Donald and Barbara Zucker School of Medicine, Hempstead, New York, who was not involved in the study. “Using enzymes or medications that work in the GI tract without absorption into the body is very interesting and has great potential. It is an early but exciting prospect.”
However, Roslin expressed some reservations. “I think we are still trying to understand whether the difference in microbiomes is the cause or effect/association. Is the microbiome the difference or is a different microbiome representative of a diet that has more fiber and less processed foods? Thus, while I find this academically fascinating, I think that there are very basic questions that need better answers, before we look at the transcription of bacteria.”
Furthermore, translating the metabolic results observed in mice to humans might not be as straightforward. “Small animal research is mandatory, but how the findings convert to humans is highly speculative,” said Roslin. “Mice that are studied are usually bred for medical research, with reduced genetic variation. Many animal models are more sensitive to time-restricted eating and caloric restriction than humans.”
While it requires further research and validation, this UC San Diego study nevertheless contributes to our overall understanding of host-microbe interactions. “We demonstrate that host circadian rhythms significantly influence microbial function, and conversely, these microbial functions can directly impact host metabolism,” said Zarrinpar. “Importantly, we now have a method to test how specific microbial activities affect host physiology by engineering native gut bacteria.”
Roslin similarly emphasized the importance of continued investment in exploring the microbial ecosystem inside us all. “There is wider evidence that bacteria and microbes are not just passengers using us for a ride but perhaps manipulating every action we take.”
A version of this article appeared on Medscape.com.
Similar to circadian rhythms that help regulate when we naturally fall asleep and wake up, microbial rhythms in our gut are naturally active at certain times of the day to help regulate our digestion.
Investigators from the University of California, San Diego sought out to track these microbial rhythms to determine whether aligning the times we eat to when our gut microbes are most active – time-restricted feeding (TRF) – can bolster our metabolic health. Their research was published recently in Cell Host & Microbe.
“Microbial rhythms are daily fluctuations in the composition and function of microbes living in our gut. Much like how our bodies follow an internal clock (circadian rhythm), gut microbes also have their own rhythms, adjusting their activities based on the time of day and when we eat,” said Amir Zarrinpar, MD, PhD, a gastroenterologist at UC San Diego School of Medicine, and senior author of the study.
Zarrinpar and his team were particularly interested in observing whether adopting the TRF approach counteracted the harmful metabolic effects often associated with consuming a high-fat diet.
The study is also notable for the team’s use of technology able to observe real-time microbial changes in the gut — something not previously attainable with existing metagenomics.
How the Study Evolved With New Tech
Researchers separated three groups of mice to analyze their microbiome activity: one on a high-fat diet with unrestricted access, another on the same high-fat diet within a TRF window of 8 hours per day, and a control group on a normal chow diet with unrestricted access.
“In mice, [their] microbial rhythms are well-aligned with their nocturnal lifestyle. For example, during their active (nighttime) period, certain beneficial microbial activities increase, helping digest food, absorb nutrients, and regulate metabolism,” said Zarrinpar. As a result, the team made sure the mice’s TRF window was at night or when they would normally be awake.
“We chose an 8-hour feeding window based on earlier research showing this time period allows mice to consume the same total calories as those with unlimited food access,” said Zarrinpar. “By controlling [the] calories in this way, we ensure any metabolic or microbial benefits we observe are specifically due to the timing of eating, rather than differences in total food intake.”
But before any observations could be made, the team first needed a way to see real-time changes in the animals’ gut microbiomes.
Zarrinpar and his team were able to uncover this, thanks to metatranscriptomics, a technique used to capture real-time microbial activity by profiling RNA transcripts. Compared with the more traditional technique of metagenomics, which could only be used to identify which genes were present, metatranscriptomics provided more in-depth temporal and activity-related context, allowing the team to observe dynamic microbial changes.
“[Metatranscriptomics] helps us understand not just which microbes are present, but specifically what they are doing at any given moment,” said Zarrinpar. “In contrast, metagenomics looks only at microbial DNA, which provides information about what microbes are potentially capable of doing, but doesn’t tell us if those genes are actively expressed. By comparing microbial gene expression (using metatranscriptomics) and microbial gene abundance (using metagenomics) across different diet and feeding conditions in [light and dark] phases, we aimed to identify how feeding timing might influence microbial activity.”
Because metagenomics focuses on stable genetic material, this technique cannot capture the real-time microbial responses to dietary timing presented in rapidly changing, short-lived RNA. At the same time, the instability of the RNA makes it difficult to test hypotheses experimentally and explains why researchers haven’t more widely relied on metatranscriptomics.
To overcome this difficulty, Zarrinpar and his team had to wait to take advantage of improved bioinformatics tools to simplify their analysis of complex datasets. “It took several years for us to analyze this dataset because robust computational tools for metatranscriptomic analysis were not widely available when we initially collected our samples. Additionally, sequencing costs were very high. To clearly identify microbial activity, we needed deep sequencing coverage to distinguish species-level differences in gene expression, especially for genes that are common across multiple types of microbes,” said Zarrinpar.
What They Found
After monitoring these groups of mice for 8 weeks, the results were revealed.
As predicted,
“This unusual daytime activity interferes with important physiological processes. Consequently, the animals experience circadian misalignment, a condition similar to what human shift workers experience when their sleep-wake and eating cycles don’t match their internal biological clocks,” he continued. “This misalignment can negatively affect metabolism, immunity, and overall health, potentially leading to metabolic diseases.”
For the mice that consumed a high-fat diet within a TRF window, metabolic phenotyping demonstrated that their specific diet regimen had protected them from harmful high-fat induced effects including adiposity, inflammation, and insulin resistance.
Even more promising, the mice not only were protected from metabolic disruption but also experienced physiological improvements including glucose homeostasis and the partial restoration of the daily microbial rhythms absent in the mice with unrestricted access to a high-fat diet.
While the TRF approach did not fully restore the normal, healthy rhythmicity seen in the control mice, the researchers noted distinct shifts in microbial patterns that indicated time-dependent enrichment in genes attributed to lipid and carbohydrate metabolism.
Better Metabolic Health — and Better Tools for Researching It
Thankfully, the latest advancements in sequencing technology, including long-read sequencing methods, are making metatranscriptomics easier for research. “These newer platforms offer greater resolution at a lower cost, making metatranscriptomics increasingly accessible,” said Zarrinpar. With these emerging technologies, he believes metatranscriptomics will become a more standard, widely used method for researchers to better understand the influence of microbial activity on our health.
These tools, for example, enabled Zarrinpar and the team to delve deeper and focus on the transcription of a particular enzyme they identified as a pivotal influence in observable metabolic improvements: bile salt hydrolase (BSH), known to regulate lipid and glucose metabolism. The TRF approach notably enhanced the expression of the BSH gene during the daytime in the gut microbe Dubosiella newyorkensis, which has a functional human equivalent.
To determine why this happened, the team leveraged genetic engineering to insert several active BSH gene variants into a benign strain of gut bacteria to administer to the mice. The only variant to produce metabolic improvements was the one derived from Dubosiella newyorkensis; the mice who were given this BSH-expressing engineered native bacteria (ENB) had increased lean muscle mass, less body fat, lower insulin levels, enhanced insulin sensitivity, and better blood glucose regulation.
“It is still early to know the full clinical potential of this new BSH-expressing engineered native bacterium,” said Zarrinpar. “However, our long-term goal is to develop a therapeutic that can be administered as a single dose, stably colonize the gut, and provide long-lasting metabolic benefits.” Testing the engineered bacteria in obese and diabetic mice on a high-fat diet would be a next step to determine whether its potential indeed holds up. If proven successful, it could then be used to develop future targeted therapies and interventions to treat common metabolic disorders.
With this engineered bacteria, Zarrinpar and his team are hopeful that it alone can replicate the microbial benefits associated with following a TRF dietary schedule. “In our study, the engineered bacterium continuously expressed the enzyme DnBSH1, independently of dietary or environmental factors. As a result, the bacterium provided metabolic benefits similar to those seen with TRF, even without requiring the mice to strictly adhere to a TRF schedule,” said Zarrinpar.
“This suggests the exciting possibility that this engineered microbe might serve either as a replacement for TRF or as a way to enhance its beneficial effects,” he continued. “Further studies will help determine whether combining this ENB with TRF could provide additional or synergistic improvements in metabolic health.”
Looking Ahead
“As the pioneer of the single anastomosis duodenal switch which separates bile from food until halfway down the GI tract, I agree that bile is very important in controlling metabolism and glucose,” said Mitchell Roslin, MD, chief director of bariatric and metabolic surgery at Lenox Hill Hospital, and the Donald and Barbara Zucker School of Medicine, Hempstead, New York, who was not involved in the study. “Using enzymes or medications that work in the GI tract without absorption into the body is very interesting and has great potential. It is an early but exciting prospect.”
However, Roslin expressed some reservations. “I think we are still trying to understand whether the difference in microbiomes is the cause or effect/association. Is the microbiome the difference or is a different microbiome representative of a diet that has more fiber and less processed foods? Thus, while I find this academically fascinating, I think that there are very basic questions that need better answers, before we look at the transcription of bacteria.”
Furthermore, translating the metabolic results observed in mice to humans might not be as straightforward. “Small animal research is mandatory, but how the findings convert to humans is highly speculative,” said Roslin. “Mice that are studied are usually bred for medical research, with reduced genetic variation. Many animal models are more sensitive to time-restricted eating and caloric restriction than humans.”
While it requires further research and validation, this UC San Diego study nevertheless contributes to our overall understanding of host-microbe interactions. “We demonstrate that host circadian rhythms significantly influence microbial function, and conversely, these microbial functions can directly impact host metabolism,” said Zarrinpar. “Importantly, we now have a method to test how specific microbial activities affect host physiology by engineering native gut bacteria.”
Roslin similarly emphasized the importance of continued investment in exploring the microbial ecosystem inside us all. “There is wider evidence that bacteria and microbes are not just passengers using us for a ride but perhaps manipulating every action we take.”
A version of this article appeared on Medscape.com.
Similar to circadian rhythms that help regulate when we naturally fall asleep and wake up, microbial rhythms in our gut are naturally active at certain times of the day to help regulate our digestion.
Investigators from the University of California, San Diego sought out to track these microbial rhythms to determine whether aligning the times we eat to when our gut microbes are most active – time-restricted feeding (TRF) – can bolster our metabolic health. Their research was published recently in Cell Host & Microbe.
“Microbial rhythms are daily fluctuations in the composition and function of microbes living in our gut. Much like how our bodies follow an internal clock (circadian rhythm), gut microbes also have their own rhythms, adjusting their activities based on the time of day and when we eat,” said Amir Zarrinpar, MD, PhD, a gastroenterologist at UC San Diego School of Medicine, and senior author of the study.
Zarrinpar and his team were particularly interested in observing whether adopting the TRF approach counteracted the harmful metabolic effects often associated with consuming a high-fat diet.
The study is also notable for the team’s use of technology able to observe real-time microbial changes in the gut — something not previously attainable with existing metagenomics.
How the Study Evolved With New Tech
Researchers separated three groups of mice to analyze their microbiome activity: one on a high-fat diet with unrestricted access, another on the same high-fat diet within a TRF window of 8 hours per day, and a control group on a normal chow diet with unrestricted access.
“In mice, [their] microbial rhythms are well-aligned with their nocturnal lifestyle. For example, during their active (nighttime) period, certain beneficial microbial activities increase, helping digest food, absorb nutrients, and regulate metabolism,” said Zarrinpar. As a result, the team made sure the mice’s TRF window was at night or when they would normally be awake.
“We chose an 8-hour feeding window based on earlier research showing this time period allows mice to consume the same total calories as those with unlimited food access,” said Zarrinpar. “By controlling [the] calories in this way, we ensure any metabolic or microbial benefits we observe are specifically due to the timing of eating, rather than differences in total food intake.”
But before any observations could be made, the team first needed a way to see real-time changes in the animals’ gut microbiomes.
Zarrinpar and his team were able to uncover this, thanks to metatranscriptomics, a technique used to capture real-time microbial activity by profiling RNA transcripts. Compared with the more traditional technique of metagenomics, which could only be used to identify which genes were present, metatranscriptomics provided more in-depth temporal and activity-related context, allowing the team to observe dynamic microbial changes.
“[Metatranscriptomics] helps us understand not just which microbes are present, but specifically what they are doing at any given moment,” said Zarrinpar. “In contrast, metagenomics looks only at microbial DNA, which provides information about what microbes are potentially capable of doing, but doesn’t tell us if those genes are actively expressed. By comparing microbial gene expression (using metatranscriptomics) and microbial gene abundance (using metagenomics) across different diet and feeding conditions in [light and dark] phases, we aimed to identify how feeding timing might influence microbial activity.”
Because metagenomics focuses on stable genetic material, this technique cannot capture the real-time microbial responses to dietary timing presented in rapidly changing, short-lived RNA. At the same time, the instability of the RNA makes it difficult to test hypotheses experimentally and explains why researchers haven’t more widely relied on metatranscriptomics.
To overcome this difficulty, Zarrinpar and his team had to wait to take advantage of improved bioinformatics tools to simplify their analysis of complex datasets. “It took several years for us to analyze this dataset because robust computational tools for metatranscriptomic analysis were not widely available when we initially collected our samples. Additionally, sequencing costs were very high. To clearly identify microbial activity, we needed deep sequencing coverage to distinguish species-level differences in gene expression, especially for genes that are common across multiple types of microbes,” said Zarrinpar.
What They Found
After monitoring these groups of mice for 8 weeks, the results were revealed.
As predicted,
“This unusual daytime activity interferes with important physiological processes. Consequently, the animals experience circadian misalignment, a condition similar to what human shift workers experience when their sleep-wake and eating cycles don’t match their internal biological clocks,” he continued. “This misalignment can negatively affect metabolism, immunity, and overall health, potentially leading to metabolic diseases.”
For the mice that consumed a high-fat diet within a TRF window, metabolic phenotyping demonstrated that their specific diet regimen had protected them from harmful high-fat induced effects including adiposity, inflammation, and insulin resistance.
Even more promising, the mice not only were protected from metabolic disruption but also experienced physiological improvements including glucose homeostasis and the partial restoration of the daily microbial rhythms absent in the mice with unrestricted access to a high-fat diet.
While the TRF approach did not fully restore the normal, healthy rhythmicity seen in the control mice, the researchers noted distinct shifts in microbial patterns that indicated time-dependent enrichment in genes attributed to lipid and carbohydrate metabolism.
Better Metabolic Health — and Better Tools for Researching It
Thankfully, the latest advancements in sequencing technology, including long-read sequencing methods, are making metatranscriptomics easier for research. “These newer platforms offer greater resolution at a lower cost, making metatranscriptomics increasingly accessible,” said Zarrinpar. With these emerging technologies, he believes metatranscriptomics will become a more standard, widely used method for researchers to better understand the influence of microbial activity on our health.
These tools, for example, enabled Zarrinpar and the team to delve deeper and focus on the transcription of a particular enzyme they identified as a pivotal influence in observable metabolic improvements: bile salt hydrolase (BSH), known to regulate lipid and glucose metabolism. The TRF approach notably enhanced the expression of the BSH gene during the daytime in the gut microbe Dubosiella newyorkensis, which has a functional human equivalent.
To determine why this happened, the team leveraged genetic engineering to insert several active BSH gene variants into a benign strain of gut bacteria to administer to the mice. The only variant to produce metabolic improvements was the one derived from Dubosiella newyorkensis; the mice who were given this BSH-expressing engineered native bacteria (ENB) had increased lean muscle mass, less body fat, lower insulin levels, enhanced insulin sensitivity, and better blood glucose regulation.
“It is still early to know the full clinical potential of this new BSH-expressing engineered native bacterium,” said Zarrinpar. “However, our long-term goal is to develop a therapeutic that can be administered as a single dose, stably colonize the gut, and provide long-lasting metabolic benefits.” Testing the engineered bacteria in obese and diabetic mice on a high-fat diet would be a next step to determine whether its potential indeed holds up. If proven successful, it could then be used to develop future targeted therapies and interventions to treat common metabolic disorders.
With this engineered bacteria, Zarrinpar and his team are hopeful that it alone can replicate the microbial benefits associated with following a TRF dietary schedule. “In our study, the engineered bacterium continuously expressed the enzyme DnBSH1, independently of dietary or environmental factors. As a result, the bacterium provided metabolic benefits similar to those seen with TRF, even without requiring the mice to strictly adhere to a TRF schedule,” said Zarrinpar.
“This suggests the exciting possibility that this engineered microbe might serve either as a replacement for TRF or as a way to enhance its beneficial effects,” he continued. “Further studies will help determine whether combining this ENB with TRF could provide additional or synergistic improvements in metabolic health.”
Looking Ahead
“As the pioneer of the single anastomosis duodenal switch which separates bile from food until halfway down the GI tract, I agree that bile is very important in controlling metabolism and glucose,” said Mitchell Roslin, MD, chief director of bariatric and metabolic surgery at Lenox Hill Hospital, and the Donald and Barbara Zucker School of Medicine, Hempstead, New York, who was not involved in the study. “Using enzymes or medications that work in the GI tract without absorption into the body is very interesting and has great potential. It is an early but exciting prospect.”
However, Roslin expressed some reservations. “I think we are still trying to understand whether the difference in microbiomes is the cause or effect/association. Is the microbiome the difference or is a different microbiome representative of a diet that has more fiber and less processed foods? Thus, while I find this academically fascinating, I think that there are very basic questions that need better answers, before we look at the transcription of bacteria.”
Furthermore, translating the metabolic results observed in mice to humans might not be as straightforward. “Small animal research is mandatory, but how the findings convert to humans is highly speculative,” said Roslin. “Mice that are studied are usually bred for medical research, with reduced genetic variation. Many animal models are more sensitive to time-restricted eating and caloric restriction than humans.”
While it requires further research and validation, this UC San Diego study nevertheless contributes to our overall understanding of host-microbe interactions. “We demonstrate that host circadian rhythms significantly influence microbial function, and conversely, these microbial functions can directly impact host metabolism,” said Zarrinpar. “Importantly, we now have a method to test how specific microbial activities affect host physiology by engineering native gut bacteria.”
Roslin similarly emphasized the importance of continued investment in exploring the microbial ecosystem inside us all. “There is wider evidence that bacteria and microbes are not just passengers using us for a ride but perhaps manipulating every action we take.”
A version of this article appeared on Medscape.com.
Dietary Trial Shows Benefits of a Low Emulsifier Diet for Crohn’s Disease
WASHINGTON, DC —
involving 154 patients with mildly active disease living across the United Kingdom.The findings were reported at Gut Microbiota for Health (GMFH) World Summit 2025 by Benoit Chassaing, PhD, of the Institut Pasteur, Paris, France, whose research leading up to the trial has demonstrated that food additive emulsifiers —ubiquitous in processed foods — alter microbiota composition and lead to microbiota encroachment into the mucus layer of the gut and subsequent chronic gut inflammation.
Patients in the ADDapt trial, which was also reported in an abstract earlier this year at the European Crohn’s and Colitis Organization (ECCO) 2025 Congress, had a Crohn’s disease activity index (CDAI) of 150-250 and evidence of inflammation (faecal calprotectin (FCP) ≥ 150 µg/g or endoscopy/radiology). All “had been exposed in their regular diets to emulsifiers,” said Chassaing, a co-investigator, during a GMFH session on “Dietary Drivers of Health and Disease.”
They were randomized to either a low-emulsifier diet or to a low-emulsifier diet followed by emulsifier “resupplementation” — a design meant to “account for the very strong placebo effect that is always observed with dietary studies,” he said.
All patients received dietary counseling, a smart phone app and barcode scan to support shopping, and weekly support. They also received supermarket foods for 25% of their needs that were either free of emulsifiers or contained emulsifiers, and they were provided three snacks per day that were emulsifier-free or contained carrageenan, carboxymethycellulse (CMC), and polysorbate-80 (P80) — dietary emulsifiers that are commonly added to processed foods to enhance texture and extend shelf-life.
In the intention-to-treat (ITT) analysis, 49% of patients in the intervention group reached the primary endpoint of a 70-point reduction or more in CDAI response after 8 weeks compared with 31% of those in the control group (P = .019), with an adjusted relative risk of response of 3.1 (P = .003), Chassaing shared at the GMFH meeting, convened by the American Gastroenterological Association and the European Society of Neurogastroenterology and Motility.
In the per-protocol analysis (n = 119), 61% and 47% of patients in the intervention and control groups, respectively, reached the primary outcome of CDAI response, with an adjusted relative risk of response of 3.0 (P = .018), he said.
Secondary endpoints included CDAI remission at 24 weeks, and according to the abstract for the ECCO Congress, in the ITT analysis, patients in the intervention group were more than twice as likely to experience remission.
Chassaing noted at the GMFH meeting that as part of the study, he and coinvestigators have been investigating the participants’ gut microbiota with metagenomic analyses. The study was led by Kevin Whelan, PhD, head of the Department of Nutritional Sciences at King’s College London, London, England.
Can Emulsifier-Sensitive Individuals Be Identified?
In murine model research 10 years ago, Chassaing showed that the administration of CMC and P80 results in microbiota encroachment into the mucus layer of the gut, alterations in microbiota composition — including an increase in bacteria that produce pro-inflammatory flagellin — and development of chronic inflammation.
Wild-type mice treated with these compounds developed metabolic disease, and mice that were modified to be predisposed to colitis had a higher incidence of robust colitis. Moreover, fecal transplantation from emulsifier-treated mice to germ-free mice reproduced these changes, “clearly suggesting that the microbiome itself is sufficient to drive chronic inflammation,” he said.
In recent years, in humans, analyses from the large French NutriNet-Sante prospective cohort study have shown associations between exposure to food additive emulsifiers and the risk for cardiovascular disease, the risk for cancer (overall, breast, and prostate), and the risk for type 2 diabetes.
But to explore causality and better understand the mechanisms of emulsifier-driven changes on the microbiota, Chassaing and his colleagues also launched the FRESH study (Functional Research on Emulsifier in Humans), a double-blind randomized controlled-feeding study of the emulsifier CMC. For 11 days, nine healthy patients consumed an emulsifier-free diet and 11 consumed an identical diet enriched with 15 g/d of CMC.
Patients on the CMC-containing diet had reduced microbiota diversity and depletions of an array of microbiota-related metabolites, but only a small subset had profound alterations in microbiota composition and increased microbiota encroachment into the mucus layer. “Some seemed to be resistant to CMC-induced microbiota encroachment, while some were highly susceptible,” Chassaing said.
The pilot study raised the question, he said, of whether there is an “infectivity component” — some kind of “sensitive” gut microbiota composition — that may be associated with dietary emulsifier-driven inflammation and disease.
In other murine research, Chassaing and his team found that germ-free mice colonized with Crohn’s disease-associated adherent-invasive E coli (AIEC) and subsequently given CMC or P80 developed chronic inflammation and metabolic dysregulation, “clearly demonstrating that you can convert resistant mice to sensitive mice just by adding one bacteria to the ecosystem,” he said. “The presence of AIEC alone was sufficient to drive the detrimental effects of dietary emulsifiers.”
(In vitro research with transcriptomic analysis then showed that the emulsifiers directly elicit AIEC virulence gene expression, Chassaing and his coauthors wrote in their 2020 paper, facilitating AIEC’s “penetration of the mucus layer and adherence to epithelial cells and resulting in activation of host pro-inflammatory signaling.”)
“We don’t think it’s solely the AIEC bacteria that will drive emulsifier sensitivity, though…we think it’s more complex,” Chassaing said at the meeting. Overall, the findings raise the question of whether emulsifier-sensitive individuals can be identified.
This, he said, is one of his most recent research questions. His lab has led the development of an in vitro microbiota model built to predict an individual’s sensitivity to emulsifiers. In a study published in April, the model recapitulated the differential CMC sensitivity observed in the earlier FRESH study, suggesting that an individual’s sensitivity to emulsifiers can indeed be predicted by examining their baseline microbiota.
Interpreting the Epidemiology
Chassaing’s research arch illustrates the synergy between epidemiological research, basic/translational research, and clinical interventional research that’s needed to understand the diet-microbiome intersection in inflammatory bowel disease, said Ashwin Ananthakrishnan, MBBS, MPH, AGAF, associate professor of medicine at Massachusetts General Hospital, Boston, in an interview at the meeting.
“It’s a good example of how to really span the spectrum, starting from the big picture and going deeper to understand mechanisms, and starting from mechanisms and expanding it out,” Ananthakrishnan said.
In his own talk about research on IBD, Ananthakrishnan said that epidemiological data have shown over the past 10-15 years that total dietary fiber is inversely associated with the risk for Crohn’s disease (with the strongest associations with fiber from fruits and vegetables). Studies have also shown that a higher intake of polyunsaturated fatty acids is associated with a lower risk for ulcerative colitis, whereas “an n-6-fatty acid-rich diet is associated with a higher risk of ulcerative colitis,” he said.
Dietary cohort studies, meanwhile, have shed light on the influence of dietary patterns — such as the Mediterranean diet and diets with high inflammatory potential—on IBD. A diet rich in ultra-processed foods has also been shown in a prospective cohort study to be associated with a higher risk for Crohn’s disease, with certain categories of ultra-processed foods (eg, breads and breakfast foods) having the strongest associations.
Such studies are limited in part, however, by inadequate assessment of potentially relevant variables such as emulsifiers, preservatives, and how the food is processed, he said.
And in interpreting the epidemiological research on fiber and IBD, for instance, one must appreciate that “there are a number of mechanisms by which fiber is impactful…there’s a big picture to look at,” Ananthakrishnan said. Fiber “can affect the microbiome, clearly, it can affect the gut barrier, and it can affect bile acids, and there are detailed translational studies in support of each of these.”
But there are other constituents of fruits and vegetables “that could potentially influence disease risk, such as AhR ligands and polyphenols,” he said. “And importantly, people not eating a lot of fiber may be eating a lot of ultra-processed foods.”
Most interventional studies of fiber have not shown a benefit of a high-fiber diet, Ananthakrishnan said, but there are multiple possible reasons and factors at play, including potential population differences (eg, in inflammatory status or baseline microbiota), shortcomings of the interventions, and potentially inaccurate outcomes.
Abigail Johnson, PhD, RDN, associate director of the Nutrition Coordinating Center, University of Minnesota Twin Cities, which supports dietary analysis, said during the session that the focus of dietary research is “moving toward understanding overall dietary patterns” as opposed to focusing more narrowly on vitamins, minerals, and macronutrients such as proteins, fats, and carbohydrates.
This is an improvement, though “we still don’t have good approaches for understanding [the contributions of] things like additives and emulsifiers, food preparation and cooking, and food processing,” said Johnson, assistant professor in the Division of Epidemiology and Community Health at University of Minnesota Twin Cities. “Perhaps by looking at things at the food level we can overcome some of these limitations.”
Ananthakrishnan reported being a consultant for Geneoscopy and receiving a research grant from Takeda. Chassaing did not report any financial disclosures. Johnson reported that she had no financial disclosures.
A version of this article appeared on Medscape.com.
WASHINGTON, DC —
involving 154 patients with mildly active disease living across the United Kingdom.The findings were reported at Gut Microbiota for Health (GMFH) World Summit 2025 by Benoit Chassaing, PhD, of the Institut Pasteur, Paris, France, whose research leading up to the trial has demonstrated that food additive emulsifiers —ubiquitous in processed foods — alter microbiota composition and lead to microbiota encroachment into the mucus layer of the gut and subsequent chronic gut inflammation.
Patients in the ADDapt trial, which was also reported in an abstract earlier this year at the European Crohn’s and Colitis Organization (ECCO) 2025 Congress, had a Crohn’s disease activity index (CDAI) of 150-250 and evidence of inflammation (faecal calprotectin (FCP) ≥ 150 µg/g or endoscopy/radiology). All “had been exposed in their regular diets to emulsifiers,” said Chassaing, a co-investigator, during a GMFH session on “Dietary Drivers of Health and Disease.”
They were randomized to either a low-emulsifier diet or to a low-emulsifier diet followed by emulsifier “resupplementation” — a design meant to “account for the very strong placebo effect that is always observed with dietary studies,” he said.
All patients received dietary counseling, a smart phone app and barcode scan to support shopping, and weekly support. They also received supermarket foods for 25% of their needs that were either free of emulsifiers or contained emulsifiers, and they were provided three snacks per day that were emulsifier-free or contained carrageenan, carboxymethycellulse (CMC), and polysorbate-80 (P80) — dietary emulsifiers that are commonly added to processed foods to enhance texture and extend shelf-life.
In the intention-to-treat (ITT) analysis, 49% of patients in the intervention group reached the primary endpoint of a 70-point reduction or more in CDAI response after 8 weeks compared with 31% of those in the control group (P = .019), with an adjusted relative risk of response of 3.1 (P = .003), Chassaing shared at the GMFH meeting, convened by the American Gastroenterological Association and the European Society of Neurogastroenterology and Motility.
In the per-protocol analysis (n = 119), 61% and 47% of patients in the intervention and control groups, respectively, reached the primary outcome of CDAI response, with an adjusted relative risk of response of 3.0 (P = .018), he said.
Secondary endpoints included CDAI remission at 24 weeks, and according to the abstract for the ECCO Congress, in the ITT analysis, patients in the intervention group were more than twice as likely to experience remission.
Chassaing noted at the GMFH meeting that as part of the study, he and coinvestigators have been investigating the participants’ gut microbiota with metagenomic analyses. The study was led by Kevin Whelan, PhD, head of the Department of Nutritional Sciences at King’s College London, London, England.
Can Emulsifier-Sensitive Individuals Be Identified?
In murine model research 10 years ago, Chassaing showed that the administration of CMC and P80 results in microbiota encroachment into the mucus layer of the gut, alterations in microbiota composition — including an increase in bacteria that produce pro-inflammatory flagellin — and development of chronic inflammation.
Wild-type mice treated with these compounds developed metabolic disease, and mice that were modified to be predisposed to colitis had a higher incidence of robust colitis. Moreover, fecal transplantation from emulsifier-treated mice to germ-free mice reproduced these changes, “clearly suggesting that the microbiome itself is sufficient to drive chronic inflammation,” he said.
In recent years, in humans, analyses from the large French NutriNet-Sante prospective cohort study have shown associations between exposure to food additive emulsifiers and the risk for cardiovascular disease, the risk for cancer (overall, breast, and prostate), and the risk for type 2 diabetes.
But to explore causality and better understand the mechanisms of emulsifier-driven changes on the microbiota, Chassaing and his colleagues also launched the FRESH study (Functional Research on Emulsifier in Humans), a double-blind randomized controlled-feeding study of the emulsifier CMC. For 11 days, nine healthy patients consumed an emulsifier-free diet and 11 consumed an identical diet enriched with 15 g/d of CMC.
Patients on the CMC-containing diet had reduced microbiota diversity and depletions of an array of microbiota-related metabolites, but only a small subset had profound alterations in microbiota composition and increased microbiota encroachment into the mucus layer. “Some seemed to be resistant to CMC-induced microbiota encroachment, while some were highly susceptible,” Chassaing said.
The pilot study raised the question, he said, of whether there is an “infectivity component” — some kind of “sensitive” gut microbiota composition — that may be associated with dietary emulsifier-driven inflammation and disease.
In other murine research, Chassaing and his team found that germ-free mice colonized with Crohn’s disease-associated adherent-invasive E coli (AIEC) and subsequently given CMC or P80 developed chronic inflammation and metabolic dysregulation, “clearly demonstrating that you can convert resistant mice to sensitive mice just by adding one bacteria to the ecosystem,” he said. “The presence of AIEC alone was sufficient to drive the detrimental effects of dietary emulsifiers.”
(In vitro research with transcriptomic analysis then showed that the emulsifiers directly elicit AIEC virulence gene expression, Chassaing and his coauthors wrote in their 2020 paper, facilitating AIEC’s “penetration of the mucus layer and adherence to epithelial cells and resulting in activation of host pro-inflammatory signaling.”)
“We don’t think it’s solely the AIEC bacteria that will drive emulsifier sensitivity, though…we think it’s more complex,” Chassaing said at the meeting. Overall, the findings raise the question of whether emulsifier-sensitive individuals can be identified.
This, he said, is one of his most recent research questions. His lab has led the development of an in vitro microbiota model built to predict an individual’s sensitivity to emulsifiers. In a study published in April, the model recapitulated the differential CMC sensitivity observed in the earlier FRESH study, suggesting that an individual’s sensitivity to emulsifiers can indeed be predicted by examining their baseline microbiota.
Interpreting the Epidemiology
Chassaing’s research arch illustrates the synergy between epidemiological research, basic/translational research, and clinical interventional research that’s needed to understand the diet-microbiome intersection in inflammatory bowel disease, said Ashwin Ananthakrishnan, MBBS, MPH, AGAF, associate professor of medicine at Massachusetts General Hospital, Boston, in an interview at the meeting.
“It’s a good example of how to really span the spectrum, starting from the big picture and going deeper to understand mechanisms, and starting from mechanisms and expanding it out,” Ananthakrishnan said.
In his own talk about research on IBD, Ananthakrishnan said that epidemiological data have shown over the past 10-15 years that total dietary fiber is inversely associated with the risk for Crohn’s disease (with the strongest associations with fiber from fruits and vegetables). Studies have also shown that a higher intake of polyunsaturated fatty acids is associated with a lower risk for ulcerative colitis, whereas “an n-6-fatty acid-rich diet is associated with a higher risk of ulcerative colitis,” he said.
Dietary cohort studies, meanwhile, have shed light on the influence of dietary patterns — such as the Mediterranean diet and diets with high inflammatory potential—on IBD. A diet rich in ultra-processed foods has also been shown in a prospective cohort study to be associated with a higher risk for Crohn’s disease, with certain categories of ultra-processed foods (eg, breads and breakfast foods) having the strongest associations.
Such studies are limited in part, however, by inadequate assessment of potentially relevant variables such as emulsifiers, preservatives, and how the food is processed, he said.
And in interpreting the epidemiological research on fiber and IBD, for instance, one must appreciate that “there are a number of mechanisms by which fiber is impactful…there’s a big picture to look at,” Ananthakrishnan said. Fiber “can affect the microbiome, clearly, it can affect the gut barrier, and it can affect bile acids, and there are detailed translational studies in support of each of these.”
But there are other constituents of fruits and vegetables “that could potentially influence disease risk, such as AhR ligands and polyphenols,” he said. “And importantly, people not eating a lot of fiber may be eating a lot of ultra-processed foods.”
Most interventional studies of fiber have not shown a benefit of a high-fiber diet, Ananthakrishnan said, but there are multiple possible reasons and factors at play, including potential population differences (eg, in inflammatory status or baseline microbiota), shortcomings of the interventions, and potentially inaccurate outcomes.
Abigail Johnson, PhD, RDN, associate director of the Nutrition Coordinating Center, University of Minnesota Twin Cities, which supports dietary analysis, said during the session that the focus of dietary research is “moving toward understanding overall dietary patterns” as opposed to focusing more narrowly on vitamins, minerals, and macronutrients such as proteins, fats, and carbohydrates.
This is an improvement, though “we still don’t have good approaches for understanding [the contributions of] things like additives and emulsifiers, food preparation and cooking, and food processing,” said Johnson, assistant professor in the Division of Epidemiology and Community Health at University of Minnesota Twin Cities. “Perhaps by looking at things at the food level we can overcome some of these limitations.”
Ananthakrishnan reported being a consultant for Geneoscopy and receiving a research grant from Takeda. Chassaing did not report any financial disclosures. Johnson reported that she had no financial disclosures.
A version of this article appeared on Medscape.com.
WASHINGTON, DC —
involving 154 patients with mildly active disease living across the United Kingdom.The findings were reported at Gut Microbiota for Health (GMFH) World Summit 2025 by Benoit Chassaing, PhD, of the Institut Pasteur, Paris, France, whose research leading up to the trial has demonstrated that food additive emulsifiers —ubiquitous in processed foods — alter microbiota composition and lead to microbiota encroachment into the mucus layer of the gut and subsequent chronic gut inflammation.
Patients in the ADDapt trial, which was also reported in an abstract earlier this year at the European Crohn’s and Colitis Organization (ECCO) 2025 Congress, had a Crohn’s disease activity index (CDAI) of 150-250 and evidence of inflammation (faecal calprotectin (FCP) ≥ 150 µg/g or endoscopy/radiology). All “had been exposed in their regular diets to emulsifiers,” said Chassaing, a co-investigator, during a GMFH session on “Dietary Drivers of Health and Disease.”
They were randomized to either a low-emulsifier diet or to a low-emulsifier diet followed by emulsifier “resupplementation” — a design meant to “account for the very strong placebo effect that is always observed with dietary studies,” he said.
All patients received dietary counseling, a smart phone app and barcode scan to support shopping, and weekly support. They also received supermarket foods for 25% of their needs that were either free of emulsifiers or contained emulsifiers, and they were provided three snacks per day that were emulsifier-free or contained carrageenan, carboxymethycellulse (CMC), and polysorbate-80 (P80) — dietary emulsifiers that are commonly added to processed foods to enhance texture and extend shelf-life.
In the intention-to-treat (ITT) analysis, 49% of patients in the intervention group reached the primary endpoint of a 70-point reduction or more in CDAI response after 8 weeks compared with 31% of those in the control group (P = .019), with an adjusted relative risk of response of 3.1 (P = .003), Chassaing shared at the GMFH meeting, convened by the American Gastroenterological Association and the European Society of Neurogastroenterology and Motility.
In the per-protocol analysis (n = 119), 61% and 47% of patients in the intervention and control groups, respectively, reached the primary outcome of CDAI response, with an adjusted relative risk of response of 3.0 (P = .018), he said.
Secondary endpoints included CDAI remission at 24 weeks, and according to the abstract for the ECCO Congress, in the ITT analysis, patients in the intervention group were more than twice as likely to experience remission.
Chassaing noted at the GMFH meeting that as part of the study, he and coinvestigators have been investigating the participants’ gut microbiota with metagenomic analyses. The study was led by Kevin Whelan, PhD, head of the Department of Nutritional Sciences at King’s College London, London, England.
Can Emulsifier-Sensitive Individuals Be Identified?
In murine model research 10 years ago, Chassaing showed that the administration of CMC and P80 results in microbiota encroachment into the mucus layer of the gut, alterations in microbiota composition — including an increase in bacteria that produce pro-inflammatory flagellin — and development of chronic inflammation.
Wild-type mice treated with these compounds developed metabolic disease, and mice that were modified to be predisposed to colitis had a higher incidence of robust colitis. Moreover, fecal transplantation from emulsifier-treated mice to germ-free mice reproduced these changes, “clearly suggesting that the microbiome itself is sufficient to drive chronic inflammation,” he said.
In recent years, in humans, analyses from the large French NutriNet-Sante prospective cohort study have shown associations between exposure to food additive emulsifiers and the risk for cardiovascular disease, the risk for cancer (overall, breast, and prostate), and the risk for type 2 diabetes.
But to explore causality and better understand the mechanisms of emulsifier-driven changes on the microbiota, Chassaing and his colleagues also launched the FRESH study (Functional Research on Emulsifier in Humans), a double-blind randomized controlled-feeding study of the emulsifier CMC. For 11 days, nine healthy patients consumed an emulsifier-free diet and 11 consumed an identical diet enriched with 15 g/d of CMC.
Patients on the CMC-containing diet had reduced microbiota diversity and depletions of an array of microbiota-related metabolites, but only a small subset had profound alterations in microbiota composition and increased microbiota encroachment into the mucus layer. “Some seemed to be resistant to CMC-induced microbiota encroachment, while some were highly susceptible,” Chassaing said.
The pilot study raised the question, he said, of whether there is an “infectivity component” — some kind of “sensitive” gut microbiota composition — that may be associated with dietary emulsifier-driven inflammation and disease.
In other murine research, Chassaing and his team found that germ-free mice colonized with Crohn’s disease-associated adherent-invasive E coli (AIEC) and subsequently given CMC or P80 developed chronic inflammation and metabolic dysregulation, “clearly demonstrating that you can convert resistant mice to sensitive mice just by adding one bacteria to the ecosystem,” he said. “The presence of AIEC alone was sufficient to drive the detrimental effects of dietary emulsifiers.”
(In vitro research with transcriptomic analysis then showed that the emulsifiers directly elicit AIEC virulence gene expression, Chassaing and his coauthors wrote in their 2020 paper, facilitating AIEC’s “penetration of the mucus layer and adherence to epithelial cells and resulting in activation of host pro-inflammatory signaling.”)
“We don’t think it’s solely the AIEC bacteria that will drive emulsifier sensitivity, though…we think it’s more complex,” Chassaing said at the meeting. Overall, the findings raise the question of whether emulsifier-sensitive individuals can be identified.
This, he said, is one of his most recent research questions. His lab has led the development of an in vitro microbiota model built to predict an individual’s sensitivity to emulsifiers. In a study published in April, the model recapitulated the differential CMC sensitivity observed in the earlier FRESH study, suggesting that an individual’s sensitivity to emulsifiers can indeed be predicted by examining their baseline microbiota.
Interpreting the Epidemiology
Chassaing’s research arch illustrates the synergy between epidemiological research, basic/translational research, and clinical interventional research that’s needed to understand the diet-microbiome intersection in inflammatory bowel disease, said Ashwin Ananthakrishnan, MBBS, MPH, AGAF, associate professor of medicine at Massachusetts General Hospital, Boston, in an interview at the meeting.
“It’s a good example of how to really span the spectrum, starting from the big picture and going deeper to understand mechanisms, and starting from mechanisms and expanding it out,” Ananthakrishnan said.
In his own talk about research on IBD, Ananthakrishnan said that epidemiological data have shown over the past 10-15 years that total dietary fiber is inversely associated with the risk for Crohn’s disease (with the strongest associations with fiber from fruits and vegetables). Studies have also shown that a higher intake of polyunsaturated fatty acids is associated with a lower risk for ulcerative colitis, whereas “an n-6-fatty acid-rich diet is associated with a higher risk of ulcerative colitis,” he said.
Dietary cohort studies, meanwhile, have shed light on the influence of dietary patterns — such as the Mediterranean diet and diets with high inflammatory potential—on IBD. A diet rich in ultra-processed foods has also been shown in a prospective cohort study to be associated with a higher risk for Crohn’s disease, with certain categories of ultra-processed foods (eg, breads and breakfast foods) having the strongest associations.
Such studies are limited in part, however, by inadequate assessment of potentially relevant variables such as emulsifiers, preservatives, and how the food is processed, he said.
And in interpreting the epidemiological research on fiber and IBD, for instance, one must appreciate that “there are a number of mechanisms by which fiber is impactful…there’s a big picture to look at,” Ananthakrishnan said. Fiber “can affect the microbiome, clearly, it can affect the gut barrier, and it can affect bile acids, and there are detailed translational studies in support of each of these.”
But there are other constituents of fruits and vegetables “that could potentially influence disease risk, such as AhR ligands and polyphenols,” he said. “And importantly, people not eating a lot of fiber may be eating a lot of ultra-processed foods.”
Most interventional studies of fiber have not shown a benefit of a high-fiber diet, Ananthakrishnan said, but there are multiple possible reasons and factors at play, including potential population differences (eg, in inflammatory status or baseline microbiota), shortcomings of the interventions, and potentially inaccurate outcomes.
Abigail Johnson, PhD, RDN, associate director of the Nutrition Coordinating Center, University of Minnesota Twin Cities, which supports dietary analysis, said during the session that the focus of dietary research is “moving toward understanding overall dietary patterns” as opposed to focusing more narrowly on vitamins, minerals, and macronutrients such as proteins, fats, and carbohydrates.
This is an improvement, though “we still don’t have good approaches for understanding [the contributions of] things like additives and emulsifiers, food preparation and cooking, and food processing,” said Johnson, assistant professor in the Division of Epidemiology and Community Health at University of Minnesota Twin Cities. “Perhaps by looking at things at the food level we can overcome some of these limitations.”
Ananthakrishnan reported being a consultant for Geneoscopy and receiving a research grant from Takeda. Chassaing did not report any financial disclosures. Johnson reported that she had no financial disclosures.
A version of this article appeared on Medscape.com.
FROM GMFH 2025
Treating Metastatic RCC: From Risk Assessment to Therapy Selection
Treating Metastatic RCC: From Risk Assessment to Therapy Selection
Treatment of metastatic renal cell carcinoma (RCC) is complex and requires careful analysis of risk and treatment options, an oncologist said at the July Association of VA Hematology and Oncology (AVAHO) seminar in Long Beach, California, regarding treating veterans with kidney cancer.
“We’ve come a long way in treating this disease, but individualizing therapy remains critical, especially in complex populations like our veterans,” said Matthew B. Rettig, MD, chief of Hematology-Oncology at the Veterans Affairs Greater Los Angeles Healthcare System and professor of Medicine and Urology at UCLA.
Rettig emphasized 2 critical early questions clinicians should consider when encountering metastatic RCC. First: Can the patient be treated with localized interventions such as metastasectomy, radiation therapy, or nephrectomy? These can be curative, Rettig said.
And second: Does the patient currently need systemic therapy? “[There are] a small subset of patients,” Rettig said, “who go into a durable, complete remission, dare I say ‘cure,’ with immunotherapeutic-based approaches.”
Rettig highlighted the International Metastatic Renal Cell Carcinoma Database Consortium criteria as a guide for clinicians as they determine the best strategy for treatment. The Database Consortium estimates survival in various lines of therapy by incorporating 6 prognostic factors: anemia, hypercalcemia, neutrophilia, thrombocytosis, performance status, and time from diagnosis to treatment.
These criteria classify patients into favorable, intermediate, or poor risk categories that can guide first-line systemic therapy. The criteria also provide estimates of median survival.
Rettig noted a “huge percentage” of veterans mirror the intermediate-risk demographics of clinical trial cohorts but often present with greater comorbidity burdens: “That plays into whether we treat and how we treat,” he said.
Rettig highlighted kidney cancer guidelines from the National Comprehensive Cancer Network and noted that several trials examined first-line use of combinations of vascular endothelial growth factor receptor tyrosine kinase inhibitors (TKIs) and checkpoint inhibitors.
There’s a general theme in the findings, he said: “You have OS (overall survival) and PFS (progression-free survival) benefit in the intermediate/poor risk group, but only PFS benefit in the patients who have favorable-risk disease. And you see higher objective response rates with the combinations.
“If you have a patient who's highly symptomatic or has an organ system threatened by a metastasis, you'd want to use a combination that elicits a higher objective response rate,” Rettig added.
A TKI is going to be the most appropriate second-line therapy for patients who received a prior checkpoint inhibitor, Rettig said.
“Don't change to another checkpoint inhibitor,” he said. “We have enough phase 3 data that indicates checkpoint inhibitors are no longer really adding to benefit once they’ve had a checkpoint inhibitor.”
Rettig said to even consider checkpoint inhibitors for patients who are checkpoint inhibitor-naïve, especially given the potential for durable remissions. As for third-line therapy, he said, “we have both belzutifan and tivozanib, which have been shown to improve PFS. More studies are ongoing.”
There are many adverse events linked to TKIs, Rettig said, including cardiovascular problems, thrombosis, hypertension, heart failure, torsades de pointes, QT prolongation, and gastrointestinal toxicity. TKIs tend to be the major drivers of adverse events in combination therapy.
Rettig emphasized the shorter half-life of the TKI axitinib, which he said allows for easier management of toxicities: “That’s why it’s preferred in the VA RCC clinical pathway.”
Rettig discloses relationships with Ambrx, Amgen, AVEO, Bayer, INmune Bio, Johnson & Johnson Health Care Systems, Lantheus, Merck, Myovant, Novartis, ORIC, and Progenics.
Treatment of metastatic renal cell carcinoma (RCC) is complex and requires careful analysis of risk and treatment options, an oncologist said at the July Association of VA Hematology and Oncology (AVAHO) seminar in Long Beach, California, regarding treating veterans with kidney cancer.
“We’ve come a long way in treating this disease, but individualizing therapy remains critical, especially in complex populations like our veterans,” said Matthew B. Rettig, MD, chief of Hematology-Oncology at the Veterans Affairs Greater Los Angeles Healthcare System and professor of Medicine and Urology at UCLA.
Rettig emphasized 2 critical early questions clinicians should consider when encountering metastatic RCC. First: Can the patient be treated with localized interventions such as metastasectomy, radiation therapy, or nephrectomy? These can be curative, Rettig said.
And second: Does the patient currently need systemic therapy? “[There are] a small subset of patients,” Rettig said, “who go into a durable, complete remission, dare I say ‘cure,’ with immunotherapeutic-based approaches.”
Rettig highlighted the International Metastatic Renal Cell Carcinoma Database Consortium criteria as a guide for clinicians as they determine the best strategy for treatment. The Database Consortium estimates survival in various lines of therapy by incorporating 6 prognostic factors: anemia, hypercalcemia, neutrophilia, thrombocytosis, performance status, and time from diagnosis to treatment.
These criteria classify patients into favorable, intermediate, or poor risk categories that can guide first-line systemic therapy. The criteria also provide estimates of median survival.
Rettig noted a “huge percentage” of veterans mirror the intermediate-risk demographics of clinical trial cohorts but often present with greater comorbidity burdens: “That plays into whether we treat and how we treat,” he said.
Rettig highlighted kidney cancer guidelines from the National Comprehensive Cancer Network and noted that several trials examined first-line use of combinations of vascular endothelial growth factor receptor tyrosine kinase inhibitors (TKIs) and checkpoint inhibitors.
There’s a general theme in the findings, he said: “You have OS (overall survival) and PFS (progression-free survival) benefit in the intermediate/poor risk group, but only PFS benefit in the patients who have favorable-risk disease. And you see higher objective response rates with the combinations.
“If you have a patient who's highly symptomatic or has an organ system threatened by a metastasis, you'd want to use a combination that elicits a higher objective response rate,” Rettig added.
A TKI is going to be the most appropriate second-line therapy for patients who received a prior checkpoint inhibitor, Rettig said.
“Don't change to another checkpoint inhibitor,” he said. “We have enough phase 3 data that indicates checkpoint inhibitors are no longer really adding to benefit once they’ve had a checkpoint inhibitor.”
Rettig said to even consider checkpoint inhibitors for patients who are checkpoint inhibitor-naïve, especially given the potential for durable remissions. As for third-line therapy, he said, “we have both belzutifan and tivozanib, which have been shown to improve PFS. More studies are ongoing.”
There are many adverse events linked to TKIs, Rettig said, including cardiovascular problems, thrombosis, hypertension, heart failure, torsades de pointes, QT prolongation, and gastrointestinal toxicity. TKIs tend to be the major drivers of adverse events in combination therapy.
Rettig emphasized the shorter half-life of the TKI axitinib, which he said allows for easier management of toxicities: “That’s why it’s preferred in the VA RCC clinical pathway.”
Rettig discloses relationships with Ambrx, Amgen, AVEO, Bayer, INmune Bio, Johnson & Johnson Health Care Systems, Lantheus, Merck, Myovant, Novartis, ORIC, and Progenics.
Treatment of metastatic renal cell carcinoma (RCC) is complex and requires careful analysis of risk and treatment options, an oncologist said at the July Association of VA Hematology and Oncology (AVAHO) seminar in Long Beach, California, regarding treating veterans with kidney cancer.
“We’ve come a long way in treating this disease, but individualizing therapy remains critical, especially in complex populations like our veterans,” said Matthew B. Rettig, MD, chief of Hematology-Oncology at the Veterans Affairs Greater Los Angeles Healthcare System and professor of Medicine and Urology at UCLA.
Rettig emphasized 2 critical early questions clinicians should consider when encountering metastatic RCC. First: Can the patient be treated with localized interventions such as metastasectomy, radiation therapy, or nephrectomy? These can be curative, Rettig said.
And second: Does the patient currently need systemic therapy? “[There are] a small subset of patients,” Rettig said, “who go into a durable, complete remission, dare I say ‘cure,’ with immunotherapeutic-based approaches.”
Rettig highlighted the International Metastatic Renal Cell Carcinoma Database Consortium criteria as a guide for clinicians as they determine the best strategy for treatment. The Database Consortium estimates survival in various lines of therapy by incorporating 6 prognostic factors: anemia, hypercalcemia, neutrophilia, thrombocytosis, performance status, and time from diagnosis to treatment.
These criteria classify patients into favorable, intermediate, or poor risk categories that can guide first-line systemic therapy. The criteria also provide estimates of median survival.
Rettig noted a “huge percentage” of veterans mirror the intermediate-risk demographics of clinical trial cohorts but often present with greater comorbidity burdens: “That plays into whether we treat and how we treat,” he said.
Rettig highlighted kidney cancer guidelines from the National Comprehensive Cancer Network and noted that several trials examined first-line use of combinations of vascular endothelial growth factor receptor tyrosine kinase inhibitors (TKIs) and checkpoint inhibitors.
There’s a general theme in the findings, he said: “You have OS (overall survival) and PFS (progression-free survival) benefit in the intermediate/poor risk group, but only PFS benefit in the patients who have favorable-risk disease. And you see higher objective response rates with the combinations.
“If you have a patient who's highly symptomatic or has an organ system threatened by a metastasis, you'd want to use a combination that elicits a higher objective response rate,” Rettig added.
A TKI is going to be the most appropriate second-line therapy for patients who received a prior checkpoint inhibitor, Rettig said.
“Don't change to another checkpoint inhibitor,” he said. “We have enough phase 3 data that indicates checkpoint inhibitors are no longer really adding to benefit once they’ve had a checkpoint inhibitor.”
Rettig said to even consider checkpoint inhibitors for patients who are checkpoint inhibitor-naïve, especially given the potential for durable remissions. As for third-line therapy, he said, “we have both belzutifan and tivozanib, which have been shown to improve PFS. More studies are ongoing.”
There are many adverse events linked to TKIs, Rettig said, including cardiovascular problems, thrombosis, hypertension, heart failure, torsades de pointes, QT prolongation, and gastrointestinal toxicity. TKIs tend to be the major drivers of adverse events in combination therapy.
Rettig emphasized the shorter half-life of the TKI axitinib, which he said allows for easier management of toxicities: “That’s why it’s preferred in the VA RCC clinical pathway.”
Rettig discloses relationships with Ambrx, Amgen, AVEO, Bayer, INmune Bio, Johnson & Johnson Health Care Systems, Lantheus, Merck, Myovant, Novartis, ORIC, and Progenics.
Treating Metastatic RCC: From Risk Assessment to Therapy Selection
Treating Metastatic RCC: From Risk Assessment to Therapy Selection