Bringing you the latest news, research and reviews, exclusive interviews, podcasts, quizzes, and more.

Theme
medstat_cr
Top Sections
Clinical Review
Expert Commentary
cr
Main menu
CR Main Menu
Explore menu
CR Explore Menu
Proclivity ID
18822001
Unpublish
Negative Keywords Excluded Elements
div[contains(@class, 'view-clinical-edge-must-reads')]
div[contains(@class, 'read-next-article')]
div[contains(@class, 'nav-primary')]
nav[contains(@class, 'nav-primary')]
section[contains(@class, 'footer-nav-section-wrapper')]
nav[contains(@class, 'nav-ce-stack nav-ce-stack__large-screen')]
header[@id='header']
div[contains(@class, 'header__large-screen')]
div[contains(@class, 'read-next-article')]
div[contains(@class, 'main-prefix')]
div[contains(@class, 'nav-primary')]
nav[contains(@class, 'nav-primary')]
section[contains(@class, 'footer-nav-section-wrapper')]
footer[@id='footer']
section[contains(@class, 'nav-hidden')]
div[contains(@class, 'ce-card-content')]
nav[contains(@class, 'nav-ce-stack')]
div[contains(@class, 'view-medstat-quiz-listing-panes')]
div[contains(@class, 'pane-article-sidebar-latest-news')]
Altmetric
Click for Credit Button Label
Take Test
DSM Affiliated
Display in offset block
Disqus Exclude
Best Practices
CE/CME
Education Center
Medical Education Library
Enable Disqus
Display Author and Disclosure Link
Publication Type
Clinical
Slot System
Featured Buckets
Disable Sticky Ads
Disable Ad Block Mitigation
Featured Buckets Admin
Show Ads on this Publication's Homepage
Consolidated Pub
Show Article Page Numbers on TOC
Use larger logo size
Off
publication_blueconic_enabled
Off
Show More Destinations Menu
Disable Adhesion on Publication
Off
Restore Menu Label on Mobile Navigation
Disable Facebook Pixel from Publication
Exclude this publication from publication selection on articles and quiz
Gating Strategy
First Page Free
Challenge Center
Disable Inline Native ads

Continuous Glucose Monitors for All? Opinions Remain Mixed

Article Type
Changed
Wed, 11/13/2024 - 02:44

 

The recent US Food and Drug Administration (FDA) clearance of two over-the-counter (OTC) continuous glucose monitors (CGMs) — Dexcom’s Stelo and Abbott’s Lingo — has sparked interest in potentially expanding their use to those without diabetes or prediabetes.

There are several valid questions about how the general population might benefit from CGMs. Can they motivate those struggling with overweight to shed pounds? Would they prompt users to follow more healthful eating patterns? Can they act as a canary in the coal mine, alerting users to prediabetes? 

The short answer to these questions is, we don’t know.

“Glucose levels fluctuate in everyone in response to meals, exercise, stress, etc, but there has been no credible research to support CGM use by most people who do not have diabetes,” Jill Crandall, MD, chief of endocrinology at Albert Einstein College of Medicine and Montefiore Health System in New York City, said in an interview.

“The utility of CGM for people without diabetes hasn’t been established and the drive to market CGM as an OTC device seems largely driven by financial considerations,” Crandall said. She advocates instead for a strategy directed at more meaningful objectives.

“For now, efforts should be focused on making CGMs available to patients who will clearly benefit — ie, people with diabetes, especially those who are using insulin and those who are struggling to achieve desired levels of glucose control.” 

Nicole Spartano, PhD, assistant professor of medicine in endocrinology, diabetes, nutrition and weight management at Boston University’s Chobanian & Avedisian School of Medicine in Massachusetts, agreed with this assessment.

“It is definitely too early to make recommendations for patients without diabetes based on their CGM data,” said Spartano, who also serves as the director of the Glucose Monitoring Station at the Framingham Heart Study in Framingham, Massachusetts. “We simply do not have enough follow-up data to tell us which CGM metrics are associated with higher risk for disease.” 

Spartano served as the lead author of a recent study showing time spent in various CGM ranges in a large cohort of individuals without diabetes using the Dexcom G6 Pro model. In the future, she said the data may be used to establish reference ranges for clinicians and individuals.

“We are working on another paper surveying diabetologists and CGM experts about how they interpret CGM reports from individuals without diabetes,” she said in an interview. Although the data are not yet published, Spartano said, “we are finding that clinicians are currently very discordant in how they interpret these reports.”
 

Potential Benefits Right Now

Satish Garg, MD, director of the Adult Clinic at the Barbara Davis Center for Diabetes at the University of Colorado Anschutz Medical Campus, Aurora, and editor-in-chief of Diabetes Technology & Therapeutics, is convinced that glucose should be considered another vital sign, like blood pressure, pulse rate, respiration rate, and body temperature. Therefore, he sees the use of a CGM in people without diabetes as a way to build awareness and perhaps prompt behavior modification.

“Someone with an A1c of 4.9 on a normal day may notice that they’ve gained a little bit of weight, and if they use an OTC CGM and start seeing changes, it might help them to modulate their diet themselves, whether they see a dietitian or not,” Garg said.

He gave the example of “a natural behavioral change” occurring when someone using a CGM declines to eat a post-meal dessert after seeing their blood glucose had already risen to 170.

Wearing a CGM also has the potential to alert the user to high blood glucose, leading them to an earlier diagnosis of prediabetes or diabetes, Shichun Bao, MD, PhD, Diabetes Technology Program Leader at the Vanderbilt Eskind Diabetes Clinic of Vanderbilt University in Nashville, Tennessee, said in an interview. She has had cases where a family member of someone with diabetes used the patient’s fingerstick meter, found that their glucose was 280, and self-diagnosed with diabetes.

“It’s the same thing with the CGM,” she said. “If they somehow did not know they have diabetes and they wear a CGM and it shows their sugar is high, that will help them to know to see their provider to get a diagnosis, get treated, and track progression.”

Given the shortage of endocrinologists and long waits for appointments in the United States and elsewhere, it is very likely that primary care physicians will be the ones fielding questions from individuals without diabetes interested in purchasing an OTC CGM. Internist Douglas Paauw, MD, a professor at the University of Washington School of Medicine, Seattle, said in an interview that, for his practice, “the benefits outweigh some of the limitations.”

“I don’t really think somebody who doesn’t have diabetes needs to be using a CGM all the time or long term,” he said. “But I have used it in a few people without diabetes, and I think if someone can afford to use it for 2-4 weeks, especially if they’ve been gaining weight, then they can really recognize what happens to their bodies when they eat certain foods.”

Paauw added that CGMs are a more effective means of teaching his patients than them receiving a lecture from him on healthy eating. “There’s nothing like immediate feedback on what happens to your body to change behavior.”

Similarly, William Golden, medical director at Arkansas Medicaid and professor of medicine and public health at the University of Arkansas for Medical Sciences, Little Rock, said in an interview that “it is difficult to justify coverage for CGMs on demand — but if people want to invest in their own devices and the technology motivates them to eat better and/or lose weight, then there are benefits to be had.” 
 

 

 

Potential Downsides

Although it may seem simple to use an OTC CGM to measure blood glucose on the fly, in the real world it can take patients time to understand these devices, “especially the first day or so, when users are going to get false lows,” Bao said. “Clinicians need to tell them if you don’t feel like your sugar is low and the device says it’s low, whether they do or don’t have diabetes, they should do a fingerstick glucose test to confirm the low before rushing to take in sugar. On the other hand, if they drink a lot of juice, their sugar will go high. So, it can create problems and false results either way.”

Many factors affect glucose, she said. “When you’re sick, glucose can go high, and when you’re very sick, in the ICU, sometimes it can be low. It depends on the situation.” Bao noted that certain vitamins and drugs can also interfere with readings.

Bao doesn’t see value in having people without diabetes monitor their glucose continuously. “If they want to see what foods or exercise do to their body, they will probably benefit from a short trial to gain some insight; otherwise, they’re wasting money,” she said.

Another potential downside is that there’s no head-to-head comparison data with the approved devices, Garg said. “But it’s clear to us that Stelo’s range is very narrow, 70 to 200, whereas the Lingo ranges are pretty much full, from 40 to 400 or 55 to 400. So, we don’t know the accuracy of these sensors.”

Golden observed that for certain patients, CGMs may lead to psychological distress rather than providing a sense of control over their blood glucose levels.

“I have had a nondiabetic patient or two that obsessed about their blood sugars and a device would only magnify their anxiety/neurosis,” he said. “The bottom line is that it’s a tool for a balanced approach to health management, but the daily results must be kept in perspective!”
 

Educate Patients, Primary Care Physicians

To maximize potential benefits for patients without diabetes, clinicians need to be well trained in the use and interpretation of results from the devices, Bao said. They can then better educate their patients, including discussing with them possible pitfalls surrounding their use. 

“For example, a patient may see that their blood glucose, as measured by a fingerstick, is 95, whereas the CGM says 140, and ask, ‘Which one do I trust?’ ”

This is where the patient can be educated about the difference between interstitial glucose, as measured by the CGM, and blood glucose, as measured by the fingerstick. Because it takes about 15 minutes for blood glucose to get to the interstitial tissue, there’s lag time, and the two measurements will differ.

“A discrepancy of 20% is totally acceptable for that reason,” Bao said.

She has also seen several examples where patients were misled by their CGM when its censor became dislodged.

“Sometimes when a sensor has moved, the patient may push it back in because they don’t want to throw it away. But it doesn’t work that way, and they end up with inaccurate readings.” 

At a minimum, Bao added, clinicians and patients should read the package insert but also be aware that it doesn’t list everything that might go wrong or interfere with the device’s accuracy.

Manufacturers of OTC devices should be training primary care and family practice doctors in their use, given the expected “huge” influx of patients wanting to use them, according to Garg.

“If you are expecting endos or diabetes specialists to see these people, that’s never going to happen,” he said. “We have a big shortage of these specialists, so industry has to train these doctors. Patients will bring their doctor’s data, and the clinicians need to learn the basics of how to interpret the glucose values they see. Then they can treat these patients rather than shipping all of them to endos who likely are not available.”

Paauw agreed that CGM training should be directed largely toward primary care professionals, who can help their under-resourced endocrinologist colleagues from seeing an uptick in “the worried well.” 

“The bottom line is that primary care professionals do need to understand the CGM,” he said. “They do need to get comfortable with it. They do need to come up with opinions on how to use it. The public’s going to be using it, and we need to be competent in it and use our subspecialists appropriately.”

Spartano received funding for an investigator-initiated research grant from Novo Nordisk unrelated to the cited CGM studies. Garg , Bao, Paauw, Golden, and Crandall declared no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

The recent US Food and Drug Administration (FDA) clearance of two over-the-counter (OTC) continuous glucose monitors (CGMs) — Dexcom’s Stelo and Abbott’s Lingo — has sparked interest in potentially expanding their use to those without diabetes or prediabetes.

There are several valid questions about how the general population might benefit from CGMs. Can they motivate those struggling with overweight to shed pounds? Would they prompt users to follow more healthful eating patterns? Can they act as a canary in the coal mine, alerting users to prediabetes? 

The short answer to these questions is, we don’t know.

“Glucose levels fluctuate in everyone in response to meals, exercise, stress, etc, but there has been no credible research to support CGM use by most people who do not have diabetes,” Jill Crandall, MD, chief of endocrinology at Albert Einstein College of Medicine and Montefiore Health System in New York City, said in an interview.

“The utility of CGM for people without diabetes hasn’t been established and the drive to market CGM as an OTC device seems largely driven by financial considerations,” Crandall said. She advocates instead for a strategy directed at more meaningful objectives.

“For now, efforts should be focused on making CGMs available to patients who will clearly benefit — ie, people with diabetes, especially those who are using insulin and those who are struggling to achieve desired levels of glucose control.” 

Nicole Spartano, PhD, assistant professor of medicine in endocrinology, diabetes, nutrition and weight management at Boston University’s Chobanian & Avedisian School of Medicine in Massachusetts, agreed with this assessment.

“It is definitely too early to make recommendations for patients without diabetes based on their CGM data,” said Spartano, who also serves as the director of the Glucose Monitoring Station at the Framingham Heart Study in Framingham, Massachusetts. “We simply do not have enough follow-up data to tell us which CGM metrics are associated with higher risk for disease.” 

Spartano served as the lead author of a recent study showing time spent in various CGM ranges in a large cohort of individuals without diabetes using the Dexcom G6 Pro model. In the future, she said the data may be used to establish reference ranges for clinicians and individuals.

“We are working on another paper surveying diabetologists and CGM experts about how they interpret CGM reports from individuals without diabetes,” she said in an interview. Although the data are not yet published, Spartano said, “we are finding that clinicians are currently very discordant in how they interpret these reports.”
 

Potential Benefits Right Now

Satish Garg, MD, director of the Adult Clinic at the Barbara Davis Center for Diabetes at the University of Colorado Anschutz Medical Campus, Aurora, and editor-in-chief of Diabetes Technology & Therapeutics, is convinced that glucose should be considered another vital sign, like blood pressure, pulse rate, respiration rate, and body temperature. Therefore, he sees the use of a CGM in people without diabetes as a way to build awareness and perhaps prompt behavior modification.

“Someone with an A1c of 4.9 on a normal day may notice that they’ve gained a little bit of weight, and if they use an OTC CGM and start seeing changes, it might help them to modulate their diet themselves, whether they see a dietitian or not,” Garg said.

He gave the example of “a natural behavioral change” occurring when someone using a CGM declines to eat a post-meal dessert after seeing their blood glucose had already risen to 170.

Wearing a CGM also has the potential to alert the user to high blood glucose, leading them to an earlier diagnosis of prediabetes or diabetes, Shichun Bao, MD, PhD, Diabetes Technology Program Leader at the Vanderbilt Eskind Diabetes Clinic of Vanderbilt University in Nashville, Tennessee, said in an interview. She has had cases where a family member of someone with diabetes used the patient’s fingerstick meter, found that their glucose was 280, and self-diagnosed with diabetes.

“It’s the same thing with the CGM,” she said. “If they somehow did not know they have diabetes and they wear a CGM and it shows their sugar is high, that will help them to know to see their provider to get a diagnosis, get treated, and track progression.”

Given the shortage of endocrinologists and long waits for appointments in the United States and elsewhere, it is very likely that primary care physicians will be the ones fielding questions from individuals without diabetes interested in purchasing an OTC CGM. Internist Douglas Paauw, MD, a professor at the University of Washington School of Medicine, Seattle, said in an interview that, for his practice, “the benefits outweigh some of the limitations.”

“I don’t really think somebody who doesn’t have diabetes needs to be using a CGM all the time or long term,” he said. “But I have used it in a few people without diabetes, and I think if someone can afford to use it for 2-4 weeks, especially if they’ve been gaining weight, then they can really recognize what happens to their bodies when they eat certain foods.”

Paauw added that CGMs are a more effective means of teaching his patients than them receiving a lecture from him on healthy eating. “There’s nothing like immediate feedback on what happens to your body to change behavior.”

Similarly, William Golden, medical director at Arkansas Medicaid and professor of medicine and public health at the University of Arkansas for Medical Sciences, Little Rock, said in an interview that “it is difficult to justify coverage for CGMs on demand — but if people want to invest in their own devices and the technology motivates them to eat better and/or lose weight, then there are benefits to be had.” 
 

 

 

Potential Downsides

Although it may seem simple to use an OTC CGM to measure blood glucose on the fly, in the real world it can take patients time to understand these devices, “especially the first day or so, when users are going to get false lows,” Bao said. “Clinicians need to tell them if you don’t feel like your sugar is low and the device says it’s low, whether they do or don’t have diabetes, they should do a fingerstick glucose test to confirm the low before rushing to take in sugar. On the other hand, if they drink a lot of juice, their sugar will go high. So, it can create problems and false results either way.”

Many factors affect glucose, she said. “When you’re sick, glucose can go high, and when you’re very sick, in the ICU, sometimes it can be low. It depends on the situation.” Bao noted that certain vitamins and drugs can also interfere with readings.

Bao doesn’t see value in having people without diabetes monitor their glucose continuously. “If they want to see what foods or exercise do to their body, they will probably benefit from a short trial to gain some insight; otherwise, they’re wasting money,” she said.

Another potential downside is that there’s no head-to-head comparison data with the approved devices, Garg said. “But it’s clear to us that Stelo’s range is very narrow, 70 to 200, whereas the Lingo ranges are pretty much full, from 40 to 400 or 55 to 400. So, we don’t know the accuracy of these sensors.”

Golden observed that for certain patients, CGMs may lead to psychological distress rather than providing a sense of control over their blood glucose levels.

“I have had a nondiabetic patient or two that obsessed about their blood sugars and a device would only magnify their anxiety/neurosis,” he said. “The bottom line is that it’s a tool for a balanced approach to health management, but the daily results must be kept in perspective!”
 

Educate Patients, Primary Care Physicians

To maximize potential benefits for patients without diabetes, clinicians need to be well trained in the use and interpretation of results from the devices, Bao said. They can then better educate their patients, including discussing with them possible pitfalls surrounding their use. 

“For example, a patient may see that their blood glucose, as measured by a fingerstick, is 95, whereas the CGM says 140, and ask, ‘Which one do I trust?’ ”

This is where the patient can be educated about the difference between interstitial glucose, as measured by the CGM, and blood glucose, as measured by the fingerstick. Because it takes about 15 minutes for blood glucose to get to the interstitial tissue, there’s lag time, and the two measurements will differ.

“A discrepancy of 20% is totally acceptable for that reason,” Bao said.

She has also seen several examples where patients were misled by their CGM when its censor became dislodged.

“Sometimes when a sensor has moved, the patient may push it back in because they don’t want to throw it away. But it doesn’t work that way, and they end up with inaccurate readings.” 

At a minimum, Bao added, clinicians and patients should read the package insert but also be aware that it doesn’t list everything that might go wrong or interfere with the device’s accuracy.

Manufacturers of OTC devices should be training primary care and family practice doctors in their use, given the expected “huge” influx of patients wanting to use them, according to Garg.

“If you are expecting endos or diabetes specialists to see these people, that’s never going to happen,” he said. “We have a big shortage of these specialists, so industry has to train these doctors. Patients will bring their doctor’s data, and the clinicians need to learn the basics of how to interpret the glucose values they see. Then they can treat these patients rather than shipping all of them to endos who likely are not available.”

Paauw agreed that CGM training should be directed largely toward primary care professionals, who can help their under-resourced endocrinologist colleagues from seeing an uptick in “the worried well.” 

“The bottom line is that primary care professionals do need to understand the CGM,” he said. “They do need to get comfortable with it. They do need to come up with opinions on how to use it. The public’s going to be using it, and we need to be competent in it and use our subspecialists appropriately.”

Spartano received funding for an investigator-initiated research grant from Novo Nordisk unrelated to the cited CGM studies. Garg , Bao, Paauw, Golden, and Crandall declared no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

 

The recent US Food and Drug Administration (FDA) clearance of two over-the-counter (OTC) continuous glucose monitors (CGMs) — Dexcom’s Stelo and Abbott’s Lingo — has sparked interest in potentially expanding their use to those without diabetes or prediabetes.

There are several valid questions about how the general population might benefit from CGMs. Can they motivate those struggling with overweight to shed pounds? Would they prompt users to follow more healthful eating patterns? Can they act as a canary in the coal mine, alerting users to prediabetes? 

The short answer to these questions is, we don’t know.

“Glucose levels fluctuate in everyone in response to meals, exercise, stress, etc, but there has been no credible research to support CGM use by most people who do not have diabetes,” Jill Crandall, MD, chief of endocrinology at Albert Einstein College of Medicine and Montefiore Health System in New York City, said in an interview.

“The utility of CGM for people without diabetes hasn’t been established and the drive to market CGM as an OTC device seems largely driven by financial considerations,” Crandall said. She advocates instead for a strategy directed at more meaningful objectives.

“For now, efforts should be focused on making CGMs available to patients who will clearly benefit — ie, people with diabetes, especially those who are using insulin and those who are struggling to achieve desired levels of glucose control.” 

Nicole Spartano, PhD, assistant professor of medicine in endocrinology, diabetes, nutrition and weight management at Boston University’s Chobanian & Avedisian School of Medicine in Massachusetts, agreed with this assessment.

“It is definitely too early to make recommendations for patients without diabetes based on their CGM data,” said Spartano, who also serves as the director of the Glucose Monitoring Station at the Framingham Heart Study in Framingham, Massachusetts. “We simply do not have enough follow-up data to tell us which CGM metrics are associated with higher risk for disease.” 

Spartano served as the lead author of a recent study showing time spent in various CGM ranges in a large cohort of individuals without diabetes using the Dexcom G6 Pro model. In the future, she said the data may be used to establish reference ranges for clinicians and individuals.

“We are working on another paper surveying diabetologists and CGM experts about how they interpret CGM reports from individuals without diabetes,” she said in an interview. Although the data are not yet published, Spartano said, “we are finding that clinicians are currently very discordant in how they interpret these reports.”
 

Potential Benefits Right Now

Satish Garg, MD, director of the Adult Clinic at the Barbara Davis Center for Diabetes at the University of Colorado Anschutz Medical Campus, Aurora, and editor-in-chief of Diabetes Technology & Therapeutics, is convinced that glucose should be considered another vital sign, like blood pressure, pulse rate, respiration rate, and body temperature. Therefore, he sees the use of a CGM in people without diabetes as a way to build awareness and perhaps prompt behavior modification.

“Someone with an A1c of 4.9 on a normal day may notice that they’ve gained a little bit of weight, and if they use an OTC CGM and start seeing changes, it might help them to modulate their diet themselves, whether they see a dietitian or not,” Garg said.

He gave the example of “a natural behavioral change” occurring when someone using a CGM declines to eat a post-meal dessert after seeing their blood glucose had already risen to 170.

Wearing a CGM also has the potential to alert the user to high blood glucose, leading them to an earlier diagnosis of prediabetes or diabetes, Shichun Bao, MD, PhD, Diabetes Technology Program Leader at the Vanderbilt Eskind Diabetes Clinic of Vanderbilt University in Nashville, Tennessee, said in an interview. She has had cases where a family member of someone with diabetes used the patient’s fingerstick meter, found that their glucose was 280, and self-diagnosed with diabetes.

“It’s the same thing with the CGM,” she said. “If they somehow did not know they have diabetes and they wear a CGM and it shows their sugar is high, that will help them to know to see their provider to get a diagnosis, get treated, and track progression.”

Given the shortage of endocrinologists and long waits for appointments in the United States and elsewhere, it is very likely that primary care physicians will be the ones fielding questions from individuals without diabetes interested in purchasing an OTC CGM. Internist Douglas Paauw, MD, a professor at the University of Washington School of Medicine, Seattle, said in an interview that, for his practice, “the benefits outweigh some of the limitations.”

“I don’t really think somebody who doesn’t have diabetes needs to be using a CGM all the time or long term,” he said. “But I have used it in a few people without diabetes, and I think if someone can afford to use it for 2-4 weeks, especially if they’ve been gaining weight, then they can really recognize what happens to their bodies when they eat certain foods.”

Paauw added that CGMs are a more effective means of teaching his patients than them receiving a lecture from him on healthy eating. “There’s nothing like immediate feedback on what happens to your body to change behavior.”

Similarly, William Golden, medical director at Arkansas Medicaid and professor of medicine and public health at the University of Arkansas for Medical Sciences, Little Rock, said in an interview that “it is difficult to justify coverage for CGMs on demand — but if people want to invest in their own devices and the technology motivates them to eat better and/or lose weight, then there are benefits to be had.” 
 

 

 

Potential Downsides

Although it may seem simple to use an OTC CGM to measure blood glucose on the fly, in the real world it can take patients time to understand these devices, “especially the first day or so, when users are going to get false lows,” Bao said. “Clinicians need to tell them if you don’t feel like your sugar is low and the device says it’s low, whether they do or don’t have diabetes, they should do a fingerstick glucose test to confirm the low before rushing to take in sugar. On the other hand, if they drink a lot of juice, their sugar will go high. So, it can create problems and false results either way.”

Many factors affect glucose, she said. “When you’re sick, glucose can go high, and when you’re very sick, in the ICU, sometimes it can be low. It depends on the situation.” Bao noted that certain vitamins and drugs can also interfere with readings.

Bao doesn’t see value in having people without diabetes monitor their glucose continuously. “If they want to see what foods or exercise do to their body, they will probably benefit from a short trial to gain some insight; otherwise, they’re wasting money,” she said.

Another potential downside is that there’s no head-to-head comparison data with the approved devices, Garg said. “But it’s clear to us that Stelo’s range is very narrow, 70 to 200, whereas the Lingo ranges are pretty much full, from 40 to 400 or 55 to 400. So, we don’t know the accuracy of these sensors.”

Golden observed that for certain patients, CGMs may lead to psychological distress rather than providing a sense of control over their blood glucose levels.

“I have had a nondiabetic patient or two that obsessed about their blood sugars and a device would only magnify their anxiety/neurosis,” he said. “The bottom line is that it’s a tool for a balanced approach to health management, but the daily results must be kept in perspective!”
 

Educate Patients, Primary Care Physicians

To maximize potential benefits for patients without diabetes, clinicians need to be well trained in the use and interpretation of results from the devices, Bao said. They can then better educate their patients, including discussing with them possible pitfalls surrounding their use. 

“For example, a patient may see that their blood glucose, as measured by a fingerstick, is 95, whereas the CGM says 140, and ask, ‘Which one do I trust?’ ”

This is where the patient can be educated about the difference between interstitial glucose, as measured by the CGM, and blood glucose, as measured by the fingerstick. Because it takes about 15 minutes for blood glucose to get to the interstitial tissue, there’s lag time, and the two measurements will differ.

“A discrepancy of 20% is totally acceptable for that reason,” Bao said.

She has also seen several examples where patients were misled by their CGM when its censor became dislodged.

“Sometimes when a sensor has moved, the patient may push it back in because they don’t want to throw it away. But it doesn’t work that way, and they end up with inaccurate readings.” 

At a minimum, Bao added, clinicians and patients should read the package insert but also be aware that it doesn’t list everything that might go wrong or interfere with the device’s accuracy.

Manufacturers of OTC devices should be training primary care and family practice doctors in their use, given the expected “huge” influx of patients wanting to use them, according to Garg.

“If you are expecting endos or diabetes specialists to see these people, that’s never going to happen,” he said. “We have a big shortage of these specialists, so industry has to train these doctors. Patients will bring their doctor’s data, and the clinicians need to learn the basics of how to interpret the glucose values they see. Then they can treat these patients rather than shipping all of them to endos who likely are not available.”

Paauw agreed that CGM training should be directed largely toward primary care professionals, who can help their under-resourced endocrinologist colleagues from seeing an uptick in “the worried well.” 

“The bottom line is that primary care professionals do need to understand the CGM,” he said. “They do need to get comfortable with it. They do need to come up with opinions on how to use it. The public’s going to be using it, and we need to be competent in it and use our subspecialists appropriately.”

Spartano received funding for an investigator-initiated research grant from Novo Nordisk unrelated to the cited CGM studies. Garg , Bao, Paauw, Golden, and Crandall declared no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Is Acute Kidney Injury Really a Single Disease?

Article Type
Changed
Wed, 11/13/2024 - 02:49

The search for a better biomarker than creatine for acute kidney injury (AKI) has been “long and elusive.” However, could researchers be on the right path now?

“The thinking is moving away from trying to find one biomarker that can be used for different types of kidney injury to a recognition that AKI is not just a single disease that a patient has or doesn’t have,” Rob D. Nerenz, PhD, an associate professor in the Department of Pathology and Laboratory Medicine at the Medical College of Wisconsin, Milwaukee, told this news organization. “It’s lots of different diseases that all affect the kidney in different ways.”

AKI is actually a “loose collection” of hepatorenal, cardiorenal, nephrotoxic, and sepsis-associated syndromes, as well as acute interstitial nephritis (AIN), he said. “So the question is not: ‘Is AKI present — yes or no?’ It’s: ‘What kind of AKI is present, and how do I treat it?’ ”
 

‘Mediocre Markers’

AKI affects about 10%-30% of hospitalized patients, according to Nerenz. It’s associated with an increased risk for adverse outcomes, including post-AKI chronic kidney disease and a mortality rate of approximately 24%.

Currently, AKI is defined by a rapid increase in serum creatinine, a decrease in urine output, or both.

“Those are mediocre markers,” Nerenz said, as serum creatinine is not very sensitive to acute change, and the increase is often detected after the therapeutic window of intervention has passed. In addition, “it only tells us that the kidneys are unhappy; it doesn’t say anything about the cause.”

Urine output is limited as a marker because many conditions affect it. “If you’re dehydrated, urine output is going to decrease,” he said. “And in some forms of AKI, urine output actually goes up.”

What’s needed, he said, is a more sensitive biomarker that’s detectable within a shorter timeframe of 2-6 hours following injury.

“Right now, we’re looking at 48 hours before a change becomes apparent, and that’s just too long. Plus, it should be kidney specific. One of the major limitations of the biomarkers that have been evaluated to this point is that, yes, they’re released by the kidney, but they’re also released by other tissue types within the body, and that hinders their effectiveness as a marker.”
 

Neutrophil Gelatinase-Associated Lipocalin (NGAL)

Although research on better biomarkers is ongoing, “there’s also a recognition that some of the protein markers that have been around for a while, if used appropriately, can provide value,” Nerenz said. These include, among others, NGAL.

NGAL works well in pediatric patients without other comorbidities, but it has been less useful in adult patients because it is also released by other cell types. However, recent research suggests it shows promise in patients with both cirrhosis and AKI.

There are three main causes of AKI in cirrhosis, Nerenz explained. The first is prerenal and can be primarily addressed through rehydration.

“When these patients come in, clinicians won’t do anything right away other than provide fluids. If creatinine improves over the 48-hour period of fluid replenishment, then the patient is sent home because there really isn’t extensive damage to the kidneys.”

If improvement isn’t seen after those 48 hours, then it could be one of two things: Hepatorenal syndrome or acute tubular necrosis. Patients with hepatorenal syndrome are candidates for terlipressin, which the Food and Drug Administration (FDA) approved for this indication in 2022 after it displayed notable efficacy in a double-blind study.

“You don’t want to give terlipressin to just anybody because if the issue is not a diminished blood supply to the kidney, it’s not going to help, and comes with some serious side effects, such as respiratory failure,” Nerenz explained. “Having a biomarker that can distinguish between hepatorenal syndrome and acute tubular necrosis really helps clinicians confidently identify which patients are good candidates for this drug. Right now, we’re flying blind to a certain extent, basically using clinical intuition.”

Currently, the determination of NGAL is FDA cleared only for pediatric use. One way hospitals have dealt with that is by making the test in their own labs, using appropriate reagents, validation, and so forth. These tests are then safe for use in adults but haven’t gone through the FDA approval process.

However, the FDA’s recent announcement stating that the agency should oversee lab-developed tests has made this situation unclear, Nerenz said.

“At this point, we don’t know if there’s still an opportunity to take the NGAL test (or any other cleared biomarker) and validate it for use in a different patient population. Many hospital labs simply don’t have the resources to take these tests through the whole FDA approval process.”
 

 

 

A New Biomarker for AIN?

Meanwhile, research is also moving forward on a better biomarker for AIN, which is also under the AKI umbrella.

“It’s important to diagnose AIN because it has a very specific treatment,” Dennis G. Moledina, MD, PhD, Yale School of Medicine in New Haven, Connecticut, told this news organization.

“AIN is caused by a bunch of different medications, such as proton pump inhibitors, cancer drugs, nonsteroidal anti-inflammatory drugs, and antibiotics, so when someone has this condition, you have to stop potentially life-saving medications and give unnecessary and potentially toxic immunosuppressive drugs, like prednisone,” he said. “If you get the diagnosis wrong, you’re stopping vital drugs and giving immunosuppression for no reason. And if you miss the diagnosis, AIN can lead to permanent chronic kidney disease.”

“Right now, the only way to diagnose AIN is to do a kidney biopsy, which is risky because it can often lead to significant bleeding,” he said. “Some people can’t undergo a biopsy because they’re on medications that increase the risk of bleeding, and they can’t be stopped.”

Furthermore, he noted, “the longer a patient takes a drug that’s causing AIN without getting a diagnosis, the less the chances of recovery because the longer you let this kidney inflammation go on, the more fibrosis and permanent damage develops. So it is important to diagnose it as early as possible, and that’s again why we have a real need for a noninvasive biomarker that can be tested rapidly.”

Moledina and colleagues have been working on identifying a suitable biomarker for close to 10 years, the latest example of which is their 2023 study validating urinary CXCL9 as just such a marker.

“We’re most excited about CXCL9 because it’s already used to diagnose some other diseases in plasma,” Moledina said. “We think that we can convince labs to test it in urine.”

In an accompanying editorial, Mark Canney, PhD, and colleagues at the University of Ottawa and The Ottawa Hospital in Ontario, Canada, wrote that the CXCL9 study findings “are exciting because they provide a road map of where diagnostics can get to for this common, yet poorly identified and treated, cause of kidney damage. The need for a different approach can be readily identified from the fact that clinicians’ gestalt for diagnosing AIN was almost tantamount to tossing a coin (AUC, 0.57). CXCL9 alone outperformed not only the clinician’s prebiopsy suspicion but also an existing diagnostic model and other candidate biomarkers both in the discovery and external validation cohorts.”

Like NGAL, CXCL9 will have to go through the FDA approval process before it can be used for AIN. Therefore, it may be a few years before it can become routinely available, Moledina said.

Nevertheless, Nerenz added, “I think the next steps for AKI are probably continuing on this path of context-dependent, selective biomarker use. I anticipate that we’ll see ongoing development in this space, just expanding to a wider variety of clinical scenarios.”

Nerenz declared receiving research funding from Abbott Labs for evaluation of an AKI biomarker. Moledina is a co-inventor on a pending patent, “Methods and Systems for Diagnosis of Acute Interstitial Nephritis”; a cofounder of the diagnostics company Predict AIN; and a consultant for Biohaven.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The search for a better biomarker than creatine for acute kidney injury (AKI) has been “long and elusive.” However, could researchers be on the right path now?

“The thinking is moving away from trying to find one biomarker that can be used for different types of kidney injury to a recognition that AKI is not just a single disease that a patient has or doesn’t have,” Rob D. Nerenz, PhD, an associate professor in the Department of Pathology and Laboratory Medicine at the Medical College of Wisconsin, Milwaukee, told this news organization. “It’s lots of different diseases that all affect the kidney in different ways.”

AKI is actually a “loose collection” of hepatorenal, cardiorenal, nephrotoxic, and sepsis-associated syndromes, as well as acute interstitial nephritis (AIN), he said. “So the question is not: ‘Is AKI present — yes or no?’ It’s: ‘What kind of AKI is present, and how do I treat it?’ ”
 

‘Mediocre Markers’

AKI affects about 10%-30% of hospitalized patients, according to Nerenz. It’s associated with an increased risk for adverse outcomes, including post-AKI chronic kidney disease and a mortality rate of approximately 24%.

Currently, AKI is defined by a rapid increase in serum creatinine, a decrease in urine output, or both.

“Those are mediocre markers,” Nerenz said, as serum creatinine is not very sensitive to acute change, and the increase is often detected after the therapeutic window of intervention has passed. In addition, “it only tells us that the kidneys are unhappy; it doesn’t say anything about the cause.”

Urine output is limited as a marker because many conditions affect it. “If you’re dehydrated, urine output is going to decrease,” he said. “And in some forms of AKI, urine output actually goes up.”

What’s needed, he said, is a more sensitive biomarker that’s detectable within a shorter timeframe of 2-6 hours following injury.

“Right now, we’re looking at 48 hours before a change becomes apparent, and that’s just too long. Plus, it should be kidney specific. One of the major limitations of the biomarkers that have been evaluated to this point is that, yes, they’re released by the kidney, but they’re also released by other tissue types within the body, and that hinders their effectiveness as a marker.”
 

Neutrophil Gelatinase-Associated Lipocalin (NGAL)

Although research on better biomarkers is ongoing, “there’s also a recognition that some of the protein markers that have been around for a while, if used appropriately, can provide value,” Nerenz said. These include, among others, NGAL.

NGAL works well in pediatric patients without other comorbidities, but it has been less useful in adult patients because it is also released by other cell types. However, recent research suggests it shows promise in patients with both cirrhosis and AKI.

There are three main causes of AKI in cirrhosis, Nerenz explained. The first is prerenal and can be primarily addressed through rehydration.

“When these patients come in, clinicians won’t do anything right away other than provide fluids. If creatinine improves over the 48-hour period of fluid replenishment, then the patient is sent home because there really isn’t extensive damage to the kidneys.”

If improvement isn’t seen after those 48 hours, then it could be one of two things: Hepatorenal syndrome or acute tubular necrosis. Patients with hepatorenal syndrome are candidates for terlipressin, which the Food and Drug Administration (FDA) approved for this indication in 2022 after it displayed notable efficacy in a double-blind study.

“You don’t want to give terlipressin to just anybody because if the issue is not a diminished blood supply to the kidney, it’s not going to help, and comes with some serious side effects, such as respiratory failure,” Nerenz explained. “Having a biomarker that can distinguish between hepatorenal syndrome and acute tubular necrosis really helps clinicians confidently identify which patients are good candidates for this drug. Right now, we’re flying blind to a certain extent, basically using clinical intuition.”

Currently, the determination of NGAL is FDA cleared only for pediatric use. One way hospitals have dealt with that is by making the test in their own labs, using appropriate reagents, validation, and so forth. These tests are then safe for use in adults but haven’t gone through the FDA approval process.

However, the FDA’s recent announcement stating that the agency should oversee lab-developed tests has made this situation unclear, Nerenz said.

“At this point, we don’t know if there’s still an opportunity to take the NGAL test (or any other cleared biomarker) and validate it for use in a different patient population. Many hospital labs simply don’t have the resources to take these tests through the whole FDA approval process.”
 

 

 

A New Biomarker for AIN?

Meanwhile, research is also moving forward on a better biomarker for AIN, which is also under the AKI umbrella.

“It’s important to diagnose AIN because it has a very specific treatment,” Dennis G. Moledina, MD, PhD, Yale School of Medicine in New Haven, Connecticut, told this news organization.

“AIN is caused by a bunch of different medications, such as proton pump inhibitors, cancer drugs, nonsteroidal anti-inflammatory drugs, and antibiotics, so when someone has this condition, you have to stop potentially life-saving medications and give unnecessary and potentially toxic immunosuppressive drugs, like prednisone,” he said. “If you get the diagnosis wrong, you’re stopping vital drugs and giving immunosuppression for no reason. And if you miss the diagnosis, AIN can lead to permanent chronic kidney disease.”

“Right now, the only way to diagnose AIN is to do a kidney biopsy, which is risky because it can often lead to significant bleeding,” he said. “Some people can’t undergo a biopsy because they’re on medications that increase the risk of bleeding, and they can’t be stopped.”

Furthermore, he noted, “the longer a patient takes a drug that’s causing AIN without getting a diagnosis, the less the chances of recovery because the longer you let this kidney inflammation go on, the more fibrosis and permanent damage develops. So it is important to diagnose it as early as possible, and that’s again why we have a real need for a noninvasive biomarker that can be tested rapidly.”

Moledina and colleagues have been working on identifying a suitable biomarker for close to 10 years, the latest example of which is their 2023 study validating urinary CXCL9 as just such a marker.

“We’re most excited about CXCL9 because it’s already used to diagnose some other diseases in plasma,” Moledina said. “We think that we can convince labs to test it in urine.”

In an accompanying editorial, Mark Canney, PhD, and colleagues at the University of Ottawa and The Ottawa Hospital in Ontario, Canada, wrote that the CXCL9 study findings “are exciting because they provide a road map of where diagnostics can get to for this common, yet poorly identified and treated, cause of kidney damage. The need for a different approach can be readily identified from the fact that clinicians’ gestalt for diagnosing AIN was almost tantamount to tossing a coin (AUC, 0.57). CXCL9 alone outperformed not only the clinician’s prebiopsy suspicion but also an existing diagnostic model and other candidate biomarkers both in the discovery and external validation cohorts.”

Like NGAL, CXCL9 will have to go through the FDA approval process before it can be used for AIN. Therefore, it may be a few years before it can become routinely available, Moledina said.

Nevertheless, Nerenz added, “I think the next steps for AKI are probably continuing on this path of context-dependent, selective biomarker use. I anticipate that we’ll see ongoing development in this space, just expanding to a wider variety of clinical scenarios.”

Nerenz declared receiving research funding from Abbott Labs for evaluation of an AKI biomarker. Moledina is a co-inventor on a pending patent, “Methods and Systems for Diagnosis of Acute Interstitial Nephritis”; a cofounder of the diagnostics company Predict AIN; and a consultant for Biohaven.

A version of this article first appeared on Medscape.com.

The search for a better biomarker than creatine for acute kidney injury (AKI) has been “long and elusive.” However, could researchers be on the right path now?

“The thinking is moving away from trying to find one biomarker that can be used for different types of kidney injury to a recognition that AKI is not just a single disease that a patient has or doesn’t have,” Rob D. Nerenz, PhD, an associate professor in the Department of Pathology and Laboratory Medicine at the Medical College of Wisconsin, Milwaukee, told this news organization. “It’s lots of different diseases that all affect the kidney in different ways.”

AKI is actually a “loose collection” of hepatorenal, cardiorenal, nephrotoxic, and sepsis-associated syndromes, as well as acute interstitial nephritis (AIN), he said. “So the question is not: ‘Is AKI present — yes or no?’ It’s: ‘What kind of AKI is present, and how do I treat it?’ ”
 

‘Mediocre Markers’

AKI affects about 10%-30% of hospitalized patients, according to Nerenz. It’s associated with an increased risk for adverse outcomes, including post-AKI chronic kidney disease and a mortality rate of approximately 24%.

Currently, AKI is defined by a rapid increase in serum creatinine, a decrease in urine output, or both.

“Those are mediocre markers,” Nerenz said, as serum creatinine is not very sensitive to acute change, and the increase is often detected after the therapeutic window of intervention has passed. In addition, “it only tells us that the kidneys are unhappy; it doesn’t say anything about the cause.”

Urine output is limited as a marker because many conditions affect it. “If you’re dehydrated, urine output is going to decrease,” he said. “And in some forms of AKI, urine output actually goes up.”

What’s needed, he said, is a more sensitive biomarker that’s detectable within a shorter timeframe of 2-6 hours following injury.

“Right now, we’re looking at 48 hours before a change becomes apparent, and that’s just too long. Plus, it should be kidney specific. One of the major limitations of the biomarkers that have been evaluated to this point is that, yes, they’re released by the kidney, but they’re also released by other tissue types within the body, and that hinders their effectiveness as a marker.”
 

Neutrophil Gelatinase-Associated Lipocalin (NGAL)

Although research on better biomarkers is ongoing, “there’s also a recognition that some of the protein markers that have been around for a while, if used appropriately, can provide value,” Nerenz said. These include, among others, NGAL.

NGAL works well in pediatric patients without other comorbidities, but it has been less useful in adult patients because it is also released by other cell types. However, recent research suggests it shows promise in patients with both cirrhosis and AKI.

There are three main causes of AKI in cirrhosis, Nerenz explained. The first is prerenal and can be primarily addressed through rehydration.

“When these patients come in, clinicians won’t do anything right away other than provide fluids. If creatinine improves over the 48-hour period of fluid replenishment, then the patient is sent home because there really isn’t extensive damage to the kidneys.”

If improvement isn’t seen after those 48 hours, then it could be one of two things: Hepatorenal syndrome or acute tubular necrosis. Patients with hepatorenal syndrome are candidates for terlipressin, which the Food and Drug Administration (FDA) approved for this indication in 2022 after it displayed notable efficacy in a double-blind study.

“You don’t want to give terlipressin to just anybody because if the issue is not a diminished blood supply to the kidney, it’s not going to help, and comes with some serious side effects, such as respiratory failure,” Nerenz explained. “Having a biomarker that can distinguish between hepatorenal syndrome and acute tubular necrosis really helps clinicians confidently identify which patients are good candidates for this drug. Right now, we’re flying blind to a certain extent, basically using clinical intuition.”

Currently, the determination of NGAL is FDA cleared only for pediatric use. One way hospitals have dealt with that is by making the test in their own labs, using appropriate reagents, validation, and so forth. These tests are then safe for use in adults but haven’t gone through the FDA approval process.

However, the FDA’s recent announcement stating that the agency should oversee lab-developed tests has made this situation unclear, Nerenz said.

“At this point, we don’t know if there’s still an opportunity to take the NGAL test (or any other cleared biomarker) and validate it for use in a different patient population. Many hospital labs simply don’t have the resources to take these tests through the whole FDA approval process.”
 

 

 

A New Biomarker for AIN?

Meanwhile, research is also moving forward on a better biomarker for AIN, which is also under the AKI umbrella.

“It’s important to diagnose AIN because it has a very specific treatment,” Dennis G. Moledina, MD, PhD, Yale School of Medicine in New Haven, Connecticut, told this news organization.

“AIN is caused by a bunch of different medications, such as proton pump inhibitors, cancer drugs, nonsteroidal anti-inflammatory drugs, and antibiotics, so when someone has this condition, you have to stop potentially life-saving medications and give unnecessary and potentially toxic immunosuppressive drugs, like prednisone,” he said. “If you get the diagnosis wrong, you’re stopping vital drugs and giving immunosuppression for no reason. And if you miss the diagnosis, AIN can lead to permanent chronic kidney disease.”

“Right now, the only way to diagnose AIN is to do a kidney biopsy, which is risky because it can often lead to significant bleeding,” he said. “Some people can’t undergo a biopsy because they’re on medications that increase the risk of bleeding, and they can’t be stopped.”

Furthermore, he noted, “the longer a patient takes a drug that’s causing AIN without getting a diagnosis, the less the chances of recovery because the longer you let this kidney inflammation go on, the more fibrosis and permanent damage develops. So it is important to diagnose it as early as possible, and that’s again why we have a real need for a noninvasive biomarker that can be tested rapidly.”

Moledina and colleagues have been working on identifying a suitable biomarker for close to 10 years, the latest example of which is their 2023 study validating urinary CXCL9 as just such a marker.

“We’re most excited about CXCL9 because it’s already used to diagnose some other diseases in plasma,” Moledina said. “We think that we can convince labs to test it in urine.”

In an accompanying editorial, Mark Canney, PhD, and colleagues at the University of Ottawa and The Ottawa Hospital in Ontario, Canada, wrote that the CXCL9 study findings “are exciting because they provide a road map of where diagnostics can get to for this common, yet poorly identified and treated, cause of kidney damage. The need for a different approach can be readily identified from the fact that clinicians’ gestalt for diagnosing AIN was almost tantamount to tossing a coin (AUC, 0.57). CXCL9 alone outperformed not only the clinician’s prebiopsy suspicion but also an existing diagnostic model and other candidate biomarkers both in the discovery and external validation cohorts.”

Like NGAL, CXCL9 will have to go through the FDA approval process before it can be used for AIN. Therefore, it may be a few years before it can become routinely available, Moledina said.

Nevertheless, Nerenz added, “I think the next steps for AKI are probably continuing on this path of context-dependent, selective biomarker use. I anticipate that we’ll see ongoing development in this space, just expanding to a wider variety of clinical scenarios.”

Nerenz declared receiving research funding from Abbott Labs for evaluation of an AKI biomarker. Moledina is a co-inventor on a pending patent, “Methods and Systems for Diagnosis of Acute Interstitial Nephritis”; a cofounder of the diagnostics company Predict AIN; and a consultant for Biohaven.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Scurvy: A Diagnosis Still Relevant Today

Article Type
Changed
Wed, 11/13/2024 - 02:29

“Petechial rash often prompts further investigation into hematological, dermatological, or vasculitis causes. However, if the above investigations are negative and skin biopsy has not revealed a cause, there is a Renaissance-era diagnosis that is often overlooked but is easily investigated and treated,” wrote Andrew Dermawan, MD, and colleagues from Sir Charles Gairdner Hospital in Nedlands, Australia, in BMJ Case Reports. The diagnosis they highlight is scurvy, a disease that has faded from common medical concern but is reemerging, partly because of the rise in bariatric surgery.

Diagnosing Scurvy in the 2020s

In their article, Dermawan and colleagues present the case of a 50-year-old man with a bilateral petechial rash on his lower limbs, without any history of trauma. The patient, who exhibited no infectious symptoms, also had gross hematuria, microcytic anemia, mild neutropenia, and lymphopenia. Tests for autoimmune and hematological diseases were negative, as were abdominal and leg CT scans, ruling out abdominal hemorrhage and vasculitis. Additionally, a skin biopsy showed no causative findings.

The doctors noted that the patient had undergone sleeve gastrectomy, prompting them to inquire about his diet. They discovered that, because of financial difficulties, his diet primarily consisted of processed foods with little to no fruits or vegetables, and he had stopped taking supplements recommended by his gastroenterologist. Further tests revealed a vitamin D deficiency and a severe deficiency in vitamin C. With the diagnosis of scurvy confirmed, the doctors treated the patient with 1000 mg of ascorbic acid daily, along with cholecalciferol, folic acid, and a multivitamin complex, leading to a complete resolution of his symptoms.
 

Risk Factors Then and Now

Scurvy can present with a range of symptoms, including petechiae, perifollicular hemorrhage, ecchymosis, gingivitis, edema, anemia, delayed wound healing, malaise, weakness, joint swelling, arthralgia, anorexia, neuropathy, and vasomotor instability. It can cause mucosal and gastric hemorrhages, and if left untreated, it can lead to fatal bleeding.

Historically known as “sailors’ disease,” scurvy plagued men on long voyages who lacked access to fresh fruits or vegetables and thus did not get enough vitamin C. In 1747, James Lind, a British physician in the Royal Navy, demonstrated that the consumption of oranges and lemons could combat scurvy.

Today’s risk factors for scurvy include malnutrition, gastrointestinal disorders (eg, chronic inflammatory bowel diseases), alcohol and tobacco use, eating disorders, psychiatric illnesses, dialysis, and the use of medications that reduce the absorption of ascorbic acid (such as corticosteroids and proton pump inhibitors).

Scurvy remains more common among individuals with unfavorable socioeconomic conditions. The authors of the study emphasize how the rising cost of living — specifically in Australia but applicable elsewhere — is changing eating habits, leading to a high consumption of low-cost, nutritionally poor foods.

Poverty has always been a risk factor for scurvy, but today there may be an additional cause: bariatric surgery. Patients undergoing these procedures are at a risk for deficiencies in fat-soluble vitamins A, D, E, and K, and if their diet is inadequate, they may also experience a vitamin C deficiency. Awareness of this can facilitate the timely diagnosis of scurvy in these patients.

This story was translated from Univadis Italy using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Topics
Sections

“Petechial rash often prompts further investigation into hematological, dermatological, or vasculitis causes. However, if the above investigations are negative and skin biopsy has not revealed a cause, there is a Renaissance-era diagnosis that is often overlooked but is easily investigated and treated,” wrote Andrew Dermawan, MD, and colleagues from Sir Charles Gairdner Hospital in Nedlands, Australia, in BMJ Case Reports. The diagnosis they highlight is scurvy, a disease that has faded from common medical concern but is reemerging, partly because of the rise in bariatric surgery.

Diagnosing Scurvy in the 2020s

In their article, Dermawan and colleagues present the case of a 50-year-old man with a bilateral petechial rash on his lower limbs, without any history of trauma. The patient, who exhibited no infectious symptoms, also had gross hematuria, microcytic anemia, mild neutropenia, and lymphopenia. Tests for autoimmune and hematological diseases were negative, as were abdominal and leg CT scans, ruling out abdominal hemorrhage and vasculitis. Additionally, a skin biopsy showed no causative findings.

The doctors noted that the patient had undergone sleeve gastrectomy, prompting them to inquire about his diet. They discovered that, because of financial difficulties, his diet primarily consisted of processed foods with little to no fruits or vegetables, and he had stopped taking supplements recommended by his gastroenterologist. Further tests revealed a vitamin D deficiency and a severe deficiency in vitamin C. With the diagnosis of scurvy confirmed, the doctors treated the patient with 1000 mg of ascorbic acid daily, along with cholecalciferol, folic acid, and a multivitamin complex, leading to a complete resolution of his symptoms.
 

Risk Factors Then and Now

Scurvy can present with a range of symptoms, including petechiae, perifollicular hemorrhage, ecchymosis, gingivitis, edema, anemia, delayed wound healing, malaise, weakness, joint swelling, arthralgia, anorexia, neuropathy, and vasomotor instability. It can cause mucosal and gastric hemorrhages, and if left untreated, it can lead to fatal bleeding.

Historically known as “sailors’ disease,” scurvy plagued men on long voyages who lacked access to fresh fruits or vegetables and thus did not get enough vitamin C. In 1747, James Lind, a British physician in the Royal Navy, demonstrated that the consumption of oranges and lemons could combat scurvy.

Today’s risk factors for scurvy include malnutrition, gastrointestinal disorders (eg, chronic inflammatory bowel diseases), alcohol and tobacco use, eating disorders, psychiatric illnesses, dialysis, and the use of medications that reduce the absorption of ascorbic acid (such as corticosteroids and proton pump inhibitors).

Scurvy remains more common among individuals with unfavorable socioeconomic conditions. The authors of the study emphasize how the rising cost of living — specifically in Australia but applicable elsewhere — is changing eating habits, leading to a high consumption of low-cost, nutritionally poor foods.

Poverty has always been a risk factor for scurvy, but today there may be an additional cause: bariatric surgery. Patients undergoing these procedures are at a risk for deficiencies in fat-soluble vitamins A, D, E, and K, and if their diet is inadequate, they may also experience a vitamin C deficiency. Awareness of this can facilitate the timely diagnosis of scurvy in these patients.

This story was translated from Univadis Italy using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

“Petechial rash often prompts further investigation into hematological, dermatological, or vasculitis causes. However, if the above investigations are negative and skin biopsy has not revealed a cause, there is a Renaissance-era diagnosis that is often overlooked but is easily investigated and treated,” wrote Andrew Dermawan, MD, and colleagues from Sir Charles Gairdner Hospital in Nedlands, Australia, in BMJ Case Reports. The diagnosis they highlight is scurvy, a disease that has faded from common medical concern but is reemerging, partly because of the rise in bariatric surgery.

Diagnosing Scurvy in the 2020s

In their article, Dermawan and colleagues present the case of a 50-year-old man with a bilateral petechial rash on his lower limbs, without any history of trauma. The patient, who exhibited no infectious symptoms, also had gross hematuria, microcytic anemia, mild neutropenia, and lymphopenia. Tests for autoimmune and hematological diseases were negative, as were abdominal and leg CT scans, ruling out abdominal hemorrhage and vasculitis. Additionally, a skin biopsy showed no causative findings.

The doctors noted that the patient had undergone sleeve gastrectomy, prompting them to inquire about his diet. They discovered that, because of financial difficulties, his diet primarily consisted of processed foods with little to no fruits or vegetables, and he had stopped taking supplements recommended by his gastroenterologist. Further tests revealed a vitamin D deficiency and a severe deficiency in vitamin C. With the diagnosis of scurvy confirmed, the doctors treated the patient with 1000 mg of ascorbic acid daily, along with cholecalciferol, folic acid, and a multivitamin complex, leading to a complete resolution of his symptoms.
 

Risk Factors Then and Now

Scurvy can present with a range of symptoms, including petechiae, perifollicular hemorrhage, ecchymosis, gingivitis, edema, anemia, delayed wound healing, malaise, weakness, joint swelling, arthralgia, anorexia, neuropathy, and vasomotor instability. It can cause mucosal and gastric hemorrhages, and if left untreated, it can lead to fatal bleeding.

Historically known as “sailors’ disease,” scurvy plagued men on long voyages who lacked access to fresh fruits or vegetables and thus did not get enough vitamin C. In 1747, James Lind, a British physician in the Royal Navy, demonstrated that the consumption of oranges and lemons could combat scurvy.

Today’s risk factors for scurvy include malnutrition, gastrointestinal disorders (eg, chronic inflammatory bowel diseases), alcohol and tobacco use, eating disorders, psychiatric illnesses, dialysis, and the use of medications that reduce the absorption of ascorbic acid (such as corticosteroids and proton pump inhibitors).

Scurvy remains more common among individuals with unfavorable socioeconomic conditions. The authors of the study emphasize how the rising cost of living — specifically in Australia but applicable elsewhere — is changing eating habits, leading to a high consumption of low-cost, nutritionally poor foods.

Poverty has always been a risk factor for scurvy, but today there may be an additional cause: bariatric surgery. Patients undergoing these procedures are at a risk for deficiencies in fat-soluble vitamins A, D, E, and K, and if their diet is inadequate, they may also experience a vitamin C deficiency. Awareness of this can facilitate the timely diagnosis of scurvy in these patients.

This story was translated from Univadis Italy using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

On Second Thought: Aspirin for Primary Prevention — What We Really Know

Article Type
Changed
Wed, 11/13/2024 - 02:26

This transcript has been edited for clarity

Aspirin. Once upon a time, everybody over age 50 years was supposed to take a baby aspirin. Now we make it a point to tell people to stop. What is going on?  

Our recommendations vis-à-vis aspirin have evolved at a dizzying pace. The young’uns watching us right now don’t know what things were like in the 1980s. The Reagan era was a wild, heady time where nuclear war was imminent and we didn’t prescribe aspirin to patients. 

That only started in 1988, which was a banner year in human history. Not because a number of doves were incinerated by the lighting of the Olympic torch at the Seoul Olympics — look it up if you don’t know what I’m talking about — but because 1988 saw the publication of the ISIS-2 trial, which first showed a mortality benefit to prescribing aspirin post–myocardial infarction (MI).

Giving patients aspirin during or after a heart attack is not controversial. It’s one of the few things in this business that isn’t, but that’s secondary prevention — treating somebody after they develop a disease. Primary prevention, treating them before they have their incident event, is a very different ballgame. Here, things are messy. 

For one thing, the doses used have been very inconsistent. We should point out that the reason for 81 mg of aspirin is very arbitrary and is rooted in the old apothecary system of weights and measurements. A standard dose of aspirin was 5 grains, where 20 grains made 1 scruple, 3 scruples made 1 dram, 8 drams made 1 oz, and 12 oz made 1 lb - because screw you, metric system. Therefore, 5 grains was 325 mg of aspirin, and 1 quarter of the standard dose became 81 mg if you rounded out the decimal. 

People have tried all kinds of dosing structures with aspirin prophylaxis. The Physicians’ Health Study used a full-dose aspirin, 325 mg every 2 days, while the Hypertension Optimal Treatment (HOT) trial tested 75 mg daily and the Women’s Health Study tested 100 mg, but every other day. 

Ironically, almost no one has studied 81 mg every day, which is weird if you think about it. The bigger problem here is not the variability of doses used, but the discrepancy when you look at older vs newer studies.

Older studies, like the Physicians’ Health Study, did show a benefit, at least in the subgroup of patients over age 50 years, which is probably where the “everybody over 50 should be taking an aspirin” idea comes from, at least as near as I can tell. 

More recent studies, like the Women’s Health Study, ASPREE, or ASPIRE, didn’t show a benefit. I know what you’re thinking: Newer stuff is always better. That’s why you should never trust anybody over age 40 years. The context of primary prevention studies has changed. In the ‘80s and ‘90s, people smoked more and we didn’t have the same medications that we have today. We talked about all this in the beta-blocker video to explain why beta-blockers don’t seem to have a benefit post MI.

We have a similar issue here. The magnitude of the benefit with aspirin primary prevention has decreased because we’re all just healthier overall. So, yay! Progress! Here’s where the numbers matter. No one is saying that aspirin doesn’t help. It does. 

If we look at the 2019 meta-analysis published in JAMA, there is a cardiovascular benefit. The numbers bear that out. I know you’re all here for the math, so here we go. Aspirin reduced the composite cardiovascular endpoint from 65.2 to 60.2 events per 10,000 patient-years; or to put it more meaningfully in absolute risk reduction terms, because that’s my jam, an absolute risk reduction of 0.41%, which means a number needed to treat of 241, which is okay-ish. It’s not super-great, but it may be justifiable for something that costs next to nothing. 

The tradeoff is bleeding. Major bleeding increased from 16.4 to 23.1 bleeds per 10,000 patient-years, or an absolute risk increase of 0.47%, which is a number needed to harm of 210. That’s the problem. Aspirin does prevent heart disease. The benefit is small, for sure, but the real problem is that it’s outweighed by the risk of bleeding, so you’re not really coming out ahead. 

The real tragedy here is that the public is locked into this idea of everyone over age 50 years should be taking an aspirin. Even today, even though guidelines have recommended against aspirin for primary prevention for some time, data from the National Health Interview Survey sample found that nearly one in three older adults take aspirin for primary prevention when they shouldn’t be. That’s a large number of people. That’s millions of Americans — and Canadians, but nobody cares about us. It’s fine. 

That’s the point. We’re not debunking aspirin. It does work. The benefits are just really small in a primary prevention population and offset by the admittedly also really small risks of bleeding. It’s a tradeoff that doesn’t really work in your favor.

But that’s aspirin for cardiovascular disease. When it comes to cancer or DVT prophylaxis, that’s another really interesting story. We might have to save that for another time. Do I know how to tease a sequel or what?

Labos, a cardiologist at Kirkland Medical Center, Montreal, Quebec, Canada, has disclosed no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

This transcript has been edited for clarity

Aspirin. Once upon a time, everybody over age 50 years was supposed to take a baby aspirin. Now we make it a point to tell people to stop. What is going on?  

Our recommendations vis-à-vis aspirin have evolved at a dizzying pace. The young’uns watching us right now don’t know what things were like in the 1980s. The Reagan era was a wild, heady time where nuclear war was imminent and we didn’t prescribe aspirin to patients. 

That only started in 1988, which was a banner year in human history. Not because a number of doves were incinerated by the lighting of the Olympic torch at the Seoul Olympics — look it up if you don’t know what I’m talking about — but because 1988 saw the publication of the ISIS-2 trial, which first showed a mortality benefit to prescribing aspirin post–myocardial infarction (MI).

Giving patients aspirin during or after a heart attack is not controversial. It’s one of the few things in this business that isn’t, but that’s secondary prevention — treating somebody after they develop a disease. Primary prevention, treating them before they have their incident event, is a very different ballgame. Here, things are messy. 

For one thing, the doses used have been very inconsistent. We should point out that the reason for 81 mg of aspirin is very arbitrary and is rooted in the old apothecary system of weights and measurements. A standard dose of aspirin was 5 grains, where 20 grains made 1 scruple, 3 scruples made 1 dram, 8 drams made 1 oz, and 12 oz made 1 lb - because screw you, metric system. Therefore, 5 grains was 325 mg of aspirin, and 1 quarter of the standard dose became 81 mg if you rounded out the decimal. 

People have tried all kinds of dosing structures with aspirin prophylaxis. The Physicians’ Health Study used a full-dose aspirin, 325 mg every 2 days, while the Hypertension Optimal Treatment (HOT) trial tested 75 mg daily and the Women’s Health Study tested 100 mg, but every other day. 

Ironically, almost no one has studied 81 mg every day, which is weird if you think about it. The bigger problem here is not the variability of doses used, but the discrepancy when you look at older vs newer studies.

Older studies, like the Physicians’ Health Study, did show a benefit, at least in the subgroup of patients over age 50 years, which is probably where the “everybody over 50 should be taking an aspirin” idea comes from, at least as near as I can tell. 

More recent studies, like the Women’s Health Study, ASPREE, or ASPIRE, didn’t show a benefit. I know what you’re thinking: Newer stuff is always better. That’s why you should never trust anybody over age 40 years. The context of primary prevention studies has changed. In the ‘80s and ‘90s, people smoked more and we didn’t have the same medications that we have today. We talked about all this in the beta-blocker video to explain why beta-blockers don’t seem to have a benefit post MI.

We have a similar issue here. The magnitude of the benefit with aspirin primary prevention has decreased because we’re all just healthier overall. So, yay! Progress! Here’s where the numbers matter. No one is saying that aspirin doesn’t help. It does. 

If we look at the 2019 meta-analysis published in JAMA, there is a cardiovascular benefit. The numbers bear that out. I know you’re all here for the math, so here we go. Aspirin reduced the composite cardiovascular endpoint from 65.2 to 60.2 events per 10,000 patient-years; or to put it more meaningfully in absolute risk reduction terms, because that’s my jam, an absolute risk reduction of 0.41%, which means a number needed to treat of 241, which is okay-ish. It’s not super-great, but it may be justifiable for something that costs next to nothing. 

The tradeoff is bleeding. Major bleeding increased from 16.4 to 23.1 bleeds per 10,000 patient-years, or an absolute risk increase of 0.47%, which is a number needed to harm of 210. That’s the problem. Aspirin does prevent heart disease. The benefit is small, for sure, but the real problem is that it’s outweighed by the risk of bleeding, so you’re not really coming out ahead. 

The real tragedy here is that the public is locked into this idea of everyone over age 50 years should be taking an aspirin. Even today, even though guidelines have recommended against aspirin for primary prevention for some time, data from the National Health Interview Survey sample found that nearly one in three older adults take aspirin for primary prevention when they shouldn’t be. That’s a large number of people. That’s millions of Americans — and Canadians, but nobody cares about us. It’s fine. 

That’s the point. We’re not debunking aspirin. It does work. The benefits are just really small in a primary prevention population and offset by the admittedly also really small risks of bleeding. It’s a tradeoff that doesn’t really work in your favor.

But that’s aspirin for cardiovascular disease. When it comes to cancer or DVT prophylaxis, that’s another really interesting story. We might have to save that for another time. Do I know how to tease a sequel or what?

Labos, a cardiologist at Kirkland Medical Center, Montreal, Quebec, Canada, has disclosed no relevant financial relationships.

A version of this article appeared on Medscape.com.

This transcript has been edited for clarity

Aspirin. Once upon a time, everybody over age 50 years was supposed to take a baby aspirin. Now we make it a point to tell people to stop. What is going on?  

Our recommendations vis-à-vis aspirin have evolved at a dizzying pace. The young’uns watching us right now don’t know what things were like in the 1980s. The Reagan era was a wild, heady time where nuclear war was imminent and we didn’t prescribe aspirin to patients. 

That only started in 1988, which was a banner year in human history. Not because a number of doves were incinerated by the lighting of the Olympic torch at the Seoul Olympics — look it up if you don’t know what I’m talking about — but because 1988 saw the publication of the ISIS-2 trial, which first showed a mortality benefit to prescribing aspirin post–myocardial infarction (MI).

Giving patients aspirin during or after a heart attack is not controversial. It’s one of the few things in this business that isn’t, but that’s secondary prevention — treating somebody after they develop a disease. Primary prevention, treating them before they have their incident event, is a very different ballgame. Here, things are messy. 

For one thing, the doses used have been very inconsistent. We should point out that the reason for 81 mg of aspirin is very arbitrary and is rooted in the old apothecary system of weights and measurements. A standard dose of aspirin was 5 grains, where 20 grains made 1 scruple, 3 scruples made 1 dram, 8 drams made 1 oz, and 12 oz made 1 lb - because screw you, metric system. Therefore, 5 grains was 325 mg of aspirin, and 1 quarter of the standard dose became 81 mg if you rounded out the decimal. 

People have tried all kinds of dosing structures with aspirin prophylaxis. The Physicians’ Health Study used a full-dose aspirin, 325 mg every 2 days, while the Hypertension Optimal Treatment (HOT) trial tested 75 mg daily and the Women’s Health Study tested 100 mg, but every other day. 

Ironically, almost no one has studied 81 mg every day, which is weird if you think about it. The bigger problem here is not the variability of doses used, but the discrepancy when you look at older vs newer studies.

Older studies, like the Physicians’ Health Study, did show a benefit, at least in the subgroup of patients over age 50 years, which is probably where the “everybody over 50 should be taking an aspirin” idea comes from, at least as near as I can tell. 

More recent studies, like the Women’s Health Study, ASPREE, or ASPIRE, didn’t show a benefit. I know what you’re thinking: Newer stuff is always better. That’s why you should never trust anybody over age 40 years. The context of primary prevention studies has changed. In the ‘80s and ‘90s, people smoked more and we didn’t have the same medications that we have today. We talked about all this in the beta-blocker video to explain why beta-blockers don’t seem to have a benefit post MI.

We have a similar issue here. The magnitude of the benefit with aspirin primary prevention has decreased because we’re all just healthier overall. So, yay! Progress! Here’s where the numbers matter. No one is saying that aspirin doesn’t help. It does. 

If we look at the 2019 meta-analysis published in JAMA, there is a cardiovascular benefit. The numbers bear that out. I know you’re all here for the math, so here we go. Aspirin reduced the composite cardiovascular endpoint from 65.2 to 60.2 events per 10,000 patient-years; or to put it more meaningfully in absolute risk reduction terms, because that’s my jam, an absolute risk reduction of 0.41%, which means a number needed to treat of 241, which is okay-ish. It’s not super-great, but it may be justifiable for something that costs next to nothing. 

The tradeoff is bleeding. Major bleeding increased from 16.4 to 23.1 bleeds per 10,000 patient-years, or an absolute risk increase of 0.47%, which is a number needed to harm of 210. That’s the problem. Aspirin does prevent heart disease. The benefit is small, for sure, but the real problem is that it’s outweighed by the risk of bleeding, so you’re not really coming out ahead. 

The real tragedy here is that the public is locked into this idea of everyone over age 50 years should be taking an aspirin. Even today, even though guidelines have recommended against aspirin for primary prevention for some time, data from the National Health Interview Survey sample found that nearly one in three older adults take aspirin for primary prevention when they shouldn’t be. That’s a large number of people. That’s millions of Americans — and Canadians, but nobody cares about us. It’s fine. 

That’s the point. We’re not debunking aspirin. It does work. The benefits are just really small in a primary prevention population and offset by the admittedly also really small risks of bleeding. It’s a tradeoff that doesn’t really work in your favor.

But that’s aspirin for cardiovascular disease. When it comes to cancer or DVT prophylaxis, that’s another really interesting story. We might have to save that for another time. Do I know how to tease a sequel or what?

Labos, a cardiologist at Kirkland Medical Center, Montreal, Quebec, Canada, has disclosed no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

IBS: Understanding a Common Yet Misunderstood Condition

Article Type
Changed
Wed, 11/13/2024 - 02:23

Irritable bowel syndrome (IBS) is one of the most common conditions encountered by both primary care providers and gastroenterologists, with a pooled global prevalence of 11.2%. This functional bowel disorder is characterized by abdominal pain or discomfort, diarrhea and/or constipation, and bloating.

Unfortunately, IBS is often misunderstood or minimized by some healthcare professionals, according to Alan Desmond, MB, consultant in gastroenterology and general internal medicine, Torbay Hospital, UK National Health Service.

Desmond regularly sees patients who either haven’t been accurately diagnosed or have been told, “Don’t worry, it’s ‘just’ irritable bowel syndrome,” he said at the recent International Conference on Nutrition in Medicine.

A 2017 study involving nearly 2000 patients with a history of gastrointestinal (GI) symptoms found that 43.1% of those who met the criteria for IBS were undiagnosed, and among those who were diagnosed, 26% were not receiving treatment.

“Many clinicians vastly underestimate the impact functional GI symptoms have on our patients in lack of productivity, becoming homebound or losing employment, the inability to enjoy a meal with friends or family, and always needing to know where the nearest bathroom is, for example,” Desmond said in an interview.

IBS can profoundly affect patients’ mental health. One study found that 38% of patients with IBS attending a tertiary care clinic contemplated suicide because they felt hopeless about ever achieving symptom relief.

Today, several dietary, pharmacologic, and psychological/behavioral approaches are available to treat patients with IBS, noted William D. Chey, MD, AGAF, chief of the Division of Gastroenterology and Hepatology, University of Michigan, Ann Arbor, Michigan.

“Each individual patient may need a different combination of these foundational treatments,” he said. “One size doesn’t fit all.”
 

Diagnostic Pathway

One reason IBS is so hard to diagnose is that it’s a “symptom-based disorder, with identification of the condition predicated upon certain key characteristics that are heterogeneous,” Chey said in an interview. “IBS in patient ‘A’ may not present the same way as IBS in patient ‘B,’ although there are certain foundational common characteristics.”

IBS involves “abnormalities in the motility and contractility of the GI tract,” he said. It can present with diarrhea (IBS-D), constipation (IBS-C), or a mixture or alternation of diarrhea and constipation (IBS-M).

Patients with IBS-D often have an exaggerated gastro-colonic response, while those with IBS-C often have a blunted response.

Beyond stool abnormalities and abdominal pain/discomfort, patients often report bloating/distension, low backache, lethargy, nausea, thigh pain, and urinary and gynecologic symptoms.

Historically, IBS has been regarded as a “diagnosis of exclusion” because classic diagnostic tests typically yield no concrete findings. Desmond noted that several blood tests, procedures, imaging studies, and other tests are available to rule out other organic GI conditions, as outlined in the Table.

Tests to rule out other organic GI conditions


If the patient comes from a geographical region where giardia is endemic, clinicians also should consider testing for the parasite, Chey said.
 

New Understanding of IBS Etiology

Now, advances in the understanding of IBS are changing the approach to the disease.

“The field is moving away from seeing IBS as a ‘wastebasket diagnosis,’ recognizing that there are other causes of a patient’s symptoms,” Mark Pimentel, MD, associate professor of medicine and gastroenterology, Cedars-Sinai, Los Angeles, said in an interview. “What’s made IBS so difficult to diagnose has been the absence of biological markers and hallmark findings on endoscopy.”

Recent research points to novel bacterial causes as culprits in the development of IBS. In particular, altered small bowel microbiota can be triggered by acute gastroenteritis.

Food poisoning can trigger the onset of IBS — a phenomenon called “postinfectious IBS (PI-IBS),” said Pimentel, who is also executive director of the Medically Associated Science and Technology Program at Cedars-Sinai. PI-IBS almost always takes the form of IBS-D, with up to 60% of patients with IBS-D suffering the long-term sequelae of food poisoning.

The types of bacteria most commonly associated with gastroenteritis are Shigella, Campylobacter, Salmonella, and Escherichia coli, Pimentel said. All of them release cytolethal distending toxin B (CdtB), causing the body to produce antibodies to the toxin.

CdtB resembles vinculin, a naturally occurring protein critical for healthy gut function. “Because of this molecular resemblance, the immune system often mistakes one for the other, producing anti-vinculin,” Pimentel explained.

This autoimmune response leads to disruptions in the gut microbiome, ultimately resulting in PI-IBS. The chain of events “doesn’t necessarily happen immediately,” Pimentel said. “You might have developed food poisoning at a party weeks or months ago.”

Acute gastroenteritis is common, affecting as many as 179 million people in the United States annually. A meta-analysis of 47 studies, incorporating 28,270 patients, found that those who had experienced acute gastroenteritis had a fourfold higher risk of developing IBS compared with nonexposed controls.

“The problem isn’t only the IBS itself, but the fact that people with PI-IBS are four times as likely to contract food poisoning again, which can further exacerbate IBS symptoms,” Pimentel said.

Diarrhea-predominant IBS can be detected through the presence of two blood biomarkers — anti-CdtB and anti-vinculin — in a blood test developed by Pimentel and his group.

“Elevation in either of these biomarkers establishes the diagnosis,” Pimentel said. “This is a breakthrough because it represents the first test that can make IBS a ‘diagnosis of inclusion.’”

The blood test also can identify IBS-M but not IBS-C.

Pimentel said that IBS-C is associated with increased levels of methanogenic archaea, which can be diagnosed by a positive methane breath test. “Methane gas slows intestinal contractility, which might result in constipation,” he said.
 

 

 

Diet as a Treatment Option

Diet is usually the starting point for IBS treatment, Chey said. “The standard dietary recommendations, as defined by the National Institute for Health and Care Excellence Guidance for managing IBS, are reasonable and common sense — eating three meals a day, avoiding carbonated beverages, excess alcohol, and excess caffeine, and avoiding hard-to-digest foods that can be gas producing.”

A diet low in fermentable oligosaccharides, disaccharides, monosaccharides and polyols (FODMAPs), which are carbohydrates that aren’t completely absorbed in the intestines, has been shown to be effective in alleviating GI distress in as many as 86% of patients with IBS, leading to improvements in overall GI symptoms as well as individual symptoms (eg, abdominal pain, bloating, constipation, diarrhea, and flatulence).

Desmond recommends the low FODMAP program delineated by Monash University in Australia. The diet should be undertaken only under the supervision of a dietitian, he warned. Moreover, following it on a long-term basis can have an adverse impact on dietary quality and the gut microbiome. Therefore, “it’s important to embark on stepwise reintroduction of FODMAPS under supervision to find acceptable thresholds that don’t cause a return of symptoms.”

A growing body of research suggests that following the Mediterranean diet can be helpful in reducing IBS symptoms. Chey said that some patients who tend to over-restrict their eating might benefit from a less restrictive diet than the typical low FODMAPs diet. For them, the Mediterranean diet may be a good option.
 

Pharmacotherapy for IBS

Nutritional approaches aren’t for everyone, Chey noted. “Some people don’t want to be on a highly restricted diet.” For them, medications addressing symptoms might be a better option.

Antispasmodics — either anticholinergics (hyoscine and dicyclomine) or smooth muscle relaxants (alverine, mebeverine, and peppermint oil) — can be helpful, although they can worsen constipation in a dose-dependent manner. It is advisable to use them on an as-needed rather than long-term basis.

Antidiarrheal agents include loperamide and diphenoxylate.

For constipation, laxatives (eg, senna, bisacodyl, polyethylene glycol, and sodium picosulfate) can be helpful.

Desmond noted that the American Gastroenterological Association does not recommend routine use of probiotics for most GI disorders, including IBS. Exceptions include prevention of Clostridioides difficile, ulcerative colitis, and pouchitis.
 

Targeting the Gut-Brain Relationship

Stress plays a role in exacerbating symptoms in patients with IBS and is an important target for intervention.

“If patients are living with a level of stress that’s impairing, we won’t be able to solve their gut issues until we resolve their stress issues,” Desmond said. “We need to calm the gut-microbiome-brain axis, which is multidimensional and bidirectional.”

Many people — even those without IBS — experience queasiness or diarrhea prior to a major event they’re nervous about, Chey noted. These events activate the brain, which activates the nervous system, which interacts with the GI tract. Indeed, IBS is now recognized as a disorder of gut-brain interaction, he said.

“We now know that the microbiome in the GI tract influences cognition and emotional function, depression, and anxiety. One might say that the gut is the ‘center of the universe’ to human beings,” Chey said.

Evidence-based psychological approaches for stress reduction in patients with IBS include cognitive behavioral therapy, specifically tailored to helping the patient identify associations between IBS symptoms and thoughts, emotions, and actions, as well as learning new behaviors and engaging in stress management. Psychodynamic (interpersonal) therapy enables patients to understand the connection between GI symptoms and interpersonal conflicts, emotional factors, or relationship difficulties.

Gut-directed hypnotherapy (GDH) is a “proven modality for IBS,” Desmond said. Unlike other forms of hypnotherapy, GDH focuses specifically on controlling and normalizing GI function. Studies have shown a reduction of ≥ 30% in abdominal pain in two thirds of participants, with overall response rates up to 85%. It can be delivered in an individual or group setting or via a smartphone.

Desmond recommends mindfulness-based therapy (MBT) for IBS. MBT focuses on the “cultivation of mindfulness, defined as intentional, nonjudgmental, present-focused awareness.” It has been found effective in reducing flares and the markers of gut inflammation in ulcerative colitis, as well as reducing symptoms of IBS.

Chey noted that an emerging body of literature supports the potential role of acupuncture in treating IBS, and his clinic employs it. “I would like to see further research into other areas of CAM [complementary and alternative medicine], including herbal approaches to IBS symptoms as well as stress.”

Finally, all the experts agree that more research is needed.

“The real tragedy is that the NIH invests next to nothing in IBS, in contrast to inflammatory bowel disease and many other conditions,” Pimentel said. “Yet IBS is 45 times more common than inflammatory bowel disease.”

Pimentel hopes that with enough advocacy and recognition that IBS isn’t “just stress-related,” more resources will be devoted to understanding this debilitating condition.

Desmond is the author of a book on the benefits of a plant-based diet. He has also received honoraria, speaking, and consultancy fees from the European Space Agency, Dyson Institute of Engineering and Technology, Riverford Organic Farmers, Ltd., Salesforce Inc., Sentara Healthcare, Saudi Sports for All Federation, the Physicians Committee for Responsible Medicine, The Plantrician Project, Doctors for Nutrition, and The Happy Pear.

Pimentel is a consultant for Bausch Health, Ferring Pharmaceuticals, and Ardelyx. He holds equity in and is also a consultant for Dieta Health, Salvo Health, Cylinder Health, and Gemelli Biotech. Cedars-Sinai has a licensing agreement with Gemelli Biotech and Hobbs Medical.

Chey is a consultant to AbbVie, Ardelyx, Atmo, Biomerica, Gemelli Biotech, Ironwood Pharmaceuticals, Nestlé, QOL Medical, Phathom Pharmaceuticals, Redhill, Salix/Valeant, Takeda, and Vibrant. He receives grant/research funding from Commonwealth Diagnostics International, Inc., US Food and Drug Administration, National Institutes of Health, QOL Medical, and Salix/Valeant. He holds stock options in Coprata, Dieta Health, Evinature, FoodMarble, Kiwi Biosciences, and ModifyHealth. He is a board or advisory panel member of the American College of Gastroenterology, GI Health Foundation, International Foundation for Gastrointestinal Disorders, Rome. He holds patents on My Nutrition Health, Digital Manometry, and Rectal Expulsion Device.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Irritable bowel syndrome (IBS) is one of the most common conditions encountered by both primary care providers and gastroenterologists, with a pooled global prevalence of 11.2%. This functional bowel disorder is characterized by abdominal pain or discomfort, diarrhea and/or constipation, and bloating.

Unfortunately, IBS is often misunderstood or minimized by some healthcare professionals, according to Alan Desmond, MB, consultant in gastroenterology and general internal medicine, Torbay Hospital, UK National Health Service.

Desmond regularly sees patients who either haven’t been accurately diagnosed or have been told, “Don’t worry, it’s ‘just’ irritable bowel syndrome,” he said at the recent International Conference on Nutrition in Medicine.

A 2017 study involving nearly 2000 patients with a history of gastrointestinal (GI) symptoms found that 43.1% of those who met the criteria for IBS were undiagnosed, and among those who were diagnosed, 26% were not receiving treatment.

“Many clinicians vastly underestimate the impact functional GI symptoms have on our patients in lack of productivity, becoming homebound or losing employment, the inability to enjoy a meal with friends or family, and always needing to know where the nearest bathroom is, for example,” Desmond said in an interview.

IBS can profoundly affect patients’ mental health. One study found that 38% of patients with IBS attending a tertiary care clinic contemplated suicide because they felt hopeless about ever achieving symptom relief.

Today, several dietary, pharmacologic, and psychological/behavioral approaches are available to treat patients with IBS, noted William D. Chey, MD, AGAF, chief of the Division of Gastroenterology and Hepatology, University of Michigan, Ann Arbor, Michigan.

“Each individual patient may need a different combination of these foundational treatments,” he said. “One size doesn’t fit all.”
 

Diagnostic Pathway

One reason IBS is so hard to diagnose is that it’s a “symptom-based disorder, with identification of the condition predicated upon certain key characteristics that are heterogeneous,” Chey said in an interview. “IBS in patient ‘A’ may not present the same way as IBS in patient ‘B,’ although there are certain foundational common characteristics.”

IBS involves “abnormalities in the motility and contractility of the GI tract,” he said. It can present with diarrhea (IBS-D), constipation (IBS-C), or a mixture or alternation of diarrhea and constipation (IBS-M).

Patients with IBS-D often have an exaggerated gastro-colonic response, while those with IBS-C often have a blunted response.

Beyond stool abnormalities and abdominal pain/discomfort, patients often report bloating/distension, low backache, lethargy, nausea, thigh pain, and urinary and gynecologic symptoms.

Historically, IBS has been regarded as a “diagnosis of exclusion” because classic diagnostic tests typically yield no concrete findings. Desmond noted that several blood tests, procedures, imaging studies, and other tests are available to rule out other organic GI conditions, as outlined in the Table.

Tests to rule out other organic GI conditions


If the patient comes from a geographical region where giardia is endemic, clinicians also should consider testing for the parasite, Chey said.
 

New Understanding of IBS Etiology

Now, advances in the understanding of IBS are changing the approach to the disease.

“The field is moving away from seeing IBS as a ‘wastebasket diagnosis,’ recognizing that there are other causes of a patient’s symptoms,” Mark Pimentel, MD, associate professor of medicine and gastroenterology, Cedars-Sinai, Los Angeles, said in an interview. “What’s made IBS so difficult to diagnose has been the absence of biological markers and hallmark findings on endoscopy.”

Recent research points to novel bacterial causes as culprits in the development of IBS. In particular, altered small bowel microbiota can be triggered by acute gastroenteritis.

Food poisoning can trigger the onset of IBS — a phenomenon called “postinfectious IBS (PI-IBS),” said Pimentel, who is also executive director of the Medically Associated Science and Technology Program at Cedars-Sinai. PI-IBS almost always takes the form of IBS-D, with up to 60% of patients with IBS-D suffering the long-term sequelae of food poisoning.

The types of bacteria most commonly associated with gastroenteritis are Shigella, Campylobacter, Salmonella, and Escherichia coli, Pimentel said. All of them release cytolethal distending toxin B (CdtB), causing the body to produce antibodies to the toxin.

CdtB resembles vinculin, a naturally occurring protein critical for healthy gut function. “Because of this molecular resemblance, the immune system often mistakes one for the other, producing anti-vinculin,” Pimentel explained.

This autoimmune response leads to disruptions in the gut microbiome, ultimately resulting in PI-IBS. The chain of events “doesn’t necessarily happen immediately,” Pimentel said. “You might have developed food poisoning at a party weeks or months ago.”

Acute gastroenteritis is common, affecting as many as 179 million people in the United States annually. A meta-analysis of 47 studies, incorporating 28,270 patients, found that those who had experienced acute gastroenteritis had a fourfold higher risk of developing IBS compared with nonexposed controls.

“The problem isn’t only the IBS itself, but the fact that people with PI-IBS are four times as likely to contract food poisoning again, which can further exacerbate IBS symptoms,” Pimentel said.

Diarrhea-predominant IBS can be detected through the presence of two blood biomarkers — anti-CdtB and anti-vinculin — in a blood test developed by Pimentel and his group.

“Elevation in either of these biomarkers establishes the diagnosis,” Pimentel said. “This is a breakthrough because it represents the first test that can make IBS a ‘diagnosis of inclusion.’”

The blood test also can identify IBS-M but not IBS-C.

Pimentel said that IBS-C is associated with increased levels of methanogenic archaea, which can be diagnosed by a positive methane breath test. “Methane gas slows intestinal contractility, which might result in constipation,” he said.
 

 

 

Diet as a Treatment Option

Diet is usually the starting point for IBS treatment, Chey said. “The standard dietary recommendations, as defined by the National Institute for Health and Care Excellence Guidance for managing IBS, are reasonable and common sense — eating three meals a day, avoiding carbonated beverages, excess alcohol, and excess caffeine, and avoiding hard-to-digest foods that can be gas producing.”

A diet low in fermentable oligosaccharides, disaccharides, monosaccharides and polyols (FODMAPs), which are carbohydrates that aren’t completely absorbed in the intestines, has been shown to be effective in alleviating GI distress in as many as 86% of patients with IBS, leading to improvements in overall GI symptoms as well as individual symptoms (eg, abdominal pain, bloating, constipation, diarrhea, and flatulence).

Desmond recommends the low FODMAP program delineated by Monash University in Australia. The diet should be undertaken only under the supervision of a dietitian, he warned. Moreover, following it on a long-term basis can have an adverse impact on dietary quality and the gut microbiome. Therefore, “it’s important to embark on stepwise reintroduction of FODMAPS under supervision to find acceptable thresholds that don’t cause a return of symptoms.”

A growing body of research suggests that following the Mediterranean diet can be helpful in reducing IBS symptoms. Chey said that some patients who tend to over-restrict their eating might benefit from a less restrictive diet than the typical low FODMAPs diet. For them, the Mediterranean diet may be a good option.
 

Pharmacotherapy for IBS

Nutritional approaches aren’t for everyone, Chey noted. “Some people don’t want to be on a highly restricted diet.” For them, medications addressing symptoms might be a better option.

Antispasmodics — either anticholinergics (hyoscine and dicyclomine) or smooth muscle relaxants (alverine, mebeverine, and peppermint oil) — can be helpful, although they can worsen constipation in a dose-dependent manner. It is advisable to use them on an as-needed rather than long-term basis.

Antidiarrheal agents include loperamide and diphenoxylate.

For constipation, laxatives (eg, senna, bisacodyl, polyethylene glycol, and sodium picosulfate) can be helpful.

Desmond noted that the American Gastroenterological Association does not recommend routine use of probiotics for most GI disorders, including IBS. Exceptions include prevention of Clostridioides difficile, ulcerative colitis, and pouchitis.
 

Targeting the Gut-Brain Relationship

Stress plays a role in exacerbating symptoms in patients with IBS and is an important target for intervention.

“If patients are living with a level of stress that’s impairing, we won’t be able to solve their gut issues until we resolve their stress issues,” Desmond said. “We need to calm the gut-microbiome-brain axis, which is multidimensional and bidirectional.”

Many people — even those without IBS — experience queasiness or diarrhea prior to a major event they’re nervous about, Chey noted. These events activate the brain, which activates the nervous system, which interacts with the GI tract. Indeed, IBS is now recognized as a disorder of gut-brain interaction, he said.

“We now know that the microbiome in the GI tract influences cognition and emotional function, depression, and anxiety. One might say that the gut is the ‘center of the universe’ to human beings,” Chey said.

Evidence-based psychological approaches for stress reduction in patients with IBS include cognitive behavioral therapy, specifically tailored to helping the patient identify associations between IBS symptoms and thoughts, emotions, and actions, as well as learning new behaviors and engaging in stress management. Psychodynamic (interpersonal) therapy enables patients to understand the connection between GI symptoms and interpersonal conflicts, emotional factors, or relationship difficulties.

Gut-directed hypnotherapy (GDH) is a “proven modality for IBS,” Desmond said. Unlike other forms of hypnotherapy, GDH focuses specifically on controlling and normalizing GI function. Studies have shown a reduction of ≥ 30% in abdominal pain in two thirds of participants, with overall response rates up to 85%. It can be delivered in an individual or group setting or via a smartphone.

Desmond recommends mindfulness-based therapy (MBT) for IBS. MBT focuses on the “cultivation of mindfulness, defined as intentional, nonjudgmental, present-focused awareness.” It has been found effective in reducing flares and the markers of gut inflammation in ulcerative colitis, as well as reducing symptoms of IBS.

Chey noted that an emerging body of literature supports the potential role of acupuncture in treating IBS, and his clinic employs it. “I would like to see further research into other areas of CAM [complementary and alternative medicine], including herbal approaches to IBS symptoms as well as stress.”

Finally, all the experts agree that more research is needed.

“The real tragedy is that the NIH invests next to nothing in IBS, in contrast to inflammatory bowel disease and many other conditions,” Pimentel said. “Yet IBS is 45 times more common than inflammatory bowel disease.”

Pimentel hopes that with enough advocacy and recognition that IBS isn’t “just stress-related,” more resources will be devoted to understanding this debilitating condition.

Desmond is the author of a book on the benefits of a plant-based diet. He has also received honoraria, speaking, and consultancy fees from the European Space Agency, Dyson Institute of Engineering and Technology, Riverford Organic Farmers, Ltd., Salesforce Inc., Sentara Healthcare, Saudi Sports for All Federation, the Physicians Committee for Responsible Medicine, The Plantrician Project, Doctors for Nutrition, and The Happy Pear.

Pimentel is a consultant for Bausch Health, Ferring Pharmaceuticals, and Ardelyx. He holds equity in and is also a consultant for Dieta Health, Salvo Health, Cylinder Health, and Gemelli Biotech. Cedars-Sinai has a licensing agreement with Gemelli Biotech and Hobbs Medical.

Chey is a consultant to AbbVie, Ardelyx, Atmo, Biomerica, Gemelli Biotech, Ironwood Pharmaceuticals, Nestlé, QOL Medical, Phathom Pharmaceuticals, Redhill, Salix/Valeant, Takeda, and Vibrant. He receives grant/research funding from Commonwealth Diagnostics International, Inc., US Food and Drug Administration, National Institutes of Health, QOL Medical, and Salix/Valeant. He holds stock options in Coprata, Dieta Health, Evinature, FoodMarble, Kiwi Biosciences, and ModifyHealth. He is a board or advisory panel member of the American College of Gastroenterology, GI Health Foundation, International Foundation for Gastrointestinal Disorders, Rome. He holds patents on My Nutrition Health, Digital Manometry, and Rectal Expulsion Device.

A version of this article appeared on Medscape.com.

Irritable bowel syndrome (IBS) is one of the most common conditions encountered by both primary care providers and gastroenterologists, with a pooled global prevalence of 11.2%. This functional bowel disorder is characterized by abdominal pain or discomfort, diarrhea and/or constipation, and bloating.

Unfortunately, IBS is often misunderstood or minimized by some healthcare professionals, according to Alan Desmond, MB, consultant in gastroenterology and general internal medicine, Torbay Hospital, UK National Health Service.

Desmond regularly sees patients who either haven’t been accurately diagnosed or have been told, “Don’t worry, it’s ‘just’ irritable bowel syndrome,” he said at the recent International Conference on Nutrition in Medicine.

A 2017 study involving nearly 2000 patients with a history of gastrointestinal (GI) symptoms found that 43.1% of those who met the criteria for IBS were undiagnosed, and among those who were diagnosed, 26% were not receiving treatment.

“Many clinicians vastly underestimate the impact functional GI symptoms have on our patients in lack of productivity, becoming homebound or losing employment, the inability to enjoy a meal with friends or family, and always needing to know where the nearest bathroom is, for example,” Desmond said in an interview.

IBS can profoundly affect patients’ mental health. One study found that 38% of patients with IBS attending a tertiary care clinic contemplated suicide because they felt hopeless about ever achieving symptom relief.

Today, several dietary, pharmacologic, and psychological/behavioral approaches are available to treat patients with IBS, noted William D. Chey, MD, AGAF, chief of the Division of Gastroenterology and Hepatology, University of Michigan, Ann Arbor, Michigan.

“Each individual patient may need a different combination of these foundational treatments,” he said. “One size doesn’t fit all.”
 

Diagnostic Pathway

One reason IBS is so hard to diagnose is that it’s a “symptom-based disorder, with identification of the condition predicated upon certain key characteristics that are heterogeneous,” Chey said in an interview. “IBS in patient ‘A’ may not present the same way as IBS in patient ‘B,’ although there are certain foundational common characteristics.”

IBS involves “abnormalities in the motility and contractility of the GI tract,” he said. It can present with diarrhea (IBS-D), constipation (IBS-C), or a mixture or alternation of diarrhea and constipation (IBS-M).

Patients with IBS-D often have an exaggerated gastro-colonic response, while those with IBS-C often have a blunted response.

Beyond stool abnormalities and abdominal pain/discomfort, patients often report bloating/distension, low backache, lethargy, nausea, thigh pain, and urinary and gynecologic symptoms.

Historically, IBS has been regarded as a “diagnosis of exclusion” because classic diagnostic tests typically yield no concrete findings. Desmond noted that several blood tests, procedures, imaging studies, and other tests are available to rule out other organic GI conditions, as outlined in the Table.

Tests to rule out other organic GI conditions


If the patient comes from a geographical region where giardia is endemic, clinicians also should consider testing for the parasite, Chey said.
 

New Understanding of IBS Etiology

Now, advances in the understanding of IBS are changing the approach to the disease.

“The field is moving away from seeing IBS as a ‘wastebasket diagnosis,’ recognizing that there are other causes of a patient’s symptoms,” Mark Pimentel, MD, associate professor of medicine and gastroenterology, Cedars-Sinai, Los Angeles, said in an interview. “What’s made IBS so difficult to diagnose has been the absence of biological markers and hallmark findings on endoscopy.”

Recent research points to novel bacterial causes as culprits in the development of IBS. In particular, altered small bowel microbiota can be triggered by acute gastroenteritis.

Food poisoning can trigger the onset of IBS — a phenomenon called “postinfectious IBS (PI-IBS),” said Pimentel, who is also executive director of the Medically Associated Science and Technology Program at Cedars-Sinai. PI-IBS almost always takes the form of IBS-D, with up to 60% of patients with IBS-D suffering the long-term sequelae of food poisoning.

The types of bacteria most commonly associated with gastroenteritis are Shigella, Campylobacter, Salmonella, and Escherichia coli, Pimentel said. All of them release cytolethal distending toxin B (CdtB), causing the body to produce antibodies to the toxin.

CdtB resembles vinculin, a naturally occurring protein critical for healthy gut function. “Because of this molecular resemblance, the immune system often mistakes one for the other, producing anti-vinculin,” Pimentel explained.

This autoimmune response leads to disruptions in the gut microbiome, ultimately resulting in PI-IBS. The chain of events “doesn’t necessarily happen immediately,” Pimentel said. “You might have developed food poisoning at a party weeks or months ago.”

Acute gastroenteritis is common, affecting as many as 179 million people in the United States annually. A meta-analysis of 47 studies, incorporating 28,270 patients, found that those who had experienced acute gastroenteritis had a fourfold higher risk of developing IBS compared with nonexposed controls.

“The problem isn’t only the IBS itself, but the fact that people with PI-IBS are four times as likely to contract food poisoning again, which can further exacerbate IBS symptoms,” Pimentel said.

Diarrhea-predominant IBS can be detected through the presence of two blood biomarkers — anti-CdtB and anti-vinculin — in a blood test developed by Pimentel and his group.

“Elevation in either of these biomarkers establishes the diagnosis,” Pimentel said. “This is a breakthrough because it represents the first test that can make IBS a ‘diagnosis of inclusion.’”

The blood test also can identify IBS-M but not IBS-C.

Pimentel said that IBS-C is associated with increased levels of methanogenic archaea, which can be diagnosed by a positive methane breath test. “Methane gas slows intestinal contractility, which might result in constipation,” he said.
 

 

 

Diet as a Treatment Option

Diet is usually the starting point for IBS treatment, Chey said. “The standard dietary recommendations, as defined by the National Institute for Health and Care Excellence Guidance for managing IBS, are reasonable and common sense — eating three meals a day, avoiding carbonated beverages, excess alcohol, and excess caffeine, and avoiding hard-to-digest foods that can be gas producing.”

A diet low in fermentable oligosaccharides, disaccharides, monosaccharides and polyols (FODMAPs), which are carbohydrates that aren’t completely absorbed in the intestines, has been shown to be effective in alleviating GI distress in as many as 86% of patients with IBS, leading to improvements in overall GI symptoms as well as individual symptoms (eg, abdominal pain, bloating, constipation, diarrhea, and flatulence).

Desmond recommends the low FODMAP program delineated by Monash University in Australia. The diet should be undertaken only under the supervision of a dietitian, he warned. Moreover, following it on a long-term basis can have an adverse impact on dietary quality and the gut microbiome. Therefore, “it’s important to embark on stepwise reintroduction of FODMAPS under supervision to find acceptable thresholds that don’t cause a return of symptoms.”

A growing body of research suggests that following the Mediterranean diet can be helpful in reducing IBS symptoms. Chey said that some patients who tend to over-restrict their eating might benefit from a less restrictive diet than the typical low FODMAPs diet. For them, the Mediterranean diet may be a good option.
 

Pharmacotherapy for IBS

Nutritional approaches aren’t for everyone, Chey noted. “Some people don’t want to be on a highly restricted diet.” For them, medications addressing symptoms might be a better option.

Antispasmodics — either anticholinergics (hyoscine and dicyclomine) or smooth muscle relaxants (alverine, mebeverine, and peppermint oil) — can be helpful, although they can worsen constipation in a dose-dependent manner. It is advisable to use them on an as-needed rather than long-term basis.

Antidiarrheal agents include loperamide and diphenoxylate.

For constipation, laxatives (eg, senna, bisacodyl, polyethylene glycol, and sodium picosulfate) can be helpful.

Desmond noted that the American Gastroenterological Association does not recommend routine use of probiotics for most GI disorders, including IBS. Exceptions include prevention of Clostridioides difficile, ulcerative colitis, and pouchitis.
 

Targeting the Gut-Brain Relationship

Stress plays a role in exacerbating symptoms in patients with IBS and is an important target for intervention.

“If patients are living with a level of stress that’s impairing, we won’t be able to solve their gut issues until we resolve their stress issues,” Desmond said. “We need to calm the gut-microbiome-brain axis, which is multidimensional and bidirectional.”

Many people — even those without IBS — experience queasiness or diarrhea prior to a major event they’re nervous about, Chey noted. These events activate the brain, which activates the nervous system, which interacts with the GI tract. Indeed, IBS is now recognized as a disorder of gut-brain interaction, he said.

“We now know that the microbiome in the GI tract influences cognition and emotional function, depression, and anxiety. One might say that the gut is the ‘center of the universe’ to human beings,” Chey said.

Evidence-based psychological approaches for stress reduction in patients with IBS include cognitive behavioral therapy, specifically tailored to helping the patient identify associations between IBS symptoms and thoughts, emotions, and actions, as well as learning new behaviors and engaging in stress management. Psychodynamic (interpersonal) therapy enables patients to understand the connection between GI symptoms and interpersonal conflicts, emotional factors, or relationship difficulties.

Gut-directed hypnotherapy (GDH) is a “proven modality for IBS,” Desmond said. Unlike other forms of hypnotherapy, GDH focuses specifically on controlling and normalizing GI function. Studies have shown a reduction of ≥ 30% in abdominal pain in two thirds of participants, with overall response rates up to 85%. It can be delivered in an individual or group setting or via a smartphone.

Desmond recommends mindfulness-based therapy (MBT) for IBS. MBT focuses on the “cultivation of mindfulness, defined as intentional, nonjudgmental, present-focused awareness.” It has been found effective in reducing flares and the markers of gut inflammation in ulcerative colitis, as well as reducing symptoms of IBS.

Chey noted that an emerging body of literature supports the potential role of acupuncture in treating IBS, and his clinic employs it. “I would like to see further research into other areas of CAM [complementary and alternative medicine], including herbal approaches to IBS symptoms as well as stress.”

Finally, all the experts agree that more research is needed.

“The real tragedy is that the NIH invests next to nothing in IBS, in contrast to inflammatory bowel disease and many other conditions,” Pimentel said. “Yet IBS is 45 times more common than inflammatory bowel disease.”

Pimentel hopes that with enough advocacy and recognition that IBS isn’t “just stress-related,” more resources will be devoted to understanding this debilitating condition.

Desmond is the author of a book on the benefits of a plant-based diet. He has also received honoraria, speaking, and consultancy fees from the European Space Agency, Dyson Institute of Engineering and Technology, Riverford Organic Farmers, Ltd., Salesforce Inc., Sentara Healthcare, Saudi Sports for All Federation, the Physicians Committee for Responsible Medicine, The Plantrician Project, Doctors for Nutrition, and The Happy Pear.

Pimentel is a consultant for Bausch Health, Ferring Pharmaceuticals, and Ardelyx. He holds equity in and is also a consultant for Dieta Health, Salvo Health, Cylinder Health, and Gemelli Biotech. Cedars-Sinai has a licensing agreement with Gemelli Biotech and Hobbs Medical.

Chey is a consultant to AbbVie, Ardelyx, Atmo, Biomerica, Gemelli Biotech, Ironwood Pharmaceuticals, Nestlé, QOL Medical, Phathom Pharmaceuticals, Redhill, Salix/Valeant, Takeda, and Vibrant. He receives grant/research funding from Commonwealth Diagnostics International, Inc., US Food and Drug Administration, National Institutes of Health, QOL Medical, and Salix/Valeant. He holds stock options in Coprata, Dieta Health, Evinature, FoodMarble, Kiwi Biosciences, and ModifyHealth. He is a board or advisory panel member of the American College of Gastroenterology, GI Health Foundation, International Foundation for Gastrointestinal Disorders, Rome. He holds patents on My Nutrition Health, Digital Manometry, and Rectal Expulsion Device.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Coming Soon: A New Disease Definition, ‘Clinical Obesity’

Article Type
Changed
Sun, 11/10/2024 - 17:58

An upcoming document will entirely reframe obesity as a “condition of excess adiposity” that constitutes a disease called “clinical obesity” when related tissue and organ abnormalities are present.

The authors of the new framework are a Lancet Commission of 56 of the world’s leading obesity experts, including academic clinicians, scientists, public health experts, patient representatives, and officers from the World Health Organization. Following peer review, it will be launched via livestream and published in Lancet Diabetes & Endocrinology in mid-January 2025, with formal endorsement from more than 75 medical societies and other relevant stakeholder organizations.

On November 4, 2024, at the Obesity Society’s Obesity Week meeting, the publication’s lead author, Francesco Rubino, MD, Chair of Bariatric and Metabolic Surgery at King’s College London in England, gave a preview. He began by noting that, despite the declaration of obesity as a chronic disease over a decade ago, the concept is still debated and not widely accepted by the public or even by all in the medical community.

“The idea of obesity as a disease remains highly controversial,” Rubino noted, adding that the current body mass index (BMI)–based definition contributes to this because it doesn’t distinguish between people whose excess adiposity place them at excess risk for disease but they’re currently healthy vs those who already have undergone bodily harm from that adiposity.

“Having a framework that distinguishes at an individual level when you are in a condition of risk and when you have a condition of disease is fundamentally important. You don’t want to blur the picture in either direction, because obviously the consequence would be quite significant. ... So, the commission focused exactly on that point,” he said.

The new paper will propose a two-part clinical approach: First, assess whether the patient has excess adiposity, with methods that will be outlined. Next, assess on an organ-by-organ basis for the presence of abnormalities related to excess adiposity, or “clinical obesity.” The document will also provide those specific criteria, Rubino said, noting that those details are under embargo until January.

However, he did say that “We are going to propose a pragmatic approach to say that BMI alone is not enough in the clinic. It’s okay as a screening tool, but when somebody potentially has obesity, then you have to add additional measures of adiposity that makes sure you decrease the level of risk… Once you have obesity, then you need to establish if it’s clinical or nonclinical.”

Asked to comment, session moderator John D. Clark, MD, PhD, Chief Population Health Officer at Sharp Rees-Stealy Medical Group, San Diego, California, said in an interview, “I think it’ll help explain and move medicine as a whole in a direction to a greater understanding of obesity actually being a disease, how to define it, and how to identify it. And will, I think, lead to a greater understanding of the underlying disease.”

And, Clark said, it should also help target individuals with preventive vs therapeutic approaches. “I would describe it as matching the right tool to the right patient. If a person has clinical obesity, they likely can and would benefit from either different or additional tools, as opposed to otherwise healthy obesity.”

Rubino said he hopes the new framework will prompt improvements in reimbursement and public policy. “Policymakers scratch their heads when they have limited resources and you need to prioritize things. Having an obesity definition that is blurry doesn’t allow you to have a fair, human, and meaningful prioritization. ... Now that we have drugs that cannot be given to 100% of people, how do you decide who gets them first? I hope this will make it easier for people to access treatment. At the moment, it is not only difficult, but it’s also unfair. It’s random. Somebody gets access, while somebody else who is very, very sick has no access. I don’t think that’s what we want.”

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

An upcoming document will entirely reframe obesity as a “condition of excess adiposity” that constitutes a disease called “clinical obesity” when related tissue and organ abnormalities are present.

The authors of the new framework are a Lancet Commission of 56 of the world’s leading obesity experts, including academic clinicians, scientists, public health experts, patient representatives, and officers from the World Health Organization. Following peer review, it will be launched via livestream and published in Lancet Diabetes & Endocrinology in mid-January 2025, with formal endorsement from more than 75 medical societies and other relevant stakeholder organizations.

On November 4, 2024, at the Obesity Society’s Obesity Week meeting, the publication’s lead author, Francesco Rubino, MD, Chair of Bariatric and Metabolic Surgery at King’s College London in England, gave a preview. He began by noting that, despite the declaration of obesity as a chronic disease over a decade ago, the concept is still debated and not widely accepted by the public or even by all in the medical community.

“The idea of obesity as a disease remains highly controversial,” Rubino noted, adding that the current body mass index (BMI)–based definition contributes to this because it doesn’t distinguish between people whose excess adiposity place them at excess risk for disease but they’re currently healthy vs those who already have undergone bodily harm from that adiposity.

“Having a framework that distinguishes at an individual level when you are in a condition of risk and when you have a condition of disease is fundamentally important. You don’t want to blur the picture in either direction, because obviously the consequence would be quite significant. ... So, the commission focused exactly on that point,” he said.

The new paper will propose a two-part clinical approach: First, assess whether the patient has excess adiposity, with methods that will be outlined. Next, assess on an organ-by-organ basis for the presence of abnormalities related to excess adiposity, or “clinical obesity.” The document will also provide those specific criteria, Rubino said, noting that those details are under embargo until January.

However, he did say that “We are going to propose a pragmatic approach to say that BMI alone is not enough in the clinic. It’s okay as a screening tool, but when somebody potentially has obesity, then you have to add additional measures of adiposity that makes sure you decrease the level of risk… Once you have obesity, then you need to establish if it’s clinical or nonclinical.”

Asked to comment, session moderator John D. Clark, MD, PhD, Chief Population Health Officer at Sharp Rees-Stealy Medical Group, San Diego, California, said in an interview, “I think it’ll help explain and move medicine as a whole in a direction to a greater understanding of obesity actually being a disease, how to define it, and how to identify it. And will, I think, lead to a greater understanding of the underlying disease.”

And, Clark said, it should also help target individuals with preventive vs therapeutic approaches. “I would describe it as matching the right tool to the right patient. If a person has clinical obesity, they likely can and would benefit from either different or additional tools, as opposed to otherwise healthy obesity.”

Rubino said he hopes the new framework will prompt improvements in reimbursement and public policy. “Policymakers scratch their heads when they have limited resources and you need to prioritize things. Having an obesity definition that is blurry doesn’t allow you to have a fair, human, and meaningful prioritization. ... Now that we have drugs that cannot be given to 100% of people, how do you decide who gets them first? I hope this will make it easier for people to access treatment. At the moment, it is not only difficult, but it’s also unfair. It’s random. Somebody gets access, while somebody else who is very, very sick has no access. I don’t think that’s what we want.”

A version of this article appeared on Medscape.com.

An upcoming document will entirely reframe obesity as a “condition of excess adiposity” that constitutes a disease called “clinical obesity” when related tissue and organ abnormalities are present.

The authors of the new framework are a Lancet Commission of 56 of the world’s leading obesity experts, including academic clinicians, scientists, public health experts, patient representatives, and officers from the World Health Organization. Following peer review, it will be launched via livestream and published in Lancet Diabetes & Endocrinology in mid-January 2025, with formal endorsement from more than 75 medical societies and other relevant stakeholder organizations.

On November 4, 2024, at the Obesity Society’s Obesity Week meeting, the publication’s lead author, Francesco Rubino, MD, Chair of Bariatric and Metabolic Surgery at King’s College London in England, gave a preview. He began by noting that, despite the declaration of obesity as a chronic disease over a decade ago, the concept is still debated and not widely accepted by the public or even by all in the medical community.

“The idea of obesity as a disease remains highly controversial,” Rubino noted, adding that the current body mass index (BMI)–based definition contributes to this because it doesn’t distinguish between people whose excess adiposity place them at excess risk for disease but they’re currently healthy vs those who already have undergone bodily harm from that adiposity.

“Having a framework that distinguishes at an individual level when you are in a condition of risk and when you have a condition of disease is fundamentally important. You don’t want to blur the picture in either direction, because obviously the consequence would be quite significant. ... So, the commission focused exactly on that point,” he said.

The new paper will propose a two-part clinical approach: First, assess whether the patient has excess adiposity, with methods that will be outlined. Next, assess on an organ-by-organ basis for the presence of abnormalities related to excess adiposity, or “clinical obesity.” The document will also provide those specific criteria, Rubino said, noting that those details are under embargo until January.

However, he did say that “We are going to propose a pragmatic approach to say that BMI alone is not enough in the clinic. It’s okay as a screening tool, but when somebody potentially has obesity, then you have to add additional measures of adiposity that makes sure you decrease the level of risk… Once you have obesity, then you need to establish if it’s clinical or nonclinical.”

Asked to comment, session moderator John D. Clark, MD, PhD, Chief Population Health Officer at Sharp Rees-Stealy Medical Group, San Diego, California, said in an interview, “I think it’ll help explain and move medicine as a whole in a direction to a greater understanding of obesity actually being a disease, how to define it, and how to identify it. And will, I think, lead to a greater understanding of the underlying disease.”

And, Clark said, it should also help target individuals with preventive vs therapeutic approaches. “I would describe it as matching the right tool to the right patient. If a person has clinical obesity, they likely can and would benefit from either different or additional tools, as opposed to otherwise healthy obesity.”

Rubino said he hopes the new framework will prompt improvements in reimbursement and public policy. “Policymakers scratch their heads when they have limited resources and you need to prioritize things. Having an obesity definition that is blurry doesn’t allow you to have a fair, human, and meaningful prioritization. ... Now that we have drugs that cannot be given to 100% of people, how do you decide who gets them first? I hope this will make it easier for people to access treatment. At the moment, it is not only difficult, but it’s also unfair. It’s random. Somebody gets access, while somebody else who is very, very sick has no access. I don’t think that’s what we want.”

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM OBESITY WEEK

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Lifestyle Medicine Trends to Keep an Eye On

Article Type
Changed
Wed, 11/06/2024 - 05:03

Our current healthcare system, which is a costly and unending cycle of merely managing chronic disease symptoms, is failing us. What we truly need is a patient-centered approach that restores health by addressing not just diagnoses but also the physical, emotional, and social needs of each individual. This is the essence of whole-person health, and transformation toward this model of care is already underway.

This shift underscores why clinicians like me support placing lifestyle medicine at the foundation of health and healthcare. Evidence-based lifestyle medicine — which applies interventions in nutrition, physical activity, restorative sleep, stress management, positive social connections, and avoidance of risky substances to prevent, treat, and when used intensively, even reverse lifestyle-related chronic disease — is a medical specialty equipped to successfully address patients’ whole-person health in an effective, high-value clinical care delivery model.

As this transformation continues, here are four key lifestyle medicine trends for 2025.
 

Lifestyle Medicine Becomes More Ingrained in Primary Care

The 2021 National Academies of Science, Engineering, and Medicine report, “Implementing High-Quality Primary Care” sounded the alarm about the state of primary care and outlined a comprehensive approach to transform it. Lifestyle medicine emerged as a solution as clinicians found innovative ways to integrate lifestyle behavior interventions into existing care models in a financially sustainable, scalable manner. Examples include Blue Zones Health, a new delivery model that aligns lifestyle medicine–certified clinicians with community and payers in California, and the University of Pittsburgh Medical Center lifestyle medicine program, where primary care patients are referred to virtual group coaching, a teaching kitchen, and classes on food as medicine, obesitytype 2 diabetes, and more.

Organizations dedicated to advancing primary care are paying close attention to the potential of lifestyle medicine. Currently, The Primary Care Collaborative has launched a new multi-year initiative on whole-person care and lifestyle medicine. This initiative aims to broaden the primary care community’s understanding of whole health and lifestyle medicine concepts and the evidence behind them, as well as lay the groundwork for future work to promote whole-person primary care and lifestyle medicine among an engaged and committed community of members. 
 

Digital Tools and AI Spark Lifestyle Medicine Innovations

American College of Lifestyle Medicine partner organizations are increasingly utilizing digital tools, such as health apps tailored to lifestyle behavior interventions, to expand access to care and support behavior change. One of the biggest challenges in lifestyle interventions is the limited time during patient encounters. But artificial intelligence (AI) tools can record (with patient permission) and summarize encounters, enabling clinicians to turn away from their keyboards and be more present to learn about the unique living, environmental, and societal factors that impact every individual’s lifestyle choices. AI tools can create individualized whole-food, plant-predominant meal plans or physical activity schedules for patients in just a few seconds. The potential for AI in lifestyle medicine is vast, and its applications were further explored at the American College of Lifestyle Medicine’s annual conference in October.

Behavior Change and Sustainability of the Food-as-Medicine Movement

Significant investments have been made in food as medicine to address diet-related chronic diseases. But merely providing medically tailored meals or produce prescriptions is not enough because once the prescriptions end, so will the health benefits. Clinicians certified in lifestyle medicine are prepared to coach patients into long-term behavior change, supporting them with education and information to shop for and prepare tasty, nutritious, and affordable food. The same applies to the use of glucagon-like peptide 1 drugs. Although the initial weight loss offers motivation, lifestyle changes are necessary to sustain long-term health benefits beyond medications.

 

 

Lifestyle Medicine Emerges as a Strategy to Achieve Health Equity 

Lifestyle behavior interventions have the unique ability to address health status and social drivers of health. For example, food as medicine affects an individual’s health while also addressing nutrition security. Certainly, no medication can both improve health status and feed someone. The addition of payment for the screening of social drivers of health to the 2024 Medicare Physician Fee Schedule is an important step toward connecting clinicians with community health–based organizations that can address factors that influence patients’ ability to adhere to lifestyle behavior care plans. Lifestyle medicine clinicians are poised to lead this effort because they are already having conversations with patients about their environment, living conditions, and access to nutritious food. 

The changes coming to our healthcare system are exciting and long overdue. Lifestyle medicine is positioned to be at the forefront of this transformation now and in the future.

Dr. Patel, president of the American College of Lifestyle Medicine in St. Louis, has disclosed no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Our current healthcare system, which is a costly and unending cycle of merely managing chronic disease symptoms, is failing us. What we truly need is a patient-centered approach that restores health by addressing not just diagnoses but also the physical, emotional, and social needs of each individual. This is the essence of whole-person health, and transformation toward this model of care is already underway.

This shift underscores why clinicians like me support placing lifestyle medicine at the foundation of health and healthcare. Evidence-based lifestyle medicine — which applies interventions in nutrition, physical activity, restorative sleep, stress management, positive social connections, and avoidance of risky substances to prevent, treat, and when used intensively, even reverse lifestyle-related chronic disease — is a medical specialty equipped to successfully address patients’ whole-person health in an effective, high-value clinical care delivery model.

As this transformation continues, here are four key lifestyle medicine trends for 2025.
 

Lifestyle Medicine Becomes More Ingrained in Primary Care

The 2021 National Academies of Science, Engineering, and Medicine report, “Implementing High-Quality Primary Care” sounded the alarm about the state of primary care and outlined a comprehensive approach to transform it. Lifestyle medicine emerged as a solution as clinicians found innovative ways to integrate lifestyle behavior interventions into existing care models in a financially sustainable, scalable manner. Examples include Blue Zones Health, a new delivery model that aligns lifestyle medicine–certified clinicians with community and payers in California, and the University of Pittsburgh Medical Center lifestyle medicine program, where primary care patients are referred to virtual group coaching, a teaching kitchen, and classes on food as medicine, obesitytype 2 diabetes, and more.

Organizations dedicated to advancing primary care are paying close attention to the potential of lifestyle medicine. Currently, The Primary Care Collaborative has launched a new multi-year initiative on whole-person care and lifestyle medicine. This initiative aims to broaden the primary care community’s understanding of whole health and lifestyle medicine concepts and the evidence behind them, as well as lay the groundwork for future work to promote whole-person primary care and lifestyle medicine among an engaged and committed community of members. 
 

Digital Tools and AI Spark Lifestyle Medicine Innovations

American College of Lifestyle Medicine partner organizations are increasingly utilizing digital tools, such as health apps tailored to lifestyle behavior interventions, to expand access to care and support behavior change. One of the biggest challenges in lifestyle interventions is the limited time during patient encounters. But artificial intelligence (AI) tools can record (with patient permission) and summarize encounters, enabling clinicians to turn away from their keyboards and be more present to learn about the unique living, environmental, and societal factors that impact every individual’s lifestyle choices. AI tools can create individualized whole-food, plant-predominant meal plans or physical activity schedules for patients in just a few seconds. The potential for AI in lifestyle medicine is vast, and its applications were further explored at the American College of Lifestyle Medicine’s annual conference in October.

Behavior Change and Sustainability of the Food-as-Medicine Movement

Significant investments have been made in food as medicine to address diet-related chronic diseases. But merely providing medically tailored meals or produce prescriptions is not enough because once the prescriptions end, so will the health benefits. Clinicians certified in lifestyle medicine are prepared to coach patients into long-term behavior change, supporting them with education and information to shop for and prepare tasty, nutritious, and affordable food. The same applies to the use of glucagon-like peptide 1 drugs. Although the initial weight loss offers motivation, lifestyle changes are necessary to sustain long-term health benefits beyond medications.

 

 

Lifestyle Medicine Emerges as a Strategy to Achieve Health Equity 

Lifestyle behavior interventions have the unique ability to address health status and social drivers of health. For example, food as medicine affects an individual’s health while also addressing nutrition security. Certainly, no medication can both improve health status and feed someone. The addition of payment for the screening of social drivers of health to the 2024 Medicare Physician Fee Schedule is an important step toward connecting clinicians with community health–based organizations that can address factors that influence patients’ ability to adhere to lifestyle behavior care plans. Lifestyle medicine clinicians are poised to lead this effort because they are already having conversations with patients about their environment, living conditions, and access to nutritious food. 

The changes coming to our healthcare system are exciting and long overdue. Lifestyle medicine is positioned to be at the forefront of this transformation now and in the future.

Dr. Patel, president of the American College of Lifestyle Medicine in St. Louis, has disclosed no relevant financial relationships.

A version of this article appeared on Medscape.com.

Our current healthcare system, which is a costly and unending cycle of merely managing chronic disease symptoms, is failing us. What we truly need is a patient-centered approach that restores health by addressing not just diagnoses but also the physical, emotional, and social needs of each individual. This is the essence of whole-person health, and transformation toward this model of care is already underway.

This shift underscores why clinicians like me support placing lifestyle medicine at the foundation of health and healthcare. Evidence-based lifestyle medicine — which applies interventions in nutrition, physical activity, restorative sleep, stress management, positive social connections, and avoidance of risky substances to prevent, treat, and when used intensively, even reverse lifestyle-related chronic disease — is a medical specialty equipped to successfully address patients’ whole-person health in an effective, high-value clinical care delivery model.

As this transformation continues, here are four key lifestyle medicine trends for 2025.
 

Lifestyle Medicine Becomes More Ingrained in Primary Care

The 2021 National Academies of Science, Engineering, and Medicine report, “Implementing High-Quality Primary Care” sounded the alarm about the state of primary care and outlined a comprehensive approach to transform it. Lifestyle medicine emerged as a solution as clinicians found innovative ways to integrate lifestyle behavior interventions into existing care models in a financially sustainable, scalable manner. Examples include Blue Zones Health, a new delivery model that aligns lifestyle medicine–certified clinicians with community and payers in California, and the University of Pittsburgh Medical Center lifestyle medicine program, where primary care patients are referred to virtual group coaching, a teaching kitchen, and classes on food as medicine, obesitytype 2 diabetes, and more.

Organizations dedicated to advancing primary care are paying close attention to the potential of lifestyle medicine. Currently, The Primary Care Collaborative has launched a new multi-year initiative on whole-person care and lifestyle medicine. This initiative aims to broaden the primary care community’s understanding of whole health and lifestyle medicine concepts and the evidence behind them, as well as lay the groundwork for future work to promote whole-person primary care and lifestyle medicine among an engaged and committed community of members. 
 

Digital Tools and AI Spark Lifestyle Medicine Innovations

American College of Lifestyle Medicine partner organizations are increasingly utilizing digital tools, such as health apps tailored to lifestyle behavior interventions, to expand access to care and support behavior change. One of the biggest challenges in lifestyle interventions is the limited time during patient encounters. But artificial intelligence (AI) tools can record (with patient permission) and summarize encounters, enabling clinicians to turn away from their keyboards and be more present to learn about the unique living, environmental, and societal factors that impact every individual’s lifestyle choices. AI tools can create individualized whole-food, plant-predominant meal plans or physical activity schedules for patients in just a few seconds. The potential for AI in lifestyle medicine is vast, and its applications were further explored at the American College of Lifestyle Medicine’s annual conference in October.

Behavior Change and Sustainability of the Food-as-Medicine Movement

Significant investments have been made in food as medicine to address diet-related chronic diseases. But merely providing medically tailored meals or produce prescriptions is not enough because once the prescriptions end, so will the health benefits. Clinicians certified in lifestyle medicine are prepared to coach patients into long-term behavior change, supporting them with education and information to shop for and prepare tasty, nutritious, and affordable food. The same applies to the use of glucagon-like peptide 1 drugs. Although the initial weight loss offers motivation, lifestyle changes are necessary to sustain long-term health benefits beyond medications.

 

 

Lifestyle Medicine Emerges as a Strategy to Achieve Health Equity 

Lifestyle behavior interventions have the unique ability to address health status and social drivers of health. For example, food as medicine affects an individual’s health while also addressing nutrition security. Certainly, no medication can both improve health status and feed someone. The addition of payment for the screening of social drivers of health to the 2024 Medicare Physician Fee Schedule is an important step toward connecting clinicians with community health–based organizations that can address factors that influence patients’ ability to adhere to lifestyle behavior care plans. Lifestyle medicine clinicians are poised to lead this effort because they are already having conversations with patients about their environment, living conditions, and access to nutritious food. 

The changes coming to our healthcare system are exciting and long overdue. Lifestyle medicine is positioned to be at the forefront of this transformation now and in the future.

Dr. Patel, president of the American College of Lifestyle Medicine in St. Louis, has disclosed no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Myth of the Month: Are Thickened Liquids Helpful in Dysphagia Patients?

Article Type
Changed
Wed, 11/06/2024 - 04:58

Case: An 80-year-old man with advanced Alzheimer’s disease is admitted to the hospital after a fall. He is noted to have coughing spells after drinking liquids. He has a swallowing study done which shows severe slowing of motility in the esophagus.

What do you recommend?

A. Feeding tube

B. Thickened liquids

C. Continue current diet



The correct answer for this patient is to allow them to continue their current diet. They do not need thickened liquids. A feeding tube would not be recommended.1

Dr. Douglas S. Paauw

So are there any data supporting the widespread use of thickened liquid diets for patients with dysphagia and aspiration?

Multiple clinical guidelines for stroke recommend the use of thickened liquids despite minimal to no evidence of efficacy.2 It is a common practice to give patients with advanced dementia thickened liquids, especially in the hospital setting. Does this help?

Makhnevich and colleagues published a cohort study of Alzheimer’s disease patients with dysphagia admitted to the hospital over a 5-year period.3 Almost half of the cohort received a thickened liquid diet; these patients were matched with patients who received a thin liquid diet. There was no significant difference in hospital mortality between the groups that received thick liquids and thin liquids (hazard ratio, 0.92; P = .46). Patients receiving thickened liquids were less likely to be intubated but were more likely to have pulmonary infections.

A 2018 Cochrane review concluded that there was no consensus on immediate and long-term effects of modifying the consistency of fluid for swallowing difficulties in dementia because too few studies have been completed.4 So why is this important information or lack of information?

What is so bad about a thickened liquid diet?

Eric Widera, MD, shared in JAMA Internal Medicine his experience along with his hospice and palliative care team of drinking thickened liquids.5 He drank only thickened liquids for a 12-hour period. “The challenge was eye-opening. It was the first time I experienced the terrible taste and texture of thickened liquids,” he wrote. He shared some of the risks of thickened liquids: dehydration, poor oral intake, and decreased quality of life.

The bottom line is that there is scant evidence for the benefit of thickened liquids, especially for patients with advanced dementia and dysphagia, and giving thickened liquids is not a benign intervention, because of poor tolerability of the diet.

References

1. American Geriatrics Society Ethics Committee and Clinical Practice and Models of Care Committee. J Am Geriatr Soc. 2014;62(8):1590-3.

2. McCurtin A et al. J Eval Clin Pract. 2020;26:1744-60.

3. Makhnevich A et al. JAMA Intern Med. 2024 Jul 1;184(7):778-85.

4. Flynn E et al. Cochrane Database Syst Rev. 2018 Sep 24;9(9):CD011077.

5. Widera E. JAMA Intern Med. 2024 Jul 1;184(7):786-7.

Publications
Topics
Sections

Case: An 80-year-old man with advanced Alzheimer’s disease is admitted to the hospital after a fall. He is noted to have coughing spells after drinking liquids. He has a swallowing study done which shows severe slowing of motility in the esophagus.

What do you recommend?

A. Feeding tube

B. Thickened liquids

C. Continue current diet



The correct answer for this patient is to allow them to continue their current diet. They do not need thickened liquids. A feeding tube would not be recommended.1

Dr. Douglas S. Paauw

So are there any data supporting the widespread use of thickened liquid diets for patients with dysphagia and aspiration?

Multiple clinical guidelines for stroke recommend the use of thickened liquids despite minimal to no evidence of efficacy.2 It is a common practice to give patients with advanced dementia thickened liquids, especially in the hospital setting. Does this help?

Makhnevich and colleagues published a cohort study of Alzheimer’s disease patients with dysphagia admitted to the hospital over a 5-year period.3 Almost half of the cohort received a thickened liquid diet; these patients were matched with patients who received a thin liquid diet. There was no significant difference in hospital mortality between the groups that received thick liquids and thin liquids (hazard ratio, 0.92; P = .46). Patients receiving thickened liquids were less likely to be intubated but were more likely to have pulmonary infections.

A 2018 Cochrane review concluded that there was no consensus on immediate and long-term effects of modifying the consistency of fluid for swallowing difficulties in dementia because too few studies have been completed.4 So why is this important information or lack of information?

What is so bad about a thickened liquid diet?

Eric Widera, MD, shared in JAMA Internal Medicine his experience along with his hospice and palliative care team of drinking thickened liquids.5 He drank only thickened liquids for a 12-hour period. “The challenge was eye-opening. It was the first time I experienced the terrible taste and texture of thickened liquids,” he wrote. He shared some of the risks of thickened liquids: dehydration, poor oral intake, and decreased quality of life.

The bottom line is that there is scant evidence for the benefit of thickened liquids, especially for patients with advanced dementia and dysphagia, and giving thickened liquids is not a benign intervention, because of poor tolerability of the diet.

References

1. American Geriatrics Society Ethics Committee and Clinical Practice and Models of Care Committee. J Am Geriatr Soc. 2014;62(8):1590-3.

2. McCurtin A et al. J Eval Clin Pract. 2020;26:1744-60.

3. Makhnevich A et al. JAMA Intern Med. 2024 Jul 1;184(7):778-85.

4. Flynn E et al. Cochrane Database Syst Rev. 2018 Sep 24;9(9):CD011077.

5. Widera E. JAMA Intern Med. 2024 Jul 1;184(7):786-7.

Case: An 80-year-old man with advanced Alzheimer’s disease is admitted to the hospital after a fall. He is noted to have coughing spells after drinking liquids. He has a swallowing study done which shows severe slowing of motility in the esophagus.

What do you recommend?

A. Feeding tube

B. Thickened liquids

C. Continue current diet



The correct answer for this patient is to allow them to continue their current diet. They do not need thickened liquids. A feeding tube would not be recommended.1

Dr. Douglas S. Paauw

So are there any data supporting the widespread use of thickened liquid diets for patients with dysphagia and aspiration?

Multiple clinical guidelines for stroke recommend the use of thickened liquids despite minimal to no evidence of efficacy.2 It is a common practice to give patients with advanced dementia thickened liquids, especially in the hospital setting. Does this help?

Makhnevich and colleagues published a cohort study of Alzheimer’s disease patients with dysphagia admitted to the hospital over a 5-year period.3 Almost half of the cohort received a thickened liquid diet; these patients were matched with patients who received a thin liquid diet. There was no significant difference in hospital mortality between the groups that received thick liquids and thin liquids (hazard ratio, 0.92; P = .46). Patients receiving thickened liquids were less likely to be intubated but were more likely to have pulmonary infections.

A 2018 Cochrane review concluded that there was no consensus on immediate and long-term effects of modifying the consistency of fluid for swallowing difficulties in dementia because too few studies have been completed.4 So why is this important information or lack of information?

What is so bad about a thickened liquid diet?

Eric Widera, MD, shared in JAMA Internal Medicine his experience along with his hospice and palliative care team of drinking thickened liquids.5 He drank only thickened liquids for a 12-hour period. “The challenge was eye-opening. It was the first time I experienced the terrible taste and texture of thickened liquids,” he wrote. He shared some of the risks of thickened liquids: dehydration, poor oral intake, and decreased quality of life.

The bottom line is that there is scant evidence for the benefit of thickened liquids, especially for patients with advanced dementia and dysphagia, and giving thickened liquids is not a benign intervention, because of poor tolerability of the diet.

References

1. American Geriatrics Society Ethics Committee and Clinical Practice and Models of Care Committee. J Am Geriatr Soc. 2014;62(8):1590-3.

2. McCurtin A et al. J Eval Clin Pract. 2020;26:1744-60.

3. Makhnevich A et al. JAMA Intern Med. 2024 Jul 1;184(7):778-85.

4. Flynn E et al. Cochrane Database Syst Rev. 2018 Sep 24;9(9):CD011077.

5. Widera E. JAMA Intern Med. 2024 Jul 1;184(7):786-7.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Humans and Carbs: A Complicated 800,000-Year Relationship

Article Type
Changed
Tue, 10/29/2024 - 05:47

Trying to reduce your carbohydrate intake means going against nearly a million years of evolution.

Humans are among a few species with multiple copies of certain genes that help us break down starch — carbs like potatoes, beans, corn, and grains — so that we can turn it into energy our bodies can use.

However, it’s been difficult for researchers to pinpoint when in human history we acquired multiple copies of these genes because they’re in a region of the genome that’s hard to sequence.

A recent study published in Science suggests that humans may have developed multiple copies of the gene for amylase — an enzyme that’s the first step in starch digestion — over 800,000 years ago, long before the agricultural revolution. This genetic change could have helped us adapt to eating starchy foods.

The study shows how “what your ancestors ate thousands of years ago could be affecting our genetics today,” said Kelsey Jorgensen, PhD, a biological anthropologist at The University of Kansas, Lawrence, who was not involved in the study.

The double-edged sword has sharpened over all those centuries. On one hand, the human body needs and craves carbs to function. On the other hand, our modern-day consumption of carbs, especially calorie-dense/nutritionally-barren processed carbs, has long since passed “healthy.”
 

How Researchers Found Our Carb-Lover Gene

The enzyme amylase turns complex carbs into maltose, a sweet-tasting sugar that is made of two glucose molecules linked together. We make two kinds of amylases: Salivary amylase that breaks down carbs in our mouths and pancreatic amylase that is secreted into our small intestines.

Modern humans have multiple copies of both amylases. Past research showed that human populations with diets high in starch can have up to nine copies of the gene for salivary amylase, called AMY1.

To pinpoint when in human history we acquired multiple copies of AMY1, the new study utilized novel techniques, called optical genome mapping and long-read sequencing, to sequence and analyze the genes. They sequenced 98 modern-day samples and 68 ancient DNA samples, including one from a Siberian person who lived 45,000 years ago.

The ancient DNA data in the study allowed the researchers to track how the number of amylase genes changed over time, said George Perry, PhD, an anthropological geneticist at The Pennsylvania State University-University Park (he was not involved in the study).

Based on the sequencing, the team analyzed changes in the genes in their samples to gauge evolutionary timelines. Perry noted that this was a “very clever approach to estimating the amylase copy number mutation rate, which in turn can really help in testing evolutionary hypotheses.”

The researchers found that even before farming, hunter-gatherers had between four and eight AMY1 genes in their cells. This suggests that people across Eurasia already had a number of these genes long before they started growing crops. (Recent research indicates that Neanderthals also ate starchy foods.)

“Even archaic hominins had these [genetic] variations and that indicates that they were consuming starch,” said Feyza Yilmaz, PhD, an associate computational scientist at The Jackson Laboratory in Bar Harbor, Maine, and a lead author of the study.

However, 4000 years ago, after the agricultural revolution, the research indicates that there were even more AMY1 copies acquired. Yilmaz noted, “with the advance of agriculture, we see an increase in high amylase copy number haplotypes. So genetic variation goes hand in hand with adaptation to the environment.” 

previous study showed that species that share an environment with humans, such as dogs and pigs, also have copy number variation of amylase genes, said Yilmaz, indicating a link between genome changes and an increase in starch consumption.
 

 

 

Potential Health Impacts on Modern Humans

The duplications in the AMY1 gene could have allowed humans to better digest starches. And it’s conceivable that having more copies of the gene means being able to break down starches even more efficiently, and those with more copies “may be more prone to having high blood sugar, prediabetes, that sort of thing,” Jorgensen said.

Whether those with more AMY1 genes have more health risks is an active area of research. “Researchers tested whether there’s a correlation between AMY1 gene copies and diabetes or BMI [body mass index]. And so far, some studies show that there is indeed correlation, but other studies show that there is no correlation at all,” said Yilmaz.

Yilmaz pointed out that only 5 or 10% of carb digestion happens in our mouths, the rest occurs in our small intestine, plus there are many other factors involved in eating and metabolism.

“I am really looking forward to seeing studies which truly figure out the connection between AMY1 copy number and metabolic health and also what type of factors play a role in metabolic health,” said Yilmaz.

It’s also possible that having more AMY1 copies could lead to more carb cravings as the enzyme creates a type of sugar in our mouths. “Previous studies show that there’s a correlation between AMY1 copy number and also the amylase enzyme levels, so the faster we process the starch, the taste [of starches] will be sweeter,” said Yilmaz.

However, the link between cravings and copy numbers isn’t clear. And we don’t exactly know what came first — did the starch in humans’ diet lead to more copies of amylase genes, or did the copies of the amylase genes drive cravings that lead us to cultivate more carbs? We’ll need more research to find out.
 

How Will Today’s Processed Carbs Affect Our Genes Tomorrow?

As our diet changes to increasingly include processed carbs, what will happen to our AMY1 genes is fuzzy. “I don’t know what this could do to our genomes in the next 1000 years or more than 1000 years,” Yilmaz noted, but she said from the evidence it seems as though we may have peaked in AMY1 copies.

Jorgensen noted that this research is focused on a European population. She wonders whether the pattern of AMY1 duplication will be repeated in other populations “because the rise of starch happened first in the Middle East and then Europe and then later in the Americas,” she said.

“There’s individual variation and then there’s population-wide variation,” Jorgensen pointed out. She speculates that the historical diet of different cultures could explain population-based variations in AMY1 genes — it’s something future research could investigate. Other populations may also experience genetic changes as much of the world shifts to a more carb-heavy Western diet.

Overall, this research adds to the growing evidence that humans have a long history of loving carbs — for better and, at least over our most recent history and immediate future, for worse.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Trying to reduce your carbohydrate intake means going against nearly a million years of evolution.

Humans are among a few species with multiple copies of certain genes that help us break down starch — carbs like potatoes, beans, corn, and grains — so that we can turn it into energy our bodies can use.

However, it’s been difficult for researchers to pinpoint when in human history we acquired multiple copies of these genes because they’re in a region of the genome that’s hard to sequence.

A recent study published in Science suggests that humans may have developed multiple copies of the gene for amylase — an enzyme that’s the first step in starch digestion — over 800,000 years ago, long before the agricultural revolution. This genetic change could have helped us adapt to eating starchy foods.

The study shows how “what your ancestors ate thousands of years ago could be affecting our genetics today,” said Kelsey Jorgensen, PhD, a biological anthropologist at The University of Kansas, Lawrence, who was not involved in the study.

The double-edged sword has sharpened over all those centuries. On one hand, the human body needs and craves carbs to function. On the other hand, our modern-day consumption of carbs, especially calorie-dense/nutritionally-barren processed carbs, has long since passed “healthy.”
 

How Researchers Found Our Carb-Lover Gene

The enzyme amylase turns complex carbs into maltose, a sweet-tasting sugar that is made of two glucose molecules linked together. We make two kinds of amylases: Salivary amylase that breaks down carbs in our mouths and pancreatic amylase that is secreted into our small intestines.

Modern humans have multiple copies of both amylases. Past research showed that human populations with diets high in starch can have up to nine copies of the gene for salivary amylase, called AMY1.

To pinpoint when in human history we acquired multiple copies of AMY1, the new study utilized novel techniques, called optical genome mapping and long-read sequencing, to sequence and analyze the genes. They sequenced 98 modern-day samples and 68 ancient DNA samples, including one from a Siberian person who lived 45,000 years ago.

The ancient DNA data in the study allowed the researchers to track how the number of amylase genes changed over time, said George Perry, PhD, an anthropological geneticist at The Pennsylvania State University-University Park (he was not involved in the study).

Based on the sequencing, the team analyzed changes in the genes in their samples to gauge evolutionary timelines. Perry noted that this was a “very clever approach to estimating the amylase copy number mutation rate, which in turn can really help in testing evolutionary hypotheses.”

The researchers found that even before farming, hunter-gatherers had between four and eight AMY1 genes in their cells. This suggests that people across Eurasia already had a number of these genes long before they started growing crops. (Recent research indicates that Neanderthals also ate starchy foods.)

“Even archaic hominins had these [genetic] variations and that indicates that they were consuming starch,” said Feyza Yilmaz, PhD, an associate computational scientist at The Jackson Laboratory in Bar Harbor, Maine, and a lead author of the study.

However, 4000 years ago, after the agricultural revolution, the research indicates that there were even more AMY1 copies acquired. Yilmaz noted, “with the advance of agriculture, we see an increase in high amylase copy number haplotypes. So genetic variation goes hand in hand with adaptation to the environment.” 

previous study showed that species that share an environment with humans, such as dogs and pigs, also have copy number variation of amylase genes, said Yilmaz, indicating a link between genome changes and an increase in starch consumption.
 

 

 

Potential Health Impacts on Modern Humans

The duplications in the AMY1 gene could have allowed humans to better digest starches. And it’s conceivable that having more copies of the gene means being able to break down starches even more efficiently, and those with more copies “may be more prone to having high blood sugar, prediabetes, that sort of thing,” Jorgensen said.

Whether those with more AMY1 genes have more health risks is an active area of research. “Researchers tested whether there’s a correlation between AMY1 gene copies and diabetes or BMI [body mass index]. And so far, some studies show that there is indeed correlation, but other studies show that there is no correlation at all,” said Yilmaz.

Yilmaz pointed out that only 5 or 10% of carb digestion happens in our mouths, the rest occurs in our small intestine, plus there are many other factors involved in eating and metabolism.

“I am really looking forward to seeing studies which truly figure out the connection between AMY1 copy number and metabolic health and also what type of factors play a role in metabolic health,” said Yilmaz.

It’s also possible that having more AMY1 copies could lead to more carb cravings as the enzyme creates a type of sugar in our mouths. “Previous studies show that there’s a correlation between AMY1 copy number and also the amylase enzyme levels, so the faster we process the starch, the taste [of starches] will be sweeter,” said Yilmaz.

However, the link between cravings and copy numbers isn’t clear. And we don’t exactly know what came first — did the starch in humans’ diet lead to more copies of amylase genes, or did the copies of the amylase genes drive cravings that lead us to cultivate more carbs? We’ll need more research to find out.
 

How Will Today’s Processed Carbs Affect Our Genes Tomorrow?

As our diet changes to increasingly include processed carbs, what will happen to our AMY1 genes is fuzzy. “I don’t know what this could do to our genomes in the next 1000 years or more than 1000 years,” Yilmaz noted, but she said from the evidence it seems as though we may have peaked in AMY1 copies.

Jorgensen noted that this research is focused on a European population. She wonders whether the pattern of AMY1 duplication will be repeated in other populations “because the rise of starch happened first in the Middle East and then Europe and then later in the Americas,” she said.

“There’s individual variation and then there’s population-wide variation,” Jorgensen pointed out. She speculates that the historical diet of different cultures could explain population-based variations in AMY1 genes — it’s something future research could investigate. Other populations may also experience genetic changes as much of the world shifts to a more carb-heavy Western diet.

Overall, this research adds to the growing evidence that humans have a long history of loving carbs — for better and, at least over our most recent history and immediate future, for worse.
 

A version of this article appeared on Medscape.com.

Trying to reduce your carbohydrate intake means going against nearly a million years of evolution.

Humans are among a few species with multiple copies of certain genes that help us break down starch — carbs like potatoes, beans, corn, and grains — so that we can turn it into energy our bodies can use.

However, it’s been difficult for researchers to pinpoint when in human history we acquired multiple copies of these genes because they’re in a region of the genome that’s hard to sequence.

A recent study published in Science suggests that humans may have developed multiple copies of the gene for amylase — an enzyme that’s the first step in starch digestion — over 800,000 years ago, long before the agricultural revolution. This genetic change could have helped us adapt to eating starchy foods.

The study shows how “what your ancestors ate thousands of years ago could be affecting our genetics today,” said Kelsey Jorgensen, PhD, a biological anthropologist at The University of Kansas, Lawrence, who was not involved in the study.

The double-edged sword has sharpened over all those centuries. On one hand, the human body needs and craves carbs to function. On the other hand, our modern-day consumption of carbs, especially calorie-dense/nutritionally-barren processed carbs, has long since passed “healthy.”
 

How Researchers Found Our Carb-Lover Gene

The enzyme amylase turns complex carbs into maltose, a sweet-tasting sugar that is made of two glucose molecules linked together. We make two kinds of amylases: Salivary amylase that breaks down carbs in our mouths and pancreatic amylase that is secreted into our small intestines.

Modern humans have multiple copies of both amylases. Past research showed that human populations with diets high in starch can have up to nine copies of the gene for salivary amylase, called AMY1.

To pinpoint when in human history we acquired multiple copies of AMY1, the new study utilized novel techniques, called optical genome mapping and long-read sequencing, to sequence and analyze the genes. They sequenced 98 modern-day samples and 68 ancient DNA samples, including one from a Siberian person who lived 45,000 years ago.

The ancient DNA data in the study allowed the researchers to track how the number of amylase genes changed over time, said George Perry, PhD, an anthropological geneticist at The Pennsylvania State University-University Park (he was not involved in the study).

Based on the sequencing, the team analyzed changes in the genes in their samples to gauge evolutionary timelines. Perry noted that this was a “very clever approach to estimating the amylase copy number mutation rate, which in turn can really help in testing evolutionary hypotheses.”

The researchers found that even before farming, hunter-gatherers had between four and eight AMY1 genes in their cells. This suggests that people across Eurasia already had a number of these genes long before they started growing crops. (Recent research indicates that Neanderthals also ate starchy foods.)

“Even archaic hominins had these [genetic] variations and that indicates that they were consuming starch,” said Feyza Yilmaz, PhD, an associate computational scientist at The Jackson Laboratory in Bar Harbor, Maine, and a lead author of the study.

However, 4000 years ago, after the agricultural revolution, the research indicates that there were even more AMY1 copies acquired. Yilmaz noted, “with the advance of agriculture, we see an increase in high amylase copy number haplotypes. So genetic variation goes hand in hand with adaptation to the environment.” 

previous study showed that species that share an environment with humans, such as dogs and pigs, also have copy number variation of amylase genes, said Yilmaz, indicating a link between genome changes and an increase in starch consumption.
 

 

 

Potential Health Impacts on Modern Humans

The duplications in the AMY1 gene could have allowed humans to better digest starches. And it’s conceivable that having more copies of the gene means being able to break down starches even more efficiently, and those with more copies “may be more prone to having high blood sugar, prediabetes, that sort of thing,” Jorgensen said.

Whether those with more AMY1 genes have more health risks is an active area of research. “Researchers tested whether there’s a correlation between AMY1 gene copies and diabetes or BMI [body mass index]. And so far, some studies show that there is indeed correlation, but other studies show that there is no correlation at all,” said Yilmaz.

Yilmaz pointed out that only 5 or 10% of carb digestion happens in our mouths, the rest occurs in our small intestine, plus there are many other factors involved in eating and metabolism.

“I am really looking forward to seeing studies which truly figure out the connection between AMY1 copy number and metabolic health and also what type of factors play a role in metabolic health,” said Yilmaz.

It’s also possible that having more AMY1 copies could lead to more carb cravings as the enzyme creates a type of sugar in our mouths. “Previous studies show that there’s a correlation between AMY1 copy number and also the amylase enzyme levels, so the faster we process the starch, the taste [of starches] will be sweeter,” said Yilmaz.

However, the link between cravings and copy numbers isn’t clear. And we don’t exactly know what came first — did the starch in humans’ diet lead to more copies of amylase genes, or did the copies of the amylase genes drive cravings that lead us to cultivate more carbs? We’ll need more research to find out.
 

How Will Today’s Processed Carbs Affect Our Genes Tomorrow?

As our diet changes to increasingly include processed carbs, what will happen to our AMY1 genes is fuzzy. “I don’t know what this could do to our genomes in the next 1000 years or more than 1000 years,” Yilmaz noted, but she said from the evidence it seems as though we may have peaked in AMY1 copies.

Jorgensen noted that this research is focused on a European population. She wonders whether the pattern of AMY1 duplication will be repeated in other populations “because the rise of starch happened first in the Middle East and then Europe and then later in the Americas,” she said.

“There’s individual variation and then there’s population-wide variation,” Jorgensen pointed out. She speculates that the historical diet of different cultures could explain population-based variations in AMY1 genes — it’s something future research could investigate. Other populations may also experience genetic changes as much of the world shifts to a more carb-heavy Western diet.

Overall, this research adds to the growing evidence that humans have a long history of loving carbs — for better and, at least over our most recent history and immediate future, for worse.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Disc Degeneration in Chronic Low Back Pain: Can Stem Cells Help?

Article Type
Changed
Wed, 11/06/2024 - 04:49

 

TOPLINE:

Allogeneic bone marrow–derived mesenchymal stromal cells (BM-MSCs) are safe but do not show efficacy in treating intervertebral disc degeneration (IDD) in patients with chronic low back pain.

METHODOLOGY:

  • The RESPINE trial assessed the efficacy and safety of a single intradiscal injection of allogeneic BM-MSCs in the treatment of chronic low back pain caused by single-level IDD.
  • Overall, 114 patients (mean age, 40.9 years; 35% women) with IDD-associated chronic low back pain that was persistent for 3 months or more despite conventional medical therapy and without previous surgery, were recruited across four European countries from April 2018 to April 2021 and randomly assigned to receive either intradiscal injections of allogeneic BM-MSCs (n = 58) or sham injections (n = 56).
  • The first co-primary endpoint was the rate of response to BM-MSC injections at 12 months after treatment, defined as improvement of at least 20% or 20 mm in the Visual Analog Scale for pain or improvement of at least 20% in the Oswestry Disability Index for functional status.
  • The secondary co-primary endpoint was structural efficacy, based on disc fluid content measured by quantitative T2 MRI between baseline and month 12.

TAKEAWAY:

  • At 12 months post-intervention, 74% of patients in the BM-MSC group were classified as responders compared with 68.8% in the placebo group. However, the difference between the groups was not statistically significant.
  • The probability of being a responder was higher in the BM-MSC group than in the sham group; however, the findings did not reach statistical significance.
  • The average change in disc fluid content, indicative of disc regeneration, from baseline to 12 months was 37.9% in the BM-MSC group and 41.7% in the placebo group, with no significant difference between the groups.
  • The incidence of adverse events and serious adverse events was not significantly different between the treatment groups.

IN PRACTICE:

“BM-MSC represents a promising opportunity for the biological treatment of IDD, but only high-quality randomized controlled trials, comparing it to standard care, can determine whether it is a truly effective alternative to spine fusion or disc replacement,” the authors wrote.

SOURCE:

The study was led by Yves-Marie Pers, MD, PhD, Clinical Immunology and Osteoarticular Diseases Therapeutic Unit, CHRU Lapeyronie, Montpellier, France. It was published online on October 11, 2024, in Annals of the Rheumatic Diseases.

LIMITATIONS:

MRI results were collected from only 55 patients across both trial arms, which may have affected the statistical power of the findings. Although patients were monitored for up to 24 months, the long-term efficacy and safety of BM-MSC therapy for IDD may not have been fully captured. Selection bias could not be excluded because of the difficulty in accurately identifying patients with chronic low back pain caused by single-level IDD.

DISCLOSURES:

The study was funded by the European Union’s Horizon 2020 Research and Innovation Programme. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Allogeneic bone marrow–derived mesenchymal stromal cells (BM-MSCs) are safe but do not show efficacy in treating intervertebral disc degeneration (IDD) in patients with chronic low back pain.

METHODOLOGY:

  • The RESPINE trial assessed the efficacy and safety of a single intradiscal injection of allogeneic BM-MSCs in the treatment of chronic low back pain caused by single-level IDD.
  • Overall, 114 patients (mean age, 40.9 years; 35% women) with IDD-associated chronic low back pain that was persistent for 3 months or more despite conventional medical therapy and without previous surgery, were recruited across four European countries from April 2018 to April 2021 and randomly assigned to receive either intradiscal injections of allogeneic BM-MSCs (n = 58) or sham injections (n = 56).
  • The first co-primary endpoint was the rate of response to BM-MSC injections at 12 months after treatment, defined as improvement of at least 20% or 20 mm in the Visual Analog Scale for pain or improvement of at least 20% in the Oswestry Disability Index for functional status.
  • The secondary co-primary endpoint was structural efficacy, based on disc fluid content measured by quantitative T2 MRI between baseline and month 12.

TAKEAWAY:

  • At 12 months post-intervention, 74% of patients in the BM-MSC group were classified as responders compared with 68.8% in the placebo group. However, the difference between the groups was not statistically significant.
  • The probability of being a responder was higher in the BM-MSC group than in the sham group; however, the findings did not reach statistical significance.
  • The average change in disc fluid content, indicative of disc regeneration, from baseline to 12 months was 37.9% in the BM-MSC group and 41.7% in the placebo group, with no significant difference between the groups.
  • The incidence of adverse events and serious adverse events was not significantly different between the treatment groups.

IN PRACTICE:

“BM-MSC represents a promising opportunity for the biological treatment of IDD, but only high-quality randomized controlled trials, comparing it to standard care, can determine whether it is a truly effective alternative to spine fusion or disc replacement,” the authors wrote.

SOURCE:

The study was led by Yves-Marie Pers, MD, PhD, Clinical Immunology and Osteoarticular Diseases Therapeutic Unit, CHRU Lapeyronie, Montpellier, France. It was published online on October 11, 2024, in Annals of the Rheumatic Diseases.

LIMITATIONS:

MRI results were collected from only 55 patients across both trial arms, which may have affected the statistical power of the findings. Although patients were monitored for up to 24 months, the long-term efficacy and safety of BM-MSC therapy for IDD may not have been fully captured. Selection bias could not be excluded because of the difficulty in accurately identifying patients with chronic low back pain caused by single-level IDD.

DISCLOSURES:

The study was funded by the European Union’s Horizon 2020 Research and Innovation Programme. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

 

TOPLINE:

Allogeneic bone marrow–derived mesenchymal stromal cells (BM-MSCs) are safe but do not show efficacy in treating intervertebral disc degeneration (IDD) in patients with chronic low back pain.

METHODOLOGY:

  • The RESPINE trial assessed the efficacy and safety of a single intradiscal injection of allogeneic BM-MSCs in the treatment of chronic low back pain caused by single-level IDD.
  • Overall, 114 patients (mean age, 40.9 years; 35% women) with IDD-associated chronic low back pain that was persistent for 3 months or more despite conventional medical therapy and without previous surgery, were recruited across four European countries from April 2018 to April 2021 and randomly assigned to receive either intradiscal injections of allogeneic BM-MSCs (n = 58) or sham injections (n = 56).
  • The first co-primary endpoint was the rate of response to BM-MSC injections at 12 months after treatment, defined as improvement of at least 20% or 20 mm in the Visual Analog Scale for pain or improvement of at least 20% in the Oswestry Disability Index for functional status.
  • The secondary co-primary endpoint was structural efficacy, based on disc fluid content measured by quantitative T2 MRI between baseline and month 12.

TAKEAWAY:

  • At 12 months post-intervention, 74% of patients in the BM-MSC group were classified as responders compared with 68.8% in the placebo group. However, the difference between the groups was not statistically significant.
  • The probability of being a responder was higher in the BM-MSC group than in the sham group; however, the findings did not reach statistical significance.
  • The average change in disc fluid content, indicative of disc regeneration, from baseline to 12 months was 37.9% in the BM-MSC group and 41.7% in the placebo group, with no significant difference between the groups.
  • The incidence of adverse events and serious adverse events was not significantly different between the treatment groups.

IN PRACTICE:

“BM-MSC represents a promising opportunity for the biological treatment of IDD, but only high-quality randomized controlled trials, comparing it to standard care, can determine whether it is a truly effective alternative to spine fusion or disc replacement,” the authors wrote.

SOURCE:

The study was led by Yves-Marie Pers, MD, PhD, Clinical Immunology and Osteoarticular Diseases Therapeutic Unit, CHRU Lapeyronie, Montpellier, France. It was published online on October 11, 2024, in Annals of the Rheumatic Diseases.

LIMITATIONS:

MRI results were collected from only 55 patients across both trial arms, which may have affected the statistical power of the findings. Although patients were monitored for up to 24 months, the long-term efficacy and safety of BM-MSC therapy for IDD may not have been fully captured. Selection bias could not be excluded because of the difficulty in accurately identifying patients with chronic low back pain caused by single-level IDD.

DISCLOSURES:

The study was funded by the European Union’s Horizon 2020 Research and Innovation Programme. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article