User login
Collaboration Tackles Steroid-Induced Adrenal Insufficiency
Endocrinologists in Europe and the United States have come together to produce joint guidance to help clinicians manage patients who have, or are a at risk for, glucocorticoid-induced adrenal insufficiency (GC-AI).
Publication of the guidance marks the first time that the European Society of Endocrinology (ESE) and the Endocrine Society have cooperated in producing a guideline.
The guideline “Diagnosis and therapy of glucocorticoid-induced adrenal insufficiency” is published in the May 2024 issues of the societies respective journals, the European Journal of Endocrinology and The Journal of Clinical Endocrinology & Metabolism.
Felix Beuschlein, PhD, from the ESE, who cochaired the clinical committee, told this news organization: “You would hope that this leads to a common ground for a very large number of patients.”
The risk for GC-AI is dependent on the dose, duration, and potency of the glucocorticoid; route of administration; as well as susceptibility of the individual patient. Once it develops or is suspected, careful education and management of affected patients is required.
Glucocorticoids Commonly Prescribed
“Glucocorticoid-induced adrenal insufficiency is actually a potential concern for a lot of patients,” coauthor Tobias Else, MD, of the department of internal medicine at the University of Michigan, Ann Arbor, explained to this news organization. “Roughly 1% of all people are being treated with glucocorticoids at any given time.”
“That’s a tremendous number, and it gives the scale of the situation,” added Dr. Beuschlein, director of the department of endocrinology, diabetology, and clinical nutrition at University Hospital Zürich in Switzerland. “Now, fortunately, only a very small proportion of patients who are treated with glucocorticoids do have endocrine problems, and this is what this guideline is actually concentrating on.”
Glucocorticoids are effective agents for treating autoimmune and inflammatory disorders. However, they can cause adverse reactions, particularly when administered at high doses and/or for a prolonged period.
Some studies have reported that even low-dose glucocorticoid use, such as prednisone at 2.5-7.5 mg/d, can increase the risk for cardiovascular disease, severe infections, hypertension, diabetes, osteoporosis, and fractures, as well as increase overall mortality with concurrent type 2 diabetes.
Tapering glucocorticoids can be challenging when symptoms of glucocorticoid withdrawal develop, which overlap with those of adrenal insufficiency, the guidelines stated. In general, tapering of glucocorticoids can occur more rapidly within a supraphysiological range, followed by a slower taper when on physiological glucocorticoid dosing.
The degree and persistence of hypothalamic-pituitary-adrenal (HPA) axis suppression after glucocorticoid therapy is stopped depends on overall exposure and recovery of adrenal function. “This is a quite individual situation, as you can imagine, because it’s about sex, age, and comorbidities, the kind of glucocorticoid or other medications that you’re giving,” said Dr. Beuschlein. To cover contingencies, the paper presents tables to explain management covering various eventualities.
Leonie van Hulsteijn, MD, from the department of clinical epidemiology, Leiden University Medical Center, Leiden, the Netherlands, said: “There are so many other specialties prescribing glucocorticoids; so especially the rheumatologist, the pulmonologist, the general practitioners.”
Asked by this news organization whether the guidelines might dissuade some clinicians from offering glucocorticoids, Dr. van Hulsteijn, who contributed to the guidance, said, “I don’t think it will keep them from prescribing it, but I really hope it will make them aware if somebody, after using long-term glucocorticoids, presents with complaints of adrenal insufficiency, that they will be aware and take immediate action.”
Evidence Gaps
The review team took around 2.5 years to draw up the guidance amid some concerns about the quality of the evidence base, which they mainly rated as “low” or “very low.” “I think we all, going through the literature, were quite astonished at how bad the evidence is for a problem as global as that,” said Dr. Beuschlein. “But that’s how it is — sometimes, even in the absence of strong evidence, you have to give some kind of guidance.”
Nevertheless, the authors have called for more research to establish risk factors contributing to the development of and susceptibility to adrenal insufficiency, a greater understanding of glucocorticoid withdrawal, and identification of glucocorticoids retaining immunosuppressive and anti-inflammatory properties that have less effect on HPA axis suppression and an improved adverse effect profile.
Patient-facing materials on GC-AI are also in development and will be made available via the ESE Patient Zone this month.
Next year, the societies plan to publish a joint guideline on diabetes in pregnancy. That will be followed in 2026 by guidance on arginine vasopressin resistance and arginine vasopressin deficiency and a guideline on male hypogonadism in 2027.
Funding for the development of joint guidelines was provided by the societies and did not involve support from other bodies.
Dr. Beuschlein declared funding from the German Research Funding Agency, the Swiss National Science Foundation, University Medicine Zürich, the Vontobel Foundation, the Swiss Heart Foundation, and consultancy work for Bayer AG. Dr. Else declared membership of the advisory board of Merck and Company. Dr. van Hulsteijn declared no conflicts of interest.
A version of this article appeared on Medscape.com.
Endocrinologists in Europe and the United States have come together to produce joint guidance to help clinicians manage patients who have, or are a at risk for, glucocorticoid-induced adrenal insufficiency (GC-AI).
Publication of the guidance marks the first time that the European Society of Endocrinology (ESE) and the Endocrine Society have cooperated in producing a guideline.
The guideline “Diagnosis and therapy of glucocorticoid-induced adrenal insufficiency” is published in the May 2024 issues of the societies respective journals, the European Journal of Endocrinology and The Journal of Clinical Endocrinology & Metabolism.
Felix Beuschlein, PhD, from the ESE, who cochaired the clinical committee, told this news organization: “You would hope that this leads to a common ground for a very large number of patients.”
The risk for GC-AI is dependent on the dose, duration, and potency of the glucocorticoid; route of administration; as well as susceptibility of the individual patient. Once it develops or is suspected, careful education and management of affected patients is required.
Glucocorticoids Commonly Prescribed
“Glucocorticoid-induced adrenal insufficiency is actually a potential concern for a lot of patients,” coauthor Tobias Else, MD, of the department of internal medicine at the University of Michigan, Ann Arbor, explained to this news organization. “Roughly 1% of all people are being treated with glucocorticoids at any given time.”
“That’s a tremendous number, and it gives the scale of the situation,” added Dr. Beuschlein, director of the department of endocrinology, diabetology, and clinical nutrition at University Hospital Zürich in Switzerland. “Now, fortunately, only a very small proportion of patients who are treated with glucocorticoids do have endocrine problems, and this is what this guideline is actually concentrating on.”
Glucocorticoids are effective agents for treating autoimmune and inflammatory disorders. However, they can cause adverse reactions, particularly when administered at high doses and/or for a prolonged period.
Some studies have reported that even low-dose glucocorticoid use, such as prednisone at 2.5-7.5 mg/d, can increase the risk for cardiovascular disease, severe infections, hypertension, diabetes, osteoporosis, and fractures, as well as increase overall mortality with concurrent type 2 diabetes.
Tapering glucocorticoids can be challenging when symptoms of glucocorticoid withdrawal develop, which overlap with those of adrenal insufficiency, the guidelines stated. In general, tapering of glucocorticoids can occur more rapidly within a supraphysiological range, followed by a slower taper when on physiological glucocorticoid dosing.
The degree and persistence of hypothalamic-pituitary-adrenal (HPA) axis suppression after glucocorticoid therapy is stopped depends on overall exposure and recovery of adrenal function. “This is a quite individual situation, as you can imagine, because it’s about sex, age, and comorbidities, the kind of glucocorticoid or other medications that you’re giving,” said Dr. Beuschlein. To cover contingencies, the paper presents tables to explain management covering various eventualities.
Leonie van Hulsteijn, MD, from the department of clinical epidemiology, Leiden University Medical Center, Leiden, the Netherlands, said: “There are so many other specialties prescribing glucocorticoids; so especially the rheumatologist, the pulmonologist, the general practitioners.”
Asked by this news organization whether the guidelines might dissuade some clinicians from offering glucocorticoids, Dr. van Hulsteijn, who contributed to the guidance, said, “I don’t think it will keep them from prescribing it, but I really hope it will make them aware if somebody, after using long-term glucocorticoids, presents with complaints of adrenal insufficiency, that they will be aware and take immediate action.”
Evidence Gaps
The review team took around 2.5 years to draw up the guidance amid some concerns about the quality of the evidence base, which they mainly rated as “low” or “very low.” “I think we all, going through the literature, were quite astonished at how bad the evidence is for a problem as global as that,” said Dr. Beuschlein. “But that’s how it is — sometimes, even in the absence of strong evidence, you have to give some kind of guidance.”
Nevertheless, the authors have called for more research to establish risk factors contributing to the development of and susceptibility to adrenal insufficiency, a greater understanding of glucocorticoid withdrawal, and identification of glucocorticoids retaining immunosuppressive and anti-inflammatory properties that have less effect on HPA axis suppression and an improved adverse effect profile.
Patient-facing materials on GC-AI are also in development and will be made available via the ESE Patient Zone this month.
Next year, the societies plan to publish a joint guideline on diabetes in pregnancy. That will be followed in 2026 by guidance on arginine vasopressin resistance and arginine vasopressin deficiency and a guideline on male hypogonadism in 2027.
Funding for the development of joint guidelines was provided by the societies and did not involve support from other bodies.
Dr. Beuschlein declared funding from the German Research Funding Agency, the Swiss National Science Foundation, University Medicine Zürich, the Vontobel Foundation, the Swiss Heart Foundation, and consultancy work for Bayer AG. Dr. Else declared membership of the advisory board of Merck and Company. Dr. van Hulsteijn declared no conflicts of interest.
A version of this article appeared on Medscape.com.
Endocrinologists in Europe and the United States have come together to produce joint guidance to help clinicians manage patients who have, or are a at risk for, glucocorticoid-induced adrenal insufficiency (GC-AI).
Publication of the guidance marks the first time that the European Society of Endocrinology (ESE) and the Endocrine Society have cooperated in producing a guideline.
The guideline “Diagnosis and therapy of glucocorticoid-induced adrenal insufficiency” is published in the May 2024 issues of the societies respective journals, the European Journal of Endocrinology and The Journal of Clinical Endocrinology & Metabolism.
Felix Beuschlein, PhD, from the ESE, who cochaired the clinical committee, told this news organization: “You would hope that this leads to a common ground for a very large number of patients.”
The risk for GC-AI is dependent on the dose, duration, and potency of the glucocorticoid; route of administration; as well as susceptibility of the individual patient. Once it develops or is suspected, careful education and management of affected patients is required.
Glucocorticoids Commonly Prescribed
“Glucocorticoid-induced adrenal insufficiency is actually a potential concern for a lot of patients,” coauthor Tobias Else, MD, of the department of internal medicine at the University of Michigan, Ann Arbor, explained to this news organization. “Roughly 1% of all people are being treated with glucocorticoids at any given time.”
“That’s a tremendous number, and it gives the scale of the situation,” added Dr. Beuschlein, director of the department of endocrinology, diabetology, and clinical nutrition at University Hospital Zürich in Switzerland. “Now, fortunately, only a very small proportion of patients who are treated with glucocorticoids do have endocrine problems, and this is what this guideline is actually concentrating on.”
Glucocorticoids are effective agents for treating autoimmune and inflammatory disorders. However, they can cause adverse reactions, particularly when administered at high doses and/or for a prolonged period.
Some studies have reported that even low-dose glucocorticoid use, such as prednisone at 2.5-7.5 mg/d, can increase the risk for cardiovascular disease, severe infections, hypertension, diabetes, osteoporosis, and fractures, as well as increase overall mortality with concurrent type 2 diabetes.
Tapering glucocorticoids can be challenging when symptoms of glucocorticoid withdrawal develop, which overlap with those of adrenal insufficiency, the guidelines stated. In general, tapering of glucocorticoids can occur more rapidly within a supraphysiological range, followed by a slower taper when on physiological glucocorticoid dosing.
The degree and persistence of hypothalamic-pituitary-adrenal (HPA) axis suppression after glucocorticoid therapy is stopped depends on overall exposure and recovery of adrenal function. “This is a quite individual situation, as you can imagine, because it’s about sex, age, and comorbidities, the kind of glucocorticoid or other medications that you’re giving,” said Dr. Beuschlein. To cover contingencies, the paper presents tables to explain management covering various eventualities.
Leonie van Hulsteijn, MD, from the department of clinical epidemiology, Leiden University Medical Center, Leiden, the Netherlands, said: “There are so many other specialties prescribing glucocorticoids; so especially the rheumatologist, the pulmonologist, the general practitioners.”
Asked by this news organization whether the guidelines might dissuade some clinicians from offering glucocorticoids, Dr. van Hulsteijn, who contributed to the guidance, said, “I don’t think it will keep them from prescribing it, but I really hope it will make them aware if somebody, after using long-term glucocorticoids, presents with complaints of adrenal insufficiency, that they will be aware and take immediate action.”
Evidence Gaps
The review team took around 2.5 years to draw up the guidance amid some concerns about the quality of the evidence base, which they mainly rated as “low” or “very low.” “I think we all, going through the literature, were quite astonished at how bad the evidence is for a problem as global as that,” said Dr. Beuschlein. “But that’s how it is — sometimes, even in the absence of strong evidence, you have to give some kind of guidance.”
Nevertheless, the authors have called for more research to establish risk factors contributing to the development of and susceptibility to adrenal insufficiency, a greater understanding of glucocorticoid withdrawal, and identification of glucocorticoids retaining immunosuppressive and anti-inflammatory properties that have less effect on HPA axis suppression and an improved adverse effect profile.
Patient-facing materials on GC-AI are also in development and will be made available via the ESE Patient Zone this month.
Next year, the societies plan to publish a joint guideline on diabetes in pregnancy. That will be followed in 2026 by guidance on arginine vasopressin resistance and arginine vasopressin deficiency and a guideline on male hypogonadism in 2027.
Funding for the development of joint guidelines was provided by the societies and did not involve support from other bodies.
Dr. Beuschlein declared funding from the German Research Funding Agency, the Swiss National Science Foundation, University Medicine Zürich, the Vontobel Foundation, the Swiss Heart Foundation, and consultancy work for Bayer AG. Dr. Else declared membership of the advisory board of Merck and Company. Dr. van Hulsteijn declared no conflicts of interest.
A version of this article appeared on Medscape.com.
Nocturnal Hot Flashes and Alzheimer’s Risk
In a recent article in the American Journal of Obstetrics & Gynecology, Rebecca C. Thurston, PhD, and Pauline Maki, PhD, leading scientists in the area of menopause’s impact on brain function, presented data from their assessment of 248 late perimenopausal and postmenopausal women who reported hot flashes, also known as vasomotor symptoms (VMS).
Hot flashes are known to be associated with changes in brain white matter, carotid atherosclerosis, brain function, and memory. Dr. Thurston and colleagues objectively measured VMS over 24 hours, using skin conductance monitoring. Plasma concentrations of Alzheimer’s disease biomarkers, including the amyloid beta 42–to–amyloid beta 40 ratio, were assessed. The mean age of study participants was 59 years, and they experienced a mean of five objective VMS daily.
A key finding was that VMS, particularly those occurring during sleep, were associated with a significantly lower amyloid beta 42–to–beta 40 ratio. This finding suggests that nighttime VMS may be a marker of risk for Alzheimer’s disease.
Previous research has found that menopausal hormone therapy is associated with favorable changes in Alzheimer’s disease biomarkers. Likewise, large observational studies have shown a lower incidence of Alzheimer’s disease among women who initiate hormone therapy in their late perimenopausal or early postmenopausal years and continue such therapy long term.
The findings of this important study by Thurston and colleagues provide further evidence to support the tantalizing possibility that agents that reduce nighttime hot flashes (including hormone therapy) may lower the subsequent incidence of Alzheimer’s disease in high-risk women.
Dr. Kaunitz is a tenured professor and associate chair in the department of obstetrics and gynecology at the University of Florida College of Medicine–Jacksonville, and medical director and director of menopause and gynecologic ultrasound services at the University of Florida Southside Women’s Health, Jacksonville. He disclosed ties to Sumitomo Pharma America, Mithra, Viatris, Bayer, Merck, Mylan (Viatris), and UpToDate.
A version of this article appeared on Medscape.com.
In a recent article in the American Journal of Obstetrics & Gynecology, Rebecca C. Thurston, PhD, and Pauline Maki, PhD, leading scientists in the area of menopause’s impact on brain function, presented data from their assessment of 248 late perimenopausal and postmenopausal women who reported hot flashes, also known as vasomotor symptoms (VMS).
Hot flashes are known to be associated with changes in brain white matter, carotid atherosclerosis, brain function, and memory. Dr. Thurston and colleagues objectively measured VMS over 24 hours, using skin conductance monitoring. Plasma concentrations of Alzheimer’s disease biomarkers, including the amyloid beta 42–to–amyloid beta 40 ratio, were assessed. The mean age of study participants was 59 years, and they experienced a mean of five objective VMS daily.
A key finding was that VMS, particularly those occurring during sleep, were associated with a significantly lower amyloid beta 42–to–beta 40 ratio. This finding suggests that nighttime VMS may be a marker of risk for Alzheimer’s disease.
Previous research has found that menopausal hormone therapy is associated with favorable changes in Alzheimer’s disease biomarkers. Likewise, large observational studies have shown a lower incidence of Alzheimer’s disease among women who initiate hormone therapy in their late perimenopausal or early postmenopausal years and continue such therapy long term.
The findings of this important study by Thurston and colleagues provide further evidence to support the tantalizing possibility that agents that reduce nighttime hot flashes (including hormone therapy) may lower the subsequent incidence of Alzheimer’s disease in high-risk women.
Dr. Kaunitz is a tenured professor and associate chair in the department of obstetrics and gynecology at the University of Florida College of Medicine–Jacksonville, and medical director and director of menopause and gynecologic ultrasound services at the University of Florida Southside Women’s Health, Jacksonville. He disclosed ties to Sumitomo Pharma America, Mithra, Viatris, Bayer, Merck, Mylan (Viatris), and UpToDate.
A version of this article appeared on Medscape.com.
In a recent article in the American Journal of Obstetrics & Gynecology, Rebecca C. Thurston, PhD, and Pauline Maki, PhD, leading scientists in the area of menopause’s impact on brain function, presented data from their assessment of 248 late perimenopausal and postmenopausal women who reported hot flashes, also known as vasomotor symptoms (VMS).
Hot flashes are known to be associated with changes in brain white matter, carotid atherosclerosis, brain function, and memory. Dr. Thurston and colleagues objectively measured VMS over 24 hours, using skin conductance monitoring. Plasma concentrations of Alzheimer’s disease biomarkers, including the amyloid beta 42–to–amyloid beta 40 ratio, were assessed. The mean age of study participants was 59 years, and they experienced a mean of five objective VMS daily.
A key finding was that VMS, particularly those occurring during sleep, were associated with a significantly lower amyloid beta 42–to–beta 40 ratio. This finding suggests that nighttime VMS may be a marker of risk for Alzheimer’s disease.
Previous research has found that menopausal hormone therapy is associated with favorable changes in Alzheimer’s disease biomarkers. Likewise, large observational studies have shown a lower incidence of Alzheimer’s disease among women who initiate hormone therapy in their late perimenopausal or early postmenopausal years and continue such therapy long term.
The findings of this important study by Thurston and colleagues provide further evidence to support the tantalizing possibility that agents that reduce nighttime hot flashes (including hormone therapy) may lower the subsequent incidence of Alzheimer’s disease in high-risk women.
Dr. Kaunitz is a tenured professor and associate chair in the department of obstetrics and gynecology at the University of Florida College of Medicine–Jacksonville, and medical director and director of menopause and gynecologic ultrasound services at the University of Florida Southside Women’s Health, Jacksonville. He disclosed ties to Sumitomo Pharma America, Mithra, Viatris, Bayer, Merck, Mylan (Viatris), and UpToDate.
A version of this article appeared on Medscape.com.
Testosterone/CVD Risk Debate Revived by New Meta-Analysis
A new systematic literature review adds complexity to the controversy over testosterone’s relationship to risk for myocardial infarction, stroke, cardiovascular death, and all-cause mortality.
Last year, the TRAVERSE (Testosterone Replacement Therapy for Assessment of Long-term Vascular Events and Efficacy ResponSE in Hypogonadal Men) trial was the first randomized, placebo-controlled study designed and powered to determine whether testosterone therapy increased risk for major cardiovascular events in men (ages 45-80 years). Its conclusions provided reassurance that modest use of testosterone therapy short term does not increase CVD risk.
But other studies have had different conclusions and TRAVERSE left unanswered questions, so Bu B. Yeap, MBBS, PhD, an endocrinologist at the University of Western Australia in Crawley, and colleagues completed a literature review with 11 prospective cohort studies of community-dwelling men with sex steroid levels measured with mass spectrometry. Nine of the studies provided individual participation data (IPD); two used aggregate data, and all had at least 5 years of follow-up.
The findings were published in Annals of Internal Medicine .
Dr. Yeap’s team concluded that certain groups of men have higher risk for CVD events. In this study, men with very low testosterone, high luteinizing hormone (LH), or very low estradiol concentrations had higher all-cause mortality. Sex hormone–binding globulin (SHBG) concentration was positively associated and dihydrotestosterone (DHT) levels were nonlinearly associated with all-cause mortality and CVD mortality.
The testosterone level below which men had higher risk of death from any cause was 7.4 nmol/L (213 ng/dL), regardless of LH concentration, the researchers concluded, writing, “This adds to information on reference ranges based on distributions of testosterone in selected samples of healthy men.”
The link between higher SHBG concentrations and higher all-cause mortality “may be related to its role as the major binding protein for sex steroids in the circulation,” the authors wrote. “We found a U-shaped association of DHT with all-cause and CVD-related mortality risks, which were higher at lower and very high DHT concentrations. Men with very low DHT concentrations also had increased risk for incident CVD events. Further investigation into potential underlying mechanisms for these associations is warranted.”
Rigorous Methodology Adds Value
Bradley D. Anawalt, MD, with the University of Washington School of Medicine in Seattle, pointed out in an accompanying editorial that the study’s findings are particularly valuable because of the team’s rigorous methodology. The team measured testosterone with the gold standard, mass spectrometry, which can also measure DHT and estradiol more accurately than widely available commercial immunoassays, which “are inaccurate for measurement of these sex steroids in men, who typically have low serum concentrations of these two metabolites of testosterone,” Dr. Anawalt said.
Also, the researchers obtained raw data from the nine IPD studies and reanalyzed the combined data, which allows for more sophisticated analysis when combining data from multiple studies, Dr. Anawalt explained.
The main finding from the Yeap et al. study, he wrote, is that high testosterone concentrations at baseline were not linked with increased deaths from CVD or from all causes “but very low serum total testosterone concentrations at baseline were.
“It is tempting to hypothesize that testosterone therapy might have cardiovascular benefits solely in patients with very low concentrations of serum total testosterone,” Dr. Anawalt wrote.
He pointed out as particularly interesting the findings for DHT and estradiol.
“The finding that a low serum estradiol concentration is associated with higher all-cause mortality adds another reason (in addition to the adverse effects on body fat and bone health) to avoid aromatase inhibitors that are commonly taken by persons who use anabolic steroids,” he wrote. “The prospect of a U-shaped curve for the relationship between serum DHT and higher cardiovascular risk warrants further study.”
The work is funded by the Government of Western Australia and Lawley Pharmaceuticals. The authors’ and editorial writer’s conflicts of interest are listed in the full study.
A new systematic literature review adds complexity to the controversy over testosterone’s relationship to risk for myocardial infarction, stroke, cardiovascular death, and all-cause mortality.
Last year, the TRAVERSE (Testosterone Replacement Therapy for Assessment of Long-term Vascular Events and Efficacy ResponSE in Hypogonadal Men) trial was the first randomized, placebo-controlled study designed and powered to determine whether testosterone therapy increased risk for major cardiovascular events in men (ages 45-80 years). Its conclusions provided reassurance that modest use of testosterone therapy short term does not increase CVD risk.
But other studies have had different conclusions and TRAVERSE left unanswered questions, so Bu B. Yeap, MBBS, PhD, an endocrinologist at the University of Western Australia in Crawley, and colleagues completed a literature review with 11 prospective cohort studies of community-dwelling men with sex steroid levels measured with mass spectrometry. Nine of the studies provided individual participation data (IPD); two used aggregate data, and all had at least 5 years of follow-up.
The findings were published in Annals of Internal Medicine .
Dr. Yeap’s team concluded that certain groups of men have higher risk for CVD events. In this study, men with very low testosterone, high luteinizing hormone (LH), or very low estradiol concentrations had higher all-cause mortality. Sex hormone–binding globulin (SHBG) concentration was positively associated and dihydrotestosterone (DHT) levels were nonlinearly associated with all-cause mortality and CVD mortality.
The testosterone level below which men had higher risk of death from any cause was 7.4 nmol/L (213 ng/dL), regardless of LH concentration, the researchers concluded, writing, “This adds to information on reference ranges based on distributions of testosterone in selected samples of healthy men.”
The link between higher SHBG concentrations and higher all-cause mortality “may be related to its role as the major binding protein for sex steroids in the circulation,” the authors wrote. “We found a U-shaped association of DHT with all-cause and CVD-related mortality risks, which were higher at lower and very high DHT concentrations. Men with very low DHT concentrations also had increased risk for incident CVD events. Further investigation into potential underlying mechanisms for these associations is warranted.”
Rigorous Methodology Adds Value
Bradley D. Anawalt, MD, with the University of Washington School of Medicine in Seattle, pointed out in an accompanying editorial that the study’s findings are particularly valuable because of the team’s rigorous methodology. The team measured testosterone with the gold standard, mass spectrometry, which can also measure DHT and estradiol more accurately than widely available commercial immunoassays, which “are inaccurate for measurement of these sex steroids in men, who typically have low serum concentrations of these two metabolites of testosterone,” Dr. Anawalt said.
Also, the researchers obtained raw data from the nine IPD studies and reanalyzed the combined data, which allows for more sophisticated analysis when combining data from multiple studies, Dr. Anawalt explained.
The main finding from the Yeap et al. study, he wrote, is that high testosterone concentrations at baseline were not linked with increased deaths from CVD or from all causes “but very low serum total testosterone concentrations at baseline were.
“It is tempting to hypothesize that testosterone therapy might have cardiovascular benefits solely in patients with very low concentrations of serum total testosterone,” Dr. Anawalt wrote.
He pointed out as particularly interesting the findings for DHT and estradiol.
“The finding that a low serum estradiol concentration is associated with higher all-cause mortality adds another reason (in addition to the adverse effects on body fat and bone health) to avoid aromatase inhibitors that are commonly taken by persons who use anabolic steroids,” he wrote. “The prospect of a U-shaped curve for the relationship between serum DHT and higher cardiovascular risk warrants further study.”
The work is funded by the Government of Western Australia and Lawley Pharmaceuticals. The authors’ and editorial writer’s conflicts of interest are listed in the full study.
A new systematic literature review adds complexity to the controversy over testosterone’s relationship to risk for myocardial infarction, stroke, cardiovascular death, and all-cause mortality.
Last year, the TRAVERSE (Testosterone Replacement Therapy for Assessment of Long-term Vascular Events and Efficacy ResponSE in Hypogonadal Men) trial was the first randomized, placebo-controlled study designed and powered to determine whether testosterone therapy increased risk for major cardiovascular events in men (ages 45-80 years). Its conclusions provided reassurance that modest use of testosterone therapy short term does not increase CVD risk.
But other studies have had different conclusions and TRAVERSE left unanswered questions, so Bu B. Yeap, MBBS, PhD, an endocrinologist at the University of Western Australia in Crawley, and colleagues completed a literature review with 11 prospective cohort studies of community-dwelling men with sex steroid levels measured with mass spectrometry. Nine of the studies provided individual participation data (IPD); two used aggregate data, and all had at least 5 years of follow-up.
The findings were published in Annals of Internal Medicine .
Dr. Yeap’s team concluded that certain groups of men have higher risk for CVD events. In this study, men with very low testosterone, high luteinizing hormone (LH), or very low estradiol concentrations had higher all-cause mortality. Sex hormone–binding globulin (SHBG) concentration was positively associated and dihydrotestosterone (DHT) levels were nonlinearly associated with all-cause mortality and CVD mortality.
The testosterone level below which men had higher risk of death from any cause was 7.4 nmol/L (213 ng/dL), regardless of LH concentration, the researchers concluded, writing, “This adds to information on reference ranges based on distributions of testosterone in selected samples of healthy men.”
The link between higher SHBG concentrations and higher all-cause mortality “may be related to its role as the major binding protein for sex steroids in the circulation,” the authors wrote. “We found a U-shaped association of DHT with all-cause and CVD-related mortality risks, which were higher at lower and very high DHT concentrations. Men with very low DHT concentrations also had increased risk for incident CVD events. Further investigation into potential underlying mechanisms for these associations is warranted.”
Rigorous Methodology Adds Value
Bradley D. Anawalt, MD, with the University of Washington School of Medicine in Seattle, pointed out in an accompanying editorial that the study’s findings are particularly valuable because of the team’s rigorous methodology. The team measured testosterone with the gold standard, mass spectrometry, which can also measure DHT and estradiol more accurately than widely available commercial immunoassays, which “are inaccurate for measurement of these sex steroids in men, who typically have low serum concentrations of these two metabolites of testosterone,” Dr. Anawalt said.
Also, the researchers obtained raw data from the nine IPD studies and reanalyzed the combined data, which allows for more sophisticated analysis when combining data from multiple studies, Dr. Anawalt explained.
The main finding from the Yeap et al. study, he wrote, is that high testosterone concentrations at baseline were not linked with increased deaths from CVD or from all causes “but very low serum total testosterone concentrations at baseline were.
“It is tempting to hypothesize that testosterone therapy might have cardiovascular benefits solely in patients with very low concentrations of serum total testosterone,” Dr. Anawalt wrote.
He pointed out as particularly interesting the findings for DHT and estradiol.
“The finding that a low serum estradiol concentration is associated with higher all-cause mortality adds another reason (in addition to the adverse effects on body fat and bone health) to avoid aromatase inhibitors that are commonly taken by persons who use anabolic steroids,” he wrote. “The prospect of a U-shaped curve for the relationship between serum DHT and higher cardiovascular risk warrants further study.”
The work is funded by the Government of Western Australia and Lawley Pharmaceuticals. The authors’ and editorial writer’s conflicts of interest are listed in the full study.
FROM ANNALS OF INTERNAL MEDICINE
Is Red Meat Healthy? Multiverse Analysis Has Lessons Beyond Meat
Observational studies on red meat consumption and lifespan are prime examples of attempts to find signal in a sea of noise.
Randomized controlled trials are the best way to sort cause from mere correlation. But these are not possible in most matters of food consumption. So, we look back and observe groups with different exposures.
My most frequent complaint about these nonrandom comparison studies has been the chance that the two groups differ in important ways, and it’s these differences — not the food in question — that account for the disparate outcomes.
But selection biases are only one issue. There is also the matter of analytic flexibility. Observational studies are born from large databases. Researchers have many choices in how to analyze all these data.
A few years ago, Brian Nosek, PhD, and colleagues elegantly showed that analytic choices can affect results. His Many Analysts, One Data Set study had little uptake in the medical community, perhaps because he studied a social science question.
Multiple Ways to Slice the Data
Recently, a group from McMaster University, led by Dena Zeraatkar, PhD, has confirmed the analytic choices problem, using the question of red meat consumption and mortality.
Their idea was simple: Because there are many plausible and defensible ways to analyze a dataset, we should not choose one method; rather, we should choose thousands, combine the results, and see where the truth lies.
You might wonder how there could be thousands of ways to analyze a dataset. I surely did.
The answer stems from the choices that researchers face. For instance, there is the selection of eligible participants, the choice of analytic model (logistic, Poisson, etc.), and covariates for which to adjust. Think exponents when combining possible choices.
Dr. Zeraatkar and colleagues are research methodologists, so, sadly, they are comfortable with the clunky name of this approach: specification curve analysis. Don’t be deterred. It means that they analyze the data in thousands of ways using computers. Each way is a specification. In the end, the specifications give rise to a curve of hazard ratios for red meat and mortality. Another name for this approach is multiverse analysis.
For their paper in the Journal of Clinical Epidemiology, aptly named “Grilling the Data,” they didn’t just conjure up the many analytic ways to study the red meat–mortality question. Instead, they used a published systematic review of 15 studies on unprocessed red meat and early mortality. The studies included in this review reported 70 unique ways to analyze the association.
Is Red Meat Good or Bad?
Their first finding was that this analysis yielded widely disparate effect estimates, from 0.63 (reduced risk for early death) to 2.31 (a higher risk). The median hazard ratio was 1.14 with an interquartile range (IQR) of 1.02-1.23. One might conclude from this that eating red meat is associated with a slightly higher risk for early mortality.
Their second step was to calculate how many ways (specifications) there were to analyze the data by totaling all possible combinations of choices in the 70 ways found in the systematic review.
They calculated a total of 10 quadrillion possible unique analyses. A quadrillion is 1 with 15 zeros. Computing power cannot handle that amount of analyses yet. So, they generated 20 random unique combinations of covariates, which narrowed the number of analyses to about 1400. About 200 of these were excluded due to implausibly wide confidence intervals.
Voilà. They now had about 1200 different ways to analyze a dataset; they chose an NHANES longitudinal cohort study from 2007-2014. They deemed each of the more than 1200 approaches plausible because they were derived from peer-reviewed papers written by experts in epidemiology.
Specification Curve Analyses Results
Each analysis (or specification) yielded a hazard ratio for red meat exposure and death.
- The median HR was 0.94 (IQR, 0.83-1.05) for the effect of red meat on all-cause mortality — ie, not significant.
- The range of hazard ratios was large. They went from 0.51 — a 49% reduced risk for early mortality — to 1.75: a 75% increase in early mortality.
- Among all analyses, 36% yielded hazard ratios above 1.0 and 64% less than 1.0.
- As for statistical significance, defined as P ≤.05, only 4% (or 48 specifications) met this threshold. Zeraatkar reminded me that this is what you’d expect if unprocessed red meat has no effect on longevity.
- Of the 48 analyses deemed statistically significant, 40 indicated that red meat consumption reduced early death and eight indicated that eating red meat led to higher mortality.
- Nearly half the analyses yielded unexciting point estimates, with hazard ratios between 0.90 and 1.10.
Paradigm Changing
As a user of evidence, I find this a potentially paradigm-changing study. Observational studies far outnumber randomized trials. For many medical questions, observational data are all we have.
Now think about every observational study published. The authors tell you — post hoc — which method they used to analyze the data. The key point is that it is one method.
Dr. Zeraatkar and colleagues have shown that there are thousands of plausible ways to analyze the data, and this can lead to very different findings. In the specific question of red meat and mortality, their many analyses yielded a null result.
Now imagine other cases where the researchers did many analyses of a dataset and chose to publish only the significant ones. Observational studies are rarely preregistered, so a reader cannot know how a result would vary depending on analytic choices. A specification curve analysis of a dataset provides a much broader picture. In the case of red meat, you see some significant results, but the vast majority hover around null.
What about the difficulty in analyzing a dataset 1000 different ways? Dr. Zeraatkar told me that it is harder than just choosing one method, but it’s not impossible.
The main barrier to adopting this multiverse approach to data, she noted, was not the extra work but the entrenched belief among researchers that there is a best way to analyze data.
I hope you read this paper and think about it every time you read an observational study that finds a positive or negative association between two things. Ask: What if the researchers were as careful as Dr. Zeraatkar and colleagues and did multiple different analyses? Would the finding hold up to a series of plausible analytic choices?
Nutritional epidemiology would benefit greatly from this approach. But so would any observational study of an exposure and outcome. I suspect that the number of “positive” associations would diminish. And that would not be a bad thing.
Dr. Mandrola, a clinical electrophysiologist at Baptist Medical Associates, Louisville, Kentucky, disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
Observational studies on red meat consumption and lifespan are prime examples of attempts to find signal in a sea of noise.
Randomized controlled trials are the best way to sort cause from mere correlation. But these are not possible in most matters of food consumption. So, we look back and observe groups with different exposures.
My most frequent complaint about these nonrandom comparison studies has been the chance that the two groups differ in important ways, and it’s these differences — not the food in question — that account for the disparate outcomes.
But selection biases are only one issue. There is also the matter of analytic flexibility. Observational studies are born from large databases. Researchers have many choices in how to analyze all these data.
A few years ago, Brian Nosek, PhD, and colleagues elegantly showed that analytic choices can affect results. His Many Analysts, One Data Set study had little uptake in the medical community, perhaps because he studied a social science question.
Multiple Ways to Slice the Data
Recently, a group from McMaster University, led by Dena Zeraatkar, PhD, has confirmed the analytic choices problem, using the question of red meat consumption and mortality.
Their idea was simple: Because there are many plausible and defensible ways to analyze a dataset, we should not choose one method; rather, we should choose thousands, combine the results, and see where the truth lies.
You might wonder how there could be thousands of ways to analyze a dataset. I surely did.
The answer stems from the choices that researchers face. For instance, there is the selection of eligible participants, the choice of analytic model (logistic, Poisson, etc.), and covariates for which to adjust. Think exponents when combining possible choices.
Dr. Zeraatkar and colleagues are research methodologists, so, sadly, they are comfortable with the clunky name of this approach: specification curve analysis. Don’t be deterred. It means that they analyze the data in thousands of ways using computers. Each way is a specification. In the end, the specifications give rise to a curve of hazard ratios for red meat and mortality. Another name for this approach is multiverse analysis.
For their paper in the Journal of Clinical Epidemiology, aptly named “Grilling the Data,” they didn’t just conjure up the many analytic ways to study the red meat–mortality question. Instead, they used a published systematic review of 15 studies on unprocessed red meat and early mortality. The studies included in this review reported 70 unique ways to analyze the association.
Is Red Meat Good or Bad?
Their first finding was that this analysis yielded widely disparate effect estimates, from 0.63 (reduced risk for early death) to 2.31 (a higher risk). The median hazard ratio was 1.14 with an interquartile range (IQR) of 1.02-1.23. One might conclude from this that eating red meat is associated with a slightly higher risk for early mortality.
Their second step was to calculate how many ways (specifications) there were to analyze the data by totaling all possible combinations of choices in the 70 ways found in the systematic review.
They calculated a total of 10 quadrillion possible unique analyses. A quadrillion is 1 with 15 zeros. Computing power cannot handle that amount of analyses yet. So, they generated 20 random unique combinations of covariates, which narrowed the number of analyses to about 1400. About 200 of these were excluded due to implausibly wide confidence intervals.
Voilà. They now had about 1200 different ways to analyze a dataset; they chose an NHANES longitudinal cohort study from 2007-2014. They deemed each of the more than 1200 approaches plausible because they were derived from peer-reviewed papers written by experts in epidemiology.
Specification Curve Analyses Results
Each analysis (or specification) yielded a hazard ratio for red meat exposure and death.
- The median HR was 0.94 (IQR, 0.83-1.05) for the effect of red meat on all-cause mortality — ie, not significant.
- The range of hazard ratios was large. They went from 0.51 — a 49% reduced risk for early mortality — to 1.75: a 75% increase in early mortality.
- Among all analyses, 36% yielded hazard ratios above 1.0 and 64% less than 1.0.
- As for statistical significance, defined as P ≤.05, only 4% (or 48 specifications) met this threshold. Zeraatkar reminded me that this is what you’d expect if unprocessed red meat has no effect on longevity.
- Of the 48 analyses deemed statistically significant, 40 indicated that red meat consumption reduced early death and eight indicated that eating red meat led to higher mortality.
- Nearly half the analyses yielded unexciting point estimates, with hazard ratios between 0.90 and 1.10.
Paradigm Changing
As a user of evidence, I find this a potentially paradigm-changing study. Observational studies far outnumber randomized trials. For many medical questions, observational data are all we have.
Now think about every observational study published. The authors tell you — post hoc — which method they used to analyze the data. The key point is that it is one method.
Dr. Zeraatkar and colleagues have shown that there are thousands of plausible ways to analyze the data, and this can lead to very different findings. In the specific question of red meat and mortality, their many analyses yielded a null result.
Now imagine other cases where the researchers did many analyses of a dataset and chose to publish only the significant ones. Observational studies are rarely preregistered, so a reader cannot know how a result would vary depending on analytic choices. A specification curve analysis of a dataset provides a much broader picture. In the case of red meat, you see some significant results, but the vast majority hover around null.
What about the difficulty in analyzing a dataset 1000 different ways? Dr. Zeraatkar told me that it is harder than just choosing one method, but it’s not impossible.
The main barrier to adopting this multiverse approach to data, she noted, was not the extra work but the entrenched belief among researchers that there is a best way to analyze data.
I hope you read this paper and think about it every time you read an observational study that finds a positive or negative association between two things. Ask: What if the researchers were as careful as Dr. Zeraatkar and colleagues and did multiple different analyses? Would the finding hold up to a series of plausible analytic choices?
Nutritional epidemiology would benefit greatly from this approach. But so would any observational study of an exposure and outcome. I suspect that the number of “positive” associations would diminish. And that would not be a bad thing.
Dr. Mandrola, a clinical electrophysiologist at Baptist Medical Associates, Louisville, Kentucky, disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
Observational studies on red meat consumption and lifespan are prime examples of attempts to find signal in a sea of noise.
Randomized controlled trials are the best way to sort cause from mere correlation. But these are not possible in most matters of food consumption. So, we look back and observe groups with different exposures.
My most frequent complaint about these nonrandom comparison studies has been the chance that the two groups differ in important ways, and it’s these differences — not the food in question — that account for the disparate outcomes.
But selection biases are only one issue. There is also the matter of analytic flexibility. Observational studies are born from large databases. Researchers have many choices in how to analyze all these data.
A few years ago, Brian Nosek, PhD, and colleagues elegantly showed that analytic choices can affect results. His Many Analysts, One Data Set study had little uptake in the medical community, perhaps because he studied a social science question.
Multiple Ways to Slice the Data
Recently, a group from McMaster University, led by Dena Zeraatkar, PhD, has confirmed the analytic choices problem, using the question of red meat consumption and mortality.
Their idea was simple: Because there are many plausible and defensible ways to analyze a dataset, we should not choose one method; rather, we should choose thousands, combine the results, and see where the truth lies.
You might wonder how there could be thousands of ways to analyze a dataset. I surely did.
The answer stems from the choices that researchers face. For instance, there is the selection of eligible participants, the choice of analytic model (logistic, Poisson, etc.), and covariates for which to adjust. Think exponents when combining possible choices.
Dr. Zeraatkar and colleagues are research methodologists, so, sadly, they are comfortable with the clunky name of this approach: specification curve analysis. Don’t be deterred. It means that they analyze the data in thousands of ways using computers. Each way is a specification. In the end, the specifications give rise to a curve of hazard ratios for red meat and mortality. Another name for this approach is multiverse analysis.
For their paper in the Journal of Clinical Epidemiology, aptly named “Grilling the Data,” they didn’t just conjure up the many analytic ways to study the red meat–mortality question. Instead, they used a published systematic review of 15 studies on unprocessed red meat and early mortality. The studies included in this review reported 70 unique ways to analyze the association.
Is Red Meat Good or Bad?
Their first finding was that this analysis yielded widely disparate effect estimates, from 0.63 (reduced risk for early death) to 2.31 (a higher risk). The median hazard ratio was 1.14 with an interquartile range (IQR) of 1.02-1.23. One might conclude from this that eating red meat is associated with a slightly higher risk for early mortality.
Their second step was to calculate how many ways (specifications) there were to analyze the data by totaling all possible combinations of choices in the 70 ways found in the systematic review.
They calculated a total of 10 quadrillion possible unique analyses. A quadrillion is 1 with 15 zeros. Computing power cannot handle that amount of analyses yet. So, they generated 20 random unique combinations of covariates, which narrowed the number of analyses to about 1400. About 200 of these were excluded due to implausibly wide confidence intervals.
Voilà. They now had about 1200 different ways to analyze a dataset; they chose an NHANES longitudinal cohort study from 2007-2014. They deemed each of the more than 1200 approaches plausible because they were derived from peer-reviewed papers written by experts in epidemiology.
Specification Curve Analyses Results
Each analysis (or specification) yielded a hazard ratio for red meat exposure and death.
- The median HR was 0.94 (IQR, 0.83-1.05) for the effect of red meat on all-cause mortality — ie, not significant.
- The range of hazard ratios was large. They went from 0.51 — a 49% reduced risk for early mortality — to 1.75: a 75% increase in early mortality.
- Among all analyses, 36% yielded hazard ratios above 1.0 and 64% less than 1.0.
- As for statistical significance, defined as P ≤.05, only 4% (or 48 specifications) met this threshold. Zeraatkar reminded me that this is what you’d expect if unprocessed red meat has no effect on longevity.
- Of the 48 analyses deemed statistically significant, 40 indicated that red meat consumption reduced early death and eight indicated that eating red meat led to higher mortality.
- Nearly half the analyses yielded unexciting point estimates, with hazard ratios between 0.90 and 1.10.
Paradigm Changing
As a user of evidence, I find this a potentially paradigm-changing study. Observational studies far outnumber randomized trials. For many medical questions, observational data are all we have.
Now think about every observational study published. The authors tell you — post hoc — which method they used to analyze the data. The key point is that it is one method.
Dr. Zeraatkar and colleagues have shown that there are thousands of plausible ways to analyze the data, and this can lead to very different findings. In the specific question of red meat and mortality, their many analyses yielded a null result.
Now imagine other cases where the researchers did many analyses of a dataset and chose to publish only the significant ones. Observational studies are rarely preregistered, so a reader cannot know how a result would vary depending on analytic choices. A specification curve analysis of a dataset provides a much broader picture. In the case of red meat, you see some significant results, but the vast majority hover around null.
What about the difficulty in analyzing a dataset 1000 different ways? Dr. Zeraatkar told me that it is harder than just choosing one method, but it’s not impossible.
The main barrier to adopting this multiverse approach to data, she noted, was not the extra work but the entrenched belief among researchers that there is a best way to analyze data.
I hope you read this paper and think about it every time you read an observational study that finds a positive or negative association between two things. Ask: What if the researchers were as careful as Dr. Zeraatkar and colleagues and did multiple different analyses? Would the finding hold up to a series of plausible analytic choices?
Nutritional epidemiology would benefit greatly from this approach. But so would any observational study of an exposure and outcome. I suspect that the number of “positive” associations would diminish. And that would not be a bad thing.
Dr. Mandrola, a clinical electrophysiologist at Baptist Medical Associates, Louisville, Kentucky, disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
Managing Obesity Can Lead to Sarcopenia: A ‘Hidden’ Problem
ASUNCIÓN, PARAGUAY — Sarcopenic obesity, which is characterized by excess adiposity and muscle loss, is an “underestimated and underdiagnosed” condition, said the panelists at a session of the XV Latin American Obesity Congress (FLASO 2024) and II Paraguayan Congress of Obesity. The condition often affects older adults but can also occur at any age as a result of unhealthy habits or intensive or repeated weight loss efforts.
“The drugs currently used for managing obesity promote significant weight loss, but by losing fat, muscle is also lost,” said Fabiola Romero Gómez, MD, a professor of medicine at the National University of Asunción and president of the Paraguayan Society of Endocrinology and Metabolism. “We must handle [these drugs] with extreme care. When we employ a strategy that achieves this significant weight loss, we must ensure that the patient receives a good protein intake and engages in resistance exercises, because otherwise, the cure may be worse than the disease.”
Some patients develop sarcopenic obesity after using glucagon-like peptide-1 (GLP-1) analogs, undergoing bariatric surgery, or pursuing restrictive diets, Dr. Romero said in an interview. The condition is more common when there are long-standing cycles of weight loss and subsequent gain, “which accounts for the majority of our patients,” she said.
“An important, largely ignored aspect of weight loss, whether through pharmacological or lifestyle intervention, is that a portion of the weight loss comprises lean muscle,” according to a recent editorial in Nature Medicine. “Weight regain, however, is almost entirely fat. People with chronic obesity often lose and regain weight in repeated cycles, each of which results in body-composition changes (even if they experience some net weight loss). This cycling puts people unable to sustain weight loss at risk of being metabolically less healthy than they were before the initial weight loss was achieved — in effect, at risk of developing sarcopenic obesity.”
A ‘Hidden’ Problem
,” said Dr. Romero.
According to the 2022 consensus of the European Society for Clinical Nutrition and Metabolism and the European Association for the Study of Obesity, clinical signs or factors suggesting sarcopenic obesity include age over 70 years, diagnosis of a chronic disease, repeated falls or weakness, and nutritional events such as recent weight loss or rapid gain, long-standing restrictive diets, and bariatric surgery.
The European guidelines also propose screening in individuals at risk to check for an increased body mass index (BMI) or waist circumference and suspicion parameters of sarcopenia. In this group of patients, the diagnosis should be made based on the analysis of alterations in muscle-skeletal functional parameters, such as grip or pinch strength or the 30-second chair stand test, followed by a determination of body mass alteration using dual-energy x-ray absorptiometry or electrical bioimpedance.
Electrical bioimpedance is Dr. Romero’s preferred method. It is an economical, simple, and easily transportable test that calculates lean muscle mass, fat mass, and body water based on electrical conductivity, she said. Experts have pointed out that bioimpedance scales “will revolutionize the way we measure obesity,” she added.
In an as-yet-unpublished study that received an honorable mention at the 3rd Paraguayan Congress of Endocrinology, Diabetes, and Metabolism last year, Dr. Romero and colleagues studied 126 patients (median age, 45 years) with obesity defined by percentage of fat mass determined by bioimpedance. When their BMI was analyzed, 11.1% were “normal” weight, and 35.7% were “overweight.” Even waist circumference measurement suggested that about 15% of participants were without obesity. Moreover, almost one in four participants presented with sarcopenia, “implying a decrease in quality of life and physical disability in the future if not investigated, diagnosed, and treated correctly,” said Dr. Romero.
Prevention and Recommendations
Exercise and nutrition are two key components in the prevention and management of sarcopenic obesity. Physicians prescribing GLP-1 receptor agonists “must also counsel patients about incorporating aerobic exercise and resistance training as part of the treatment plan, as well as ensuring they eat a high-protein diet,” Yoon Ji Ahn, MD, and Vibha Singhal, MD, MPH, of the Weight Management Center of Massachusetts General Hospital in Boston, wrote in a commentary published by this news organization.
Paraguayan nutritionist Patricia López Soto, a diabetes educator with postgraduate degrees in obesity, diabetes, and bariatric surgery from Favaloro University in Buenos Aires, shared with this news organization the following general recommendations to prevent sarcopenic obesity in patients undergoing weight loss treatment:
- Follow a healthy and balanced Mediterranean or DASH-style diet.
- Increase protein intake at the three to four main meals to a minimum of 1.4-1.5 g/kg/day.
- Try to make the protein intake mostly of high biological value: Beef, chicken, fish, eggs, seafood, cheese, skim milk, and yogurt.
- Ensure protein intake at each meal of between 25 g and 30 g to increase protein synthesis. For example, a 150 g portion of meat or chicken provides 30 g of protein.
- If the protein intake is not achieved through food, a supplement measure like isolated and hydrolyzed whey protein is a good option.
- Engage in strength or resistance training (weightlifting) three to four times per week and 30 minutes of cardiovascular exercise every day.
- To improve adherence, treatment should be carried out with a multidisciplinary team that includes a physician, nutritionist, and physical trainer, with frequent check-ups and body composition studies by bioimpedance.
Dr. Romero and Ms. López declared no relevant financial relationships.
This story was translated from the Medscape Spanish edition using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
ASUNCIÓN, PARAGUAY — Sarcopenic obesity, which is characterized by excess adiposity and muscle loss, is an “underestimated and underdiagnosed” condition, said the panelists at a session of the XV Latin American Obesity Congress (FLASO 2024) and II Paraguayan Congress of Obesity. The condition often affects older adults but can also occur at any age as a result of unhealthy habits or intensive or repeated weight loss efforts.
“The drugs currently used for managing obesity promote significant weight loss, but by losing fat, muscle is also lost,” said Fabiola Romero Gómez, MD, a professor of medicine at the National University of Asunción and president of the Paraguayan Society of Endocrinology and Metabolism. “We must handle [these drugs] with extreme care. When we employ a strategy that achieves this significant weight loss, we must ensure that the patient receives a good protein intake and engages in resistance exercises, because otherwise, the cure may be worse than the disease.”
Some patients develop sarcopenic obesity after using glucagon-like peptide-1 (GLP-1) analogs, undergoing bariatric surgery, or pursuing restrictive diets, Dr. Romero said in an interview. The condition is more common when there are long-standing cycles of weight loss and subsequent gain, “which accounts for the majority of our patients,” she said.
“An important, largely ignored aspect of weight loss, whether through pharmacological or lifestyle intervention, is that a portion of the weight loss comprises lean muscle,” according to a recent editorial in Nature Medicine. “Weight regain, however, is almost entirely fat. People with chronic obesity often lose and regain weight in repeated cycles, each of which results in body-composition changes (even if they experience some net weight loss). This cycling puts people unable to sustain weight loss at risk of being metabolically less healthy than they were before the initial weight loss was achieved — in effect, at risk of developing sarcopenic obesity.”
A ‘Hidden’ Problem
,” said Dr. Romero.
According to the 2022 consensus of the European Society for Clinical Nutrition and Metabolism and the European Association for the Study of Obesity, clinical signs or factors suggesting sarcopenic obesity include age over 70 years, diagnosis of a chronic disease, repeated falls or weakness, and nutritional events such as recent weight loss or rapid gain, long-standing restrictive diets, and bariatric surgery.
The European guidelines also propose screening in individuals at risk to check for an increased body mass index (BMI) or waist circumference and suspicion parameters of sarcopenia. In this group of patients, the diagnosis should be made based on the analysis of alterations in muscle-skeletal functional parameters, such as grip or pinch strength or the 30-second chair stand test, followed by a determination of body mass alteration using dual-energy x-ray absorptiometry or electrical bioimpedance.
Electrical bioimpedance is Dr. Romero’s preferred method. It is an economical, simple, and easily transportable test that calculates lean muscle mass, fat mass, and body water based on electrical conductivity, she said. Experts have pointed out that bioimpedance scales “will revolutionize the way we measure obesity,” she added.
In an as-yet-unpublished study that received an honorable mention at the 3rd Paraguayan Congress of Endocrinology, Diabetes, and Metabolism last year, Dr. Romero and colleagues studied 126 patients (median age, 45 years) with obesity defined by percentage of fat mass determined by bioimpedance. When their BMI was analyzed, 11.1% were “normal” weight, and 35.7% were “overweight.” Even waist circumference measurement suggested that about 15% of participants were without obesity. Moreover, almost one in four participants presented with sarcopenia, “implying a decrease in quality of life and physical disability in the future if not investigated, diagnosed, and treated correctly,” said Dr. Romero.
Prevention and Recommendations
Exercise and nutrition are two key components in the prevention and management of sarcopenic obesity. Physicians prescribing GLP-1 receptor agonists “must also counsel patients about incorporating aerobic exercise and resistance training as part of the treatment plan, as well as ensuring they eat a high-protein diet,” Yoon Ji Ahn, MD, and Vibha Singhal, MD, MPH, of the Weight Management Center of Massachusetts General Hospital in Boston, wrote in a commentary published by this news organization.
Paraguayan nutritionist Patricia López Soto, a diabetes educator with postgraduate degrees in obesity, diabetes, and bariatric surgery from Favaloro University in Buenos Aires, shared with this news organization the following general recommendations to prevent sarcopenic obesity in patients undergoing weight loss treatment:
- Follow a healthy and balanced Mediterranean or DASH-style diet.
- Increase protein intake at the three to four main meals to a minimum of 1.4-1.5 g/kg/day.
- Try to make the protein intake mostly of high biological value: Beef, chicken, fish, eggs, seafood, cheese, skim milk, and yogurt.
- Ensure protein intake at each meal of between 25 g and 30 g to increase protein synthesis. For example, a 150 g portion of meat or chicken provides 30 g of protein.
- If the protein intake is not achieved through food, a supplement measure like isolated and hydrolyzed whey protein is a good option.
- Engage in strength or resistance training (weightlifting) three to four times per week and 30 minutes of cardiovascular exercise every day.
- To improve adherence, treatment should be carried out with a multidisciplinary team that includes a physician, nutritionist, and physical trainer, with frequent check-ups and body composition studies by bioimpedance.
Dr. Romero and Ms. López declared no relevant financial relationships.
This story was translated from the Medscape Spanish edition using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
ASUNCIÓN, PARAGUAY — Sarcopenic obesity, which is characterized by excess adiposity and muscle loss, is an “underestimated and underdiagnosed” condition, said the panelists at a session of the XV Latin American Obesity Congress (FLASO 2024) and II Paraguayan Congress of Obesity. The condition often affects older adults but can also occur at any age as a result of unhealthy habits or intensive or repeated weight loss efforts.
“The drugs currently used for managing obesity promote significant weight loss, but by losing fat, muscle is also lost,” said Fabiola Romero Gómez, MD, a professor of medicine at the National University of Asunción and president of the Paraguayan Society of Endocrinology and Metabolism. “We must handle [these drugs] with extreme care. When we employ a strategy that achieves this significant weight loss, we must ensure that the patient receives a good protein intake and engages in resistance exercises, because otherwise, the cure may be worse than the disease.”
Some patients develop sarcopenic obesity after using glucagon-like peptide-1 (GLP-1) analogs, undergoing bariatric surgery, or pursuing restrictive diets, Dr. Romero said in an interview. The condition is more common when there are long-standing cycles of weight loss and subsequent gain, “which accounts for the majority of our patients,” she said.
“An important, largely ignored aspect of weight loss, whether through pharmacological or lifestyle intervention, is that a portion of the weight loss comprises lean muscle,” according to a recent editorial in Nature Medicine. “Weight regain, however, is almost entirely fat. People with chronic obesity often lose and regain weight in repeated cycles, each of which results in body-composition changes (even if they experience some net weight loss). This cycling puts people unable to sustain weight loss at risk of being metabolically less healthy than they were before the initial weight loss was achieved — in effect, at risk of developing sarcopenic obesity.”
A ‘Hidden’ Problem
,” said Dr. Romero.
According to the 2022 consensus of the European Society for Clinical Nutrition and Metabolism and the European Association for the Study of Obesity, clinical signs or factors suggesting sarcopenic obesity include age over 70 years, diagnosis of a chronic disease, repeated falls or weakness, and nutritional events such as recent weight loss or rapid gain, long-standing restrictive diets, and bariatric surgery.
The European guidelines also propose screening in individuals at risk to check for an increased body mass index (BMI) or waist circumference and suspicion parameters of sarcopenia. In this group of patients, the diagnosis should be made based on the analysis of alterations in muscle-skeletal functional parameters, such as grip or pinch strength or the 30-second chair stand test, followed by a determination of body mass alteration using dual-energy x-ray absorptiometry or electrical bioimpedance.
Electrical bioimpedance is Dr. Romero’s preferred method. It is an economical, simple, and easily transportable test that calculates lean muscle mass, fat mass, and body water based on electrical conductivity, she said. Experts have pointed out that bioimpedance scales “will revolutionize the way we measure obesity,” she added.
In an as-yet-unpublished study that received an honorable mention at the 3rd Paraguayan Congress of Endocrinology, Diabetes, and Metabolism last year, Dr. Romero and colleagues studied 126 patients (median age, 45 years) with obesity defined by percentage of fat mass determined by bioimpedance. When their BMI was analyzed, 11.1% were “normal” weight, and 35.7% were “overweight.” Even waist circumference measurement suggested that about 15% of participants were without obesity. Moreover, almost one in four participants presented with sarcopenia, “implying a decrease in quality of life and physical disability in the future if not investigated, diagnosed, and treated correctly,” said Dr. Romero.
Prevention and Recommendations
Exercise and nutrition are two key components in the prevention and management of sarcopenic obesity. Physicians prescribing GLP-1 receptor agonists “must also counsel patients about incorporating aerobic exercise and resistance training as part of the treatment plan, as well as ensuring they eat a high-protein diet,” Yoon Ji Ahn, MD, and Vibha Singhal, MD, MPH, of the Weight Management Center of Massachusetts General Hospital in Boston, wrote in a commentary published by this news organization.
Paraguayan nutritionist Patricia López Soto, a diabetes educator with postgraduate degrees in obesity, diabetes, and bariatric surgery from Favaloro University in Buenos Aires, shared with this news organization the following general recommendations to prevent sarcopenic obesity in patients undergoing weight loss treatment:
- Follow a healthy and balanced Mediterranean or DASH-style diet.
- Increase protein intake at the three to four main meals to a minimum of 1.4-1.5 g/kg/day.
- Try to make the protein intake mostly of high biological value: Beef, chicken, fish, eggs, seafood, cheese, skim milk, and yogurt.
- Ensure protein intake at each meal of between 25 g and 30 g to increase protein synthesis. For example, a 150 g portion of meat or chicken provides 30 g of protein.
- If the protein intake is not achieved through food, a supplement measure like isolated and hydrolyzed whey protein is a good option.
- Engage in strength or resistance training (weightlifting) three to four times per week and 30 minutes of cardiovascular exercise every day.
- To improve adherence, treatment should be carried out with a multidisciplinary team that includes a physician, nutritionist, and physical trainer, with frequent check-ups and body composition studies by bioimpedance.
Dr. Romero and Ms. López declared no relevant financial relationships.
This story was translated from the Medscape Spanish edition using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
Pancreatic Fat Is the Main Driver for Exocrine and Endocrine Pancreatic Diseases
TOPLINE:
Excessive intrapancreatic fat deposition (IPFD) leading to fatty change of the pancreas (FP) was prevalent in almost 18% of participants in a large population-based cohort, and both IPFD and FP were associated with an increased risk for diabetes, acute pancreatitis, and pancreatic cancer.
METHODOLOGY:
- This prospective cohort study conducted from July 2014 to January 2023 investigated the prevalence of FP and the link between IPFD and pancreatic diseases in 42,599 participants (median age, 65 years; 46.6% men) from the UK Biobank who underwent abdominal Dixon MRI.
- IPFD levels were measured using MRI and a deep learning-based framework called nnUNet.
- The outcomes assessed in this study were diseases of the exocrine pancreas and endocrine pancreas, including acute pancreatitis, pancreatic cancer, diabetes, and other pancreatic conditions.
TAKEAWAY:
- The prevalence of FP was 17.86%.
- Elevation in IPFD levels by one quintile increased the risk for the development of acute pancreatitis by 51.3% (P = .001), pancreatic cancer by 36.5% (P = .017), diabetes by 22.1% (P < .001), and all pancreatic diseases by 22.7% (P < .001).
- FP increased the risk for acute pancreatitis by 298.2% (P < .001), pancreatic cancer by 97.6% (P = .034), diabetes by 33.7% (P = .001), and all pancreatic diseases by 44.1% (P < .001).
- An increasing trend in the prevalence of FP with advancing age was observed in both men and women.
IN PRACTICE:
“FP is a common pancreatic disorder. Fat in the pancreas is an independent risk factor for diseases of both the exocrine pancreas and endocrine pancreas,” the authors wrote.
SOURCE:
This study, led by Xiaowu Dong, MD, of the Pancreatic Center, Department of Gastroenterology, Yangzhou Key Laboratory of Pancreatic Disease, Affiliated Hospital of Yangzhou University, Yangzhou University, Yangzhou, China, was published online in The American Journal of Gastroenterology.
LIMITATIONS:
The authors acknowledged that most of the enrolled participants were White and older than 45 years. A low response rate to recruitment invitations in the UK Biobank database may have introduced self-selection bias. The median follow-up duration of 4.61 years was short and may be insufficient to fully capture the impact of IPFD. Additionally, the use of the average fat fraction for the entire pancreas may have led to spatial variations being ignored.
DISCLOSURES:
This work was supported by the National Natural Science Foundation of China, Cultivation Foundation of Yangzhou Municipal Key Laboratory, The Medical Research Project of Jiangsu Provincial Health Commission, Yangzhou key research and development plan, and Suzhou Innovation Platform Construction Projects-Municipal Key Laboratory Construction. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
TOPLINE:
Excessive intrapancreatic fat deposition (IPFD) leading to fatty change of the pancreas (FP) was prevalent in almost 18% of participants in a large population-based cohort, and both IPFD and FP were associated with an increased risk for diabetes, acute pancreatitis, and pancreatic cancer.
METHODOLOGY:
- This prospective cohort study conducted from July 2014 to January 2023 investigated the prevalence of FP and the link between IPFD and pancreatic diseases in 42,599 participants (median age, 65 years; 46.6% men) from the UK Biobank who underwent abdominal Dixon MRI.
- IPFD levels were measured using MRI and a deep learning-based framework called nnUNet.
- The outcomes assessed in this study were diseases of the exocrine pancreas and endocrine pancreas, including acute pancreatitis, pancreatic cancer, diabetes, and other pancreatic conditions.
TAKEAWAY:
- The prevalence of FP was 17.86%.
- Elevation in IPFD levels by one quintile increased the risk for the development of acute pancreatitis by 51.3% (P = .001), pancreatic cancer by 36.5% (P = .017), diabetes by 22.1% (P < .001), and all pancreatic diseases by 22.7% (P < .001).
- FP increased the risk for acute pancreatitis by 298.2% (P < .001), pancreatic cancer by 97.6% (P = .034), diabetes by 33.7% (P = .001), and all pancreatic diseases by 44.1% (P < .001).
- An increasing trend in the prevalence of FP with advancing age was observed in both men and women.
IN PRACTICE:
“FP is a common pancreatic disorder. Fat in the pancreas is an independent risk factor for diseases of both the exocrine pancreas and endocrine pancreas,” the authors wrote.
SOURCE:
This study, led by Xiaowu Dong, MD, of the Pancreatic Center, Department of Gastroenterology, Yangzhou Key Laboratory of Pancreatic Disease, Affiliated Hospital of Yangzhou University, Yangzhou University, Yangzhou, China, was published online in The American Journal of Gastroenterology.
LIMITATIONS:
The authors acknowledged that most of the enrolled participants were White and older than 45 years. A low response rate to recruitment invitations in the UK Biobank database may have introduced self-selection bias. The median follow-up duration of 4.61 years was short and may be insufficient to fully capture the impact of IPFD. Additionally, the use of the average fat fraction for the entire pancreas may have led to spatial variations being ignored.
DISCLOSURES:
This work was supported by the National Natural Science Foundation of China, Cultivation Foundation of Yangzhou Municipal Key Laboratory, The Medical Research Project of Jiangsu Provincial Health Commission, Yangzhou key research and development plan, and Suzhou Innovation Platform Construction Projects-Municipal Key Laboratory Construction. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
TOPLINE:
Excessive intrapancreatic fat deposition (IPFD) leading to fatty change of the pancreas (FP) was prevalent in almost 18% of participants in a large population-based cohort, and both IPFD and FP were associated with an increased risk for diabetes, acute pancreatitis, and pancreatic cancer.
METHODOLOGY:
- This prospective cohort study conducted from July 2014 to January 2023 investigated the prevalence of FP and the link between IPFD and pancreatic diseases in 42,599 participants (median age, 65 years; 46.6% men) from the UK Biobank who underwent abdominal Dixon MRI.
- IPFD levels were measured using MRI and a deep learning-based framework called nnUNet.
- The outcomes assessed in this study were diseases of the exocrine pancreas and endocrine pancreas, including acute pancreatitis, pancreatic cancer, diabetes, and other pancreatic conditions.
TAKEAWAY:
- The prevalence of FP was 17.86%.
- Elevation in IPFD levels by one quintile increased the risk for the development of acute pancreatitis by 51.3% (P = .001), pancreatic cancer by 36.5% (P = .017), diabetes by 22.1% (P < .001), and all pancreatic diseases by 22.7% (P < .001).
- FP increased the risk for acute pancreatitis by 298.2% (P < .001), pancreatic cancer by 97.6% (P = .034), diabetes by 33.7% (P = .001), and all pancreatic diseases by 44.1% (P < .001).
- An increasing trend in the prevalence of FP with advancing age was observed in both men and women.
IN PRACTICE:
“FP is a common pancreatic disorder. Fat in the pancreas is an independent risk factor for diseases of both the exocrine pancreas and endocrine pancreas,” the authors wrote.
SOURCE:
This study, led by Xiaowu Dong, MD, of the Pancreatic Center, Department of Gastroenterology, Yangzhou Key Laboratory of Pancreatic Disease, Affiliated Hospital of Yangzhou University, Yangzhou University, Yangzhou, China, was published online in The American Journal of Gastroenterology.
LIMITATIONS:
The authors acknowledged that most of the enrolled participants were White and older than 45 years. A low response rate to recruitment invitations in the UK Biobank database may have introduced self-selection bias. The median follow-up duration of 4.61 years was short and may be insufficient to fully capture the impact of IPFD. Additionally, the use of the average fat fraction for the entire pancreas may have led to spatial variations being ignored.
DISCLOSURES:
This work was supported by the National Natural Science Foundation of China, Cultivation Foundation of Yangzhou Municipal Key Laboratory, The Medical Research Project of Jiangsu Provincial Health Commission, Yangzhou key research and development plan, and Suzhou Innovation Platform Construction Projects-Municipal Key Laboratory Construction. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
Microbiome Alterations Linked to Growth Hormone Deficiency
, said Chinese researchers.
The research, published recently in Pediatric Research, involved more than 80 children and showed that those with GHD had alterations in microbial populations that have been linked to longevity, as well as a microbial and metabolite signature that allowed accurate discrimination from ISS.
“These findings provide novel insights into potential early diagnosis and innovative treatment alternatives, such as fecal microbiota transplantation, for short stature with varying growth hormone levels,” the authors wrote.
Andrew Dauber, MD, MMSc, chief of endocrinology, Children’s National Hospital, Washington, who was not involved in the study, said that while this is “a really interesting area of research,” he expressed “hesitancy about getting too excited about this data yet.”
“One of the problems is how you define growth hormone deficiency,” as it is “not a black and white diagnosis,” and the etiology and child’s growth trajectory also need to be considered, Dr. Dauber told said.
He explained: “The problem is that, when you rely on the growth hormone stimulation test alone, there’s so many false positives and so much overlap between patients with true growth hormone deficiency and those without. And I think that this article fell prey to that.”
He added: “It would be really, really interesting and helpful to have a microbiome signature that allows you to distinguish between true growth hormone deficiency and patients with idiopathic short stature.”
“But you have to make sure that your groups are very well defined for this study to be really valid. And that’s one of my concerns here.”
Dr. Dauber continued: “Now, that being said, they did find some associations that correlated with growth hormone peak levels,” some which replicate previous findings, “so I do think that there are kernels of important findings here.”
‘Tease Out Influences’ to Isolate the Interaction
He pointed out that there are “many factors that influence the microbiome,” such as the use of antibiotics, diet, age, and geographic location. Therefore, a study that could truly tease out all these influences and isolate the interaction with growth hormone levels would need to be “very thoughtfully designed.”
A number of factors contribute to short stature, lead author Lan Li, MD, Department of Radiology, The Second Affiliated Hospital and Yuying Children’s Hospital of Wenzhou Medical University, Wenzhou, China, and colleagues.
These include genetic factors, environmental factors, and conditions such as being small for gestational age at birth, familial short stature, and chronic systemic diseases, as well as GHD and ISS.
Recent animal studies have suggested that there may be a bidirectional relationship between the gut microbiota and the growth hormone/insulin-like growth factor 1 axis, and it has been shown that individuals with GHD have significant alterations in their gut microbiota compared with healthy controls.
To investigate, they studied 36 children diagnosed with GHD, 32 with ISS, and 16 age- and sex-matched healthy controls, all of whom were recruited between February 2019 and June 2021 from the Pediatric Endocrinology Department of The Second Affiliated Hospital of Wenzhou Medical University.
Fecal samples obtained from the children underwent microbiome analysis using 16S ribosomal RNA gene sequencing, alongside nuclear MRI analysis of the metabolome, or the entire complement of small molecules in the samples.
Patients with GHD had a significantly higher body mass index than those with ISS (P < .05), and their peak growth hormone level was significantly lower (P < .001). Patients with GHD also had significantly higher total cholesterol and low-density lipoprotein cholesterol levels than patients with ISS (P < .05).
The team reports that the alpha diversity of the fecal microbiome, which measures the microbial diversity within a fecal sample, was similar between the three groups.
However, there was significant variation between the groups in the beta diversity, which quantifies the similarity or dissimilarity between two samples, and allows the overall taxonomic or functional diversity pattern to be linked to environmental features.
Compared with the healthy control group, the abundance of Pelomonas, Rodentibacter, and Rothia was significantly decreased in GHD and patients with ISS, while the abundance of Prevotellaceae_NK3B31_group was increased in the two patient groups, particularly in those with GHD.
In addition, the researchers found a decreased Firmicutes/Bacteroidota (F/B) ratio in participants with short stature, particularly in the GHD group. They noted that “emerging evidence suggests the F/B ratio may play a role in longevity.”
Nocardioides was substantially more common in the ISS group vs both patients with GHD and healthy controls, while Fusobacterium mortiferum was characteristic of GHD. The team suggests this “may serve as a critical intestinal factor contributing to the short stature observed in GHD.”
The metabolome analysis revealed that glucose, pyruvate, and pyrimidine metabolism may also play a significant role in distinguishing between patients with GHD and ISS and healthy control groups.
Finally, the team demonstrated that a panel combining 13 microbiome and metabolome markers was able to discriminate between GHD and ISS at an area under the receiver operating characteristic curve of 0.945, with a sensitivity of 87% and a specificity of 91%.
The study was supported by grants from the National Natural Science Foundation of China and Wenzhou Science and Technology Bureau in China. No relevant financial relationships were declared.
A version of this article appeared on Medscape.com.
, said Chinese researchers.
The research, published recently in Pediatric Research, involved more than 80 children and showed that those with GHD had alterations in microbial populations that have been linked to longevity, as well as a microbial and metabolite signature that allowed accurate discrimination from ISS.
“These findings provide novel insights into potential early diagnosis and innovative treatment alternatives, such as fecal microbiota transplantation, for short stature with varying growth hormone levels,” the authors wrote.
Andrew Dauber, MD, MMSc, chief of endocrinology, Children’s National Hospital, Washington, who was not involved in the study, said that while this is “a really interesting area of research,” he expressed “hesitancy about getting too excited about this data yet.”
“One of the problems is how you define growth hormone deficiency,” as it is “not a black and white diagnosis,” and the etiology and child’s growth trajectory also need to be considered, Dr. Dauber told said.
He explained: “The problem is that, when you rely on the growth hormone stimulation test alone, there’s so many false positives and so much overlap between patients with true growth hormone deficiency and those without. And I think that this article fell prey to that.”
He added: “It would be really, really interesting and helpful to have a microbiome signature that allows you to distinguish between true growth hormone deficiency and patients with idiopathic short stature.”
“But you have to make sure that your groups are very well defined for this study to be really valid. And that’s one of my concerns here.”
Dr. Dauber continued: “Now, that being said, they did find some associations that correlated with growth hormone peak levels,” some which replicate previous findings, “so I do think that there are kernels of important findings here.”
‘Tease Out Influences’ to Isolate the Interaction
He pointed out that there are “many factors that influence the microbiome,” such as the use of antibiotics, diet, age, and geographic location. Therefore, a study that could truly tease out all these influences and isolate the interaction with growth hormone levels would need to be “very thoughtfully designed.”
A number of factors contribute to short stature, lead author Lan Li, MD, Department of Radiology, The Second Affiliated Hospital and Yuying Children’s Hospital of Wenzhou Medical University, Wenzhou, China, and colleagues.
These include genetic factors, environmental factors, and conditions such as being small for gestational age at birth, familial short stature, and chronic systemic diseases, as well as GHD and ISS.
Recent animal studies have suggested that there may be a bidirectional relationship between the gut microbiota and the growth hormone/insulin-like growth factor 1 axis, and it has been shown that individuals with GHD have significant alterations in their gut microbiota compared with healthy controls.
To investigate, they studied 36 children diagnosed with GHD, 32 with ISS, and 16 age- and sex-matched healthy controls, all of whom were recruited between February 2019 and June 2021 from the Pediatric Endocrinology Department of The Second Affiliated Hospital of Wenzhou Medical University.
Fecal samples obtained from the children underwent microbiome analysis using 16S ribosomal RNA gene sequencing, alongside nuclear MRI analysis of the metabolome, or the entire complement of small molecules in the samples.
Patients with GHD had a significantly higher body mass index than those with ISS (P < .05), and their peak growth hormone level was significantly lower (P < .001). Patients with GHD also had significantly higher total cholesterol and low-density lipoprotein cholesterol levels than patients with ISS (P < .05).
The team reports that the alpha diversity of the fecal microbiome, which measures the microbial diversity within a fecal sample, was similar between the three groups.
However, there was significant variation between the groups in the beta diversity, which quantifies the similarity or dissimilarity between two samples, and allows the overall taxonomic or functional diversity pattern to be linked to environmental features.
Compared with the healthy control group, the abundance of Pelomonas, Rodentibacter, and Rothia was significantly decreased in GHD and patients with ISS, while the abundance of Prevotellaceae_NK3B31_group was increased in the two patient groups, particularly in those with GHD.
In addition, the researchers found a decreased Firmicutes/Bacteroidota (F/B) ratio in participants with short stature, particularly in the GHD group. They noted that “emerging evidence suggests the F/B ratio may play a role in longevity.”
Nocardioides was substantially more common in the ISS group vs both patients with GHD and healthy controls, while Fusobacterium mortiferum was characteristic of GHD. The team suggests this “may serve as a critical intestinal factor contributing to the short stature observed in GHD.”
The metabolome analysis revealed that glucose, pyruvate, and pyrimidine metabolism may also play a significant role in distinguishing between patients with GHD and ISS and healthy control groups.
Finally, the team demonstrated that a panel combining 13 microbiome and metabolome markers was able to discriminate between GHD and ISS at an area under the receiver operating characteristic curve of 0.945, with a sensitivity of 87% and a specificity of 91%.
The study was supported by grants from the National Natural Science Foundation of China and Wenzhou Science and Technology Bureau in China. No relevant financial relationships were declared.
A version of this article appeared on Medscape.com.
, said Chinese researchers.
The research, published recently in Pediatric Research, involved more than 80 children and showed that those with GHD had alterations in microbial populations that have been linked to longevity, as well as a microbial and metabolite signature that allowed accurate discrimination from ISS.
“These findings provide novel insights into potential early diagnosis and innovative treatment alternatives, such as fecal microbiota transplantation, for short stature with varying growth hormone levels,” the authors wrote.
Andrew Dauber, MD, MMSc, chief of endocrinology, Children’s National Hospital, Washington, who was not involved in the study, said that while this is “a really interesting area of research,” he expressed “hesitancy about getting too excited about this data yet.”
“One of the problems is how you define growth hormone deficiency,” as it is “not a black and white diagnosis,” and the etiology and child’s growth trajectory also need to be considered, Dr. Dauber told said.
He explained: “The problem is that, when you rely on the growth hormone stimulation test alone, there’s so many false positives and so much overlap between patients with true growth hormone deficiency and those without. And I think that this article fell prey to that.”
He added: “It would be really, really interesting and helpful to have a microbiome signature that allows you to distinguish between true growth hormone deficiency and patients with idiopathic short stature.”
“But you have to make sure that your groups are very well defined for this study to be really valid. And that’s one of my concerns here.”
Dr. Dauber continued: “Now, that being said, they did find some associations that correlated with growth hormone peak levels,” some which replicate previous findings, “so I do think that there are kernels of important findings here.”
‘Tease Out Influences’ to Isolate the Interaction
He pointed out that there are “many factors that influence the microbiome,” such as the use of antibiotics, diet, age, and geographic location. Therefore, a study that could truly tease out all these influences and isolate the interaction with growth hormone levels would need to be “very thoughtfully designed.”
A number of factors contribute to short stature, lead author Lan Li, MD, Department of Radiology, The Second Affiliated Hospital and Yuying Children’s Hospital of Wenzhou Medical University, Wenzhou, China, and colleagues.
These include genetic factors, environmental factors, and conditions such as being small for gestational age at birth, familial short stature, and chronic systemic diseases, as well as GHD and ISS.
Recent animal studies have suggested that there may be a bidirectional relationship between the gut microbiota and the growth hormone/insulin-like growth factor 1 axis, and it has been shown that individuals with GHD have significant alterations in their gut microbiota compared with healthy controls.
To investigate, they studied 36 children diagnosed with GHD, 32 with ISS, and 16 age- and sex-matched healthy controls, all of whom were recruited between February 2019 and June 2021 from the Pediatric Endocrinology Department of The Second Affiliated Hospital of Wenzhou Medical University.
Fecal samples obtained from the children underwent microbiome analysis using 16S ribosomal RNA gene sequencing, alongside nuclear MRI analysis of the metabolome, or the entire complement of small molecules in the samples.
Patients with GHD had a significantly higher body mass index than those with ISS (P < .05), and their peak growth hormone level was significantly lower (P < .001). Patients with GHD also had significantly higher total cholesterol and low-density lipoprotein cholesterol levels than patients with ISS (P < .05).
The team reports that the alpha diversity of the fecal microbiome, which measures the microbial diversity within a fecal sample, was similar between the three groups.
However, there was significant variation between the groups in the beta diversity, which quantifies the similarity or dissimilarity between two samples, and allows the overall taxonomic or functional diversity pattern to be linked to environmental features.
Compared with the healthy control group, the abundance of Pelomonas, Rodentibacter, and Rothia was significantly decreased in GHD and patients with ISS, while the abundance of Prevotellaceae_NK3B31_group was increased in the two patient groups, particularly in those with GHD.
In addition, the researchers found a decreased Firmicutes/Bacteroidota (F/B) ratio in participants with short stature, particularly in the GHD group. They noted that “emerging evidence suggests the F/B ratio may play a role in longevity.”
Nocardioides was substantially more common in the ISS group vs both patients with GHD and healthy controls, while Fusobacterium mortiferum was characteristic of GHD. The team suggests this “may serve as a critical intestinal factor contributing to the short stature observed in GHD.”
The metabolome analysis revealed that glucose, pyruvate, and pyrimidine metabolism may also play a significant role in distinguishing between patients with GHD and ISS and healthy control groups.
Finally, the team demonstrated that a panel combining 13 microbiome and metabolome markers was able to discriminate between GHD and ISS at an area under the receiver operating characteristic curve of 0.945, with a sensitivity of 87% and a specificity of 91%.
The study was supported by grants from the National Natural Science Foundation of China and Wenzhou Science and Technology Bureau in China. No relevant financial relationships were declared.
A version of this article appeared on Medscape.com.
FROM PEDIATRIC RESEARCH
GLP-1 Receptor Agonists: Which Drug for Which Patient?
With all the excitement about GLP-1 agonists,
Of course, we want to make sure that we’re treating the right condition. If the patient has type 2 diabetes, we tend to give them medication that is indicated for type 2 diabetes. Many GLP-1 agonists are available in a diabetes version and a chronic weight management or obesity version. If a patient has diabetes and obesity, they can receive either one. If a patient has only diabetes but not obesity, they should be prescribed the diabetes version. For obesity without diabetes, we tend to stick with the drugs that are indicated for chronic weight management.
Let’s go through them.
Exenatide. In chronological order of approval, the first GLP-1 drug that was used for diabetes dates back to exenatide (Bydureon). Bydureon had a partner called Byetta (also exenatide), both of which are still on the market but infrequently used. Some patients reported that these medications were inconvenient because they required twice-daily injections and caused painful injection-site nodules.
Diabetes drugs in more common use include liraglutide (Victoza) for type 2 diabetes. It is a daily injection and has various doses. We always start low and increase with tolerance and desired effect for A1c.
Liraglutide. Victoza has an antiobesity counterpart called Saxenda. The Saxenda pen looks very similar to the Victoza pen. It is a daily GLP-1 agonist for chronic weight management. The SCALE trial demonstrated 8%-12% weight loss with Saxenda.
Those are the daily injections: Victoza for diabetes and Saxenda for weight loss.
Our patients are very excited about the advent of weekly injections for diabetes and weight management. Ozempic is very popular. It is a weekly GLP-1 agonist for type 2 diabetes. Many patients come in asking for Ozempic, and we must make sure that we’re moving them in the right direction depending on their condition.
Semaglutide. Ozempic has a few different doses. It is a weekly injection and has been found to be quite efficacious for treating diabetes. The drug’s weight loss counterpart is called Wegovy, which comes in a different pen. Both forms contain the compound semaglutide. While all of these GLP-1 agonists are indicated to treat type 2 diabetes or for weight management, Wegovy has a special indication that none of the others have. In March 2024, Wegovy acquired an indication to decrease cardiac risk in those with a BMI ≥ 27 and a previous cardiac history. This will really change the accessibility of this medication because patients with heart conditions who are on Medicare are expected to have access to Wegovy.
Tirzepatide. Another weekly injection for treatment of type 2 diabetes is called Mounjaro. Its counterpart for weight management is called Zepbound, which was found to have about 20.9% weight loss over 72 weeks. These medications have similar side effects in differing degrees, but the most-often reported are nausea, stool changes, abdominal pain, and reflux. There are some other potential side effects; I recommend that you read the individual prescribing information available for each drug to have more clarity about that.
It is important that we stay on label for using the GLP-1 receptor agonists, for many reasons. One, it increases our patients’ accessibility to the right medication for them, and we can also make sure that we’re treating the patient with the right drug according to the clinical trials. When the clinical trials are done, the study populations demonstrate safety and efficacy for that population. But if we’re prescribing a GLP-1 for a different population, it is considered off-label use.
Dr. Lofton, an obesity medicine specialist, is clinical associate professor of surgery and medicine at NYU Grossman School of Medicine, and director of the medical weight management program at NYU Langone Weight Management Center, New York. She disclosed ties to Novo Nordisk and Eli Lilly. This transcript has been edited for clarity.
A version of this article appeared on Medscape.com.
With all the excitement about GLP-1 agonists,
Of course, we want to make sure that we’re treating the right condition. If the patient has type 2 diabetes, we tend to give them medication that is indicated for type 2 diabetes. Many GLP-1 agonists are available in a diabetes version and a chronic weight management or obesity version. If a patient has diabetes and obesity, they can receive either one. If a patient has only diabetes but not obesity, they should be prescribed the diabetes version. For obesity without diabetes, we tend to stick with the drugs that are indicated for chronic weight management.
Let’s go through them.
Exenatide. In chronological order of approval, the first GLP-1 drug that was used for diabetes dates back to exenatide (Bydureon). Bydureon had a partner called Byetta (also exenatide), both of which are still on the market but infrequently used. Some patients reported that these medications were inconvenient because they required twice-daily injections and caused painful injection-site nodules.
Diabetes drugs in more common use include liraglutide (Victoza) for type 2 diabetes. It is a daily injection and has various doses. We always start low and increase with tolerance and desired effect for A1c.
Liraglutide. Victoza has an antiobesity counterpart called Saxenda. The Saxenda pen looks very similar to the Victoza pen. It is a daily GLP-1 agonist for chronic weight management. The SCALE trial demonstrated 8%-12% weight loss with Saxenda.
Those are the daily injections: Victoza for diabetes and Saxenda for weight loss.
Our patients are very excited about the advent of weekly injections for diabetes and weight management. Ozempic is very popular. It is a weekly GLP-1 agonist for type 2 diabetes. Many patients come in asking for Ozempic, and we must make sure that we’re moving them in the right direction depending on their condition.
Semaglutide. Ozempic has a few different doses. It is a weekly injection and has been found to be quite efficacious for treating diabetes. The drug’s weight loss counterpart is called Wegovy, which comes in a different pen. Both forms contain the compound semaglutide. While all of these GLP-1 agonists are indicated to treat type 2 diabetes or for weight management, Wegovy has a special indication that none of the others have. In March 2024, Wegovy acquired an indication to decrease cardiac risk in those with a BMI ≥ 27 and a previous cardiac history. This will really change the accessibility of this medication because patients with heart conditions who are on Medicare are expected to have access to Wegovy.
Tirzepatide. Another weekly injection for treatment of type 2 diabetes is called Mounjaro. Its counterpart for weight management is called Zepbound, which was found to have about 20.9% weight loss over 72 weeks. These medications have similar side effects in differing degrees, but the most-often reported are nausea, stool changes, abdominal pain, and reflux. There are some other potential side effects; I recommend that you read the individual prescribing information available for each drug to have more clarity about that.
It is important that we stay on label for using the GLP-1 receptor agonists, for many reasons. One, it increases our patients’ accessibility to the right medication for them, and we can also make sure that we’re treating the patient with the right drug according to the clinical trials. When the clinical trials are done, the study populations demonstrate safety and efficacy for that population. But if we’re prescribing a GLP-1 for a different population, it is considered off-label use.
Dr. Lofton, an obesity medicine specialist, is clinical associate professor of surgery and medicine at NYU Grossman School of Medicine, and director of the medical weight management program at NYU Langone Weight Management Center, New York. She disclosed ties to Novo Nordisk and Eli Lilly. This transcript has been edited for clarity.
A version of this article appeared on Medscape.com.
With all the excitement about GLP-1 agonists,
Of course, we want to make sure that we’re treating the right condition. If the patient has type 2 diabetes, we tend to give them medication that is indicated for type 2 diabetes. Many GLP-1 agonists are available in a diabetes version and a chronic weight management or obesity version. If a patient has diabetes and obesity, they can receive either one. If a patient has only diabetes but not obesity, they should be prescribed the diabetes version. For obesity without diabetes, we tend to stick with the drugs that are indicated for chronic weight management.
Let’s go through them.
Exenatide. In chronological order of approval, the first GLP-1 drug that was used for diabetes dates back to exenatide (Bydureon). Bydureon had a partner called Byetta (also exenatide), both of which are still on the market but infrequently used. Some patients reported that these medications were inconvenient because they required twice-daily injections and caused painful injection-site nodules.
Diabetes drugs in more common use include liraglutide (Victoza) for type 2 diabetes. It is a daily injection and has various doses. We always start low and increase with tolerance and desired effect for A1c.
Liraglutide. Victoza has an antiobesity counterpart called Saxenda. The Saxenda pen looks very similar to the Victoza pen. It is a daily GLP-1 agonist for chronic weight management. The SCALE trial demonstrated 8%-12% weight loss with Saxenda.
Those are the daily injections: Victoza for diabetes and Saxenda for weight loss.
Our patients are very excited about the advent of weekly injections for diabetes and weight management. Ozempic is very popular. It is a weekly GLP-1 agonist for type 2 diabetes. Many patients come in asking for Ozempic, and we must make sure that we’re moving them in the right direction depending on their condition.
Semaglutide. Ozempic has a few different doses. It is a weekly injection and has been found to be quite efficacious for treating diabetes. The drug’s weight loss counterpart is called Wegovy, which comes in a different pen. Both forms contain the compound semaglutide. While all of these GLP-1 agonists are indicated to treat type 2 diabetes or for weight management, Wegovy has a special indication that none of the others have. In March 2024, Wegovy acquired an indication to decrease cardiac risk in those with a BMI ≥ 27 and a previous cardiac history. This will really change the accessibility of this medication because patients with heart conditions who are on Medicare are expected to have access to Wegovy.
Tirzepatide. Another weekly injection for treatment of type 2 diabetes is called Mounjaro. Its counterpart for weight management is called Zepbound, which was found to have about 20.9% weight loss over 72 weeks. These medications have similar side effects in differing degrees, but the most-often reported are nausea, stool changes, abdominal pain, and reflux. There are some other potential side effects; I recommend that you read the individual prescribing information available for each drug to have more clarity about that.
It is important that we stay on label for using the GLP-1 receptor agonists, for many reasons. One, it increases our patients’ accessibility to the right medication for them, and we can also make sure that we’re treating the patient with the right drug according to the clinical trials. When the clinical trials are done, the study populations demonstrate safety and efficacy for that population. But if we’re prescribing a GLP-1 for a different population, it is considered off-label use.
Dr. Lofton, an obesity medicine specialist, is clinical associate professor of surgery and medicine at NYU Grossman School of Medicine, and director of the medical weight management program at NYU Langone Weight Management Center, New York. She disclosed ties to Novo Nordisk and Eli Lilly. This transcript has been edited for clarity.
A version of this article appeared on Medscape.com.
Are Carbs Really the Enemy?
Recent headlines scream that we have an obesity problem and that carbs are the culprit for the problem. That leads me to ask: How did we get to blaming carbs as the enemy in the war against obesity?
First, a quick review of the history of diet and macronutrient content.
A long time ago, prehistoric humans foraged and hunted for food. Protein and fat were procured from animal meat, which was very important for encephalization, or evolutionary increase in the complexity or relative size of the brain. Most of the requirements for protein and iron were satisfied by hunting and eating land animals as well as consuming marine life that washed up on shore.
Carbohydrates in the form of plant foods served as the only sources of energy available to prehistoric hunter-gatherers, which offset the high protein content of the rest of their diet. These were only available during spring and summer.
Then, about 10,000 years ago, plant and animal agriculture began, and humans saw a permanent shift in the macronutrient content of our daily intake so that it was more consistent and stable. Initially, the nutrient characteristic changes were subtle, going from wild food to cultivated food with the Agricultural Revolution in the mid-17th century. Then, it changed even more rapidly less than 200 years ago with the Industrial Revolution, resulting in semiprocessed and ultraprocessed foods.
This change in food intake altered human physiology, with major changes in our digestive, immune, and neural physiology and an increase in chronic disease prevalence. The last 50 years has seen an increase in obesity in the United States, along with increases in chronic disease such as type 2 diabetes, which leads cardiovascular disease and certain cancers.
Back to Carbohydrates: Do We Need Them? How Much? What Kind?
Unfortunately, ultraprocessed foods have become a staple of the standard American or Western diet.
Ultraprocessed foods such as cakes, cookies, crackers, sugary breakfast cereals, pizza, potato chips, soft drinks, and ice cream are eons away from our prehistoric diet of wild game, nuts, fruits, and berries, at which time, our digestive immune and nervous systems evolved. The pace at which ultraprocessed foods have entered our diet outpaces the time necessary for adaptation of our digestive systems and genes to these foods. They are indeed pathogenic in this context.
So when was the time when humans consumed an “optimal” diet? This is hard to say because during the time of brain evolution, we needed protein and iron and succumbed to infections and trauma. In the early 1900s, we continued to succumb to infection until the discovery of antibiotics. Soon thereafter, industrialization and processed foods led to weight gain and the chronic diseases of the cardiovascular system and type 2 diabetes.
Carbohydrates provide calories and fiber and some micronutrients, which are needed for energy, metabolism, and bowel and immune health. But how much do we need?
Currently in the United States, the percentage of total food energy derived from the three major macronutrients is: carbohydrates, 51.8%; fat, 32.8%; and protein, 15.4%. Current advice for a healthy diet to lower risk for cardiovascular disease is to limit fat intake to 30% of total energy, protein to 15%, and to increase complex carbohydrates to 55%-60% of total energy. But we also need to qualify this in terms of the quality of the macronutrient, particularly carbohydrates.
In addition to the quality, the macronutrient content of the diet has varied considerably from our prehistoric times when dietary protein intakes were high at 19%-35% of energy at the expense of carbohydrate (22%-40% of energy).
If our genes haven’t kept up with industrialization, then why do we need so many carbohydrates to equate to 55%-60% of energy? Is it possible that we are confusing what is available with what we actually need? What do I mean by this?
We certainly have changed the landscape of the world due to agriculture, which has allowed us to procreate and feed ourselves, and certainly, industrialization has increased the availability of accessible cheap food. Protein in the form of meat, fish, and fowl are harder to get in industrialized nations as are fruits and vegetables. These macronutrients were the foods of our ancestors. It may be that a healthy diet is considered the one that is available.
For instance, the Mediterranean diet is somewhat higher in fat content, 40%-50% fat (mostly mono and unsaturated), and similar in protein content but lower in carbohydrate content than the typical Western diet. The Dietary Approaches to Stop Hypertension (DASH) diet is lower in fat at 25% total calories, is higher in carbohydrates at 55%, and is lower in protein, but this diet was generated in the United States, therefore it is more Western.
We need high-quality protein for organ and muscle function, high-quality unsaturated and monounsaturated fats for brain function and cellular functions, and high-quality complex carbohydrates for energy and gut health as well as micronutrients for many cellular functions. A ketogenic diet is not sustainable in the long-term for these reasons: chiefly the need for some carbohydrates for gut health and micronutrients.
How much carbohydrate content is needed should take into consideration energy expenditure as well as micronutrients and fiber intake. Protein and fat can contribute to energy production but not as readily as carbohydrates that can quickly restore glycogen in the muscle and liver. What’s interesting is that our ancestors were able to hunt and run away from danger with the small amounts of carbohydrates from plants and berries plus the protein and fat intake from animals and fish — but the Olympics weren’t a thing then!
It may be another 200,000 years before our genes catch up to ultraprocessed foods and the simple carbohydrates and sugars contained in these products. Evidence suggests that ultraprocessed foods cause inflammation in organs like the liver, adipose tissue, the heart, and even the brain. In the brain, this inflammation may be what’s causing us to defend a higher body weight set point in this environment of easily obtained highly palatable ultraprocessed foods.
Let’s not wait until our genes catch up and our bodies tolerate junk food without disease progression. It could be like waiting for Godot!
Dr. Apovian is professor of medicine, Harvard Medical School, and codirector, Center for Weight Management and Wellness, Brigham and Women’s Hospital, Boston, Massachusetts. She disclosed ties to Altimmune, CinFina Pharma, Cowen and Company, EPG Communication Holdings, Form Health, Gelesis, and L-Nutra.
A version of this article appeared on Medscape.com.
Recent headlines scream that we have an obesity problem and that carbs are the culprit for the problem. That leads me to ask: How did we get to blaming carbs as the enemy in the war against obesity?
First, a quick review of the history of diet and macronutrient content.
A long time ago, prehistoric humans foraged and hunted for food. Protein and fat were procured from animal meat, which was very important for encephalization, or evolutionary increase in the complexity or relative size of the brain. Most of the requirements for protein and iron were satisfied by hunting and eating land animals as well as consuming marine life that washed up on shore.
Carbohydrates in the form of plant foods served as the only sources of energy available to prehistoric hunter-gatherers, which offset the high protein content of the rest of their diet. These were only available during spring and summer.
Then, about 10,000 years ago, plant and animal agriculture began, and humans saw a permanent shift in the macronutrient content of our daily intake so that it was more consistent and stable. Initially, the nutrient characteristic changes were subtle, going from wild food to cultivated food with the Agricultural Revolution in the mid-17th century. Then, it changed even more rapidly less than 200 years ago with the Industrial Revolution, resulting in semiprocessed and ultraprocessed foods.
This change in food intake altered human physiology, with major changes in our digestive, immune, and neural physiology and an increase in chronic disease prevalence. The last 50 years has seen an increase in obesity in the United States, along with increases in chronic disease such as type 2 diabetes, which leads cardiovascular disease and certain cancers.
Back to Carbohydrates: Do We Need Them? How Much? What Kind?
Unfortunately, ultraprocessed foods have become a staple of the standard American or Western diet.
Ultraprocessed foods such as cakes, cookies, crackers, sugary breakfast cereals, pizza, potato chips, soft drinks, and ice cream are eons away from our prehistoric diet of wild game, nuts, fruits, and berries, at which time, our digestive immune and nervous systems evolved. The pace at which ultraprocessed foods have entered our diet outpaces the time necessary for adaptation of our digestive systems and genes to these foods. They are indeed pathogenic in this context.
So when was the time when humans consumed an “optimal” diet? This is hard to say because during the time of brain evolution, we needed protein and iron and succumbed to infections and trauma. In the early 1900s, we continued to succumb to infection until the discovery of antibiotics. Soon thereafter, industrialization and processed foods led to weight gain and the chronic diseases of the cardiovascular system and type 2 diabetes.
Carbohydrates provide calories and fiber and some micronutrients, which are needed for energy, metabolism, and bowel and immune health. But how much do we need?
Currently in the United States, the percentage of total food energy derived from the three major macronutrients is: carbohydrates, 51.8%; fat, 32.8%; and protein, 15.4%. Current advice for a healthy diet to lower risk for cardiovascular disease is to limit fat intake to 30% of total energy, protein to 15%, and to increase complex carbohydrates to 55%-60% of total energy. But we also need to qualify this in terms of the quality of the macronutrient, particularly carbohydrates.
In addition to the quality, the macronutrient content of the diet has varied considerably from our prehistoric times when dietary protein intakes were high at 19%-35% of energy at the expense of carbohydrate (22%-40% of energy).
If our genes haven’t kept up with industrialization, then why do we need so many carbohydrates to equate to 55%-60% of energy? Is it possible that we are confusing what is available with what we actually need? What do I mean by this?
We certainly have changed the landscape of the world due to agriculture, which has allowed us to procreate and feed ourselves, and certainly, industrialization has increased the availability of accessible cheap food. Protein in the form of meat, fish, and fowl are harder to get in industrialized nations as are fruits and vegetables. These macronutrients were the foods of our ancestors. It may be that a healthy diet is considered the one that is available.
For instance, the Mediterranean diet is somewhat higher in fat content, 40%-50% fat (mostly mono and unsaturated), and similar in protein content but lower in carbohydrate content than the typical Western diet. The Dietary Approaches to Stop Hypertension (DASH) diet is lower in fat at 25% total calories, is higher in carbohydrates at 55%, and is lower in protein, but this diet was generated in the United States, therefore it is more Western.
We need high-quality protein for organ and muscle function, high-quality unsaturated and monounsaturated fats for brain function and cellular functions, and high-quality complex carbohydrates for energy and gut health as well as micronutrients for many cellular functions. A ketogenic diet is not sustainable in the long-term for these reasons: chiefly the need for some carbohydrates for gut health and micronutrients.
How much carbohydrate content is needed should take into consideration energy expenditure as well as micronutrients and fiber intake. Protein and fat can contribute to energy production but not as readily as carbohydrates that can quickly restore glycogen in the muscle and liver. What’s interesting is that our ancestors were able to hunt and run away from danger with the small amounts of carbohydrates from plants and berries plus the protein and fat intake from animals and fish — but the Olympics weren’t a thing then!
It may be another 200,000 years before our genes catch up to ultraprocessed foods and the simple carbohydrates and sugars contained in these products. Evidence suggests that ultraprocessed foods cause inflammation in organs like the liver, adipose tissue, the heart, and even the brain. In the brain, this inflammation may be what’s causing us to defend a higher body weight set point in this environment of easily obtained highly palatable ultraprocessed foods.
Let’s not wait until our genes catch up and our bodies tolerate junk food without disease progression. It could be like waiting for Godot!
Dr. Apovian is professor of medicine, Harvard Medical School, and codirector, Center for Weight Management and Wellness, Brigham and Women’s Hospital, Boston, Massachusetts. She disclosed ties to Altimmune, CinFina Pharma, Cowen and Company, EPG Communication Holdings, Form Health, Gelesis, and L-Nutra.
A version of this article appeared on Medscape.com.
Recent headlines scream that we have an obesity problem and that carbs are the culprit for the problem. That leads me to ask: How did we get to blaming carbs as the enemy in the war against obesity?
First, a quick review of the history of diet and macronutrient content.
A long time ago, prehistoric humans foraged and hunted for food. Protein and fat were procured from animal meat, which was very important for encephalization, or evolutionary increase in the complexity or relative size of the brain. Most of the requirements for protein and iron were satisfied by hunting and eating land animals as well as consuming marine life that washed up on shore.
Carbohydrates in the form of plant foods served as the only sources of energy available to prehistoric hunter-gatherers, which offset the high protein content of the rest of their diet. These were only available during spring and summer.
Then, about 10,000 years ago, plant and animal agriculture began, and humans saw a permanent shift in the macronutrient content of our daily intake so that it was more consistent and stable. Initially, the nutrient characteristic changes were subtle, going from wild food to cultivated food with the Agricultural Revolution in the mid-17th century. Then, it changed even more rapidly less than 200 years ago with the Industrial Revolution, resulting in semiprocessed and ultraprocessed foods.
This change in food intake altered human physiology, with major changes in our digestive, immune, and neural physiology and an increase in chronic disease prevalence. The last 50 years has seen an increase in obesity in the United States, along with increases in chronic disease such as type 2 diabetes, which leads cardiovascular disease and certain cancers.
Back to Carbohydrates: Do We Need Them? How Much? What Kind?
Unfortunately, ultraprocessed foods have become a staple of the standard American or Western diet.
Ultraprocessed foods such as cakes, cookies, crackers, sugary breakfast cereals, pizza, potato chips, soft drinks, and ice cream are eons away from our prehistoric diet of wild game, nuts, fruits, and berries, at which time, our digestive immune and nervous systems evolved. The pace at which ultraprocessed foods have entered our diet outpaces the time necessary for adaptation of our digestive systems and genes to these foods. They are indeed pathogenic in this context.
So when was the time when humans consumed an “optimal” diet? This is hard to say because during the time of brain evolution, we needed protein and iron and succumbed to infections and trauma. In the early 1900s, we continued to succumb to infection until the discovery of antibiotics. Soon thereafter, industrialization and processed foods led to weight gain and the chronic diseases of the cardiovascular system and type 2 diabetes.
Carbohydrates provide calories and fiber and some micronutrients, which are needed for energy, metabolism, and bowel and immune health. But how much do we need?
Currently in the United States, the percentage of total food energy derived from the three major macronutrients is: carbohydrates, 51.8%; fat, 32.8%; and protein, 15.4%. Current advice for a healthy diet to lower risk for cardiovascular disease is to limit fat intake to 30% of total energy, protein to 15%, and to increase complex carbohydrates to 55%-60% of total energy. But we also need to qualify this in terms of the quality of the macronutrient, particularly carbohydrates.
In addition to the quality, the macronutrient content of the diet has varied considerably from our prehistoric times when dietary protein intakes were high at 19%-35% of energy at the expense of carbohydrate (22%-40% of energy).
If our genes haven’t kept up with industrialization, then why do we need so many carbohydrates to equate to 55%-60% of energy? Is it possible that we are confusing what is available with what we actually need? What do I mean by this?
We certainly have changed the landscape of the world due to agriculture, which has allowed us to procreate and feed ourselves, and certainly, industrialization has increased the availability of accessible cheap food. Protein in the form of meat, fish, and fowl are harder to get in industrialized nations as are fruits and vegetables. These macronutrients were the foods of our ancestors. It may be that a healthy diet is considered the one that is available.
For instance, the Mediterranean diet is somewhat higher in fat content, 40%-50% fat (mostly mono and unsaturated), and similar in protein content but lower in carbohydrate content than the typical Western diet. The Dietary Approaches to Stop Hypertension (DASH) diet is lower in fat at 25% total calories, is higher in carbohydrates at 55%, and is lower in protein, but this diet was generated in the United States, therefore it is more Western.
We need high-quality protein for organ and muscle function, high-quality unsaturated and monounsaturated fats for brain function and cellular functions, and high-quality complex carbohydrates for energy and gut health as well as micronutrients for many cellular functions. A ketogenic diet is not sustainable in the long-term for these reasons: chiefly the need for some carbohydrates for gut health and micronutrients.
How much carbohydrate content is needed should take into consideration energy expenditure as well as micronutrients and fiber intake. Protein and fat can contribute to energy production but not as readily as carbohydrates that can quickly restore glycogen in the muscle and liver. What’s interesting is that our ancestors were able to hunt and run away from danger with the small amounts of carbohydrates from plants and berries plus the protein and fat intake from animals and fish — but the Olympics weren’t a thing then!
It may be another 200,000 years before our genes catch up to ultraprocessed foods and the simple carbohydrates and sugars contained in these products. Evidence suggests that ultraprocessed foods cause inflammation in organs like the liver, adipose tissue, the heart, and even the brain. In the brain, this inflammation may be what’s causing us to defend a higher body weight set point in this environment of easily obtained highly palatable ultraprocessed foods.
Let’s not wait until our genes catch up and our bodies tolerate junk food without disease progression. It could be like waiting for Godot!
Dr. Apovian is professor of medicine, Harvard Medical School, and codirector, Center for Weight Management and Wellness, Brigham and Women’s Hospital, Boston, Massachusetts. She disclosed ties to Altimmune, CinFina Pharma, Cowen and Company, EPG Communication Holdings, Form Health, Gelesis, and L-Nutra.
A version of this article appeared on Medscape.com.
Temporary Gut Liner Lowers Weight, A1c
LONDON — , showed data.
Two years after the liner’s removal, 80% of patients continued to show significant improvement, while 20% returned to baseline.
Presenting results at the Diabetes UK Professional Conference (DUKPC) 2024, the researchers, led by Bob Ryder, MD, FRCP, from the Department of Diabetes, Birmingham City Hospital, Birmingham, England, aimed to assess the safety and efficacy of EndoBarrier, as well as maintenance of efficacy 24 months after the device removal.
“We think EndoBarrier finds its place between the end of all the earlier measures and the possible option of bariatric surgery, and these data show that it can lead to tremendous weight loss and improvement in A1c,” Dr. Ryder said in an interview.
Commenting on how most patients had responded to use of the device, Dr. Ryder said, “People with obesity are often very unhappy and have tried everything over many years to no effect; however, this gut liner provided the opportunity to shift out of this state, and they often become so happy with the result they were determined to stick with it and continue with a healthier lifestyle including much more exercise.”
Convenient, Reversible Procedure
Ninety consecutive patients from Birmingham, all with longstanding, poorly controlled, type 2 diabetes and obesity, underwent the implantation procedure, and 60 of these attended follow-up visits 2 years post implantation.
Unlike permanent and more invasive weight loss surgeries, the EndoBarrier device is reversible and fitted with a straightforward procedure.
The thin impermeable sleeve is inserted via an approximate 1-hour endoscopy, enabling the patient to return home the same day. It lines the first 60 cm of the small intestine. Digested food passes through it without absorption and then makes contact with pancreatic and bile juices at the other end. This triggers a change in the metabolism of glucose and nutrients through modulating gut hormones and gut bacteria, as well as disrupting bile flow.
“Because the food bypasses the small intestine, the first time the food is encountered is in an area where it is not normally found, and this causes a reaction where signals are sent to the brain to stop eating,” explained Dr. Ryder.
Due to a license for 1 year of use, the gut liner was removed after a year via a 30-minute endoscopy procedure.
Over Half Maintained Full Improvement 2 Years Post Removal
A total of 60/90 (66%) attended follow-up visits and comprised the data presented. Mean age was 51.2 years, 47% were men, 50% were White, mean body mass index (BMI) was 41.5 kg/m2, and mean A1c was 9.3%. Duration of type 2 diabetes was a median of 11 years, and 60% were taking insulin.
Patients followed dietary requirements for the initial phase after implantation. “During the first week, they followed a liquid diet, then during week 2 — mushy food, and then they were told to chew it really well to avoid blockage,” said Dr. Ryder.
Mean weight loss on removal of the liner (at 12 months post implantation) was 16.7 kg (P < .001), while BMI dropped by mean 6 kg/m2, A1c dropped by a mean of 1.8%, and mean systolic blood pressure by 10.9 mm Hg.
Just over half (32/60, 53%) showed maintenance of fully sustained improvement 2 years after removal of the liner — defined as no significant difference after 2 years between weight loss (mean, 96-97 kg) and similarly for A1c improvement (7.6%-7.4%).
Sixteen of 60 (27%) showed partially sustained improvement over the 2 years of follow-up, with BMI increasing from a mean of 116.8 kg to 128.6 kg and A1c increasing from 7.5% to 8.4%. While 20% (12/60) returned to baseline.
Of the 36/60 people using insulin prior to EndoBarrier treatment, 10 (27.8%) were no longer using insulin at 2 years post removal.
Thirteen of 90 (14%) had early removal of the gut liner due to gastrointestinal hemorrhage (five), liver abscess (two), other abscess (one), and gastrointestinal symptoms (five), but they all made a full recovery; after removal, most experienced benefit despite the adverse event, reported Dr. Ryder.
Sarah Davies, MBBCh, a GP at Woodlands Medical Centre, Cardiff, Wales, agreed that EndoBarrier might be a viable option for patients struggling with obesity. “As GPs, we are the first port of call for these patients. It’s very novel, I hadn’t heard of it before. I like how it’s a noninvasive way for my patients to lose weight and maintain that even after EndoBarrier has been removed.”
Outcomes are being monitored in an ongoing global registry to help determine if EndoBarrier is a safe and effective treatment for individuals with type 2 diabetes and obesity. Dr. Ryder noted that a similar study with 3 years of follow-up showed similar results. Further results will be presented by Dr. Ryder at the upcoming meeting of the American Diabetes Association.
EndoBarrier is currently not approved in the United States. It is awaiting United Kingdom and European CE mark, which the manufacturer hope will be granted this summer. The license will be for patients with BMI of 35-50 kg/m2.
A version of this article appeared on Medscape.com.
LONDON — , showed data.
Two years after the liner’s removal, 80% of patients continued to show significant improvement, while 20% returned to baseline.
Presenting results at the Diabetes UK Professional Conference (DUKPC) 2024, the researchers, led by Bob Ryder, MD, FRCP, from the Department of Diabetes, Birmingham City Hospital, Birmingham, England, aimed to assess the safety and efficacy of EndoBarrier, as well as maintenance of efficacy 24 months after the device removal.
“We think EndoBarrier finds its place between the end of all the earlier measures and the possible option of bariatric surgery, and these data show that it can lead to tremendous weight loss and improvement in A1c,” Dr. Ryder said in an interview.
Commenting on how most patients had responded to use of the device, Dr. Ryder said, “People with obesity are often very unhappy and have tried everything over many years to no effect; however, this gut liner provided the opportunity to shift out of this state, and they often become so happy with the result they were determined to stick with it and continue with a healthier lifestyle including much more exercise.”
Convenient, Reversible Procedure
Ninety consecutive patients from Birmingham, all with longstanding, poorly controlled, type 2 diabetes and obesity, underwent the implantation procedure, and 60 of these attended follow-up visits 2 years post implantation.
Unlike permanent and more invasive weight loss surgeries, the EndoBarrier device is reversible and fitted with a straightforward procedure.
The thin impermeable sleeve is inserted via an approximate 1-hour endoscopy, enabling the patient to return home the same day. It lines the first 60 cm of the small intestine. Digested food passes through it without absorption and then makes contact with pancreatic and bile juices at the other end. This triggers a change in the metabolism of glucose and nutrients through modulating gut hormones and gut bacteria, as well as disrupting bile flow.
“Because the food bypasses the small intestine, the first time the food is encountered is in an area where it is not normally found, and this causes a reaction where signals are sent to the brain to stop eating,” explained Dr. Ryder.
Due to a license for 1 year of use, the gut liner was removed after a year via a 30-minute endoscopy procedure.
Over Half Maintained Full Improvement 2 Years Post Removal
A total of 60/90 (66%) attended follow-up visits and comprised the data presented. Mean age was 51.2 years, 47% were men, 50% were White, mean body mass index (BMI) was 41.5 kg/m2, and mean A1c was 9.3%. Duration of type 2 diabetes was a median of 11 years, and 60% were taking insulin.
Patients followed dietary requirements for the initial phase after implantation. “During the first week, they followed a liquid diet, then during week 2 — mushy food, and then they were told to chew it really well to avoid blockage,” said Dr. Ryder.
Mean weight loss on removal of the liner (at 12 months post implantation) was 16.7 kg (P < .001), while BMI dropped by mean 6 kg/m2, A1c dropped by a mean of 1.8%, and mean systolic blood pressure by 10.9 mm Hg.
Just over half (32/60, 53%) showed maintenance of fully sustained improvement 2 years after removal of the liner — defined as no significant difference after 2 years between weight loss (mean, 96-97 kg) and similarly for A1c improvement (7.6%-7.4%).
Sixteen of 60 (27%) showed partially sustained improvement over the 2 years of follow-up, with BMI increasing from a mean of 116.8 kg to 128.6 kg and A1c increasing from 7.5% to 8.4%. While 20% (12/60) returned to baseline.
Of the 36/60 people using insulin prior to EndoBarrier treatment, 10 (27.8%) were no longer using insulin at 2 years post removal.
Thirteen of 90 (14%) had early removal of the gut liner due to gastrointestinal hemorrhage (five), liver abscess (two), other abscess (one), and gastrointestinal symptoms (five), but they all made a full recovery; after removal, most experienced benefit despite the adverse event, reported Dr. Ryder.
Sarah Davies, MBBCh, a GP at Woodlands Medical Centre, Cardiff, Wales, agreed that EndoBarrier might be a viable option for patients struggling with obesity. “As GPs, we are the first port of call for these patients. It’s very novel, I hadn’t heard of it before. I like how it’s a noninvasive way for my patients to lose weight and maintain that even after EndoBarrier has been removed.”
Outcomes are being monitored in an ongoing global registry to help determine if EndoBarrier is a safe and effective treatment for individuals with type 2 diabetes and obesity. Dr. Ryder noted that a similar study with 3 years of follow-up showed similar results. Further results will be presented by Dr. Ryder at the upcoming meeting of the American Diabetes Association.
EndoBarrier is currently not approved in the United States. It is awaiting United Kingdom and European CE mark, which the manufacturer hope will be granted this summer. The license will be for patients with BMI of 35-50 kg/m2.
A version of this article appeared on Medscape.com.
LONDON — , showed data.
Two years after the liner’s removal, 80% of patients continued to show significant improvement, while 20% returned to baseline.
Presenting results at the Diabetes UK Professional Conference (DUKPC) 2024, the researchers, led by Bob Ryder, MD, FRCP, from the Department of Diabetes, Birmingham City Hospital, Birmingham, England, aimed to assess the safety and efficacy of EndoBarrier, as well as maintenance of efficacy 24 months after the device removal.
“We think EndoBarrier finds its place between the end of all the earlier measures and the possible option of bariatric surgery, and these data show that it can lead to tremendous weight loss and improvement in A1c,” Dr. Ryder said in an interview.
Commenting on how most patients had responded to use of the device, Dr. Ryder said, “People with obesity are often very unhappy and have tried everything over many years to no effect; however, this gut liner provided the opportunity to shift out of this state, and they often become so happy with the result they were determined to stick with it and continue with a healthier lifestyle including much more exercise.”
Convenient, Reversible Procedure
Ninety consecutive patients from Birmingham, all with longstanding, poorly controlled, type 2 diabetes and obesity, underwent the implantation procedure, and 60 of these attended follow-up visits 2 years post implantation.
Unlike permanent and more invasive weight loss surgeries, the EndoBarrier device is reversible and fitted with a straightforward procedure.
The thin impermeable sleeve is inserted via an approximate 1-hour endoscopy, enabling the patient to return home the same day. It lines the first 60 cm of the small intestine. Digested food passes through it without absorption and then makes contact with pancreatic and bile juices at the other end. This triggers a change in the metabolism of glucose and nutrients through modulating gut hormones and gut bacteria, as well as disrupting bile flow.
“Because the food bypasses the small intestine, the first time the food is encountered is in an area where it is not normally found, and this causes a reaction where signals are sent to the brain to stop eating,” explained Dr. Ryder.
Due to a license for 1 year of use, the gut liner was removed after a year via a 30-minute endoscopy procedure.
Over Half Maintained Full Improvement 2 Years Post Removal
A total of 60/90 (66%) attended follow-up visits and comprised the data presented. Mean age was 51.2 years, 47% were men, 50% were White, mean body mass index (BMI) was 41.5 kg/m2, and mean A1c was 9.3%. Duration of type 2 diabetes was a median of 11 years, and 60% were taking insulin.
Patients followed dietary requirements for the initial phase after implantation. “During the first week, they followed a liquid diet, then during week 2 — mushy food, and then they were told to chew it really well to avoid blockage,” said Dr. Ryder.
Mean weight loss on removal of the liner (at 12 months post implantation) was 16.7 kg (P < .001), while BMI dropped by mean 6 kg/m2, A1c dropped by a mean of 1.8%, and mean systolic blood pressure by 10.9 mm Hg.
Just over half (32/60, 53%) showed maintenance of fully sustained improvement 2 years after removal of the liner — defined as no significant difference after 2 years between weight loss (mean, 96-97 kg) and similarly for A1c improvement (7.6%-7.4%).
Sixteen of 60 (27%) showed partially sustained improvement over the 2 years of follow-up, with BMI increasing from a mean of 116.8 kg to 128.6 kg and A1c increasing from 7.5% to 8.4%. While 20% (12/60) returned to baseline.
Of the 36/60 people using insulin prior to EndoBarrier treatment, 10 (27.8%) were no longer using insulin at 2 years post removal.
Thirteen of 90 (14%) had early removal of the gut liner due to gastrointestinal hemorrhage (five), liver abscess (two), other abscess (one), and gastrointestinal symptoms (five), but they all made a full recovery; after removal, most experienced benefit despite the adverse event, reported Dr. Ryder.
Sarah Davies, MBBCh, a GP at Woodlands Medical Centre, Cardiff, Wales, agreed that EndoBarrier might be a viable option for patients struggling with obesity. “As GPs, we are the first port of call for these patients. It’s very novel, I hadn’t heard of it before. I like how it’s a noninvasive way for my patients to lose weight and maintain that even after EndoBarrier has been removed.”
Outcomes are being monitored in an ongoing global registry to help determine if EndoBarrier is a safe and effective treatment for individuals with type 2 diabetes and obesity. Dr. Ryder noted that a similar study with 3 years of follow-up showed similar results. Further results will be presented by Dr. Ryder at the upcoming meeting of the American Diabetes Association.
EndoBarrier is currently not approved in the United States. It is awaiting United Kingdom and European CE mark, which the manufacturer hope will be granted this summer. The license will be for patients with BMI of 35-50 kg/m2.
A version of this article appeared on Medscape.com.