User login
Bringing you the latest news, research and reviews, exclusive interviews, podcasts, quizzes, and more.
div[contains(@class, 'read-next-article')]
div[contains(@class, 'nav-primary')]
nav[contains(@class, 'nav-primary')]
section[contains(@class, 'footer-nav-section-wrapper')]
nav[contains(@class, 'nav-ce-stack nav-ce-stack__large-screen')]
header[@id='header']
div[contains(@class, 'header__large-screen')]
div[contains(@class, 'read-next-article')]
div[contains(@class, 'main-prefix')]
div[contains(@class, 'nav-primary')]
nav[contains(@class, 'nav-primary')]
section[contains(@class, 'footer-nav-section-wrapper')]
footer[@id='footer']
section[contains(@class, 'nav-hidden')]
div[contains(@class, 'ce-card-content')]
nav[contains(@class, 'nav-ce-stack')]
div[contains(@class, 'view-medstat-quiz-listing-panes')]
div[contains(@class, 'pane-article-sidebar-latest-news')]
Chronic Pain Linked to Accelerated Brain Aging
, new research showed.
Using structural MRI data from more than 9000 adults with knee osteoarthritis (KOA) from the UK Biobank, investigators developed a brain age model to compare an individual’s brain age with their chronological age. Those with KOA showed a much faster rate of brain aging than healthy individuals.
The acceleration in brain aging was largely driven by the hippocampus and predicted memory decline and incident dementia during follow-up. Researchers identified a gene highly expressed in glial cells as a possible genetic factor for accelerated brain aging.
“We demonstrate the accelerated brain aging and cognitive decline in chronic musculoskeletal pain, in particular knee osteoarthritis, and provide a neural marker for early detection and intervention,” said co-first author Jiao Liu, PhD candidate, Chinese Academy of Sciences, Beijing.
“We are interested to know how to slow down the aging brain in chronic musculoskeletal pain patients. Proper exercise and lifestyle may reduce the risk,” Dr. Liu said.
The study was published online in Nature Mental Health.
Common Condition
CMP affects more than 40% of the world’s population and has been shown to have a harmful impact on cognitive function, although the exact mechanisms remain unclear. Prior research suggests that inflammatory markers associated with brain aging are higher in patients with CMP, suggesting a link between brain aging and CMP.
To investigate further, researchers explored patterns of brain aging in healthy cohorts and cohorts with four common types of CMP — chronic knee pain, chronic back pain, chronic neck pain, and chronic hip pain.
Using their brain age model, investigators observed significantly increased brain aging, or “predicted age difference,” only in individuals with KOA (P < .001). The observation was validated in an independent dataset (P = .020), suggesting a pattern of brain aging acceleration specific to KOA.
This acceleration was primarily driven by key brain regions involved in cognitive processing, including hippocampus and orbitofrontal cortex, and was correlated with longitudinal memory decline and dementia risk.
These data also suggest that the SLC39A8 gene, which is highly expressed in glial cells, might be a key genetic factor underpinning this acceleration.
“We not only revealed the specificity of accelerated brain aging in knee osteoarthritis patients, but importantly, we also provided longitudinal evidence suggesting the ability of our brain aging marker to predict future memory decline and increased dementia risk,” corresponding author Yiheng Tu, PhD, also with Chinese Academy of Sciences, Beijing, said in a news release.
A Future Treatment Target?
Commenting on this research, Shaheen Lakhan, MD, PhD, a neurologist and researcher based in Miami, noted that in this study, people with KOA showed signs of “faster brain aging on scans. Think of it as your brain wearing a disguise, appearing older than its actual years,” Dr. Lakhan said.
“Inflammation, a key player in osteoarthritis, might be playing a double agent, wreaking havoc not just on your joints but potentially on your memory too. Researchers even identified a specific gene linked to both knee pain and faster brain aging, hinting at a potential target for future treatments,” he added.
“Importantly, the increased risk of cognitive decline and dementia associated with chronic pain is likely one of many factors, and probably not a very high one on its own,” Dr. Lakhan noted.
The “good news,” he said, is that there are many “well-established ways to keep your brain sharp. Regular exercise, a healthy diet, and staying mentally stimulated are all proven strategies to reduce dementia risk. Think of chronic pain management as another tool you can add to your brain health toolbox.”
Support for the study was provided by the STI-2030 Major Project, the National Natural Science Foundation of China, the Scientific Foundation of the Institute of Psychology, Chinese Academy of Sciences, and the Young Elite Scientist Sponsorship Program by the China Association for Science and Technology. Dr. Liu and Dr. Lakhan had no relevant disclosures.
A version of this article appeared on Medscape.com.
, new research showed.
Using structural MRI data from more than 9000 adults with knee osteoarthritis (KOA) from the UK Biobank, investigators developed a brain age model to compare an individual’s brain age with their chronological age. Those with KOA showed a much faster rate of brain aging than healthy individuals.
The acceleration in brain aging was largely driven by the hippocampus and predicted memory decline and incident dementia during follow-up. Researchers identified a gene highly expressed in glial cells as a possible genetic factor for accelerated brain aging.
“We demonstrate the accelerated brain aging and cognitive decline in chronic musculoskeletal pain, in particular knee osteoarthritis, and provide a neural marker for early detection and intervention,” said co-first author Jiao Liu, PhD candidate, Chinese Academy of Sciences, Beijing.
“We are interested to know how to slow down the aging brain in chronic musculoskeletal pain patients. Proper exercise and lifestyle may reduce the risk,” Dr. Liu said.
The study was published online in Nature Mental Health.
Common Condition
CMP affects more than 40% of the world’s population and has been shown to have a harmful impact on cognitive function, although the exact mechanisms remain unclear. Prior research suggests that inflammatory markers associated with brain aging are higher in patients with CMP, suggesting a link between brain aging and CMP.
To investigate further, researchers explored patterns of brain aging in healthy cohorts and cohorts with four common types of CMP — chronic knee pain, chronic back pain, chronic neck pain, and chronic hip pain.
Using their brain age model, investigators observed significantly increased brain aging, or “predicted age difference,” only in individuals with KOA (P < .001). The observation was validated in an independent dataset (P = .020), suggesting a pattern of brain aging acceleration specific to KOA.
This acceleration was primarily driven by key brain regions involved in cognitive processing, including hippocampus and orbitofrontal cortex, and was correlated with longitudinal memory decline and dementia risk.
These data also suggest that the SLC39A8 gene, which is highly expressed in glial cells, might be a key genetic factor underpinning this acceleration.
“We not only revealed the specificity of accelerated brain aging in knee osteoarthritis patients, but importantly, we also provided longitudinal evidence suggesting the ability of our brain aging marker to predict future memory decline and increased dementia risk,” corresponding author Yiheng Tu, PhD, also with Chinese Academy of Sciences, Beijing, said in a news release.
A Future Treatment Target?
Commenting on this research, Shaheen Lakhan, MD, PhD, a neurologist and researcher based in Miami, noted that in this study, people with KOA showed signs of “faster brain aging on scans. Think of it as your brain wearing a disguise, appearing older than its actual years,” Dr. Lakhan said.
“Inflammation, a key player in osteoarthritis, might be playing a double agent, wreaking havoc not just on your joints but potentially on your memory too. Researchers even identified a specific gene linked to both knee pain and faster brain aging, hinting at a potential target for future treatments,” he added.
“Importantly, the increased risk of cognitive decline and dementia associated with chronic pain is likely one of many factors, and probably not a very high one on its own,” Dr. Lakhan noted.
The “good news,” he said, is that there are many “well-established ways to keep your brain sharp. Regular exercise, a healthy diet, and staying mentally stimulated are all proven strategies to reduce dementia risk. Think of chronic pain management as another tool you can add to your brain health toolbox.”
Support for the study was provided by the STI-2030 Major Project, the National Natural Science Foundation of China, the Scientific Foundation of the Institute of Psychology, Chinese Academy of Sciences, and the Young Elite Scientist Sponsorship Program by the China Association for Science and Technology. Dr. Liu and Dr. Lakhan had no relevant disclosures.
A version of this article appeared on Medscape.com.
, new research showed.
Using structural MRI data from more than 9000 adults with knee osteoarthritis (KOA) from the UK Biobank, investigators developed a brain age model to compare an individual’s brain age with their chronological age. Those with KOA showed a much faster rate of brain aging than healthy individuals.
The acceleration in brain aging was largely driven by the hippocampus and predicted memory decline and incident dementia during follow-up. Researchers identified a gene highly expressed in glial cells as a possible genetic factor for accelerated brain aging.
“We demonstrate the accelerated brain aging and cognitive decline in chronic musculoskeletal pain, in particular knee osteoarthritis, and provide a neural marker for early detection and intervention,” said co-first author Jiao Liu, PhD candidate, Chinese Academy of Sciences, Beijing.
“We are interested to know how to slow down the aging brain in chronic musculoskeletal pain patients. Proper exercise and lifestyle may reduce the risk,” Dr. Liu said.
The study was published online in Nature Mental Health.
Common Condition
CMP affects more than 40% of the world’s population and has been shown to have a harmful impact on cognitive function, although the exact mechanisms remain unclear. Prior research suggests that inflammatory markers associated with brain aging are higher in patients with CMP, suggesting a link between brain aging and CMP.
To investigate further, researchers explored patterns of brain aging in healthy cohorts and cohorts with four common types of CMP — chronic knee pain, chronic back pain, chronic neck pain, and chronic hip pain.
Using their brain age model, investigators observed significantly increased brain aging, or “predicted age difference,” only in individuals with KOA (P < .001). The observation was validated in an independent dataset (P = .020), suggesting a pattern of brain aging acceleration specific to KOA.
This acceleration was primarily driven by key brain regions involved in cognitive processing, including hippocampus and orbitofrontal cortex, and was correlated with longitudinal memory decline and dementia risk.
These data also suggest that the SLC39A8 gene, which is highly expressed in glial cells, might be a key genetic factor underpinning this acceleration.
“We not only revealed the specificity of accelerated brain aging in knee osteoarthritis patients, but importantly, we also provided longitudinal evidence suggesting the ability of our brain aging marker to predict future memory decline and increased dementia risk,” corresponding author Yiheng Tu, PhD, also with Chinese Academy of Sciences, Beijing, said in a news release.
A Future Treatment Target?
Commenting on this research, Shaheen Lakhan, MD, PhD, a neurologist and researcher based in Miami, noted that in this study, people with KOA showed signs of “faster brain aging on scans. Think of it as your brain wearing a disguise, appearing older than its actual years,” Dr. Lakhan said.
“Inflammation, a key player in osteoarthritis, might be playing a double agent, wreaking havoc not just on your joints but potentially on your memory too. Researchers even identified a specific gene linked to both knee pain and faster brain aging, hinting at a potential target for future treatments,” he added.
“Importantly, the increased risk of cognitive decline and dementia associated with chronic pain is likely one of many factors, and probably not a very high one on its own,” Dr. Lakhan noted.
The “good news,” he said, is that there are many “well-established ways to keep your brain sharp. Regular exercise, a healthy diet, and staying mentally stimulated are all proven strategies to reduce dementia risk. Think of chronic pain management as another tool you can add to your brain health toolbox.”
Support for the study was provided by the STI-2030 Major Project, the National Natural Science Foundation of China, the Scientific Foundation of the Institute of Psychology, Chinese Academy of Sciences, and the Young Elite Scientist Sponsorship Program by the China Association for Science and Technology. Dr. Liu and Dr. Lakhan had no relevant disclosures.
A version of this article appeared on Medscape.com.
FROM NATURE MENTAL HEALTH
Arm Fat Raises CVD Risk in People With Type 2 Diabetes
TOPLINE:
In people with type 2 diabetes (T2D), higher levels of arm and trunk fat are associated with an increased risk for cardiovascular disease (CVD) and mortality, while higher levels of leg fat are associated with a reduced risk for these conditions.
METHODOLOGY:
- People with T2D have a twofold to fourfold higher risk for CVD and mortality, and evidence shows obesity management helps delay complications and premature death, but an elevated body mass index (BMI) may be insufficient to measure obesity.
- In the “obesity paradox,” people with elevated BMI may have a lower CVD risk than people of normal weight.
- Researchers prospectively investigated how regional body fat accumulation was associated with CVD risk in 21,472 people with T2D (mean age, 58.9 years; 60.7% men; BMI about 29-33) from the UK Biobank (2006-2010), followed up for a median of 7.7 years.
- The regional body fat distribution in arms, trunk, and legs was assessed using bioelectrical impedance analysis.
- The primary outcomes were the incidence of CVD, all-cause mortality, and CVD mortality.
TAKEAWAY:
- However, participants in the highest quartile of leg fat percentage had a lower risk for CVD than those in the lowest quartile (HR, 0.72; 95% CI, 0.58-0.90).
- A nonlinear relationship was observed between higher leg fat percentage and lower CVD risk and between higher trunk fat percentage and higher CVD risk, whereas a linear relationship was noted between higher arm fat percentage and higher CVD risk.
- The patterns of association were similar for both all-cause mortality and CVD mortality. Overall patterns were similar for men and women.
IN PRACTICE:
“Our findings add to the understanding of body fat distribution in patients with T2D, which highlights the importance of considering both the amount and the location of body fat when assessing CVD and mortality risk among patients with T2D,” wrote the authors.
SOURCE:
The study led by Zixin Qiu, School of Public Health, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China, was published online in The Journal of Clinical Endocrinology & Metabolism.
LIMITATIONS:
As body fat was measured only once at the beginning of the study, its changing association over time could not be assessed. Moreover, the findings were primarily based on predominantly White UK adults, potentially restricting their generalizability to other population groups. Furthermore, diabetes was diagnosed using self-reported medical history, medication, and hemoglobin A1c levels, implying that some cases may have gone undetected at baseline.
DISCLOSURES:
This study was funded by grants from the National Natural Science Foundation of China, Hubei Province Science Fund for Distinguished Young Scholars, and Fundamental Research Funds for the Central Universities. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
TOPLINE:
In people with type 2 diabetes (T2D), higher levels of arm and trunk fat are associated with an increased risk for cardiovascular disease (CVD) and mortality, while higher levels of leg fat are associated with a reduced risk for these conditions.
METHODOLOGY:
- People with T2D have a twofold to fourfold higher risk for CVD and mortality, and evidence shows obesity management helps delay complications and premature death, but an elevated body mass index (BMI) may be insufficient to measure obesity.
- In the “obesity paradox,” people with elevated BMI may have a lower CVD risk than people of normal weight.
- Researchers prospectively investigated how regional body fat accumulation was associated with CVD risk in 21,472 people with T2D (mean age, 58.9 years; 60.7% men; BMI about 29-33) from the UK Biobank (2006-2010), followed up for a median of 7.7 years.
- The regional body fat distribution in arms, trunk, and legs was assessed using bioelectrical impedance analysis.
- The primary outcomes were the incidence of CVD, all-cause mortality, and CVD mortality.
TAKEAWAY:
- However, participants in the highest quartile of leg fat percentage had a lower risk for CVD than those in the lowest quartile (HR, 0.72; 95% CI, 0.58-0.90).
- A nonlinear relationship was observed between higher leg fat percentage and lower CVD risk and between higher trunk fat percentage and higher CVD risk, whereas a linear relationship was noted between higher arm fat percentage and higher CVD risk.
- The patterns of association were similar for both all-cause mortality and CVD mortality. Overall patterns were similar for men and women.
IN PRACTICE:
“Our findings add to the understanding of body fat distribution in patients with T2D, which highlights the importance of considering both the amount and the location of body fat when assessing CVD and mortality risk among patients with T2D,” wrote the authors.
SOURCE:
The study led by Zixin Qiu, School of Public Health, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China, was published online in The Journal of Clinical Endocrinology & Metabolism.
LIMITATIONS:
As body fat was measured only once at the beginning of the study, its changing association over time could not be assessed. Moreover, the findings were primarily based on predominantly White UK adults, potentially restricting their generalizability to other population groups. Furthermore, diabetes was diagnosed using self-reported medical history, medication, and hemoglobin A1c levels, implying that some cases may have gone undetected at baseline.
DISCLOSURES:
This study was funded by grants from the National Natural Science Foundation of China, Hubei Province Science Fund for Distinguished Young Scholars, and Fundamental Research Funds for the Central Universities. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
TOPLINE:
In people with type 2 diabetes (T2D), higher levels of arm and trunk fat are associated with an increased risk for cardiovascular disease (CVD) and mortality, while higher levels of leg fat are associated with a reduced risk for these conditions.
METHODOLOGY:
- People with T2D have a twofold to fourfold higher risk for CVD and mortality, and evidence shows obesity management helps delay complications and premature death, but an elevated body mass index (BMI) may be insufficient to measure obesity.
- In the “obesity paradox,” people with elevated BMI may have a lower CVD risk than people of normal weight.
- Researchers prospectively investigated how regional body fat accumulation was associated with CVD risk in 21,472 people with T2D (mean age, 58.9 years; 60.7% men; BMI about 29-33) from the UK Biobank (2006-2010), followed up for a median of 7.7 years.
- The regional body fat distribution in arms, trunk, and legs was assessed using bioelectrical impedance analysis.
- The primary outcomes were the incidence of CVD, all-cause mortality, and CVD mortality.
TAKEAWAY:
- However, participants in the highest quartile of leg fat percentage had a lower risk for CVD than those in the lowest quartile (HR, 0.72; 95% CI, 0.58-0.90).
- A nonlinear relationship was observed between higher leg fat percentage and lower CVD risk and between higher trunk fat percentage and higher CVD risk, whereas a linear relationship was noted between higher arm fat percentage and higher CVD risk.
- The patterns of association were similar for both all-cause mortality and CVD mortality. Overall patterns were similar for men and women.
IN PRACTICE:
“Our findings add to the understanding of body fat distribution in patients with T2D, which highlights the importance of considering both the amount and the location of body fat when assessing CVD and mortality risk among patients with T2D,” wrote the authors.
SOURCE:
The study led by Zixin Qiu, School of Public Health, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China, was published online in The Journal of Clinical Endocrinology & Metabolism.
LIMITATIONS:
As body fat was measured only once at the beginning of the study, its changing association over time could not be assessed. Moreover, the findings were primarily based on predominantly White UK adults, potentially restricting their generalizability to other population groups. Furthermore, diabetes was diagnosed using self-reported medical history, medication, and hemoglobin A1c levels, implying that some cases may have gone undetected at baseline.
DISCLOSURES:
This study was funded by grants from the National Natural Science Foundation of China, Hubei Province Science Fund for Distinguished Young Scholars, and Fundamental Research Funds for the Central Universities. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
Can Short Cycles of a Fasting-Like Diet Reduce Disease Risk?
TOPLINE:
METHODOLOGY:
- In two clinical trials, monthly 5-day cycles of an FMD (a proprietary line of plant-based, low-calorie, and low-protein food products) showed lower body weight, body fat, and blood pressure at 3 months.
- Researchers assessed secondary outcomes for the impact of the diet on risk factors for metabolic syndrome and biomarkers associated with aging and age-related diseases.
- This study looked at data from nearly half of the original 184 participants (aged 18-70 years) from the two clinical trials who went through three to four monthly cycles, adhering to 5 days of an FMD in either a crossover design compared with a normal diet or an intervention group compared with people following a Mediterranean diet.
- Abdominal fat and hepatic fat were measured using an MRI in a subset of representative participants. The study also assessed metabolic blood markers and lipids and lymphoid-to-myeloid ratios (for immune aging).
- Biological age estimation was calculated from seven clinical chemistry measures, and life expectancy and mortality risk estimates and a simulation of continued FMD cycles were based on the National Health and Nutrition Examination Survey.
TAKEAWAY:
- In 15 volunteers measured by MRI, the body mass index (P = .0002), total body fat (P = .002), subcutaneous adipose tissue (P = .008), visceral adipose tissue (P = .002), and hepatic fat fraction (P = .049) reduced after the third FMD cycle, with a 50% reduction in liver fat for the five people with hepatic steatosis.
- In 11 participants with prediabetes, insulin resistance (measured by homeostatic model assessment) reduced from 1.473 to 1.209 (P = .046), while A1c levels dropped from 5.8 to 5.43 (P = .032) after the third FMD cycle.
- The lymphoid-to-myeloid ratio improved (P = .005) in all study participants receiving three FMD cycles, indicating an immune aging reversal.
- The estimated median biological age of the 86 participants who completed three FMD cycles in both trials decreased by nearly 2.5 years, independent of weight loss.
IN PRACTICE:
“Together our findings indicate that the FMD is a feasible periodic dietary intervention that reduces disease risk factors and biological age,” the authors wrote.
SOURCE:
The study, led by Sebastian Brandhorst, PhD, Leonard Davis School of Gerontology, University of Southern California (USC), Los Angeles, and Morgan E. Levine, PhD, Department of Pathology, Yale School of Medicine, New Haven, Connecticut, was published in Nature Communications.
LIMITATIONS:
The study estimated the effects of monthly FMD cycles based on results from two clinical trials and included a small subset of trial volunteers. By study measures, the cohort was healthier and biologically younger than average people of similar chronological age. Of the 86 participants, 24 who underwent FMD cycles exhibited increased biological age. The simulation did not consider compliance, dropout, mortality, or the bias that may arise owing to enthusiastic volunteers. Estimated risk reductions assume an effect of change in biological age, which hasn’t been proven. Projections from extending the effects of FMD to a lifelong intervention may require cautious interpretation.
DISCLOSURES:
The study was supported by the USC Edna Jones chair fund and funds from NIH/NIA and the Yale PEPPER Center. The experimental diet was provided by L-Nutra Inc. Some authors declared an equity interest in L-Nutra, with one author’s equity to be assigned to the nonprofit foundation Create Cures. Others disclosed no conflicts of interest.
A version of this article appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- In two clinical trials, monthly 5-day cycles of an FMD (a proprietary line of plant-based, low-calorie, and low-protein food products) showed lower body weight, body fat, and blood pressure at 3 months.
- Researchers assessed secondary outcomes for the impact of the diet on risk factors for metabolic syndrome and biomarkers associated with aging and age-related diseases.
- This study looked at data from nearly half of the original 184 participants (aged 18-70 years) from the two clinical trials who went through three to four monthly cycles, adhering to 5 days of an FMD in either a crossover design compared with a normal diet or an intervention group compared with people following a Mediterranean diet.
- Abdominal fat and hepatic fat were measured using an MRI in a subset of representative participants. The study also assessed metabolic blood markers and lipids and lymphoid-to-myeloid ratios (for immune aging).
- Biological age estimation was calculated from seven clinical chemistry measures, and life expectancy and mortality risk estimates and a simulation of continued FMD cycles were based on the National Health and Nutrition Examination Survey.
TAKEAWAY:
- In 15 volunteers measured by MRI, the body mass index (P = .0002), total body fat (P = .002), subcutaneous adipose tissue (P = .008), visceral adipose tissue (P = .002), and hepatic fat fraction (P = .049) reduced after the third FMD cycle, with a 50% reduction in liver fat for the five people with hepatic steatosis.
- In 11 participants with prediabetes, insulin resistance (measured by homeostatic model assessment) reduced from 1.473 to 1.209 (P = .046), while A1c levels dropped from 5.8 to 5.43 (P = .032) after the third FMD cycle.
- The lymphoid-to-myeloid ratio improved (P = .005) in all study participants receiving three FMD cycles, indicating an immune aging reversal.
- The estimated median biological age of the 86 participants who completed three FMD cycles in both trials decreased by nearly 2.5 years, independent of weight loss.
IN PRACTICE:
“Together our findings indicate that the FMD is a feasible periodic dietary intervention that reduces disease risk factors and biological age,” the authors wrote.
SOURCE:
The study, led by Sebastian Brandhorst, PhD, Leonard Davis School of Gerontology, University of Southern California (USC), Los Angeles, and Morgan E. Levine, PhD, Department of Pathology, Yale School of Medicine, New Haven, Connecticut, was published in Nature Communications.
LIMITATIONS:
The study estimated the effects of monthly FMD cycles based on results from two clinical trials and included a small subset of trial volunteers. By study measures, the cohort was healthier and biologically younger than average people of similar chronological age. Of the 86 participants, 24 who underwent FMD cycles exhibited increased biological age. The simulation did not consider compliance, dropout, mortality, or the bias that may arise owing to enthusiastic volunteers. Estimated risk reductions assume an effect of change in biological age, which hasn’t been proven. Projections from extending the effects of FMD to a lifelong intervention may require cautious interpretation.
DISCLOSURES:
The study was supported by the USC Edna Jones chair fund and funds from NIH/NIA and the Yale PEPPER Center. The experimental diet was provided by L-Nutra Inc. Some authors declared an equity interest in L-Nutra, with one author’s equity to be assigned to the nonprofit foundation Create Cures. Others disclosed no conflicts of interest.
A version of this article appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- In two clinical trials, monthly 5-day cycles of an FMD (a proprietary line of plant-based, low-calorie, and low-protein food products) showed lower body weight, body fat, and blood pressure at 3 months.
- Researchers assessed secondary outcomes for the impact of the diet on risk factors for metabolic syndrome and biomarkers associated with aging and age-related diseases.
- This study looked at data from nearly half of the original 184 participants (aged 18-70 years) from the two clinical trials who went through three to four monthly cycles, adhering to 5 days of an FMD in either a crossover design compared with a normal diet or an intervention group compared with people following a Mediterranean diet.
- Abdominal fat and hepatic fat were measured using an MRI in a subset of representative participants. The study also assessed metabolic blood markers and lipids and lymphoid-to-myeloid ratios (for immune aging).
- Biological age estimation was calculated from seven clinical chemistry measures, and life expectancy and mortality risk estimates and a simulation of continued FMD cycles were based on the National Health and Nutrition Examination Survey.
TAKEAWAY:
- In 15 volunteers measured by MRI, the body mass index (P = .0002), total body fat (P = .002), subcutaneous adipose tissue (P = .008), visceral adipose tissue (P = .002), and hepatic fat fraction (P = .049) reduced after the third FMD cycle, with a 50% reduction in liver fat for the five people with hepatic steatosis.
- In 11 participants with prediabetes, insulin resistance (measured by homeostatic model assessment) reduced from 1.473 to 1.209 (P = .046), while A1c levels dropped from 5.8 to 5.43 (P = .032) after the third FMD cycle.
- The lymphoid-to-myeloid ratio improved (P = .005) in all study participants receiving three FMD cycles, indicating an immune aging reversal.
- The estimated median biological age of the 86 participants who completed three FMD cycles in both trials decreased by nearly 2.5 years, independent of weight loss.
IN PRACTICE:
“Together our findings indicate that the FMD is a feasible periodic dietary intervention that reduces disease risk factors and biological age,” the authors wrote.
SOURCE:
The study, led by Sebastian Brandhorst, PhD, Leonard Davis School of Gerontology, University of Southern California (USC), Los Angeles, and Morgan E. Levine, PhD, Department of Pathology, Yale School of Medicine, New Haven, Connecticut, was published in Nature Communications.
LIMITATIONS:
The study estimated the effects of monthly FMD cycles based on results from two clinical trials and included a small subset of trial volunteers. By study measures, the cohort was healthier and biologically younger than average people of similar chronological age. Of the 86 participants, 24 who underwent FMD cycles exhibited increased biological age. The simulation did not consider compliance, dropout, mortality, or the bias that may arise owing to enthusiastic volunteers. Estimated risk reductions assume an effect of change in biological age, which hasn’t been proven. Projections from extending the effects of FMD to a lifelong intervention may require cautious interpretation.
DISCLOSURES:
The study was supported by the USC Edna Jones chair fund and funds from NIH/NIA and the Yale PEPPER Center. The experimental diet was provided by L-Nutra Inc. Some authors declared an equity interest in L-Nutra, with one author’s equity to be assigned to the nonprofit foundation Create Cures. Others disclosed no conflicts of interest.
A version of this article appeared on Medscape.com.
Using AI to Transform Diabetic Foot and Limb Preservation
Diabetic foot complications represent a major global health challenge, with a high prevalence among patients with diabetes. A diabetic foot ulcer (DFU) not only affects the patient›s quality of life but also increases the risk for amputation.
Worldwide, a DFU occurs every second, and an amputation occurs every 20 seconds. The limitations of current detection and intervention methods underline the urgent need for innovative solutions.
Recent advances in artificial intelligence (AI) have paved the way for individualized risk prediction models for chronic wound management. These models use deep learning algorithms to analyze clinical data and images, providing personalized treatment plans that may improve healing outcomes and reduce the risk for amputation.
AI-powered tools can also be deployed for the diagnosis of diabetic foot complications. Using image analysis and pattern recognition, AI tools are learning to accurately detect signs of DFUs and other complications, facilitating early and effective intervention. Our group and others have been working not only on imaging devices but also on thermographic tools that — with the help of AI — can create an automated “foot selfie” to predict and prevent problems before they start.
AI’s predictive capabilities are instrumental to its clinical value. By identifying patients at high risk for DFUs, healthcare providers can implement preemptive measures, significantly reducing the likelihood of severe complications.
Although the potential benefits of AI in diabetic foot care are immense, integrating these tools into clinical practice poses challenges. These include ensuring the reliability of AI predictions, addressing data privacy concerns, and training healthcare professionals on the use of AI technologies.
As in so many other areas in our lives, AI holds the promise to revolutionize diabetic foot and limb preservation, offering hope for improved patient outcomes through early detection, precise diagnosis, and personalized care. However, realizing this potential requires ongoing research, development, and collaboration across the medical and technological fields to ensure these innovative solutions can be effectively integrated into standard care practices.
Dr. Armstrong is professor of surgery, Keck School of Medicine of University of Southern California, Los Angeles, California. He has disclosed the following relevant financial relationships: Partially supported by National Institutes of Health; National Institute of Diabetes; Digestive and Kidney Disease Award Number 1R01124789-01A1.
A version of this article first appeared on Medscape.com.
Diabetic foot complications represent a major global health challenge, with a high prevalence among patients with diabetes. A diabetic foot ulcer (DFU) not only affects the patient›s quality of life but also increases the risk for amputation.
Worldwide, a DFU occurs every second, and an amputation occurs every 20 seconds. The limitations of current detection and intervention methods underline the urgent need for innovative solutions.
Recent advances in artificial intelligence (AI) have paved the way for individualized risk prediction models for chronic wound management. These models use deep learning algorithms to analyze clinical data and images, providing personalized treatment plans that may improve healing outcomes and reduce the risk for amputation.
AI-powered tools can also be deployed for the diagnosis of diabetic foot complications. Using image analysis and pattern recognition, AI tools are learning to accurately detect signs of DFUs and other complications, facilitating early and effective intervention. Our group and others have been working not only on imaging devices but also on thermographic tools that — with the help of AI — can create an automated “foot selfie” to predict and prevent problems before they start.
AI’s predictive capabilities are instrumental to its clinical value. By identifying patients at high risk for DFUs, healthcare providers can implement preemptive measures, significantly reducing the likelihood of severe complications.
Although the potential benefits of AI in diabetic foot care are immense, integrating these tools into clinical practice poses challenges. These include ensuring the reliability of AI predictions, addressing data privacy concerns, and training healthcare professionals on the use of AI technologies.
As in so many other areas in our lives, AI holds the promise to revolutionize diabetic foot and limb preservation, offering hope for improved patient outcomes through early detection, precise diagnosis, and personalized care. However, realizing this potential requires ongoing research, development, and collaboration across the medical and technological fields to ensure these innovative solutions can be effectively integrated into standard care practices.
Dr. Armstrong is professor of surgery, Keck School of Medicine of University of Southern California, Los Angeles, California. He has disclosed the following relevant financial relationships: Partially supported by National Institutes of Health; National Institute of Diabetes; Digestive and Kidney Disease Award Number 1R01124789-01A1.
A version of this article first appeared on Medscape.com.
Diabetic foot complications represent a major global health challenge, with a high prevalence among patients with diabetes. A diabetic foot ulcer (DFU) not only affects the patient›s quality of life but also increases the risk for amputation.
Worldwide, a DFU occurs every second, and an amputation occurs every 20 seconds. The limitations of current detection and intervention methods underline the urgent need for innovative solutions.
Recent advances in artificial intelligence (AI) have paved the way for individualized risk prediction models for chronic wound management. These models use deep learning algorithms to analyze clinical data and images, providing personalized treatment plans that may improve healing outcomes and reduce the risk for amputation.
AI-powered tools can also be deployed for the diagnosis of diabetic foot complications. Using image analysis and pattern recognition, AI tools are learning to accurately detect signs of DFUs and other complications, facilitating early and effective intervention. Our group and others have been working not only on imaging devices but also on thermographic tools that — with the help of AI — can create an automated “foot selfie” to predict and prevent problems before they start.
AI’s predictive capabilities are instrumental to its clinical value. By identifying patients at high risk for DFUs, healthcare providers can implement preemptive measures, significantly reducing the likelihood of severe complications.
Although the potential benefits of AI in diabetic foot care are immense, integrating these tools into clinical practice poses challenges. These include ensuring the reliability of AI predictions, addressing data privacy concerns, and training healthcare professionals on the use of AI technologies.
As in so many other areas in our lives, AI holds the promise to revolutionize diabetic foot and limb preservation, offering hope for improved patient outcomes through early detection, precise diagnosis, and personalized care. However, realizing this potential requires ongoing research, development, and collaboration across the medical and technological fields to ensure these innovative solutions can be effectively integrated into standard care practices.
Dr. Armstrong is professor of surgery, Keck School of Medicine of University of Southern California, Los Angeles, California. He has disclosed the following relevant financial relationships: Partially supported by National Institutes of Health; National Institute of Diabetes; Digestive and Kidney Disease Award Number 1R01124789-01A1.
A version of this article first appeared on Medscape.com.
Telemedicine Reduces Rehospitalization, Revascularization in Post-PCI ACS Patients
ATLANTA — Patients with acute coronary syndrome (ACS) who had a myocardial infarction or unstable angina and underwent percutaneous coronary intervention (PCI) had a 76% lower rate of hospital readmission after 6 months if they participated in a remote monitoring protocol compared with similar patients who had standard post-discharge care, results of a new trial suggest.
The TELE-ACS trial showed that at 6 months, telemedicine patients also had statistically significantly lower rates of post-discharge emergency department visits, unplanned coronary revascularizations, and cardiovascular symptoms, such as chest pain, shortness of breath and dizziness. However, the rates of major adverse cardiovascular events (MACE) were similar between the two groups. The protocol included consultation with a cardiologist who reviewed home-monitoring data.
“The team was able to aid in preventing unnecessary presentations and advised the patients to seek emergency care whenever was necessary,” Nasser Alshahrani, MSc, a clinical research fellow at Imperial College London, said while presenting the results at the American College of Cardiology meeting. “The TELE-ACS protocol provided a significant reduction in readmission rates post-ACS and other adverse events.”
The study findings were published online simultaneously in the Journal of the American College of Cardiology.
Telemedicine Protocol
The trial, conducted from January 2022 to April 2023, randomly assigned 337 patients to telemedicine or standard care when they were discharged after PCI and had at least one cardiovascular risk factor. The telemedicine protocol consisted of 12-lead electrocardiogram belt, an automated blood-pressure monitor, and a pulse oximeter.
Patients in the telemedicine arm initiated the remote monitoring protocol if they thought they had cardiac symptoms. The majority (86%) were men with what the study described as “a high preponderance of cardiovascular risk factors.” Average age was 58.1 years.
If a telemedicine patient initiated the protocol, a cardiologist remotely assessed the patient’s symptoms and channeled the patient to the appropriate care pathway, whether reassuring the patient or sending them to a primary care physician or emergency department, or to call emergency services. Patients who didn’t get a call back from the cardiologist within 15 minutes were told to seek care in the standard clinical pathway.
Telemedicine patients were given the telemonitoring package and training in how to use the devices before they were discharged. They also received three follow-up quality control calls in the first two months to ensure they were using the equipment correctly. They kept the telemonitoring equipment for 8 months, but were followed out to 9 months. Six telemedicine patients dropped out while one standard care patient withdrew from the study.
Results showed that at 6 months, telemedicine patients had statistically significantly lower rates of post-discharge emergency department visits (25% vs 37%, P < .001), unplanned coronary revascularizations (3% vs 9%, P < .01) and cardiovascular symptoms, such as chest pain, shortness of breath and dizziness (a 13% to 18% difference for each symptom, P < .01).
MACE rates were similar between the two groups.
At 9 months, 3 months after the protocol ended, 20 telemedicine patients and 50 standard-care patients were readmitted to the hospital, while 52 and 73, respectively, went to the emergency department.
The telemedicine patients also had shorter hospital stays: an average of 0.5 and 1.2 days at 6 and 9 months, respectively, vs 1.5 and 1.8 days in the standard treatment arm (P < .001 for both).
Mr. Alshahrani noted several limitations with the study, namely that 86% of participants were men, and that the intervention was only offered to people who had smartphones. “The high level of support for the telemedicine group, with prompt cardiology responses, may be challenging to replicate outside a trial setting, requiring significant investment and training,” he added.
Human Element Key
In an interview from London after the presentation, lead author Ramzi Khamis, MB ChB, PhD, said, “This was quite a basic study. Really what we did was we integrated a clinical decision-making algorithm that we perfected with some quite novel but basic technology.” Future research should strive to add a home troponin test to the protocol and an artificial intelligence component, he said.
However, Dr. Khamis noted that human interaction was key to the success of the TELE-ACS trial. “The human factor is very important here and I think it would be really interesting to have a head-to-head comparison of human interaction with remote monitoring vs an AI-driven interaction,” he said. “I have my doubts that AI would be able to beat the human factor here.”
Lawrence Phillips, MD, medical director of outpatient cardiology at NYU Langone Heart, told this news organization that the study was appropriately powered to evaluate the telemedicine protocol, and that it could serve as a template for other studies of remote monitoring in cardiology.
“I think that this study is forming the foundation of evolving telemedicine data,” he said. “It shows really interesting results, and I’m sure it’s going to be reproduced in different ways going forward.”
While other studies have shown the utility of telemedicine to decrease unnecessary hospitalizations, this study went one step further, Dr. Phillips said. “What was unique about this study was the package that they put together,” he added. “It was a combination of telehealth and being able to speak with someone when you have concerns with objective data of an electrocardiogram, blood-pressure cuff, and oxygen level assessment, which is an interesting approach having that ejective data with [a] subjective element.”
The trial received funding from the British Heart Foundation; King Khalid University, Abha, Saudi Arabia via The Saudi Arabian Cultural Bureau; Sansour Fund, Imperial Healthcare Charity; and Safwan Sobhan Fund at Imperial College London. Mr. Alshahrani and Dr. Khamis have no relevant relationships to disclose. Dr. Phillips has no relevant disclosures.
A version of this article first appeared on Medscape.com.
ATLANTA — Patients with acute coronary syndrome (ACS) who had a myocardial infarction or unstable angina and underwent percutaneous coronary intervention (PCI) had a 76% lower rate of hospital readmission after 6 months if they participated in a remote monitoring protocol compared with similar patients who had standard post-discharge care, results of a new trial suggest.
The TELE-ACS trial showed that at 6 months, telemedicine patients also had statistically significantly lower rates of post-discharge emergency department visits, unplanned coronary revascularizations, and cardiovascular symptoms, such as chest pain, shortness of breath and dizziness. However, the rates of major adverse cardiovascular events (MACE) were similar between the two groups. The protocol included consultation with a cardiologist who reviewed home-monitoring data.
“The team was able to aid in preventing unnecessary presentations and advised the patients to seek emergency care whenever was necessary,” Nasser Alshahrani, MSc, a clinical research fellow at Imperial College London, said while presenting the results at the American College of Cardiology meeting. “The TELE-ACS protocol provided a significant reduction in readmission rates post-ACS and other adverse events.”
The study findings were published online simultaneously in the Journal of the American College of Cardiology.
Telemedicine Protocol
The trial, conducted from January 2022 to April 2023, randomly assigned 337 patients to telemedicine or standard care when they were discharged after PCI and had at least one cardiovascular risk factor. The telemedicine protocol consisted of 12-lead electrocardiogram belt, an automated blood-pressure monitor, and a pulse oximeter.
Patients in the telemedicine arm initiated the remote monitoring protocol if they thought they had cardiac symptoms. The majority (86%) were men with what the study described as “a high preponderance of cardiovascular risk factors.” Average age was 58.1 years.
If a telemedicine patient initiated the protocol, a cardiologist remotely assessed the patient’s symptoms and channeled the patient to the appropriate care pathway, whether reassuring the patient or sending them to a primary care physician or emergency department, or to call emergency services. Patients who didn’t get a call back from the cardiologist within 15 minutes were told to seek care in the standard clinical pathway.
Telemedicine patients were given the telemonitoring package and training in how to use the devices before they were discharged. They also received three follow-up quality control calls in the first two months to ensure they were using the equipment correctly. They kept the telemonitoring equipment for 8 months, but were followed out to 9 months. Six telemedicine patients dropped out while one standard care patient withdrew from the study.
Results showed that at 6 months, telemedicine patients had statistically significantly lower rates of post-discharge emergency department visits (25% vs 37%, P < .001), unplanned coronary revascularizations (3% vs 9%, P < .01) and cardiovascular symptoms, such as chest pain, shortness of breath and dizziness (a 13% to 18% difference for each symptom, P < .01).
MACE rates were similar between the two groups.
At 9 months, 3 months after the protocol ended, 20 telemedicine patients and 50 standard-care patients were readmitted to the hospital, while 52 and 73, respectively, went to the emergency department.
The telemedicine patients also had shorter hospital stays: an average of 0.5 and 1.2 days at 6 and 9 months, respectively, vs 1.5 and 1.8 days in the standard treatment arm (P < .001 for both).
Mr. Alshahrani noted several limitations with the study, namely that 86% of participants were men, and that the intervention was only offered to people who had smartphones. “The high level of support for the telemedicine group, with prompt cardiology responses, may be challenging to replicate outside a trial setting, requiring significant investment and training,” he added.
Human Element Key
In an interview from London after the presentation, lead author Ramzi Khamis, MB ChB, PhD, said, “This was quite a basic study. Really what we did was we integrated a clinical decision-making algorithm that we perfected with some quite novel but basic technology.” Future research should strive to add a home troponin test to the protocol and an artificial intelligence component, he said.
However, Dr. Khamis noted that human interaction was key to the success of the TELE-ACS trial. “The human factor is very important here and I think it would be really interesting to have a head-to-head comparison of human interaction with remote monitoring vs an AI-driven interaction,” he said. “I have my doubts that AI would be able to beat the human factor here.”
Lawrence Phillips, MD, medical director of outpatient cardiology at NYU Langone Heart, told this news organization that the study was appropriately powered to evaluate the telemedicine protocol, and that it could serve as a template for other studies of remote monitoring in cardiology.
“I think that this study is forming the foundation of evolving telemedicine data,” he said. “It shows really interesting results, and I’m sure it’s going to be reproduced in different ways going forward.”
While other studies have shown the utility of telemedicine to decrease unnecessary hospitalizations, this study went one step further, Dr. Phillips said. “What was unique about this study was the package that they put together,” he added. “It was a combination of telehealth and being able to speak with someone when you have concerns with objective data of an electrocardiogram, blood-pressure cuff, and oxygen level assessment, which is an interesting approach having that ejective data with [a] subjective element.”
The trial received funding from the British Heart Foundation; King Khalid University, Abha, Saudi Arabia via The Saudi Arabian Cultural Bureau; Sansour Fund, Imperial Healthcare Charity; and Safwan Sobhan Fund at Imperial College London. Mr. Alshahrani and Dr. Khamis have no relevant relationships to disclose. Dr. Phillips has no relevant disclosures.
A version of this article first appeared on Medscape.com.
ATLANTA — Patients with acute coronary syndrome (ACS) who had a myocardial infarction or unstable angina and underwent percutaneous coronary intervention (PCI) had a 76% lower rate of hospital readmission after 6 months if they participated in a remote monitoring protocol compared with similar patients who had standard post-discharge care, results of a new trial suggest.
The TELE-ACS trial showed that at 6 months, telemedicine patients also had statistically significantly lower rates of post-discharge emergency department visits, unplanned coronary revascularizations, and cardiovascular symptoms, such as chest pain, shortness of breath and dizziness. However, the rates of major adverse cardiovascular events (MACE) were similar between the two groups. The protocol included consultation with a cardiologist who reviewed home-monitoring data.
“The team was able to aid in preventing unnecessary presentations and advised the patients to seek emergency care whenever was necessary,” Nasser Alshahrani, MSc, a clinical research fellow at Imperial College London, said while presenting the results at the American College of Cardiology meeting. “The TELE-ACS protocol provided a significant reduction in readmission rates post-ACS and other adverse events.”
The study findings were published online simultaneously in the Journal of the American College of Cardiology.
Telemedicine Protocol
The trial, conducted from January 2022 to April 2023, randomly assigned 337 patients to telemedicine or standard care when they were discharged after PCI and had at least one cardiovascular risk factor. The telemedicine protocol consisted of 12-lead electrocardiogram belt, an automated blood-pressure monitor, and a pulse oximeter.
Patients in the telemedicine arm initiated the remote monitoring protocol if they thought they had cardiac symptoms. The majority (86%) were men with what the study described as “a high preponderance of cardiovascular risk factors.” Average age was 58.1 years.
If a telemedicine patient initiated the protocol, a cardiologist remotely assessed the patient’s symptoms and channeled the patient to the appropriate care pathway, whether reassuring the patient or sending them to a primary care physician or emergency department, or to call emergency services. Patients who didn’t get a call back from the cardiologist within 15 minutes were told to seek care in the standard clinical pathway.
Telemedicine patients were given the telemonitoring package and training in how to use the devices before they were discharged. They also received three follow-up quality control calls in the first two months to ensure they were using the equipment correctly. They kept the telemonitoring equipment for 8 months, but were followed out to 9 months. Six telemedicine patients dropped out while one standard care patient withdrew from the study.
Results showed that at 6 months, telemedicine patients had statistically significantly lower rates of post-discharge emergency department visits (25% vs 37%, P < .001), unplanned coronary revascularizations (3% vs 9%, P < .01) and cardiovascular symptoms, such as chest pain, shortness of breath and dizziness (a 13% to 18% difference for each symptom, P < .01).
MACE rates were similar between the two groups.
At 9 months, 3 months after the protocol ended, 20 telemedicine patients and 50 standard-care patients were readmitted to the hospital, while 52 and 73, respectively, went to the emergency department.
The telemedicine patients also had shorter hospital stays: an average of 0.5 and 1.2 days at 6 and 9 months, respectively, vs 1.5 and 1.8 days in the standard treatment arm (P < .001 for both).
Mr. Alshahrani noted several limitations with the study, namely that 86% of participants were men, and that the intervention was only offered to people who had smartphones. “The high level of support for the telemedicine group, with prompt cardiology responses, may be challenging to replicate outside a trial setting, requiring significant investment and training,” he added.
Human Element Key
In an interview from London after the presentation, lead author Ramzi Khamis, MB ChB, PhD, said, “This was quite a basic study. Really what we did was we integrated a clinical decision-making algorithm that we perfected with some quite novel but basic technology.” Future research should strive to add a home troponin test to the protocol and an artificial intelligence component, he said.
However, Dr. Khamis noted that human interaction was key to the success of the TELE-ACS trial. “The human factor is very important here and I think it would be really interesting to have a head-to-head comparison of human interaction with remote monitoring vs an AI-driven interaction,” he said. “I have my doubts that AI would be able to beat the human factor here.”
Lawrence Phillips, MD, medical director of outpatient cardiology at NYU Langone Heart, told this news organization that the study was appropriately powered to evaluate the telemedicine protocol, and that it could serve as a template for other studies of remote monitoring in cardiology.
“I think that this study is forming the foundation of evolving telemedicine data,” he said. “It shows really interesting results, and I’m sure it’s going to be reproduced in different ways going forward.”
While other studies have shown the utility of telemedicine to decrease unnecessary hospitalizations, this study went one step further, Dr. Phillips said. “What was unique about this study was the package that they put together,” he added. “It was a combination of telehealth and being able to speak with someone when you have concerns with objective data of an electrocardiogram, blood-pressure cuff, and oxygen level assessment, which is an interesting approach having that ejective data with [a] subjective element.”
The trial received funding from the British Heart Foundation; King Khalid University, Abha, Saudi Arabia via The Saudi Arabian Cultural Bureau; Sansour Fund, Imperial Healthcare Charity; and Safwan Sobhan Fund at Imperial College London. Mr. Alshahrani and Dr. Khamis have no relevant relationships to disclose. Dr. Phillips has no relevant disclosures.
A version of this article first appeared on Medscape.com.
FROM THE JOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGY
New Tool Helps Clinicians Detect Zoom Dysmorphia in Virtual Settings
SAN DIEGO — , according to George Kroumpouzos, MD, PhD, who, with colleagues, recently proposed a screening tool to help identify patients with zoom dysmorphia.
The term, coined in 2020 by dermatologist Shadi Kourosh, MD, MPH, and colleagues at Harvard Medical School, Boston, refers to an altered or skewed negative perception of one’s body image that results from spending extended amounts of time on video calls. Speaking at the annual meeting of the American Academy of Dermatology, Dr. Kroumpouzos, clinical associate professor of dermatology at Brown University, Providence Rhode Island, explained that most people believe that zoom dysmorphia falls within the spectrum of body dysmorphic disorder (BDD). He described zoom dysmorphia as “a facial dysmorphia triggered or aggravated by frequent virtual meetings. Frequent use of videoconferencing platforms is linked to a distorted perception of facial images, which leads to dysmorphic concerns.”
Individuals with zoom dysmorphia tend to scrutinize their facial features and fixate on what they think needs to improve, he continued. They experience anxiety about attending video conferences with the camera on and feel pressured to appear perfect before virtual meetings. “They find facial flaws during virtual meetings, and they believe others notice their perceived flaws,” he said. “This all has drastic effects on body dissatisfaction and self-esteem, which leads to a desire to seek cosmetic procedures. It interferes with an individual’s life and can trigger or aggravate body dysmorphic disorder.”
While several tools have been validated in cosmetic settings to screen for BDD, such as the 9-item Body Dysmorphic Disorder Questionnaire–Dermatology questionnaire, the 7-item Body Dysmorphic Disorder Questionnaire–Aesthetic Surgery questionnaire, the Cosmetic Procedure Screening Questionnaire, and the Body Dysmorphic Disorder Symptom Scale, no formal screening tools exist to identify zoom dysmorphia. To complicate matters, “identifying dysmorphic concerns in virtual settings can be challenging,” Dr. Kroumpouzos added. “This makes the recognition of zoom dysmorphia during telehealth visits even more difficult.”
Individuals who may have zoom dysmorphia may fear being misunderstood, judged, or ridiculed because of a perceived flaw in appearance, he said, making establishing rapport and eye contact difficult. “There’s a reticence and silence due to the individual’s avoidant characteristics,” he said. “Patients may become easily distracted or disengaged during telehealth visits in case of technical issues. Psychiatric comorbidities can mask symptoms related to dysmorphic concerns.”
To bridge this gap, Dr. Kroumpouzos and colleagues have proposed a screening tool, a questionnaire related to features of zoom dysmorphia, to facilitate recognition of zoom dysmorphia in virtual settings.
The first component consists of open-ended questions such as “Are you comfortable with being interviewed in a virtual appointment?” and “How do you feel about your appearance during virtual meetings?” Such questions “aim to start the dialogue, to facilitate the discussion with a patient who may be shy or avoidant,” Dr. Kroumpouzos explained.
The second component of the tool consists of questions more specific to screening for zoom dysmorphia, starting with “Are you concerned about facial flaws?” If the patient answers no, they don’t qualify for any others, he said. “But, if they answer yes to that question and yes to at least one more [question], they may have zoom dysmorphia.”
Other questions include, “Do you think that your face is not friendly to the camera?” “Do you hesitate to open the camera?” “Have you tried to hide or camouflage your flaw with your hands, hair, makeup, or clothing?” “Have you sought advice from others to improve your appearance or image?” “Do you often use the filter features of the video conferencing platform?” “Did you consider buying a new camera or equipment that helps improve your image?”
If the clinician deems the patient a candidate for the diagnosis of zoom dysmorphia, the tool recommends asking a BDD-focused question: “In the past month, have you been very concerned that there is something wrong with your physical appearance or the way one or more parts of your body look?” If the patient answers yes, “that individual should be invited to fill out a questionnaire specifically for BDD or come to the office for further evaluation,” Dr. Kroumpouzos said.
In his view, the brevity of the proposed screening tool makes it easy to incorporate into clinical practice, and the “yes or no” questions are practical. “It is crucial to elicit the presence of zoom dysmorphia in its early stage,” he said. “Zoom dysmorphia may trigger an increase in BDD, [so] it is essential to identify the presence of BDD in zoom dysmorphia sufferers and treat it appropriately.”
Dr. Kroumpouzos reported having no relevant financial disclosures.
SAN DIEGO — , according to George Kroumpouzos, MD, PhD, who, with colleagues, recently proposed a screening tool to help identify patients with zoom dysmorphia.
The term, coined in 2020 by dermatologist Shadi Kourosh, MD, MPH, and colleagues at Harvard Medical School, Boston, refers to an altered or skewed negative perception of one’s body image that results from spending extended amounts of time on video calls. Speaking at the annual meeting of the American Academy of Dermatology, Dr. Kroumpouzos, clinical associate professor of dermatology at Brown University, Providence Rhode Island, explained that most people believe that zoom dysmorphia falls within the spectrum of body dysmorphic disorder (BDD). He described zoom dysmorphia as “a facial dysmorphia triggered or aggravated by frequent virtual meetings. Frequent use of videoconferencing platforms is linked to a distorted perception of facial images, which leads to dysmorphic concerns.”
Individuals with zoom dysmorphia tend to scrutinize their facial features and fixate on what they think needs to improve, he continued. They experience anxiety about attending video conferences with the camera on and feel pressured to appear perfect before virtual meetings. “They find facial flaws during virtual meetings, and they believe others notice their perceived flaws,” he said. “This all has drastic effects on body dissatisfaction and self-esteem, which leads to a desire to seek cosmetic procedures. It interferes with an individual’s life and can trigger or aggravate body dysmorphic disorder.”
While several tools have been validated in cosmetic settings to screen for BDD, such as the 9-item Body Dysmorphic Disorder Questionnaire–Dermatology questionnaire, the 7-item Body Dysmorphic Disorder Questionnaire–Aesthetic Surgery questionnaire, the Cosmetic Procedure Screening Questionnaire, and the Body Dysmorphic Disorder Symptom Scale, no formal screening tools exist to identify zoom dysmorphia. To complicate matters, “identifying dysmorphic concerns in virtual settings can be challenging,” Dr. Kroumpouzos added. “This makes the recognition of zoom dysmorphia during telehealth visits even more difficult.”
Individuals who may have zoom dysmorphia may fear being misunderstood, judged, or ridiculed because of a perceived flaw in appearance, he said, making establishing rapport and eye contact difficult. “There’s a reticence and silence due to the individual’s avoidant characteristics,” he said. “Patients may become easily distracted or disengaged during telehealth visits in case of technical issues. Psychiatric comorbidities can mask symptoms related to dysmorphic concerns.”
To bridge this gap, Dr. Kroumpouzos and colleagues have proposed a screening tool, a questionnaire related to features of zoom dysmorphia, to facilitate recognition of zoom dysmorphia in virtual settings.
The first component consists of open-ended questions such as “Are you comfortable with being interviewed in a virtual appointment?” and “How do you feel about your appearance during virtual meetings?” Such questions “aim to start the dialogue, to facilitate the discussion with a patient who may be shy or avoidant,” Dr. Kroumpouzos explained.
The second component of the tool consists of questions more specific to screening for zoom dysmorphia, starting with “Are you concerned about facial flaws?” If the patient answers no, they don’t qualify for any others, he said. “But, if they answer yes to that question and yes to at least one more [question], they may have zoom dysmorphia.”
Other questions include, “Do you think that your face is not friendly to the camera?” “Do you hesitate to open the camera?” “Have you tried to hide or camouflage your flaw with your hands, hair, makeup, or clothing?” “Have you sought advice from others to improve your appearance or image?” “Do you often use the filter features of the video conferencing platform?” “Did you consider buying a new camera or equipment that helps improve your image?”
If the clinician deems the patient a candidate for the diagnosis of zoom dysmorphia, the tool recommends asking a BDD-focused question: “In the past month, have you been very concerned that there is something wrong with your physical appearance or the way one or more parts of your body look?” If the patient answers yes, “that individual should be invited to fill out a questionnaire specifically for BDD or come to the office for further evaluation,” Dr. Kroumpouzos said.
In his view, the brevity of the proposed screening tool makes it easy to incorporate into clinical practice, and the “yes or no” questions are practical. “It is crucial to elicit the presence of zoom dysmorphia in its early stage,” he said. “Zoom dysmorphia may trigger an increase in BDD, [so] it is essential to identify the presence of BDD in zoom dysmorphia sufferers and treat it appropriately.”
Dr. Kroumpouzos reported having no relevant financial disclosures.
SAN DIEGO — , according to George Kroumpouzos, MD, PhD, who, with colleagues, recently proposed a screening tool to help identify patients with zoom dysmorphia.
The term, coined in 2020 by dermatologist Shadi Kourosh, MD, MPH, and colleagues at Harvard Medical School, Boston, refers to an altered or skewed negative perception of one’s body image that results from spending extended amounts of time on video calls. Speaking at the annual meeting of the American Academy of Dermatology, Dr. Kroumpouzos, clinical associate professor of dermatology at Brown University, Providence Rhode Island, explained that most people believe that zoom dysmorphia falls within the spectrum of body dysmorphic disorder (BDD). He described zoom dysmorphia as “a facial dysmorphia triggered or aggravated by frequent virtual meetings. Frequent use of videoconferencing platforms is linked to a distorted perception of facial images, which leads to dysmorphic concerns.”
Individuals with zoom dysmorphia tend to scrutinize their facial features and fixate on what they think needs to improve, he continued. They experience anxiety about attending video conferences with the camera on and feel pressured to appear perfect before virtual meetings. “They find facial flaws during virtual meetings, and they believe others notice their perceived flaws,” he said. “This all has drastic effects on body dissatisfaction and self-esteem, which leads to a desire to seek cosmetic procedures. It interferes with an individual’s life and can trigger or aggravate body dysmorphic disorder.”
While several tools have been validated in cosmetic settings to screen for BDD, such as the 9-item Body Dysmorphic Disorder Questionnaire–Dermatology questionnaire, the 7-item Body Dysmorphic Disorder Questionnaire–Aesthetic Surgery questionnaire, the Cosmetic Procedure Screening Questionnaire, and the Body Dysmorphic Disorder Symptom Scale, no formal screening tools exist to identify zoom dysmorphia. To complicate matters, “identifying dysmorphic concerns in virtual settings can be challenging,” Dr. Kroumpouzos added. “This makes the recognition of zoom dysmorphia during telehealth visits even more difficult.”
Individuals who may have zoom dysmorphia may fear being misunderstood, judged, or ridiculed because of a perceived flaw in appearance, he said, making establishing rapport and eye contact difficult. “There’s a reticence and silence due to the individual’s avoidant characteristics,” he said. “Patients may become easily distracted or disengaged during telehealth visits in case of technical issues. Psychiatric comorbidities can mask symptoms related to dysmorphic concerns.”
To bridge this gap, Dr. Kroumpouzos and colleagues have proposed a screening tool, a questionnaire related to features of zoom dysmorphia, to facilitate recognition of zoom dysmorphia in virtual settings.
The first component consists of open-ended questions such as “Are you comfortable with being interviewed in a virtual appointment?” and “How do you feel about your appearance during virtual meetings?” Such questions “aim to start the dialogue, to facilitate the discussion with a patient who may be shy or avoidant,” Dr. Kroumpouzos explained.
The second component of the tool consists of questions more specific to screening for zoom dysmorphia, starting with “Are you concerned about facial flaws?” If the patient answers no, they don’t qualify for any others, he said. “But, if they answer yes to that question and yes to at least one more [question], they may have zoom dysmorphia.”
Other questions include, “Do you think that your face is not friendly to the camera?” “Do you hesitate to open the camera?” “Have you tried to hide or camouflage your flaw with your hands, hair, makeup, or clothing?” “Have you sought advice from others to improve your appearance or image?” “Do you often use the filter features of the video conferencing platform?” “Did you consider buying a new camera or equipment that helps improve your image?”
If the clinician deems the patient a candidate for the diagnosis of zoom dysmorphia, the tool recommends asking a BDD-focused question: “In the past month, have you been very concerned that there is something wrong with your physical appearance or the way one or more parts of your body look?” If the patient answers yes, “that individual should be invited to fill out a questionnaire specifically for BDD or come to the office for further evaluation,” Dr. Kroumpouzos said.
In his view, the brevity of the proposed screening tool makes it easy to incorporate into clinical practice, and the “yes or no” questions are practical. “It is crucial to elicit the presence of zoom dysmorphia in its early stage,” he said. “Zoom dysmorphia may trigger an increase in BDD, [so] it is essential to identify the presence of BDD in zoom dysmorphia sufferers and treat it appropriately.”
Dr. Kroumpouzos reported having no relevant financial disclosures.
FROM AAD 2024
Gut Bacteria’s Influence on Obesity Differs in Men and Women
Gut bacteria predictive of body mass index (BMI), waist circumference, and fat mass are different in men and women, and therefore, interventions to prevent obesity may need to be different, as well, new research suggested.
Metagenomic analyses of fecal samples and metabolomic analyses of serum samples from 361 volunteers in Spain showed that an imbalance in specific bacterial strains likely play an important role in the onset and development of obesity, and that there are “considerable differences” between the sexes, said lead study author Paula Aranaz, MD, Centre for Nutrition Research, at the University of Navarra, Pamplona, Spain.
“We are still far from knowing the magnitude of the effect that the microbiota [bacteria, viruses, fungi, and protozoa] has on our metabolic health and, therefore, on the greater or lesser risk of suffering from obesity,” Dr. Aranaz told this news organization.
“However,” she said, “what does seem clear is that the microorganisms of our intestine perform a crucial role in the way we metabolize nutrients and, therefore, influence the compounds and molecules that circulate through our body, affecting different organs and tissues, and our general metabolic health.”
The study will be presented at the European Congress on Obesity (ECO) 2024, to be held in Venice, Italy, from May 12 to 15. The abstract is now available online.
Variation in Bacteria Species, Abundance
The researchers examined the fecal metabolome of 361 adult volunteers (median age, 44; 70%, women) from the Spanish Obekit randomized trial, which investigated the relationship between genetic variants and how participants responded to a low-calorie diet.
A total of 65 participants were normal weight, 110 with overweight, and 186 with obesity. They were matched for sex and age and classified according to an obesity (OB) index as LOW or HIGH.
LOW included those with a BMI ≤ 30 kg/m2, fat mass percentage ≤ 25% (women) or ≤ 32% (men), and waist circumference ≤ 88 cm (women) or ≤ 102 cm (men). HIGH included those with a BMI > 30 kg/m2, fat mass > 25% (women) or > 32% (men), and waist circumference > 88 cm (women) or > 102 cm (men).
In men, a greater abundance of Parabacteroides helcogenes and Campylobacter canadensis species was associated with higher BMI, fat mass, and waist circumference.
By contrast, in women, a greater abundance of Prevotella micans, P brevis, and P sacharolitica was predictive of higher BMI, fat mass, and waist circumference.
Untargeted metabolomic analyses revealed variation in the abundance of certain metabolites in participants with a HIGH OB index — notably, higher levels of phospholipids (implicated in the development of metabolic disease and modulators of insulin sensitivity) and sphingolipids, which play a role in the development of diabetes and the emergence of vascular complications.
“We can reduce the risk of metabolic diseases by modulating the gut microbiome through nutritional and lifestyle factors, including dietary patterns, foods, exercise, probiotics, and postbiotics,” Dr. Aranaz said. Which modifications can and should be made “depend on many factors, including the host genetics, endocrine system, sex, and age.”
The researchers currently are working to try to relate the identified metabolites to the bacterial species that could be producing them and to characterize the biological effect that these species and their metabolites exert on the organism, Dr. Aranaz added.
Ultimately, she said, “we would like to [design] a microbiota/metabolomic test that can be used in clinical practice to identify human enterotypes and to personalize the dietary strategies to minimize the health risks related to gut dysbiosis.”
No funding was reported. Dr. Aranaz declared no conflicts of interest.
A version of this article appeared on Medscape.com .
Gut bacteria predictive of body mass index (BMI), waist circumference, and fat mass are different in men and women, and therefore, interventions to prevent obesity may need to be different, as well, new research suggested.
Metagenomic analyses of fecal samples and metabolomic analyses of serum samples from 361 volunteers in Spain showed that an imbalance in specific bacterial strains likely play an important role in the onset and development of obesity, and that there are “considerable differences” between the sexes, said lead study author Paula Aranaz, MD, Centre for Nutrition Research, at the University of Navarra, Pamplona, Spain.
“We are still far from knowing the magnitude of the effect that the microbiota [bacteria, viruses, fungi, and protozoa] has on our metabolic health and, therefore, on the greater or lesser risk of suffering from obesity,” Dr. Aranaz told this news organization.
“However,” she said, “what does seem clear is that the microorganisms of our intestine perform a crucial role in the way we metabolize nutrients and, therefore, influence the compounds and molecules that circulate through our body, affecting different organs and tissues, and our general metabolic health.”
The study will be presented at the European Congress on Obesity (ECO) 2024, to be held in Venice, Italy, from May 12 to 15. The abstract is now available online.
Variation in Bacteria Species, Abundance
The researchers examined the fecal metabolome of 361 adult volunteers (median age, 44; 70%, women) from the Spanish Obekit randomized trial, which investigated the relationship between genetic variants and how participants responded to a low-calorie diet.
A total of 65 participants were normal weight, 110 with overweight, and 186 with obesity. They were matched for sex and age and classified according to an obesity (OB) index as LOW or HIGH.
LOW included those with a BMI ≤ 30 kg/m2, fat mass percentage ≤ 25% (women) or ≤ 32% (men), and waist circumference ≤ 88 cm (women) or ≤ 102 cm (men). HIGH included those with a BMI > 30 kg/m2, fat mass > 25% (women) or > 32% (men), and waist circumference > 88 cm (women) or > 102 cm (men).
In men, a greater abundance of Parabacteroides helcogenes and Campylobacter canadensis species was associated with higher BMI, fat mass, and waist circumference.
By contrast, in women, a greater abundance of Prevotella micans, P brevis, and P sacharolitica was predictive of higher BMI, fat mass, and waist circumference.
Untargeted metabolomic analyses revealed variation in the abundance of certain metabolites in participants with a HIGH OB index — notably, higher levels of phospholipids (implicated in the development of metabolic disease and modulators of insulin sensitivity) and sphingolipids, which play a role in the development of diabetes and the emergence of vascular complications.
“We can reduce the risk of metabolic diseases by modulating the gut microbiome through nutritional and lifestyle factors, including dietary patterns, foods, exercise, probiotics, and postbiotics,” Dr. Aranaz said. Which modifications can and should be made “depend on many factors, including the host genetics, endocrine system, sex, and age.”
The researchers currently are working to try to relate the identified metabolites to the bacterial species that could be producing them and to characterize the biological effect that these species and their metabolites exert on the organism, Dr. Aranaz added.
Ultimately, she said, “we would like to [design] a microbiota/metabolomic test that can be used in clinical practice to identify human enterotypes and to personalize the dietary strategies to minimize the health risks related to gut dysbiosis.”
No funding was reported. Dr. Aranaz declared no conflicts of interest.
A version of this article appeared on Medscape.com .
Gut bacteria predictive of body mass index (BMI), waist circumference, and fat mass are different in men and women, and therefore, interventions to prevent obesity may need to be different, as well, new research suggested.
Metagenomic analyses of fecal samples and metabolomic analyses of serum samples from 361 volunteers in Spain showed that an imbalance in specific bacterial strains likely play an important role in the onset and development of obesity, and that there are “considerable differences” between the sexes, said lead study author Paula Aranaz, MD, Centre for Nutrition Research, at the University of Navarra, Pamplona, Spain.
“We are still far from knowing the magnitude of the effect that the microbiota [bacteria, viruses, fungi, and protozoa] has on our metabolic health and, therefore, on the greater or lesser risk of suffering from obesity,” Dr. Aranaz told this news organization.
“However,” she said, “what does seem clear is that the microorganisms of our intestine perform a crucial role in the way we metabolize nutrients and, therefore, influence the compounds and molecules that circulate through our body, affecting different organs and tissues, and our general metabolic health.”
The study will be presented at the European Congress on Obesity (ECO) 2024, to be held in Venice, Italy, from May 12 to 15. The abstract is now available online.
Variation in Bacteria Species, Abundance
The researchers examined the fecal metabolome of 361 adult volunteers (median age, 44; 70%, women) from the Spanish Obekit randomized trial, which investigated the relationship between genetic variants and how participants responded to a low-calorie diet.
A total of 65 participants were normal weight, 110 with overweight, and 186 with obesity. They were matched for sex and age and classified according to an obesity (OB) index as LOW or HIGH.
LOW included those with a BMI ≤ 30 kg/m2, fat mass percentage ≤ 25% (women) or ≤ 32% (men), and waist circumference ≤ 88 cm (women) or ≤ 102 cm (men). HIGH included those with a BMI > 30 kg/m2, fat mass > 25% (women) or > 32% (men), and waist circumference > 88 cm (women) or > 102 cm (men).
In men, a greater abundance of Parabacteroides helcogenes and Campylobacter canadensis species was associated with higher BMI, fat mass, and waist circumference.
By contrast, in women, a greater abundance of Prevotella micans, P brevis, and P sacharolitica was predictive of higher BMI, fat mass, and waist circumference.
Untargeted metabolomic analyses revealed variation in the abundance of certain metabolites in participants with a HIGH OB index — notably, higher levels of phospholipids (implicated in the development of metabolic disease and modulators of insulin sensitivity) and sphingolipids, which play a role in the development of diabetes and the emergence of vascular complications.
“We can reduce the risk of metabolic diseases by modulating the gut microbiome through nutritional and lifestyle factors, including dietary patterns, foods, exercise, probiotics, and postbiotics,” Dr. Aranaz said. Which modifications can and should be made “depend on many factors, including the host genetics, endocrine system, sex, and age.”
The researchers currently are working to try to relate the identified metabolites to the bacterial species that could be producing them and to characterize the biological effect that these species and their metabolites exert on the organism, Dr. Aranaz added.
Ultimately, she said, “we would like to [design] a microbiota/metabolomic test that can be used in clinical practice to identify human enterotypes and to personalize the dietary strategies to minimize the health risks related to gut dysbiosis.”
No funding was reported. Dr. Aranaz declared no conflicts of interest.
A version of this article appeared on Medscape.com .
An App for ED?
Little blue pill meets a little blue light.
A digital application can improve erectile function, according to new research presented at the European Association of Urology (EAU) Annual Congress on April 8, 2024.
Researchers developed a 12-week, self-managed program to treat erectile dysfunction (ED). The program is delivered to patients’ mobile devices and encourages users to do cardiovascular training, pelvic floor exercises, and physiotherapy. It also provides information about ED, sexual therapy, and stress management.
“The treatment of ED through physical activity and/or lifestyle changes is recommended in current European guidelines but is not well established in clinical practice,” according to the researchers.
App or Waitlist
The app, known as Kranus Edera, was created by Kranus Health. It is available by prescription in Germany and France.
To study the effectiveness of the app, investigators conducted a randomized controlled trial at the University Hospital Münster in Germany.
The study included 241 men who had scores of 21 or less on the International Index of Erectile Function (IIEF-5).
About half of the participants were randomly assigned to get the app. The rest were placed on a waiting list for the technology and served as a control group.
Men who received the app also reported gains in measures of quality of life (20.5 vs −0.04) and patient activation (11.1 vs 0.64).
Nearly nine in 10 people who used the app did so several times per week, the researchers reported.
Sabine Kliesch, MD, with University Hospital Münster, led the study, which was presented at a poster session on April 8 at the EAU Congress in Paris.
Fully Reimbursed in Germany
In Germany, Kranus Edera has been included on a government list of digital health apps that are fully reimbursed by insurers, partly based on the results of the clinical trial. The cost there is €235 (about $255).
Patients typically notice improvements in 2-4 weeks, according to the company’s website. Patients who are taking a phosphodiesterase-5 enzyme inhibitor for ED may continue taking the medication, although they may no longer need it or they may be able to reduce the dose after treatment with the app, it says.
Kranus also has virtual treatments for incontinence in women and voiding dysfunction.
The app is meant to save doctors time by providing patients with detailed explanations and guidance within the app itself, said Laura Wiemer, MD, senior medical director of Kranus.
The app’s modules help reinforce guideline-recommended approaches to the treatment of ED “in playful ways with awards, motivational messages, and individual adjustments to help achieve better adherence and compliance of the patient,” Dr. Wiemer told this news organization.
Kranus plans to expand to the United States in 2024, she said.
A version of this article appeared on Medscape.com.
Little blue pill meets a little blue light.
A digital application can improve erectile function, according to new research presented at the European Association of Urology (EAU) Annual Congress on April 8, 2024.
Researchers developed a 12-week, self-managed program to treat erectile dysfunction (ED). The program is delivered to patients’ mobile devices and encourages users to do cardiovascular training, pelvic floor exercises, and physiotherapy. It also provides information about ED, sexual therapy, and stress management.
“The treatment of ED through physical activity and/or lifestyle changes is recommended in current European guidelines but is not well established in clinical practice,” according to the researchers.
App or Waitlist
The app, known as Kranus Edera, was created by Kranus Health. It is available by prescription in Germany and France.
To study the effectiveness of the app, investigators conducted a randomized controlled trial at the University Hospital Münster in Germany.
The study included 241 men who had scores of 21 or less on the International Index of Erectile Function (IIEF-5).
About half of the participants were randomly assigned to get the app. The rest were placed on a waiting list for the technology and served as a control group.
Men who received the app also reported gains in measures of quality of life (20.5 vs −0.04) and patient activation (11.1 vs 0.64).
Nearly nine in 10 people who used the app did so several times per week, the researchers reported.
Sabine Kliesch, MD, with University Hospital Münster, led the study, which was presented at a poster session on April 8 at the EAU Congress in Paris.
Fully Reimbursed in Germany
In Germany, Kranus Edera has been included on a government list of digital health apps that are fully reimbursed by insurers, partly based on the results of the clinical trial. The cost there is €235 (about $255).
Patients typically notice improvements in 2-4 weeks, according to the company’s website. Patients who are taking a phosphodiesterase-5 enzyme inhibitor for ED may continue taking the medication, although they may no longer need it or they may be able to reduce the dose after treatment with the app, it says.
Kranus also has virtual treatments for incontinence in women and voiding dysfunction.
The app is meant to save doctors time by providing patients with detailed explanations and guidance within the app itself, said Laura Wiemer, MD, senior medical director of Kranus.
The app’s modules help reinforce guideline-recommended approaches to the treatment of ED “in playful ways with awards, motivational messages, and individual adjustments to help achieve better adherence and compliance of the patient,” Dr. Wiemer told this news organization.
Kranus plans to expand to the United States in 2024, she said.
A version of this article appeared on Medscape.com.
Little blue pill meets a little blue light.
A digital application can improve erectile function, according to new research presented at the European Association of Urology (EAU) Annual Congress on April 8, 2024.
Researchers developed a 12-week, self-managed program to treat erectile dysfunction (ED). The program is delivered to patients’ mobile devices and encourages users to do cardiovascular training, pelvic floor exercises, and physiotherapy. It also provides information about ED, sexual therapy, and stress management.
“The treatment of ED through physical activity and/or lifestyle changes is recommended in current European guidelines but is not well established in clinical practice,” according to the researchers.
App or Waitlist
The app, known as Kranus Edera, was created by Kranus Health. It is available by prescription in Germany and France.
To study the effectiveness of the app, investigators conducted a randomized controlled trial at the University Hospital Münster in Germany.
The study included 241 men who had scores of 21 or less on the International Index of Erectile Function (IIEF-5).
About half of the participants were randomly assigned to get the app. The rest were placed on a waiting list for the technology and served as a control group.
Men who received the app also reported gains in measures of quality of life (20.5 vs −0.04) and patient activation (11.1 vs 0.64).
Nearly nine in 10 people who used the app did so several times per week, the researchers reported.
Sabine Kliesch, MD, with University Hospital Münster, led the study, which was presented at a poster session on April 8 at the EAU Congress in Paris.
Fully Reimbursed in Germany
In Germany, Kranus Edera has been included on a government list of digital health apps that are fully reimbursed by insurers, partly based on the results of the clinical trial. The cost there is €235 (about $255).
Patients typically notice improvements in 2-4 weeks, according to the company’s website. Patients who are taking a phosphodiesterase-5 enzyme inhibitor for ED may continue taking the medication, although they may no longer need it or they may be able to reduce the dose after treatment with the app, it says.
Kranus also has virtual treatments for incontinence in women and voiding dysfunction.
The app is meant to save doctors time by providing patients with detailed explanations and guidance within the app itself, said Laura Wiemer, MD, senior medical director of Kranus.
The app’s modules help reinforce guideline-recommended approaches to the treatment of ED “in playful ways with awards, motivational messages, and individual adjustments to help achieve better adherence and compliance of the patient,” Dr. Wiemer told this news organization.
Kranus plans to expand to the United States in 2024, she said.
A version of this article appeared on Medscape.com.
Do Adults With Obesity Feel Pain More Intensely?
TOPLINE:
Adults with excess weight or obesity tend to experience higher levels of pain intensity than those with a normal weight, highlighting the importance of addressing obesity as part of pain management strategies.
METHODOLOGY:
- Recent studies suggest that obesity may change pain perception and worsen existing painful conditions.
- To examine the association between overweight or obesity and self-perceived pain intensities, researchers conducted a meta-analysis of 22 studies that included 31,210 adults older than 18 years and from diverse international cohorts.
- The participants were categorized by body mass index (BMI) as being normal weight (18.5-24.9), overweight (25.0-29.9), and obese (≥ 30). A BMI ≥ 25 was considered excess weight.
- Pain intensity was assessed by self-report using the Visual Analog Scale, Numerical Rating Scale, and Numerical Pain Rating Scale, with the lowest value indicating “no pain” and the highest value representing “pain as bad as it could be.”
- Researchers compared pain intensity between these patient BMI groups: Normal weight vs overweight plus obesity, normal weight vs overweight, normal weight vs obesity, and overweight vs obesity.
TAKEAWAY:
- Compared with people with normal weight, people with excess weight (overweight or obesity; standardized mean difference [SMD], −0.15; P = .0052) or with obesity (SMD, −0.22; P = .0008) reported higher pain intensities, with a small effect size.
- The comparison of self-report pain in people who had normal weight and overweight did not show any statistically significant difference.
IN PRACTICE:
“These findings encourage the treatment of obesity and the control of body mass index (weight loss) as key complementary interventions for pain management,” wrote the authors.
SOURCE:
This study was led by Miguel M. Garcia, Department of Basic Health Sciences, Universidad Rey Juan Carlos, Unidad Asociada de I+D+i al Instituto de Química Médica CSIC-URJC, Alcorcón, Spain. It was published online in Frontiers in Endocrinology.
LIMITATIONS:
The analysis did not include individuals who were underweight, potentially overlooking the associations between physical pain and malnutrition. BMI may misclassify individuals with high muscularity, as it doesn’t accurately reflect adiposity and cannot distinguish between two people with similar BMIs and different body compositions. Furthermore, the study did not consider gender-based differences while evaluating pain outcomes.
DISCLOSURES:
The study received no specific funding from any funding agency in the public, commercial, or not-for-profit sectors. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
TOPLINE:
Adults with excess weight or obesity tend to experience higher levels of pain intensity than those with a normal weight, highlighting the importance of addressing obesity as part of pain management strategies.
METHODOLOGY:
- Recent studies suggest that obesity may change pain perception and worsen existing painful conditions.
- To examine the association between overweight or obesity and self-perceived pain intensities, researchers conducted a meta-analysis of 22 studies that included 31,210 adults older than 18 years and from diverse international cohorts.
- The participants were categorized by body mass index (BMI) as being normal weight (18.5-24.9), overweight (25.0-29.9), and obese (≥ 30). A BMI ≥ 25 was considered excess weight.
- Pain intensity was assessed by self-report using the Visual Analog Scale, Numerical Rating Scale, and Numerical Pain Rating Scale, with the lowest value indicating “no pain” and the highest value representing “pain as bad as it could be.”
- Researchers compared pain intensity between these patient BMI groups: Normal weight vs overweight plus obesity, normal weight vs overweight, normal weight vs obesity, and overweight vs obesity.
TAKEAWAY:
- Compared with people with normal weight, people with excess weight (overweight or obesity; standardized mean difference [SMD], −0.15; P = .0052) or with obesity (SMD, −0.22; P = .0008) reported higher pain intensities, with a small effect size.
- The comparison of self-report pain in people who had normal weight and overweight did not show any statistically significant difference.
IN PRACTICE:
“These findings encourage the treatment of obesity and the control of body mass index (weight loss) as key complementary interventions for pain management,” wrote the authors.
SOURCE:
This study was led by Miguel M. Garcia, Department of Basic Health Sciences, Universidad Rey Juan Carlos, Unidad Asociada de I+D+i al Instituto de Química Médica CSIC-URJC, Alcorcón, Spain. It was published online in Frontiers in Endocrinology.
LIMITATIONS:
The analysis did not include individuals who were underweight, potentially overlooking the associations between physical pain and malnutrition. BMI may misclassify individuals with high muscularity, as it doesn’t accurately reflect adiposity and cannot distinguish between two people with similar BMIs and different body compositions. Furthermore, the study did not consider gender-based differences while evaluating pain outcomes.
DISCLOSURES:
The study received no specific funding from any funding agency in the public, commercial, or not-for-profit sectors. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
TOPLINE:
Adults with excess weight or obesity tend to experience higher levels of pain intensity than those with a normal weight, highlighting the importance of addressing obesity as part of pain management strategies.
METHODOLOGY:
- Recent studies suggest that obesity may change pain perception and worsen existing painful conditions.
- To examine the association between overweight or obesity and self-perceived pain intensities, researchers conducted a meta-analysis of 22 studies that included 31,210 adults older than 18 years and from diverse international cohorts.
- The participants were categorized by body mass index (BMI) as being normal weight (18.5-24.9), overweight (25.0-29.9), and obese (≥ 30). A BMI ≥ 25 was considered excess weight.
- Pain intensity was assessed by self-report using the Visual Analog Scale, Numerical Rating Scale, and Numerical Pain Rating Scale, with the lowest value indicating “no pain” and the highest value representing “pain as bad as it could be.”
- Researchers compared pain intensity between these patient BMI groups: Normal weight vs overweight plus obesity, normal weight vs overweight, normal weight vs obesity, and overweight vs obesity.
TAKEAWAY:
- Compared with people with normal weight, people with excess weight (overweight or obesity; standardized mean difference [SMD], −0.15; P = .0052) or with obesity (SMD, −0.22; P = .0008) reported higher pain intensities, with a small effect size.
- The comparison of self-report pain in people who had normal weight and overweight did not show any statistically significant difference.
IN PRACTICE:
“These findings encourage the treatment of obesity and the control of body mass index (weight loss) as key complementary interventions for pain management,” wrote the authors.
SOURCE:
This study was led by Miguel M. Garcia, Department of Basic Health Sciences, Universidad Rey Juan Carlos, Unidad Asociada de I+D+i al Instituto de Química Médica CSIC-URJC, Alcorcón, Spain. It was published online in Frontiers in Endocrinology.
LIMITATIONS:
The analysis did not include individuals who were underweight, potentially overlooking the associations between physical pain and malnutrition. BMI may misclassify individuals with high muscularity, as it doesn’t accurately reflect adiposity and cannot distinguish between two people with similar BMIs and different body compositions. Furthermore, the study did not consider gender-based differences while evaluating pain outcomes.
DISCLOSURES:
The study received no specific funding from any funding agency in the public, commercial, or not-for-profit sectors. The authors declared no conflicts of interest.
A version of this article appeared on Medscape.com.
Analysis Finds Low Malignancy Rate in Pediatric Longitudinal Melanonychia
TOPLINE:
METHODOLOGY:
- LM — a pigmented band in the nail plate caused by increased melanin deposition — occurs in children and adults, resulting from melanocytic activation or proliferation in response to infection, systemic disease, medication, trauma, and other factors.
- Clinical features of LM in children mimic red-flag signs of subungual melanoma in adults although rarely is subungual melanoma.
- A biopsy can confirm the diagnosis, but other considerations include the scar, cost and stress of a procedure, and possibly pain or deformity.
- The researchers conducted a systematic review and meta-analysis of the prevalence of clinical and dermoscopic features in 1391 pediatric patients with LM (diagnosed at a mean age of 5-13 years) from 24 studies published between 1996 and 2023.
TAKEAWAY:
- Of 731 lesions in which a diagnosis was provided, benign nail matrix nevus accounted for 86% of cases.
- Only eight cases of subungual melanoma in situ were diagnosed, with no cases of invasive melanoma identified.
- Most lesions occurred on the fingernails (76%), particularly in the first digits (45%), and the most frequent clinical features included dark-colored bands (70%), multicolored bands (48%), broad bandwidth (41%), and pseudo-Hutchinson sign (41%).
- During a median follow-up of 1-5.5 years, 30% of lesions continued to evolve with changes in width or color, while 23% remained stable and 20% underwent spontaneous regression.
IN PRACTICE:
“In the pivotal clinical decision of whether to biopsy a child with longitudinal melanonychia, perhaps with features that would require a prompt biopsy in an adult, this study provides data to support the option of clinical monitoring,” the authors wrote.
SOURCE:
The meta-analysis, led by Serena Yun-Chen Tsai, MD, in the Department of Dermatology, Massachusetts General Hospital, Boston, Massachusetts, was published online in Pediatric Dermatology.
LIMITATIONS:
Most studies were conducted in Asia, and data stratified by skin type were limited. Inconsistent reporting and missing critical features could affect data quality. Also, certain features displayed high heterogeneity.
DISCLOSURES:
This meta-analysis was supported by the Pediatric Dermatology Research Alliance Career Bridge Research Grant. One co-author disclosed relationships with UpToDate (author, reviewer), Skin Analytics (consultant), and DermTech (research materials).
A version of this article appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- LM — a pigmented band in the nail plate caused by increased melanin deposition — occurs in children and adults, resulting from melanocytic activation or proliferation in response to infection, systemic disease, medication, trauma, and other factors.
- Clinical features of LM in children mimic red-flag signs of subungual melanoma in adults although rarely is subungual melanoma.
- A biopsy can confirm the diagnosis, but other considerations include the scar, cost and stress of a procedure, and possibly pain or deformity.
- The researchers conducted a systematic review and meta-analysis of the prevalence of clinical and dermoscopic features in 1391 pediatric patients with LM (diagnosed at a mean age of 5-13 years) from 24 studies published between 1996 and 2023.
TAKEAWAY:
- Of 731 lesions in which a diagnosis was provided, benign nail matrix nevus accounted for 86% of cases.
- Only eight cases of subungual melanoma in situ were diagnosed, with no cases of invasive melanoma identified.
- Most lesions occurred on the fingernails (76%), particularly in the first digits (45%), and the most frequent clinical features included dark-colored bands (70%), multicolored bands (48%), broad bandwidth (41%), and pseudo-Hutchinson sign (41%).
- During a median follow-up of 1-5.5 years, 30% of lesions continued to evolve with changes in width or color, while 23% remained stable and 20% underwent spontaneous regression.
IN PRACTICE:
“In the pivotal clinical decision of whether to biopsy a child with longitudinal melanonychia, perhaps with features that would require a prompt biopsy in an adult, this study provides data to support the option of clinical monitoring,” the authors wrote.
SOURCE:
The meta-analysis, led by Serena Yun-Chen Tsai, MD, in the Department of Dermatology, Massachusetts General Hospital, Boston, Massachusetts, was published online in Pediatric Dermatology.
LIMITATIONS:
Most studies were conducted in Asia, and data stratified by skin type were limited. Inconsistent reporting and missing critical features could affect data quality. Also, certain features displayed high heterogeneity.
DISCLOSURES:
This meta-analysis was supported by the Pediatric Dermatology Research Alliance Career Bridge Research Grant. One co-author disclosed relationships with UpToDate (author, reviewer), Skin Analytics (consultant), and DermTech (research materials).
A version of this article appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- LM — a pigmented band in the nail plate caused by increased melanin deposition — occurs in children and adults, resulting from melanocytic activation or proliferation in response to infection, systemic disease, medication, trauma, and other factors.
- Clinical features of LM in children mimic red-flag signs of subungual melanoma in adults although rarely is subungual melanoma.
- A biopsy can confirm the diagnosis, but other considerations include the scar, cost and stress of a procedure, and possibly pain or deformity.
- The researchers conducted a systematic review and meta-analysis of the prevalence of clinical and dermoscopic features in 1391 pediatric patients with LM (diagnosed at a mean age of 5-13 years) from 24 studies published between 1996 and 2023.
TAKEAWAY:
- Of 731 lesions in which a diagnosis was provided, benign nail matrix nevus accounted for 86% of cases.
- Only eight cases of subungual melanoma in situ were diagnosed, with no cases of invasive melanoma identified.
- Most lesions occurred on the fingernails (76%), particularly in the first digits (45%), and the most frequent clinical features included dark-colored bands (70%), multicolored bands (48%), broad bandwidth (41%), and pseudo-Hutchinson sign (41%).
- During a median follow-up of 1-5.5 years, 30% of lesions continued to evolve with changes in width or color, while 23% remained stable and 20% underwent spontaneous regression.
IN PRACTICE:
“In the pivotal clinical decision of whether to biopsy a child with longitudinal melanonychia, perhaps with features that would require a prompt biopsy in an adult, this study provides data to support the option of clinical monitoring,” the authors wrote.
SOURCE:
The meta-analysis, led by Serena Yun-Chen Tsai, MD, in the Department of Dermatology, Massachusetts General Hospital, Boston, Massachusetts, was published online in Pediatric Dermatology.
LIMITATIONS:
Most studies were conducted in Asia, and data stratified by skin type were limited. Inconsistent reporting and missing critical features could affect data quality. Also, certain features displayed high heterogeneity.
DISCLOSURES:
This meta-analysis was supported by the Pediatric Dermatology Research Alliance Career Bridge Research Grant. One co-author disclosed relationships with UpToDate (author, reviewer), Skin Analytics (consultant), and DermTech (research materials).
A version of this article appeared on Medscape.com.