User login
Rheumatologic Disease–Associated Hyperinflammatory Condition Successfully Treated with Emapalumab
TOPLINE:
Emapalumab (Gamifant)-containing regimens stabilize key laboratory parameters and show a high 12-month survival probability in patients with rheumatologic disease–associated hemophagocytic lymphohistiocytosis (HLH).
METHODOLOGY:
- Researchers conducted a retrospective medical chart review study across 33 US hospitals to assess the real-world treatment patterns and outcomes in patients with HLH treated with emapalumab.
- They included 15 patients with rheumatologic disease–associated HLH (median age at diagnosis, 5 years; 73.3% women) who received at least one dose of emapalumab between November 20, 2018, and October 31, 2021.
- Most patients with rheumatologic disease–associated HLH had either systemic juvenile idiopathic arthritis (n = 9) or adult-onset Still’s disease (n = 1).
- Patients received emapalumab for refractory, recurrent, or progressive disease, with an overall treatment duration of 63 days.
- The primary objective of this study was to describe emapalumab treatment patterns such as time to initiation, treatment duration, dosing patterns, and reasons for initiation.
TAKEAWAY:
- Most patients (60%) with rheumatologic disease–associated HLH were critically ill and were initiated on emapalumab in an intensive care unit; emapalumab was mostly initiated for treating refractory (33.3%) and recurrent (33.3%) disease.
- All patients concurrently received emapalumab with other HLH-related therapies, with glucocorticoids (100%) and anakinra (60%) used most frequently.
- Emapalumab treatment led to achievement of normal fibrinogen levels (> 360 mg/dL), according to defined laboratory criteria in all patients with rheumatologic disease–associated HLH, and an 80.6% reduction in the required glucocorticoid dose.
- The 12-month survival probability from the initiation of emapalumab was 86.7% in all patients with rheumatologic disease–associated HLH and 90.0% in the subset with systemic juvenile idiopathic arthritis or adult-onset Still’s disease.
IN PRACTICE:
“In this study, emapalumab-containing regimens normalized rheumatologic disease–associated laboratory parameters, substantially reduced glucocorticoid dose, and were associated with low mortality,” the authors wrote.
SOURCE:
The study was led by Shanmuganathan Chandrakasan, MD, Children’s Healthcare of Atlanta, Emory University, Atlanta, Georgia, and was published online on September 8, 2024, in Arthritis & Rheumatology.
LIMITATIONS:
Chart data required for analyses were missing or incomplete in this retrospective study. The sample size of patients with rheumatologic disease–associated HLH was small. No safety data were collected.
DISCLOSURES:
The study was supported by Sobi, which markets emapalumab. Some authors declared receiving grants, consulting fees, or payments or having financial and nonfinancial interests and other ties with several pharmaceutical companies, including Sobi.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Emapalumab (Gamifant)-containing regimens stabilize key laboratory parameters and show a high 12-month survival probability in patients with rheumatologic disease–associated hemophagocytic lymphohistiocytosis (HLH).
METHODOLOGY:
- Researchers conducted a retrospective medical chart review study across 33 US hospitals to assess the real-world treatment patterns and outcomes in patients with HLH treated with emapalumab.
- They included 15 patients with rheumatologic disease–associated HLH (median age at diagnosis, 5 years; 73.3% women) who received at least one dose of emapalumab between November 20, 2018, and October 31, 2021.
- Most patients with rheumatologic disease–associated HLH had either systemic juvenile idiopathic arthritis (n = 9) or adult-onset Still’s disease (n = 1).
- Patients received emapalumab for refractory, recurrent, or progressive disease, with an overall treatment duration of 63 days.
- The primary objective of this study was to describe emapalumab treatment patterns such as time to initiation, treatment duration, dosing patterns, and reasons for initiation.
TAKEAWAY:
- Most patients (60%) with rheumatologic disease–associated HLH were critically ill and were initiated on emapalumab in an intensive care unit; emapalumab was mostly initiated for treating refractory (33.3%) and recurrent (33.3%) disease.
- All patients concurrently received emapalumab with other HLH-related therapies, with glucocorticoids (100%) and anakinra (60%) used most frequently.
- Emapalumab treatment led to achievement of normal fibrinogen levels (> 360 mg/dL), according to defined laboratory criteria in all patients with rheumatologic disease–associated HLH, and an 80.6% reduction in the required glucocorticoid dose.
- The 12-month survival probability from the initiation of emapalumab was 86.7% in all patients with rheumatologic disease–associated HLH and 90.0% in the subset with systemic juvenile idiopathic arthritis or adult-onset Still’s disease.
IN PRACTICE:
“In this study, emapalumab-containing regimens normalized rheumatologic disease–associated laboratory parameters, substantially reduced glucocorticoid dose, and were associated with low mortality,” the authors wrote.
SOURCE:
The study was led by Shanmuganathan Chandrakasan, MD, Children’s Healthcare of Atlanta, Emory University, Atlanta, Georgia, and was published online on September 8, 2024, in Arthritis & Rheumatology.
LIMITATIONS:
Chart data required for analyses were missing or incomplete in this retrospective study. The sample size of patients with rheumatologic disease–associated HLH was small. No safety data were collected.
DISCLOSURES:
The study was supported by Sobi, which markets emapalumab. Some authors declared receiving grants, consulting fees, or payments or having financial and nonfinancial interests and other ties with several pharmaceutical companies, including Sobi.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Emapalumab (Gamifant)-containing regimens stabilize key laboratory parameters and show a high 12-month survival probability in patients with rheumatologic disease–associated hemophagocytic lymphohistiocytosis (HLH).
METHODOLOGY:
- Researchers conducted a retrospective medical chart review study across 33 US hospitals to assess the real-world treatment patterns and outcomes in patients with HLH treated with emapalumab.
- They included 15 patients with rheumatologic disease–associated HLH (median age at diagnosis, 5 years; 73.3% women) who received at least one dose of emapalumab between November 20, 2018, and October 31, 2021.
- Most patients with rheumatologic disease–associated HLH had either systemic juvenile idiopathic arthritis (n = 9) or adult-onset Still’s disease (n = 1).
- Patients received emapalumab for refractory, recurrent, or progressive disease, with an overall treatment duration of 63 days.
- The primary objective of this study was to describe emapalumab treatment patterns such as time to initiation, treatment duration, dosing patterns, and reasons for initiation.
TAKEAWAY:
- Most patients (60%) with rheumatologic disease–associated HLH were critically ill and were initiated on emapalumab in an intensive care unit; emapalumab was mostly initiated for treating refractory (33.3%) and recurrent (33.3%) disease.
- All patients concurrently received emapalumab with other HLH-related therapies, with glucocorticoids (100%) and anakinra (60%) used most frequently.
- Emapalumab treatment led to achievement of normal fibrinogen levels (> 360 mg/dL), according to defined laboratory criteria in all patients with rheumatologic disease–associated HLH, and an 80.6% reduction in the required glucocorticoid dose.
- The 12-month survival probability from the initiation of emapalumab was 86.7% in all patients with rheumatologic disease–associated HLH and 90.0% in the subset with systemic juvenile idiopathic arthritis or adult-onset Still’s disease.
IN PRACTICE:
“In this study, emapalumab-containing regimens normalized rheumatologic disease–associated laboratory parameters, substantially reduced glucocorticoid dose, and were associated with low mortality,” the authors wrote.
SOURCE:
The study was led by Shanmuganathan Chandrakasan, MD, Children’s Healthcare of Atlanta, Emory University, Atlanta, Georgia, and was published online on September 8, 2024, in Arthritis & Rheumatology.
LIMITATIONS:
Chart data required for analyses were missing or incomplete in this retrospective study. The sample size of patients with rheumatologic disease–associated HLH was small. No safety data were collected.
DISCLOSURES:
The study was supported by Sobi, which markets emapalumab. Some authors declared receiving grants, consulting fees, or payments or having financial and nonfinancial interests and other ties with several pharmaceutical companies, including Sobi.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Walking App Works Only if Users Think It Does
TOPLINE:
Apps designed to increase physical activity may be useful in increasing daily step counts for users who believe the intervention beneficial, but not for those who do not. The app’s effectiveness is notably influenced by how users perceive its utility.
METHODOLOGY:
- Researchers conducted a randomized controlled trial from February 2021 to May 2022 to evaluate the effectiveness of SNapp, an adaptive app designed to promote walking through tailored coaching content.
- Overall, 176 adults (76% women; mean age, 56 years) were randomly assigned to use the app plus tailored coaching content (SNapp group; n = 87) or only the step counter app (control group; n = 89).
- SNapp’s coaching content provided personalized feedback on step counts and recommendations for increasing walking, while also considering individual preferences for behavior change techniques.
- The primary outcome was the daily step count recorded by the app, which was updated on an hourly basis in a database over an intervention period of 12 months.
- Perceptions of ease of use and usefulness were assessed to determine their effect on the effectiveness of the app.
TAKEAWAY:
- Intervention group participants used the app nearly 30% of days, while those using the app alone showed almost identical use.
- The SNapp intervention did not significantly affect the step counts on average over time (B, −202.30; 95% CI, −889.7 to 485.1).
- Perceived usefulness significantly moderated the intervention effect of SNapp (B, 344.38; 90% CI, 40.4-648.3), but perceived ease of use did not (B, 38.60; 90% CI, −276.5 to 353.7).
- Among participants with a high perceived usefulness, the SNapp group had a higher median step count than the control group (median difference, 1260 steps; 90% CI, −3243.7 to 1298.2); however, this difference was not statistically significant.
IN PRACTICE:
“This study shows that perceived usefulness is also an important factor influencing behavioral effects. Hence, it is essential for apps to be perceived as useful to effectively improve users’ activity levels,” the authors wrote.
SOURCE:
The study was led by Anne L. Vos, PhD, of the Amsterdam School of Communication Research at the University of Amsterdam, in the Netherlands. It was published online on September 16, 2024, in the American Journal of Preventive Medicine.
LIMITATIONS:
The study’s recruitment strategy primarily attracted highly educated individuals, limiting generalizability. The app’s accuracy in measuring steps could be improved, as it sometimes underestimated step counts. Researchers also were unable to check if participants read messages from coaches.
DISCLOSURES:
The study was supported by grants from the Dutch Heart Foundation and the Netherlands Organisation for Health Research and Development. No relevant conflicts of interest were disclosed by the authors.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Apps designed to increase physical activity may be useful in increasing daily step counts for users who believe the intervention beneficial, but not for those who do not. The app’s effectiveness is notably influenced by how users perceive its utility.
METHODOLOGY:
- Researchers conducted a randomized controlled trial from February 2021 to May 2022 to evaluate the effectiveness of SNapp, an adaptive app designed to promote walking through tailored coaching content.
- Overall, 176 adults (76% women; mean age, 56 years) were randomly assigned to use the app plus tailored coaching content (SNapp group; n = 87) or only the step counter app (control group; n = 89).
- SNapp’s coaching content provided personalized feedback on step counts and recommendations for increasing walking, while also considering individual preferences for behavior change techniques.
- The primary outcome was the daily step count recorded by the app, which was updated on an hourly basis in a database over an intervention period of 12 months.
- Perceptions of ease of use and usefulness were assessed to determine their effect on the effectiveness of the app.
TAKEAWAY:
- Intervention group participants used the app nearly 30% of days, while those using the app alone showed almost identical use.
- The SNapp intervention did not significantly affect the step counts on average over time (B, −202.30; 95% CI, −889.7 to 485.1).
- Perceived usefulness significantly moderated the intervention effect of SNapp (B, 344.38; 90% CI, 40.4-648.3), but perceived ease of use did not (B, 38.60; 90% CI, −276.5 to 353.7).
- Among participants with a high perceived usefulness, the SNapp group had a higher median step count than the control group (median difference, 1260 steps; 90% CI, −3243.7 to 1298.2); however, this difference was not statistically significant.
IN PRACTICE:
“This study shows that perceived usefulness is also an important factor influencing behavioral effects. Hence, it is essential for apps to be perceived as useful to effectively improve users’ activity levels,” the authors wrote.
SOURCE:
The study was led by Anne L. Vos, PhD, of the Amsterdam School of Communication Research at the University of Amsterdam, in the Netherlands. It was published online on September 16, 2024, in the American Journal of Preventive Medicine.
LIMITATIONS:
The study’s recruitment strategy primarily attracted highly educated individuals, limiting generalizability. The app’s accuracy in measuring steps could be improved, as it sometimes underestimated step counts. Researchers also were unable to check if participants read messages from coaches.
DISCLOSURES:
The study was supported by grants from the Dutch Heart Foundation and the Netherlands Organisation for Health Research and Development. No relevant conflicts of interest were disclosed by the authors.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Apps designed to increase physical activity may be useful in increasing daily step counts for users who believe the intervention beneficial, but not for those who do not. The app’s effectiveness is notably influenced by how users perceive its utility.
METHODOLOGY:
- Researchers conducted a randomized controlled trial from February 2021 to May 2022 to evaluate the effectiveness of SNapp, an adaptive app designed to promote walking through tailored coaching content.
- Overall, 176 adults (76% women; mean age, 56 years) were randomly assigned to use the app plus tailored coaching content (SNapp group; n = 87) or only the step counter app (control group; n = 89).
- SNapp’s coaching content provided personalized feedback on step counts and recommendations for increasing walking, while also considering individual preferences for behavior change techniques.
- The primary outcome was the daily step count recorded by the app, which was updated on an hourly basis in a database over an intervention period of 12 months.
- Perceptions of ease of use and usefulness were assessed to determine their effect on the effectiveness of the app.
TAKEAWAY:
- Intervention group participants used the app nearly 30% of days, while those using the app alone showed almost identical use.
- The SNapp intervention did not significantly affect the step counts on average over time (B, −202.30; 95% CI, −889.7 to 485.1).
- Perceived usefulness significantly moderated the intervention effect of SNapp (B, 344.38; 90% CI, 40.4-648.3), but perceived ease of use did not (B, 38.60; 90% CI, −276.5 to 353.7).
- Among participants with a high perceived usefulness, the SNapp group had a higher median step count than the control group (median difference, 1260 steps; 90% CI, −3243.7 to 1298.2); however, this difference was not statistically significant.
IN PRACTICE:
“This study shows that perceived usefulness is also an important factor influencing behavioral effects. Hence, it is essential for apps to be perceived as useful to effectively improve users’ activity levels,” the authors wrote.
SOURCE:
The study was led by Anne L. Vos, PhD, of the Amsterdam School of Communication Research at the University of Amsterdam, in the Netherlands. It was published online on September 16, 2024, in the American Journal of Preventive Medicine.
LIMITATIONS:
The study’s recruitment strategy primarily attracted highly educated individuals, limiting generalizability. The app’s accuracy in measuring steps could be improved, as it sometimes underestimated step counts. Researchers also were unable to check if participants read messages from coaches.
DISCLOSURES:
The study was supported by grants from the Dutch Heart Foundation and the Netherlands Organisation for Health Research and Development. No relevant conflicts of interest were disclosed by the authors.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Bariatric Surgery and Weight Loss Make Brain Say Meh to Sweets
TOPLINE:
METHODOLOGY:
- Previous studies have suggested that individuals undergoing bariatric surgery show reduced preference for sweet-tasting food post-surgery, but the mechanisms behind these changes remain unclear.
- This observational cohort study aimed to examine the neural processing of sweet taste in the reward regions of the brain before and after bariatric surgery in 24 women with obesity (mean body mass index [BMI], 47) who underwent bariatric surgery and 21 control participants with normal to overweight (mean BMI, 23.5).
- Participants (mean age about 43 years; 75%-81% White) underwent sucrose taste testing and functional MRI (fMRI) to compare the responses of the brain with sucrose solutions of 0.10 M and 0.40 M (akin to sugar-sweetened beverages, such as Coca-Cola at ~0.32 M) and Mountain Dew at ~0.35 M) versus water.
- In the bariatric surgery group, participants underwent fMRI 1-117 days before surgery, and 21 participants who lost about 20% of their weight after the surgery underwent a follow-up fMRI roughly 3-4 months later.
- The researchers analyzed the brain’s reward response using a composite activation of several reward system regions (the ventral tegmental area, ventral striatum, and orbitofrontal cortex) and of sensory regions (the primary somatosensory cortex and primary insula taste cortex).
TAKEAWAY:
- The perceived intensity of sweetness was comparable between the control group and the bariatric surgery group both before and after surgery.
- In the bariatric surgery group, the average preferred sweet concentration decreased from 0.52 M before surgery to 0.29 M after surgery (P = .008).
- The fMRI analysis indicated that women showed a trend toward a higher reward response to 0.4 M sucrose before bariatric surgery than the control participants.
- The activation of the reward region in response to 0.4 M sucrose (but not 0.1 M) declined in the bariatric surgery group after surgery (P = .042).
IN PRACTICE:
“Our findings suggest that both the brain reward response to and subjective liking of an innately desirable taste decline following bariatric surgery,” the authors wrote.
SOURCE:
This study was led by Jonathan Alessi, Indiana University School of Medicine, Indianapolis, and published online in Obesity.
LIMITATIONS:
The study sample size was relatively small, and the duration of follow-up was short, with recruitment curtailed by the COVID-19 pandemic. This study did not assess the consumption of sugar or sweetened food, which could provide further insights into changes in the dietary behavior post-surgery. Participants included women only, and the findings could have been different if men were recruited.
DISCLOSURES:
This study was funded by the American Diabetes Association, Indiana Clinical and Translational Sciences Institute, and National Institute on Alcohol Abuse and Alcoholism. Three authors reported financial relationships with some pharmaceutical companies outside of this study.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Previous studies have suggested that individuals undergoing bariatric surgery show reduced preference for sweet-tasting food post-surgery, but the mechanisms behind these changes remain unclear.
- This observational cohort study aimed to examine the neural processing of sweet taste in the reward regions of the brain before and after bariatric surgery in 24 women with obesity (mean body mass index [BMI], 47) who underwent bariatric surgery and 21 control participants with normal to overweight (mean BMI, 23.5).
- Participants (mean age about 43 years; 75%-81% White) underwent sucrose taste testing and functional MRI (fMRI) to compare the responses of the brain with sucrose solutions of 0.10 M and 0.40 M (akin to sugar-sweetened beverages, such as Coca-Cola at ~0.32 M) and Mountain Dew at ~0.35 M) versus water.
- In the bariatric surgery group, participants underwent fMRI 1-117 days before surgery, and 21 participants who lost about 20% of their weight after the surgery underwent a follow-up fMRI roughly 3-4 months later.
- The researchers analyzed the brain’s reward response using a composite activation of several reward system regions (the ventral tegmental area, ventral striatum, and orbitofrontal cortex) and of sensory regions (the primary somatosensory cortex and primary insula taste cortex).
TAKEAWAY:
- The perceived intensity of sweetness was comparable between the control group and the bariatric surgery group both before and after surgery.
- In the bariatric surgery group, the average preferred sweet concentration decreased from 0.52 M before surgery to 0.29 M after surgery (P = .008).
- The fMRI analysis indicated that women showed a trend toward a higher reward response to 0.4 M sucrose before bariatric surgery than the control participants.
- The activation of the reward region in response to 0.4 M sucrose (but not 0.1 M) declined in the bariatric surgery group after surgery (P = .042).
IN PRACTICE:
“Our findings suggest that both the brain reward response to and subjective liking of an innately desirable taste decline following bariatric surgery,” the authors wrote.
SOURCE:
This study was led by Jonathan Alessi, Indiana University School of Medicine, Indianapolis, and published online in Obesity.
LIMITATIONS:
The study sample size was relatively small, and the duration of follow-up was short, with recruitment curtailed by the COVID-19 pandemic. This study did not assess the consumption of sugar or sweetened food, which could provide further insights into changes in the dietary behavior post-surgery. Participants included women only, and the findings could have been different if men were recruited.
DISCLOSURES:
This study was funded by the American Diabetes Association, Indiana Clinical and Translational Sciences Institute, and National Institute on Alcohol Abuse and Alcoholism. Three authors reported financial relationships with some pharmaceutical companies outside of this study.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Previous studies have suggested that individuals undergoing bariatric surgery show reduced preference for sweet-tasting food post-surgery, but the mechanisms behind these changes remain unclear.
- This observational cohort study aimed to examine the neural processing of sweet taste in the reward regions of the brain before and after bariatric surgery in 24 women with obesity (mean body mass index [BMI], 47) who underwent bariatric surgery and 21 control participants with normal to overweight (mean BMI, 23.5).
- Participants (mean age about 43 years; 75%-81% White) underwent sucrose taste testing and functional MRI (fMRI) to compare the responses of the brain with sucrose solutions of 0.10 M and 0.40 M (akin to sugar-sweetened beverages, such as Coca-Cola at ~0.32 M) and Mountain Dew at ~0.35 M) versus water.
- In the bariatric surgery group, participants underwent fMRI 1-117 days before surgery, and 21 participants who lost about 20% of their weight after the surgery underwent a follow-up fMRI roughly 3-4 months later.
- The researchers analyzed the brain’s reward response using a composite activation of several reward system regions (the ventral tegmental area, ventral striatum, and orbitofrontal cortex) and of sensory regions (the primary somatosensory cortex and primary insula taste cortex).
TAKEAWAY:
- The perceived intensity of sweetness was comparable between the control group and the bariatric surgery group both before and after surgery.
- In the bariatric surgery group, the average preferred sweet concentration decreased from 0.52 M before surgery to 0.29 M after surgery (P = .008).
- The fMRI analysis indicated that women showed a trend toward a higher reward response to 0.4 M sucrose before bariatric surgery than the control participants.
- The activation of the reward region in response to 0.4 M sucrose (but not 0.1 M) declined in the bariatric surgery group after surgery (P = .042).
IN PRACTICE:
“Our findings suggest that both the brain reward response to and subjective liking of an innately desirable taste decline following bariatric surgery,” the authors wrote.
SOURCE:
This study was led by Jonathan Alessi, Indiana University School of Medicine, Indianapolis, and published online in Obesity.
LIMITATIONS:
The study sample size was relatively small, and the duration of follow-up was short, with recruitment curtailed by the COVID-19 pandemic. This study did not assess the consumption of sugar or sweetened food, which could provide further insights into changes in the dietary behavior post-surgery. Participants included women only, and the findings could have been different if men were recruited.
DISCLOSURES:
This study was funded by the American Diabetes Association, Indiana Clinical and Translational Sciences Institute, and National Institute on Alcohol Abuse and Alcoholism. Three authors reported financial relationships with some pharmaceutical companies outside of this study.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
ANCA-Associated Vasculitis Has Five Unique Patient Clusters
TOPLINE:
A data-driven subclassification of antineutrophil cytoplasmic antibody (ANCA)–associated vasculitis has identified five distinct clusters with varying degrees of kidney involvement and systemic inflammation, offering insights into improved patient stratification and treatment approaches.
METHODOLOGY:
- ANCA-associated vasculitis is a rare and complex autoimmune disease that is traditionally classified into granulomatosis with polyangiitis (GPA) and microscopic polyangiitis (MPA).
- Researchers employed advanced artificial intelligence and big data techniques to identify phenotypically distinct subgroups of ANCA-associated vasculitis and developed a classification system using real-world patient data from the Federated Vasculitis Registry consortium.
- They included 3868 patients diagnosed with ANCA-associated vasculitis between November 1, 1966, and March 1, 2023 (mean age at diagnosis, 57.2 years; 51.9% men), across six European vasculitis registries; while a majority of patients (62.9%) were diagnosed with GPA, the remaining 37.1% were diagnosed with MPA.
- Overall, 17 clinical and demographic variables such as the age at diagnosis, gender, serum creatinine and C-reactive protein levels, the type of ANCA, and the involvement of various organ systems were used to create a model for categorizing patients into different clusters.
- The median follow-up duration was 4.2 years.
TAKEAWAY:
- Five distinct clusters were identified in ANCA-associated vasculitis; three had significant kidney involvement (the severe kidney cluster, myeloperoxidase-ANCA-positive kidney cluster, and proteinase 3-ANCA-positive kidney cluster) and two had minimal kidney involvement (young respiratory cluster and inflammatory multisystem cluster).
- The clusters with significant kidney involvement were associated with poorer outcomes, including a higher risk for kidney failure and death. The severe kidney cluster had the poorest prognosis, with mortality and the rate of end-stage kidney failure being 30.5% and 41.6%, respectively.
- The young respiratory cluster, characterized by predominant ear-nose-throat involvement and low systemic inflammation, showed the best prognostic outcomes.
- This cluster membership model showed a greater predictive accuracy for patient and kidney survival than traditional methods based on clinical diagnosis or ANCA specificity.
IN PRACTICE:
“These findings highlight the necessity of recognizing severe kidney disease at the time of diagnosis as an indicator of poor outcome, thereby necessitating intensified treatment approaches,” experts from the Department of Internal Medicine IV (Nephrology and Hypertension), Medical University of Innsbruck, Austria, wrote in an accompanying editorial published online on August 22, 2024, in The Lancet Rheumatology.
SOURCE:
This study was led by Karl Gisslander, Department of Clinical Sciences, Lund University, Lund, Sweden, and was published online on August 22, 2024, in The Lancet Rheumatology.
LIMITATIONS:
Data on estimated glomerular filtration rate recovery in clusters with kidney disease were lacking. Populations from East Asia, where myeloperoxidase-ANCA positivity is more prevalent, were not included.
DISCLOSURES:
This study received funding from the European Union’s Horizon 2020 research and innovation program under the European Joint Programme on Rare Diseases. Some authors declared serving on advisory boards or receiving grants, contracts, travel support, consulting fees, payments, or honoraria from various pharmaceutical companies and other institutions.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
A data-driven subclassification of antineutrophil cytoplasmic antibody (ANCA)–associated vasculitis has identified five distinct clusters with varying degrees of kidney involvement and systemic inflammation, offering insights into improved patient stratification and treatment approaches.
METHODOLOGY:
- ANCA-associated vasculitis is a rare and complex autoimmune disease that is traditionally classified into granulomatosis with polyangiitis (GPA) and microscopic polyangiitis (MPA).
- Researchers employed advanced artificial intelligence and big data techniques to identify phenotypically distinct subgroups of ANCA-associated vasculitis and developed a classification system using real-world patient data from the Federated Vasculitis Registry consortium.
- They included 3868 patients diagnosed with ANCA-associated vasculitis between November 1, 1966, and March 1, 2023 (mean age at diagnosis, 57.2 years; 51.9% men), across six European vasculitis registries; while a majority of patients (62.9%) were diagnosed with GPA, the remaining 37.1% were diagnosed with MPA.
- Overall, 17 clinical and demographic variables such as the age at diagnosis, gender, serum creatinine and C-reactive protein levels, the type of ANCA, and the involvement of various organ systems were used to create a model for categorizing patients into different clusters.
- The median follow-up duration was 4.2 years.
TAKEAWAY:
- Five distinct clusters were identified in ANCA-associated vasculitis; three had significant kidney involvement (the severe kidney cluster, myeloperoxidase-ANCA-positive kidney cluster, and proteinase 3-ANCA-positive kidney cluster) and two had minimal kidney involvement (young respiratory cluster and inflammatory multisystem cluster).
- The clusters with significant kidney involvement were associated with poorer outcomes, including a higher risk for kidney failure and death. The severe kidney cluster had the poorest prognosis, with mortality and the rate of end-stage kidney failure being 30.5% and 41.6%, respectively.
- The young respiratory cluster, characterized by predominant ear-nose-throat involvement and low systemic inflammation, showed the best prognostic outcomes.
- This cluster membership model showed a greater predictive accuracy for patient and kidney survival than traditional methods based on clinical diagnosis or ANCA specificity.
IN PRACTICE:
“These findings highlight the necessity of recognizing severe kidney disease at the time of diagnosis as an indicator of poor outcome, thereby necessitating intensified treatment approaches,” experts from the Department of Internal Medicine IV (Nephrology and Hypertension), Medical University of Innsbruck, Austria, wrote in an accompanying editorial published online on August 22, 2024, in The Lancet Rheumatology.
SOURCE:
This study was led by Karl Gisslander, Department of Clinical Sciences, Lund University, Lund, Sweden, and was published online on August 22, 2024, in The Lancet Rheumatology.
LIMITATIONS:
Data on estimated glomerular filtration rate recovery in clusters with kidney disease were lacking. Populations from East Asia, where myeloperoxidase-ANCA positivity is more prevalent, were not included.
DISCLOSURES:
This study received funding from the European Union’s Horizon 2020 research and innovation program under the European Joint Programme on Rare Diseases. Some authors declared serving on advisory boards or receiving grants, contracts, travel support, consulting fees, payments, or honoraria from various pharmaceutical companies and other institutions.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
A data-driven subclassification of antineutrophil cytoplasmic antibody (ANCA)–associated vasculitis has identified five distinct clusters with varying degrees of kidney involvement and systemic inflammation, offering insights into improved patient stratification and treatment approaches.
METHODOLOGY:
- ANCA-associated vasculitis is a rare and complex autoimmune disease that is traditionally classified into granulomatosis with polyangiitis (GPA) and microscopic polyangiitis (MPA).
- Researchers employed advanced artificial intelligence and big data techniques to identify phenotypically distinct subgroups of ANCA-associated vasculitis and developed a classification system using real-world patient data from the Federated Vasculitis Registry consortium.
- They included 3868 patients diagnosed with ANCA-associated vasculitis between November 1, 1966, and March 1, 2023 (mean age at diagnosis, 57.2 years; 51.9% men), across six European vasculitis registries; while a majority of patients (62.9%) were diagnosed with GPA, the remaining 37.1% were diagnosed with MPA.
- Overall, 17 clinical and demographic variables such as the age at diagnosis, gender, serum creatinine and C-reactive protein levels, the type of ANCA, and the involvement of various organ systems were used to create a model for categorizing patients into different clusters.
- The median follow-up duration was 4.2 years.
TAKEAWAY:
- Five distinct clusters were identified in ANCA-associated vasculitis; three had significant kidney involvement (the severe kidney cluster, myeloperoxidase-ANCA-positive kidney cluster, and proteinase 3-ANCA-positive kidney cluster) and two had minimal kidney involvement (young respiratory cluster and inflammatory multisystem cluster).
- The clusters with significant kidney involvement were associated with poorer outcomes, including a higher risk for kidney failure and death. The severe kidney cluster had the poorest prognosis, with mortality and the rate of end-stage kidney failure being 30.5% and 41.6%, respectively.
- The young respiratory cluster, characterized by predominant ear-nose-throat involvement and low systemic inflammation, showed the best prognostic outcomes.
- This cluster membership model showed a greater predictive accuracy for patient and kidney survival than traditional methods based on clinical diagnosis or ANCA specificity.
IN PRACTICE:
“These findings highlight the necessity of recognizing severe kidney disease at the time of diagnosis as an indicator of poor outcome, thereby necessitating intensified treatment approaches,” experts from the Department of Internal Medicine IV (Nephrology and Hypertension), Medical University of Innsbruck, Austria, wrote in an accompanying editorial published online on August 22, 2024, in The Lancet Rheumatology.
SOURCE:
This study was led by Karl Gisslander, Department of Clinical Sciences, Lund University, Lund, Sweden, and was published online on August 22, 2024, in The Lancet Rheumatology.
LIMITATIONS:
Data on estimated glomerular filtration rate recovery in clusters with kidney disease were lacking. Populations from East Asia, where myeloperoxidase-ANCA positivity is more prevalent, were not included.
DISCLOSURES:
This study received funding from the European Union’s Horizon 2020 research and innovation program under the European Joint Programme on Rare Diseases. Some authors declared serving on advisory boards or receiving grants, contracts, travel support, consulting fees, payments, or honoraria from various pharmaceutical companies and other institutions.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
Laughter: Comic Relief for Dry Eyes?
TOPLINE:
Laughter exercise practiced four times a day was noninferior to 0.1% sodium hyaluronic acid in alleviating the symptoms of dry eye disease and improved tear film stability and the function of the meibomian gland, researchers found.
METHODOLOGY:
- Researchers aimed to assess the effectiveness and safety of laughter exercise in patients with symptomatic dry eye disease by conducting a two-arm clinical trial at the largest ophthalmic center in southern China.
- They included 299 patients aged 18-45 years (74% women) with symptomatic dry eye disease who were randomly assigned to receive either laughter exercise or eye drops with 0.1% sodium hyaluronic acid four times daily for 8 weeks.
- All the participants were required to have an ocular surface disease index score, a measure of symptoms related to dry eye disease, between 18 and 80 points and a fluorescein tear film breakup time of no more than 8 seconds.
- Participants in the laughter exercise group watched an instructional video and were requested to repeat the phrases “Hee hee hee, hah hah hah, cheese cheese cheese, cheek cheek cheek, hah hah hah hah hah hah” 30 times per 5-minute session.
- The primary outcome was the mean change in the ocular surface disease index score from baseline to 8 weeks.
TAKEAWAY:
- At 8 weeks, the ocular surface disease index score reduced by 10.50 points (95% CI, −13.1 to −7.82) in the laughter exercise group and by 8.83 points (95% CI, −11.7 to −6.02) in the group prescribed eye drops.
- At 12 weeks, patients in the laughter exercise group showed a significantly greater reduction in the ocular surface disease index score than those in the group prescribed eye drops (mean between-group difference, −4.08 points; P = .024).
- Laughter exercise also led to a more significant improvement in the noninvasive tear breakup time than the use of eye drops (mean between-group difference, 2.30 sec; P < .001).
- No adverse events were reported in either of the groups during the study period.
IN PRACTICE:
“As a safe, environmentally friendly, and low-cost intervention, laughter exercise could serve as a first-line, home-based treatment for people with symptomatic dry eye disease and limited corneal staining,” the authors of the study reported.
SOURCE:
This study was led by Jing Li from the State Key Laboratory of Ophthalmology at the Zhongshan Ophthalmic Center in Sun Yat-sen University in Guangzhou, China, and was published online on September 11, 2024, in The BMJ.
LIMITATIONS:
The study lacked a double-blinded design. The laughter exercise required a greater time investment than the application of eye drops, which may affect adherence in the long run.
DISCLOSURES:
This study was supported by grants from the National Natural Science Foundation of China and High-level Hospital Construction Project. The authors declared receiving support from the funding agencies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
Laughter exercise practiced four times a day was noninferior to 0.1% sodium hyaluronic acid in alleviating the symptoms of dry eye disease and improved tear film stability and the function of the meibomian gland, researchers found.
METHODOLOGY:
- Researchers aimed to assess the effectiveness and safety of laughter exercise in patients with symptomatic dry eye disease by conducting a two-arm clinical trial at the largest ophthalmic center in southern China.
- They included 299 patients aged 18-45 years (74% women) with symptomatic dry eye disease who were randomly assigned to receive either laughter exercise or eye drops with 0.1% sodium hyaluronic acid four times daily for 8 weeks.
- All the participants were required to have an ocular surface disease index score, a measure of symptoms related to dry eye disease, between 18 and 80 points and a fluorescein tear film breakup time of no more than 8 seconds.
- Participants in the laughter exercise group watched an instructional video and were requested to repeat the phrases “Hee hee hee, hah hah hah, cheese cheese cheese, cheek cheek cheek, hah hah hah hah hah hah” 30 times per 5-minute session.
- The primary outcome was the mean change in the ocular surface disease index score from baseline to 8 weeks.
TAKEAWAY:
- At 8 weeks, the ocular surface disease index score reduced by 10.50 points (95% CI, −13.1 to −7.82) in the laughter exercise group and by 8.83 points (95% CI, −11.7 to −6.02) in the group prescribed eye drops.
- At 12 weeks, patients in the laughter exercise group showed a significantly greater reduction in the ocular surface disease index score than those in the group prescribed eye drops (mean between-group difference, −4.08 points; P = .024).
- Laughter exercise also led to a more significant improvement in the noninvasive tear breakup time than the use of eye drops (mean between-group difference, 2.30 sec; P < .001).
- No adverse events were reported in either of the groups during the study period.
IN PRACTICE:
“As a safe, environmentally friendly, and low-cost intervention, laughter exercise could serve as a first-line, home-based treatment for people with symptomatic dry eye disease and limited corneal staining,” the authors of the study reported.
SOURCE:
This study was led by Jing Li from the State Key Laboratory of Ophthalmology at the Zhongshan Ophthalmic Center in Sun Yat-sen University in Guangzhou, China, and was published online on September 11, 2024, in The BMJ.
LIMITATIONS:
The study lacked a double-blinded design. The laughter exercise required a greater time investment than the application of eye drops, which may affect adherence in the long run.
DISCLOSURES:
This study was supported by grants from the National Natural Science Foundation of China and High-level Hospital Construction Project. The authors declared receiving support from the funding agencies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
Laughter exercise practiced four times a day was noninferior to 0.1% sodium hyaluronic acid in alleviating the symptoms of dry eye disease and improved tear film stability and the function of the meibomian gland, researchers found.
METHODOLOGY:
- Researchers aimed to assess the effectiveness and safety of laughter exercise in patients with symptomatic dry eye disease by conducting a two-arm clinical trial at the largest ophthalmic center in southern China.
- They included 299 patients aged 18-45 years (74% women) with symptomatic dry eye disease who were randomly assigned to receive either laughter exercise or eye drops with 0.1% sodium hyaluronic acid four times daily for 8 weeks.
- All the participants were required to have an ocular surface disease index score, a measure of symptoms related to dry eye disease, between 18 and 80 points and a fluorescein tear film breakup time of no more than 8 seconds.
- Participants in the laughter exercise group watched an instructional video and were requested to repeat the phrases “Hee hee hee, hah hah hah, cheese cheese cheese, cheek cheek cheek, hah hah hah hah hah hah” 30 times per 5-minute session.
- The primary outcome was the mean change in the ocular surface disease index score from baseline to 8 weeks.
TAKEAWAY:
- At 8 weeks, the ocular surface disease index score reduced by 10.50 points (95% CI, −13.1 to −7.82) in the laughter exercise group and by 8.83 points (95% CI, −11.7 to −6.02) in the group prescribed eye drops.
- At 12 weeks, patients in the laughter exercise group showed a significantly greater reduction in the ocular surface disease index score than those in the group prescribed eye drops (mean between-group difference, −4.08 points; P = .024).
- Laughter exercise also led to a more significant improvement in the noninvasive tear breakup time than the use of eye drops (mean between-group difference, 2.30 sec; P < .001).
- No adverse events were reported in either of the groups during the study period.
IN PRACTICE:
“As a safe, environmentally friendly, and low-cost intervention, laughter exercise could serve as a first-line, home-based treatment for people with symptomatic dry eye disease and limited corneal staining,” the authors of the study reported.
SOURCE:
This study was led by Jing Li from the State Key Laboratory of Ophthalmology at the Zhongshan Ophthalmic Center in Sun Yat-sen University in Guangzhou, China, and was published online on September 11, 2024, in The BMJ.
LIMITATIONS:
The study lacked a double-blinded design. The laughter exercise required a greater time investment than the application of eye drops, which may affect adherence in the long run.
DISCLOSURES:
This study was supported by grants from the National Natural Science Foundation of China and High-level Hospital Construction Project. The authors declared receiving support from the funding agencies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
A Simple Blood Test May Predict Cancer Risk in T2D
TOPLINE:
potentially enabling the identification of higher-risk individuals through a simple blood test.
METHODOLOGY:
- T2D is associated with an increased risk for obesity-related cancers, including breast, renal, uterine, thyroid, ovarian, and gastrointestinal cancers, as well as multiple myeloma, possibly because of chronic low-grade inflammation.
- Researchers explored whether the markers of inflammation IL-6, tumor necrosis factor alpha (TNF-alpha), and high-sensitivity C-reactive protein (hsCRP) can serve as predictive biomarkers for obesity-related cancers in patients recently diagnosed with T2D.
- They identified patients with recent-onset T2D and no prior history of cancer participating in the ongoing Danish Centre for Strategic Research in Type 2 Diabetes cohort study.
- At study initiation, plasma levels of IL-6 and TNF-alpha were measured using Meso Scale Discovery assays, and serum levels of hsCRP were measured using immunofluorometric assays.
TAKEAWAY:
- Among 6,466 eligible patients (40.5% women; median age, 60.9 years), 327 developed obesity-related cancers over a median follow-up of 8.8 years.
- Each SD increase in log-transformed IL-6 levels increased the risk for obesity-related cancers by 19%.
- The researchers did not find a strong association between TNF-alpha or hsCRP and obesity-related cancers.
- The addition of baseline IL-6 levels to other well-known risk factors for obesity-related cancers improved the performance of a cancer prediction model from 0.685 to 0.693, translating to a small but important increase in the ability to predict whether an individual would develop one of these cancers.
IN PRACTICE:
“In future, a simple blood test could identify those at higher risk of the cancers,” said the study’s lead author in an accompanying press release.
SOURCE:
The study was led by Mathilde D. Bennetsen, Steno Diabetes Center Odense, Odense University Hospital, Odense, Denmark, and published online on August 27 as an early release from the European Association for the Study of Diabetes (EASD) 2024 Annual Meeting.
LIMITATIONS:
No limitations were discussed in this abstract. However, the reliance on registry data may have introduced potential biases related to data accuracy and completeness.
DISCLOSURES:
The Danish Centre for Strategic Research in Type 2 Diabetes was supported by grants from the Danish Agency for Science and the Novo Nordisk Foundation. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
potentially enabling the identification of higher-risk individuals through a simple blood test.
METHODOLOGY:
- T2D is associated with an increased risk for obesity-related cancers, including breast, renal, uterine, thyroid, ovarian, and gastrointestinal cancers, as well as multiple myeloma, possibly because of chronic low-grade inflammation.
- Researchers explored whether the markers of inflammation IL-6, tumor necrosis factor alpha (TNF-alpha), and high-sensitivity C-reactive protein (hsCRP) can serve as predictive biomarkers for obesity-related cancers in patients recently diagnosed with T2D.
- They identified patients with recent-onset T2D and no prior history of cancer participating in the ongoing Danish Centre for Strategic Research in Type 2 Diabetes cohort study.
- At study initiation, plasma levels of IL-6 and TNF-alpha were measured using Meso Scale Discovery assays, and serum levels of hsCRP were measured using immunofluorometric assays.
TAKEAWAY:
- Among 6,466 eligible patients (40.5% women; median age, 60.9 years), 327 developed obesity-related cancers over a median follow-up of 8.8 years.
- Each SD increase in log-transformed IL-6 levels increased the risk for obesity-related cancers by 19%.
- The researchers did not find a strong association between TNF-alpha or hsCRP and obesity-related cancers.
- The addition of baseline IL-6 levels to other well-known risk factors for obesity-related cancers improved the performance of a cancer prediction model from 0.685 to 0.693, translating to a small but important increase in the ability to predict whether an individual would develop one of these cancers.
IN PRACTICE:
“In future, a simple blood test could identify those at higher risk of the cancers,” said the study’s lead author in an accompanying press release.
SOURCE:
The study was led by Mathilde D. Bennetsen, Steno Diabetes Center Odense, Odense University Hospital, Odense, Denmark, and published online on August 27 as an early release from the European Association for the Study of Diabetes (EASD) 2024 Annual Meeting.
LIMITATIONS:
No limitations were discussed in this abstract. However, the reliance on registry data may have introduced potential biases related to data accuracy and completeness.
DISCLOSURES:
The Danish Centre for Strategic Research in Type 2 Diabetes was supported by grants from the Danish Agency for Science and the Novo Nordisk Foundation. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
potentially enabling the identification of higher-risk individuals through a simple blood test.
METHODOLOGY:
- T2D is associated with an increased risk for obesity-related cancers, including breast, renal, uterine, thyroid, ovarian, and gastrointestinal cancers, as well as multiple myeloma, possibly because of chronic low-grade inflammation.
- Researchers explored whether the markers of inflammation IL-6, tumor necrosis factor alpha (TNF-alpha), and high-sensitivity C-reactive protein (hsCRP) can serve as predictive biomarkers for obesity-related cancers in patients recently diagnosed with T2D.
- They identified patients with recent-onset T2D and no prior history of cancer participating in the ongoing Danish Centre for Strategic Research in Type 2 Diabetes cohort study.
- At study initiation, plasma levels of IL-6 and TNF-alpha were measured using Meso Scale Discovery assays, and serum levels of hsCRP were measured using immunofluorometric assays.
TAKEAWAY:
- Among 6,466 eligible patients (40.5% women; median age, 60.9 years), 327 developed obesity-related cancers over a median follow-up of 8.8 years.
- Each SD increase in log-transformed IL-6 levels increased the risk for obesity-related cancers by 19%.
- The researchers did not find a strong association between TNF-alpha or hsCRP and obesity-related cancers.
- The addition of baseline IL-6 levels to other well-known risk factors for obesity-related cancers improved the performance of a cancer prediction model from 0.685 to 0.693, translating to a small but important increase in the ability to predict whether an individual would develop one of these cancers.
IN PRACTICE:
“In future, a simple blood test could identify those at higher risk of the cancers,” said the study’s lead author in an accompanying press release.
SOURCE:
The study was led by Mathilde D. Bennetsen, Steno Diabetes Center Odense, Odense University Hospital, Odense, Denmark, and published online on August 27 as an early release from the European Association for the Study of Diabetes (EASD) 2024 Annual Meeting.
LIMITATIONS:
No limitations were discussed in this abstract. However, the reliance on registry data may have introduced potential biases related to data accuracy and completeness.
DISCLOSURES:
The Danish Centre for Strategic Research in Type 2 Diabetes was supported by grants from the Danish Agency for Science and the Novo Nordisk Foundation. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Alcohol’s Effect on Gout Risk Strongest in Men But Present in Both Sexes
TOPLINE:
A higher alcohol consumption is associated with an increased risk for gout, more strongly in men than in women. This sex-specific difference may be attributed to the different types of alcohol consumed by men and women, rather than biologic variations.
METHODOLOGY:
- This prospective cohort study investigated the association between total and specific alcohol consumption and the long-term risk for incident gout in 179,828 men (mean age, 56.0 years) and 221,300 women (mean age, 56.0 years) from the UK Biobank who did not have gout at baseline.
- Alcohol consumption was assessed using a computer-assisted touch screen system. Among men, 2.9%, 3.6%, and 93.6% were identified as never, former, and current drinkers, respectively. Among women, 5.9%, 3.6%, and 90.5% were identified as never, former, and current drinkers, respectively.
- Participants were also required to share details about their weekly alcohol intake and the types of alcoholic beverages they consumed (red wine, champagne or white wine, beer or cider, spirits, or fortified wine).
- The median follow-up duration of this study was 12.7 years.
- Cases of incident gout during the follow-up period were identified using hospital records and the International Classification of Diseases codes.
TAKEAWAY:
- The risk for gout was 69% higher in men who were current drinkers than in those who were never drinkers (hazard ratio [HR], 1.69; 95% CI, 1.30-2.18), while an inverse association was observed in women who were current drinkers, although it was not statistically significant. A significant interaction was observed between drinking status and sex (P < .001 for interaction).
- Among current drinkers, more frequent alcohol consumption was associated with a higher risk for gout among both sexes, with the association being stronger in men (HR, 2.05; 95% CI, 1.84-2.30) than in women (HR, 1.34; 95% CI, 1.12-1.61).
- The consumption of beer or cider was higher in men than in women (4.2 vs 0.4 pints/wk).
- Among all alcoholic beverages, the consumption of beer or cider (per 1 pint/d) showed the strongest association with the risk for gout in both men (HR, 1.60; 95% CI, 1.53-1.67) and women (HR, 1.62; 95% CI, 1.02-2.57).
IN PRACTICE:
“The observed sex-specific difference in the association of total alcohol consumption with incident gout may be owing to differences between men and women in the types of alcohol consumed rather than biological differences,” the authors wrote.
SOURCE:
The study was led by Jie-Qiong Lyu, MPH, Department of Nutrition and Food Hygiene, School of Public Health, Suzhou Medical College of Soochow University in China. It was published online in JAMA Network Open.
LIMITATIONS:
The frequency of alcohol consumption was self-reported, leading to potential misclassification. Incident cases of gout were identified from hospital records, which may have caused some undiagnosed cases or those diagnosed only in primary care settings to be missed. Most participants were of European descent and relatively healthier than the general population, limiting generalizability.
DISCLOSURES:
This work was supported by the Gusu Leading Talent Plan for Scientific and Technological Innovation and Entrepreneurship. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
A higher alcohol consumption is associated with an increased risk for gout, more strongly in men than in women. This sex-specific difference may be attributed to the different types of alcohol consumed by men and women, rather than biologic variations.
METHODOLOGY:
- This prospective cohort study investigated the association between total and specific alcohol consumption and the long-term risk for incident gout in 179,828 men (mean age, 56.0 years) and 221,300 women (mean age, 56.0 years) from the UK Biobank who did not have gout at baseline.
- Alcohol consumption was assessed using a computer-assisted touch screen system. Among men, 2.9%, 3.6%, and 93.6% were identified as never, former, and current drinkers, respectively. Among women, 5.9%, 3.6%, and 90.5% were identified as never, former, and current drinkers, respectively.
- Participants were also required to share details about their weekly alcohol intake and the types of alcoholic beverages they consumed (red wine, champagne or white wine, beer or cider, spirits, or fortified wine).
- The median follow-up duration of this study was 12.7 years.
- Cases of incident gout during the follow-up period were identified using hospital records and the International Classification of Diseases codes.
TAKEAWAY:
- The risk for gout was 69% higher in men who were current drinkers than in those who were never drinkers (hazard ratio [HR], 1.69; 95% CI, 1.30-2.18), while an inverse association was observed in women who were current drinkers, although it was not statistically significant. A significant interaction was observed between drinking status and sex (P < .001 for interaction).
- Among current drinkers, more frequent alcohol consumption was associated with a higher risk for gout among both sexes, with the association being stronger in men (HR, 2.05; 95% CI, 1.84-2.30) than in women (HR, 1.34; 95% CI, 1.12-1.61).
- The consumption of beer or cider was higher in men than in women (4.2 vs 0.4 pints/wk).
- Among all alcoholic beverages, the consumption of beer or cider (per 1 pint/d) showed the strongest association with the risk for gout in both men (HR, 1.60; 95% CI, 1.53-1.67) and women (HR, 1.62; 95% CI, 1.02-2.57).
IN PRACTICE:
“The observed sex-specific difference in the association of total alcohol consumption with incident gout may be owing to differences between men and women in the types of alcohol consumed rather than biological differences,” the authors wrote.
SOURCE:
The study was led by Jie-Qiong Lyu, MPH, Department of Nutrition and Food Hygiene, School of Public Health, Suzhou Medical College of Soochow University in China. It was published online in JAMA Network Open.
LIMITATIONS:
The frequency of alcohol consumption was self-reported, leading to potential misclassification. Incident cases of gout were identified from hospital records, which may have caused some undiagnosed cases or those diagnosed only in primary care settings to be missed. Most participants were of European descent and relatively healthier than the general population, limiting generalizability.
DISCLOSURES:
This work was supported by the Gusu Leading Talent Plan for Scientific and Technological Innovation and Entrepreneurship. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
A higher alcohol consumption is associated with an increased risk for gout, more strongly in men than in women. This sex-specific difference may be attributed to the different types of alcohol consumed by men and women, rather than biologic variations.
METHODOLOGY:
- This prospective cohort study investigated the association between total and specific alcohol consumption and the long-term risk for incident gout in 179,828 men (mean age, 56.0 years) and 221,300 women (mean age, 56.0 years) from the UK Biobank who did not have gout at baseline.
- Alcohol consumption was assessed using a computer-assisted touch screen system. Among men, 2.9%, 3.6%, and 93.6% were identified as never, former, and current drinkers, respectively. Among women, 5.9%, 3.6%, and 90.5% were identified as never, former, and current drinkers, respectively.
- Participants were also required to share details about their weekly alcohol intake and the types of alcoholic beverages they consumed (red wine, champagne or white wine, beer or cider, spirits, or fortified wine).
- The median follow-up duration of this study was 12.7 years.
- Cases of incident gout during the follow-up period were identified using hospital records and the International Classification of Diseases codes.
TAKEAWAY:
- The risk for gout was 69% higher in men who were current drinkers than in those who were never drinkers (hazard ratio [HR], 1.69; 95% CI, 1.30-2.18), while an inverse association was observed in women who were current drinkers, although it was not statistically significant. A significant interaction was observed between drinking status and sex (P < .001 for interaction).
- Among current drinkers, more frequent alcohol consumption was associated with a higher risk for gout among both sexes, with the association being stronger in men (HR, 2.05; 95% CI, 1.84-2.30) than in women (HR, 1.34; 95% CI, 1.12-1.61).
- The consumption of beer or cider was higher in men than in women (4.2 vs 0.4 pints/wk).
- Among all alcoholic beverages, the consumption of beer or cider (per 1 pint/d) showed the strongest association with the risk for gout in both men (HR, 1.60; 95% CI, 1.53-1.67) and women (HR, 1.62; 95% CI, 1.02-2.57).
IN PRACTICE:
“The observed sex-specific difference in the association of total alcohol consumption with incident gout may be owing to differences between men and women in the types of alcohol consumed rather than biological differences,” the authors wrote.
SOURCE:
The study was led by Jie-Qiong Lyu, MPH, Department of Nutrition and Food Hygiene, School of Public Health, Suzhou Medical College of Soochow University in China. It was published online in JAMA Network Open.
LIMITATIONS:
The frequency of alcohol consumption was self-reported, leading to potential misclassification. Incident cases of gout were identified from hospital records, which may have caused some undiagnosed cases or those diagnosed only in primary care settings to be missed. Most participants were of European descent and relatively healthier than the general population, limiting generalizability.
DISCLOSURES:
This work was supported by the Gusu Leading Talent Plan for Scientific and Technological Innovation and Entrepreneurship. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Thyroid Resistance Ups Mortality in Euthyroid CKD Patients
TOPLINE:
An impaired central sensitivity to thyroid hormone may be associated with an increased risk for death in patients with chronic kidney disease (CKD) and normal thyroid function.
METHODOLOGY:
- Previous studies have shown that abnormal levels of thyroid-stimulating hormone (TSH) are associated with a higher mortality risk in patients with CKD, but whether the risk extends to those with normal thyroid function remains controversial.
- Researchers investigated the association between central sensitivity to thyroid hormone and the risk for all-cause mortality in 1303 euthyroid patients with CKD (mean age, 60 years; 59% women) from the National Health and Nutrition Examination Survey database (2007-2012).
- All participants had CKD stages I-IV, defined as an estimated glomerular filtration rate < 60 mL/min/1.73 m2 and/or a urinary albumin to urinary creatinine ratio ≥ 30 mg/g.
- The central sensitivity to thyroid hormone was primarily evaluated using a new central thyroid hormone resistance index, the Thyroid Feedback Quantile–based Index (TFQI), using free thyroxine and TSH concentrations.
- The participants were followed for a median duration of 115 months, during which 503 died.
TAKEAWAY:
- Patients with CKD who died during the follow-up period had a significantly higher TFQI (P < .001) than those who survived.
- The rates of all-cause mortality increased from 26.61% in the lowest TFQI tertile to 40.89% in the highest tertile (P = .001).
- A per unit increase in the TFQI was associated with a 40% increased risk for all-cause mortality (hazard ratio, 1.40; 95% CI, 1.10-1.79).
- This association between TFQI level and all-cause mortality persisted in all subgroups stratified by age, gender, race, body mass index, hypertension, diabetes, cardiovascular diseases, and CKD stages.
IN PRACTICE:
“Our study demonstrates that impaired sensitivity to thyroid hormone might be associated with all-cause mortality in CKD patients with normal thyroid function, independent of other traditional risk factors and comorbidities,” the authors wrote.
SOURCE:
This study was led by Qichao Yang and Ru Dong, Department of Endocrinology, Affiliated Wujin Hospital of Jiangsu University, Changzhou, China, and was published online on August 6, 2024, in BMC Public Health.
LIMITATIONS:
Thyroid function was measured only at baseline, and the changes in thyroid function over time were not measured. The study excluded people on thyroid hormone replacement therapy but did not consider other medication use that might have affected thyroid function, such as beta-blockers, steroids, and amiodarone. Thyroid-related antibodies, metabolic syndrome, and nonalcoholic fatty liver disease were not included in the analysis as possible confounding factors. The US-based sample requires further validation.
DISCLOSURES:
The study was supported by the Changzhou Health Commission. The authors declared no competing interests.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
An impaired central sensitivity to thyroid hormone may be associated with an increased risk for death in patients with chronic kidney disease (CKD) and normal thyroid function.
METHODOLOGY:
- Previous studies have shown that abnormal levels of thyroid-stimulating hormone (TSH) are associated with a higher mortality risk in patients with CKD, but whether the risk extends to those with normal thyroid function remains controversial.
- Researchers investigated the association between central sensitivity to thyroid hormone and the risk for all-cause mortality in 1303 euthyroid patients with CKD (mean age, 60 years; 59% women) from the National Health and Nutrition Examination Survey database (2007-2012).
- All participants had CKD stages I-IV, defined as an estimated glomerular filtration rate < 60 mL/min/1.73 m2 and/or a urinary albumin to urinary creatinine ratio ≥ 30 mg/g.
- The central sensitivity to thyroid hormone was primarily evaluated using a new central thyroid hormone resistance index, the Thyroid Feedback Quantile–based Index (TFQI), using free thyroxine and TSH concentrations.
- The participants were followed for a median duration of 115 months, during which 503 died.
TAKEAWAY:
- Patients with CKD who died during the follow-up period had a significantly higher TFQI (P < .001) than those who survived.
- The rates of all-cause mortality increased from 26.61% in the lowest TFQI tertile to 40.89% in the highest tertile (P = .001).
- A per unit increase in the TFQI was associated with a 40% increased risk for all-cause mortality (hazard ratio, 1.40; 95% CI, 1.10-1.79).
- This association between TFQI level and all-cause mortality persisted in all subgroups stratified by age, gender, race, body mass index, hypertension, diabetes, cardiovascular diseases, and CKD stages.
IN PRACTICE:
“Our study demonstrates that impaired sensitivity to thyroid hormone might be associated with all-cause mortality in CKD patients with normal thyroid function, independent of other traditional risk factors and comorbidities,” the authors wrote.
SOURCE:
This study was led by Qichao Yang and Ru Dong, Department of Endocrinology, Affiliated Wujin Hospital of Jiangsu University, Changzhou, China, and was published online on August 6, 2024, in BMC Public Health.
LIMITATIONS:
Thyroid function was measured only at baseline, and the changes in thyroid function over time were not measured. The study excluded people on thyroid hormone replacement therapy but did not consider other medication use that might have affected thyroid function, such as beta-blockers, steroids, and amiodarone. Thyroid-related antibodies, metabolic syndrome, and nonalcoholic fatty liver disease were not included in the analysis as possible confounding factors. The US-based sample requires further validation.
DISCLOSURES:
The study was supported by the Changzhou Health Commission. The authors declared no competing interests.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
An impaired central sensitivity to thyroid hormone may be associated with an increased risk for death in patients with chronic kidney disease (CKD) and normal thyroid function.
METHODOLOGY:
- Previous studies have shown that abnormal levels of thyroid-stimulating hormone (TSH) are associated with a higher mortality risk in patients with CKD, but whether the risk extends to those with normal thyroid function remains controversial.
- Researchers investigated the association between central sensitivity to thyroid hormone and the risk for all-cause mortality in 1303 euthyroid patients with CKD (mean age, 60 years; 59% women) from the National Health and Nutrition Examination Survey database (2007-2012).
- All participants had CKD stages I-IV, defined as an estimated glomerular filtration rate < 60 mL/min/1.73 m2 and/or a urinary albumin to urinary creatinine ratio ≥ 30 mg/g.
- The central sensitivity to thyroid hormone was primarily evaluated using a new central thyroid hormone resistance index, the Thyroid Feedback Quantile–based Index (TFQI), using free thyroxine and TSH concentrations.
- The participants were followed for a median duration of 115 months, during which 503 died.
TAKEAWAY:
- Patients with CKD who died during the follow-up period had a significantly higher TFQI (P < .001) than those who survived.
- The rates of all-cause mortality increased from 26.61% in the lowest TFQI tertile to 40.89% in the highest tertile (P = .001).
- A per unit increase in the TFQI was associated with a 40% increased risk for all-cause mortality (hazard ratio, 1.40; 95% CI, 1.10-1.79).
- This association between TFQI level and all-cause mortality persisted in all subgroups stratified by age, gender, race, body mass index, hypertension, diabetes, cardiovascular diseases, and CKD stages.
IN PRACTICE:
“Our study demonstrates that impaired sensitivity to thyroid hormone might be associated with all-cause mortality in CKD patients with normal thyroid function, independent of other traditional risk factors and comorbidities,” the authors wrote.
SOURCE:
This study was led by Qichao Yang and Ru Dong, Department of Endocrinology, Affiliated Wujin Hospital of Jiangsu University, Changzhou, China, and was published online on August 6, 2024, in BMC Public Health.
LIMITATIONS:
Thyroid function was measured only at baseline, and the changes in thyroid function over time were not measured. The study excluded people on thyroid hormone replacement therapy but did not consider other medication use that might have affected thyroid function, such as beta-blockers, steroids, and amiodarone. Thyroid-related antibodies, metabolic syndrome, and nonalcoholic fatty liver disease were not included in the analysis as possible confounding factors. The US-based sample requires further validation.
DISCLOSURES:
The study was supported by the Changzhou Health Commission. The authors declared no competing interests.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Analysis of Colchicine’s Drug-Drug Interactions Finds Little Risk
TOPLINE:
The presence of an operational classification of drug interactions (ORCA) class 3 or 4 drug-drug interactions (DDIs) did not increase the risk for colchicine-related gastrointestinal adverse events or modify the effect of colchicine on death or hospitalization caused by COVID-19 infection in ambulatory patients.
METHODOLOGY:
- This secondary analysis of the COLCORONA trial aimed to evaluate if a potential DDI of colchicine was associated with changes in its pharmacokinetics or modified its clinical safety and efficacy in patients with COVID-19.
- Overall, 4432 ambulatory patients with COVID-19 (median age, 54 years; 54% women) were randomly assigned to receive colchicine 0.5 mg twice daily for 3 days and then 0.5 mg once daily for 27 days (n = 2205) or a placebo (n = 2227).
- All the participants had at least one high-risk criterion such as age ≥ 70 years, diabetes, heart failure, systolic blood pressure ≥ 150 mm Hg, respiratory disease, coronary disease, body temperature ≥ 38.4 °C within the last 48 hours, dyspnea, bicytopenia, pancytopenia, or high neutrophil count with low lymphocyte count.
- The medications that could interact with colchicine were determined and categorized under ORCA classes 1 (contraindicated), 2 (provisionally contraindicated), 3 (conditional use), or 4 (minimal risk).
- The primary outcome was any gastrointestinal adverse event assessed over a 30-day follow-up period.
TAKEAWAY:
- Among all the participants, 1% received medications with an ORCA class 2 interaction, 14% with a class 3 interaction, and 13% with a class 4 interaction; rosuvastatin (12%) and atorvastatin (10%) were the most common interacting medications.
- The odds of any gastrointestinal adverse event were 1.80 times and 1.68 times higher in the colchicine arm than in the placebo arm among those without and with a DDI, respectively, with the effect of colchicine being consistent regardless of the presence of drug interactions (P = .69 for interaction).
- Similarly, DDIs did not influence the effect of colchicine on combined risk for COVID-19 hospitalization or mortality (P = .80 for interaction).
IN PRACTICE:
“Once potential DDIs have been identified through screening, they must be tested,” Hemalkumar B. Mehta, PhD, and G. Caleb Alexander, MD, of the Johns Hopkins Bloomberg School of Public Health, Baltimore, wrote in an invited commentary published online in JAMA Network Open. “Theoretical DDIs may not translate into real-world harms,” they added.
SOURCE:
The study was led by Lama S. Alfehaid, PharmD, of Brigham and Women’s Hospital, Boston. It was published online in JAMA Network Open.
LIMITATIONS:
This study focused on the medications used by participants at baseline, which may not have captured all potential DDIs. The findings did not provide information on rare adverse events, such as rhabdomyolysis, which usually occur months after initiating drug therapy. Furthermore, all the study participants had confirmed SARS-CoV-2 infection, which may have increased their susceptibility to adverse reactions associated with the use of colchicine.
DISCLOSURES:
Some authors were supported by grants from the National Institutes of Health/National Heart, Lung, and Blood Institute, American Heart Association, and other sources. The authors also declared serving on advisory boards or on the board of directors; receiving personal fees, grants, research support, or speaking fees; or having other ties with many pharmaceutical companies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
The presence of an operational classification of drug interactions (ORCA) class 3 or 4 drug-drug interactions (DDIs) did not increase the risk for colchicine-related gastrointestinal adverse events or modify the effect of colchicine on death or hospitalization caused by COVID-19 infection in ambulatory patients.
METHODOLOGY:
- This secondary analysis of the COLCORONA trial aimed to evaluate if a potential DDI of colchicine was associated with changes in its pharmacokinetics or modified its clinical safety and efficacy in patients with COVID-19.
- Overall, 4432 ambulatory patients with COVID-19 (median age, 54 years; 54% women) were randomly assigned to receive colchicine 0.5 mg twice daily for 3 days and then 0.5 mg once daily for 27 days (n = 2205) or a placebo (n = 2227).
- All the participants had at least one high-risk criterion such as age ≥ 70 years, diabetes, heart failure, systolic blood pressure ≥ 150 mm Hg, respiratory disease, coronary disease, body temperature ≥ 38.4 °C within the last 48 hours, dyspnea, bicytopenia, pancytopenia, or high neutrophil count with low lymphocyte count.
- The medications that could interact with colchicine were determined and categorized under ORCA classes 1 (contraindicated), 2 (provisionally contraindicated), 3 (conditional use), or 4 (minimal risk).
- The primary outcome was any gastrointestinal adverse event assessed over a 30-day follow-up period.
TAKEAWAY:
- Among all the participants, 1% received medications with an ORCA class 2 interaction, 14% with a class 3 interaction, and 13% with a class 4 interaction; rosuvastatin (12%) and atorvastatin (10%) were the most common interacting medications.
- The odds of any gastrointestinal adverse event were 1.80 times and 1.68 times higher in the colchicine arm than in the placebo arm among those without and with a DDI, respectively, with the effect of colchicine being consistent regardless of the presence of drug interactions (P = .69 for interaction).
- Similarly, DDIs did not influence the effect of colchicine on combined risk for COVID-19 hospitalization or mortality (P = .80 for interaction).
IN PRACTICE:
“Once potential DDIs have been identified through screening, they must be tested,” Hemalkumar B. Mehta, PhD, and G. Caleb Alexander, MD, of the Johns Hopkins Bloomberg School of Public Health, Baltimore, wrote in an invited commentary published online in JAMA Network Open. “Theoretical DDIs may not translate into real-world harms,” they added.
SOURCE:
The study was led by Lama S. Alfehaid, PharmD, of Brigham and Women’s Hospital, Boston. It was published online in JAMA Network Open.
LIMITATIONS:
This study focused on the medications used by participants at baseline, which may not have captured all potential DDIs. The findings did not provide information on rare adverse events, such as rhabdomyolysis, which usually occur months after initiating drug therapy. Furthermore, all the study participants had confirmed SARS-CoV-2 infection, which may have increased their susceptibility to adverse reactions associated with the use of colchicine.
DISCLOSURES:
Some authors were supported by grants from the National Institutes of Health/National Heart, Lung, and Blood Institute, American Heart Association, and other sources. The authors also declared serving on advisory boards or on the board of directors; receiving personal fees, grants, research support, or speaking fees; or having other ties with many pharmaceutical companies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
The presence of an operational classification of drug interactions (ORCA) class 3 or 4 drug-drug interactions (DDIs) did not increase the risk for colchicine-related gastrointestinal adverse events or modify the effect of colchicine on death or hospitalization caused by COVID-19 infection in ambulatory patients.
METHODOLOGY:
- This secondary analysis of the COLCORONA trial aimed to evaluate if a potential DDI of colchicine was associated with changes in its pharmacokinetics or modified its clinical safety and efficacy in patients with COVID-19.
- Overall, 4432 ambulatory patients with COVID-19 (median age, 54 years; 54% women) were randomly assigned to receive colchicine 0.5 mg twice daily for 3 days and then 0.5 mg once daily for 27 days (n = 2205) or a placebo (n = 2227).
- All the participants had at least one high-risk criterion such as age ≥ 70 years, diabetes, heart failure, systolic blood pressure ≥ 150 mm Hg, respiratory disease, coronary disease, body temperature ≥ 38.4 °C within the last 48 hours, dyspnea, bicytopenia, pancytopenia, or high neutrophil count with low lymphocyte count.
- The medications that could interact with colchicine were determined and categorized under ORCA classes 1 (contraindicated), 2 (provisionally contraindicated), 3 (conditional use), or 4 (minimal risk).
- The primary outcome was any gastrointestinal adverse event assessed over a 30-day follow-up period.
TAKEAWAY:
- Among all the participants, 1% received medications with an ORCA class 2 interaction, 14% with a class 3 interaction, and 13% with a class 4 interaction; rosuvastatin (12%) and atorvastatin (10%) were the most common interacting medications.
- The odds of any gastrointestinal adverse event were 1.80 times and 1.68 times higher in the colchicine arm than in the placebo arm among those without and with a DDI, respectively, with the effect of colchicine being consistent regardless of the presence of drug interactions (P = .69 for interaction).
- Similarly, DDIs did not influence the effect of colchicine on combined risk for COVID-19 hospitalization or mortality (P = .80 for interaction).
IN PRACTICE:
“Once potential DDIs have been identified through screening, they must be tested,” Hemalkumar B. Mehta, PhD, and G. Caleb Alexander, MD, of the Johns Hopkins Bloomberg School of Public Health, Baltimore, wrote in an invited commentary published online in JAMA Network Open. “Theoretical DDIs may not translate into real-world harms,” they added.
SOURCE:
The study was led by Lama S. Alfehaid, PharmD, of Brigham and Women’s Hospital, Boston. It was published online in JAMA Network Open.
LIMITATIONS:
This study focused on the medications used by participants at baseline, which may not have captured all potential DDIs. The findings did not provide information on rare adverse events, such as rhabdomyolysis, which usually occur months after initiating drug therapy. Furthermore, all the study participants had confirmed SARS-CoV-2 infection, which may have increased their susceptibility to adverse reactions associated with the use of colchicine.
DISCLOSURES:
Some authors were supported by grants from the National Institutes of Health/National Heart, Lung, and Blood Institute, American Heart Association, and other sources. The authors also declared serving on advisory boards or on the board of directors; receiving personal fees, grants, research support, or speaking fees; or having other ties with many pharmaceutical companies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Testosterone Increases Metabolic Syndrome Risk in Trans Men
TOPLINE:
Long-term gender-affirming hormone treatment with testosterone increases the risk for metabolic syndromes in transmasculine individuals, whereas transfeminine individuals receiving estradiol have a lower risk.
METHODOLOGY:
- Many transgender individuals receive exogenous sex hormone therapy to reduce gender dysphoria and improve quality of life. These treatments, however, may influence the development of metabolic syndrome.
- This retrospective, longitudinal cohort study investigated the association between gender-affirming hormone treatment and metabolic syndrome scores in transfeminine and transmasculine individuals compared with cisgender men and women not receiving the treatment.
- Overall, 645 transgender participants (mean age at index date, 41.3 years; 494 transfeminine and 151 transmasculine) were matched with 645 cisgender participants (280 women and 365 men) from the Veterans Health Administration.
- Metabolic syndrome scores were calculated based on blood pressure; body mass index (BMI); and levels of high-density lipoprotein (HDL) cholesterol, triglycerides, and blood glucose.
- Changes in metabolic syndrome scores before and after hormonal transition were compared among transgender and cisgender individuals for the corresponding dates.
TAKEAWAY:
- After hormonal transition, all measured metabolic syndrome components significantly worsened in the transmasculine group (P < .05 for all).
- In contrast, the systolic blood pressure and triglyceride levels decreased, HDL cholesterol levels increased, and BMI showed no significant change in the transfeminine group after hormonal transition.
- The increase in metabolic syndrome scores after vs before the date of hormonal transition was the highest for transmasculine individuals (298.0%; P < .001), followed by cisgender women (108.3%; P < .001), cisgender men (49.3%; P = .02), and transfeminine individuals (3.0%; P = .77).
IN PRACTICE:
“This is relevant for the management of metabolic syndrome risk factors in cisgender and transgender individuals and to potentially predict the risk of atherosclerotic cardiovascular disease, type 2 diabetes, systolic hypertension, insulin resistance, and nonalcoholic fatty liver disease,” the authors wrote.
SOURCE:
Leila Hashemi, MD, MS, of the Department of General Internal Medicine, David Geffen School of Medicine, University of California, Los Angeles, led this study, which was published online in JAMA Network Open.
LIMITATIONS:
Causal inferences could not be drawn because of the study’s observational nature. The transmasculine and cisgender female groups were limited in size, and military veterans have special circumstances not representative of the general population. Minority stress among the transgender veterans was also not considered, which may have affected the health and well-being outcomes.
DISCLOSURES:
This study was supported by the National Institutes of Health and Office of Research on Women’s Health grants. One author received grants from the National Institutes of Health.
A version of this article first appeared on Medscape.com.
TOPLINE:
Long-term gender-affirming hormone treatment with testosterone increases the risk for metabolic syndromes in transmasculine individuals, whereas transfeminine individuals receiving estradiol have a lower risk.
METHODOLOGY:
- Many transgender individuals receive exogenous sex hormone therapy to reduce gender dysphoria and improve quality of life. These treatments, however, may influence the development of metabolic syndrome.
- This retrospective, longitudinal cohort study investigated the association between gender-affirming hormone treatment and metabolic syndrome scores in transfeminine and transmasculine individuals compared with cisgender men and women not receiving the treatment.
- Overall, 645 transgender participants (mean age at index date, 41.3 years; 494 transfeminine and 151 transmasculine) were matched with 645 cisgender participants (280 women and 365 men) from the Veterans Health Administration.
- Metabolic syndrome scores were calculated based on blood pressure; body mass index (BMI); and levels of high-density lipoprotein (HDL) cholesterol, triglycerides, and blood glucose.
- Changes in metabolic syndrome scores before and after hormonal transition were compared among transgender and cisgender individuals for the corresponding dates.
TAKEAWAY:
- After hormonal transition, all measured metabolic syndrome components significantly worsened in the transmasculine group (P < .05 for all).
- In contrast, the systolic blood pressure and triglyceride levels decreased, HDL cholesterol levels increased, and BMI showed no significant change in the transfeminine group after hormonal transition.
- The increase in metabolic syndrome scores after vs before the date of hormonal transition was the highest for transmasculine individuals (298.0%; P < .001), followed by cisgender women (108.3%; P < .001), cisgender men (49.3%; P = .02), and transfeminine individuals (3.0%; P = .77).
IN PRACTICE:
“This is relevant for the management of metabolic syndrome risk factors in cisgender and transgender individuals and to potentially predict the risk of atherosclerotic cardiovascular disease, type 2 diabetes, systolic hypertension, insulin resistance, and nonalcoholic fatty liver disease,” the authors wrote.
SOURCE:
Leila Hashemi, MD, MS, of the Department of General Internal Medicine, David Geffen School of Medicine, University of California, Los Angeles, led this study, which was published online in JAMA Network Open.
LIMITATIONS:
Causal inferences could not be drawn because of the study’s observational nature. The transmasculine and cisgender female groups were limited in size, and military veterans have special circumstances not representative of the general population. Minority stress among the transgender veterans was also not considered, which may have affected the health and well-being outcomes.
DISCLOSURES:
This study was supported by the National Institutes of Health and Office of Research on Women’s Health grants. One author received grants from the National Institutes of Health.
A version of this article first appeared on Medscape.com.
TOPLINE:
Long-term gender-affirming hormone treatment with testosterone increases the risk for metabolic syndromes in transmasculine individuals, whereas transfeminine individuals receiving estradiol have a lower risk.
METHODOLOGY:
- Many transgender individuals receive exogenous sex hormone therapy to reduce gender dysphoria and improve quality of life. These treatments, however, may influence the development of metabolic syndrome.
- This retrospective, longitudinal cohort study investigated the association between gender-affirming hormone treatment and metabolic syndrome scores in transfeminine and transmasculine individuals compared with cisgender men and women not receiving the treatment.
- Overall, 645 transgender participants (mean age at index date, 41.3 years; 494 transfeminine and 151 transmasculine) were matched with 645 cisgender participants (280 women and 365 men) from the Veterans Health Administration.
- Metabolic syndrome scores were calculated based on blood pressure; body mass index (BMI); and levels of high-density lipoprotein (HDL) cholesterol, triglycerides, and blood glucose.
- Changes in metabolic syndrome scores before and after hormonal transition were compared among transgender and cisgender individuals for the corresponding dates.
TAKEAWAY:
- After hormonal transition, all measured metabolic syndrome components significantly worsened in the transmasculine group (P < .05 for all).
- In contrast, the systolic blood pressure and triglyceride levels decreased, HDL cholesterol levels increased, and BMI showed no significant change in the transfeminine group after hormonal transition.
- The increase in metabolic syndrome scores after vs before the date of hormonal transition was the highest for transmasculine individuals (298.0%; P < .001), followed by cisgender women (108.3%; P < .001), cisgender men (49.3%; P = .02), and transfeminine individuals (3.0%; P = .77).
IN PRACTICE:
“This is relevant for the management of metabolic syndrome risk factors in cisgender and transgender individuals and to potentially predict the risk of atherosclerotic cardiovascular disease, type 2 diabetes, systolic hypertension, insulin resistance, and nonalcoholic fatty liver disease,” the authors wrote.
SOURCE:
Leila Hashemi, MD, MS, of the Department of General Internal Medicine, David Geffen School of Medicine, University of California, Los Angeles, led this study, which was published online in JAMA Network Open.
LIMITATIONS:
Causal inferences could not be drawn because of the study’s observational nature. The transmasculine and cisgender female groups were limited in size, and military veterans have special circumstances not representative of the general population. Minority stress among the transgender veterans was also not considered, which may have affected the health and well-being outcomes.
DISCLOSURES:
This study was supported by the National Institutes of Health and Office of Research on Women’s Health grants. One author received grants from the National Institutes of Health.
A version of this article first appeared on Medscape.com.