User login
Central Line Skin Reactions in Children: Survey Addresses Treatment Protocols in Use
TOPLINE:
A and reported varying management approaches.
METHODOLOGY:
- Researchers developed and administered a 14-item Qualtrics survey to 107 dermatologists providing pediatric inpatient care through the Society for Pediatric Dermatology’s Inpatient Dermatology Section and Section Chief email lists.
- A total of 35 dermatologists (33%) from multiple institutions responded to the survey; most respondents (94%) specialized in pediatric dermatology.
- Researchers assessed management of CLD-associated adverse skin reactions.
TAKEAWAY:
- All respondents reported receiving CLD-related consults, but 66% indicated there was no personal or institutional standardized approach for managing CLD-associated skin reactions.
- Respondents said most reactions were in children aged 1-12 years (19 or 76% of 25 respondents) compared with those aged < 1 year (3 or 12% of 25 respondents).
- Management strategies included switching to alternative products, applying topical corticosteroids, and performing patch testing for allergies.
IN PRACTICE:
“Insights derived from this study, including variation in clinician familiarity with reaction patterns, underscore the necessity of a standardized protocol for classifying and managing cutaneous CLD reactions in pediatric patients,” the authors wrote. “Further investigation is needed to better characterize CLD-associated allergic CD [contact dermatitis], irritant CD, and skin infections, as well as at-risk populations, to better inform clinical approaches,” they added.
SOURCE:
The study was led by Carly Mulinda, Columbia University College of Physicians and Surgeons, New York, and was published online on December 16 in Pediatric Dermatology.
LIMITATIONS:
The authors noted variable respondent awareness of institutional CLD and potential recency bias as key limitations of the study.
DISCLOSURES:
Study funding source was not declared. The authors reported no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
A and reported varying management approaches.
METHODOLOGY:
- Researchers developed and administered a 14-item Qualtrics survey to 107 dermatologists providing pediatric inpatient care through the Society for Pediatric Dermatology’s Inpatient Dermatology Section and Section Chief email lists.
- A total of 35 dermatologists (33%) from multiple institutions responded to the survey; most respondents (94%) specialized in pediatric dermatology.
- Researchers assessed management of CLD-associated adverse skin reactions.
TAKEAWAY:
- All respondents reported receiving CLD-related consults, but 66% indicated there was no personal or institutional standardized approach for managing CLD-associated skin reactions.
- Respondents said most reactions were in children aged 1-12 years (19 or 76% of 25 respondents) compared with those aged < 1 year (3 or 12% of 25 respondents).
- Management strategies included switching to alternative products, applying topical corticosteroids, and performing patch testing for allergies.
IN PRACTICE:
“Insights derived from this study, including variation in clinician familiarity with reaction patterns, underscore the necessity of a standardized protocol for classifying and managing cutaneous CLD reactions in pediatric patients,” the authors wrote. “Further investigation is needed to better characterize CLD-associated allergic CD [contact dermatitis], irritant CD, and skin infections, as well as at-risk populations, to better inform clinical approaches,” they added.
SOURCE:
The study was led by Carly Mulinda, Columbia University College of Physicians and Surgeons, New York, and was published online on December 16 in Pediatric Dermatology.
LIMITATIONS:
The authors noted variable respondent awareness of institutional CLD and potential recency bias as key limitations of the study.
DISCLOSURES:
Study funding source was not declared. The authors reported no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
A and reported varying management approaches.
METHODOLOGY:
- Researchers developed and administered a 14-item Qualtrics survey to 107 dermatologists providing pediatric inpatient care through the Society for Pediatric Dermatology’s Inpatient Dermatology Section and Section Chief email lists.
- A total of 35 dermatologists (33%) from multiple institutions responded to the survey; most respondents (94%) specialized in pediatric dermatology.
- Researchers assessed management of CLD-associated adverse skin reactions.
TAKEAWAY:
- All respondents reported receiving CLD-related consults, but 66% indicated there was no personal or institutional standardized approach for managing CLD-associated skin reactions.
- Respondents said most reactions were in children aged 1-12 years (19 or 76% of 25 respondents) compared with those aged < 1 year (3 or 12% of 25 respondents).
- Management strategies included switching to alternative products, applying topical corticosteroids, and performing patch testing for allergies.
IN PRACTICE:
“Insights derived from this study, including variation in clinician familiarity with reaction patterns, underscore the necessity of a standardized protocol for classifying and managing cutaneous CLD reactions in pediatric patients,” the authors wrote. “Further investigation is needed to better characterize CLD-associated allergic CD [contact dermatitis], irritant CD, and skin infections, as well as at-risk populations, to better inform clinical approaches,” they added.
SOURCE:
The study was led by Carly Mulinda, Columbia University College of Physicians and Surgeons, New York, and was published online on December 16 in Pediatric Dermatology.
LIMITATIONS:
The authors noted variable respondent awareness of institutional CLD and potential recency bias as key limitations of the study.
DISCLOSURES:
Study funding source was not declared. The authors reported no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
AI-Aided Colonoscopy’s ‘Intelligent’ Module Ups Polyp Detection
Colin J. Rees, a professor of gastroenterology in the Faculty of Medical Sciences at Newcastle University in Newcastle upon Tyne, England, and colleagues compared the real-world clinical effectiveness of computer-aided detection (CADe)–assisted colonoscopy using an “intelligent” module with that of standard colonoscopy in a study in The Lancet Gastroenterology & Hepatology.
They found the GI Genius Intelligent Endoscopy Module (Medtronic) increased the mean number of adenomas detected per procedure and the adenoma detection rate, especially for small, flat (type 0-IIa) polyps, and sessile serrated lesions, which are more likely to be missed.
“Missed sessile serrated lesions disproportionately increase the risk of post-colonoscopy colorectal cancer, thus the adoption of GI Genius into routine colonoscopy practice could not only increase polyp detection but also reduce the incidence of post-colonoscopy colorectal cancer,” the investigators wrote.
“AI is going to have a major impact upon most aspects of healthcare. Some areas of medical practice are now well established, and some are still in evolution,” Rees, who is also president of the British Society of Gastroenterology, said in an interview. “Within gastroenterology, the role of AI in endoscopic diagnostics is also evolving. The COLO-DETECT trial demonstrates that AI increases detection of lesions, and work is ongoing to see how AI might help with characterization and other elements of endoscopic practice.”
Study Details
The multicenter, open-label, parallel-arm, pragmatic randomized controlled trial was conducted at 12 National Health Service hospitals in England. The study cohort consisted of adults ≥ 18 years undergoing colorectal cancer (CRC) screening or colonoscopy for gastrointestinal symptom surveillance owing to personal or family history.
Recruiting staff, participants, and colonoscopists were unmasked to allocation, whereas histopathologists, cochief investigators, and trial statisticians were masked.
CADe-assisted colonoscopy consisted of standard colonoscopy plus the GI Genius module active for at least the entire inspection phase of colonoscope withdrawal.
The primary outcome was mean adenomas per procedure (total number of adenomas detected divided by total number of procedures). The key secondary outcome was adenoma detection rate (proportion of colonoscopies with at least one adenoma).
From March 2021 to April 2023, the investigators recruited 2032 participants, 55.7% men, with a mean cohort age of 62.4 years and randomly assigned them to CADe-assisted colonoscopy (n = 1015) or to standard colonoscopy (n = 1017). Of these, 60.6% were undergoing screening and 39.4% had symptomatic indications.
Mean adenomas per procedure were 1.56 (SD, 2.82; n = 1001 participants with data) in the CADe-assisted group vs 1.21 (n = 1009) in the standard group, for an adjusted mean difference of 0.36 (95% CI, 0.14-0.57; adjusted incidence rate ratio, 1.30; 95% CI, 1.15-1.47; P < .0001).
Adenomas were detected in 555 (56.6%) of 980 participants in the CADe-assisted group vs 477 (48.4%) of 986 in the standard group, representing a proportion difference of 8.3% (95% CI, 3.9-12.7; adjusted odds ratio, 1.47; 95% CI, 1.21-1.78; P < .0001).
As to safety, adverse events were numerically comparable in both the intervention and control groups, with overall events 25 vs 19 and serious events 4 vs 6. On independent review, no adverse events in the CADe-assisted colonoscopy group were related to GI Genius.
Offering a US perspective on the study, Nabil M. Mansour, MD, an associate professor and director of the McNair General GI Clinic at Baylor College of Medicine in Houston, Texas, said GI Genius and other CADe systems represent a significant advance over standard colonoscopy for identifying premalignant polyps. “While the data have been mixed, most studies, particularly randomized controlled trials have shown significant improvements with CADe in detection both terms of in adenomas per colonoscopy and reductions in adenoma miss rate,” he said in an interview.
He added that the main utility of CADe is for asymptomatic patients undergoing average-risk screening and surveillance colonoscopy for CRC screening and prevention, as well as for those with positive stool-based screening tests, “though there is no downside to using it in symptomatic patients as well.” Though AI colonoscopy likely still stands at < 50% of endoscopy centers overall, and is used mainly at academic centers, his clinic has been using it for the past year.
The main question, Mansour cautioned, is whether increased detection of small polyps will actually reduce CRC incidence or mortality, and it will likely be several years before clear, concrete data can answer that.
“Most studies have shown the improvement in adenoma detection is mainly for diminutive polyps < 5 mm in diameter, but whether that will actually translate to substantive improvements in hard outcomes is as yet unknown,” he said. “But if gastroenterologists are interested in doing everything they can today to help improve detection rates and lower miss rates of premalignant polyps, serious consideration should be given to adopting the use of CADe in practice.”
This study was supported by Medtronic. Rees reported receiving grant funding from ARC Medical, Norgine, Medtronic, 3-D Matrix, and Olympus Medical, and has been an expert witness for ARC Medical. Other authors disclosed receiving research funding, honoraria, or travel expenses from Medtronic or other private companies. Mansour had no competing interests to declare.
A version of this article appeared on Medscape.com.
Colin J. Rees, a professor of gastroenterology in the Faculty of Medical Sciences at Newcastle University in Newcastle upon Tyne, England, and colleagues compared the real-world clinical effectiveness of computer-aided detection (CADe)–assisted colonoscopy using an “intelligent” module with that of standard colonoscopy in a study in The Lancet Gastroenterology & Hepatology.
They found the GI Genius Intelligent Endoscopy Module (Medtronic) increased the mean number of adenomas detected per procedure and the adenoma detection rate, especially for small, flat (type 0-IIa) polyps, and sessile serrated lesions, which are more likely to be missed.
“Missed sessile serrated lesions disproportionately increase the risk of post-colonoscopy colorectal cancer, thus the adoption of GI Genius into routine colonoscopy practice could not only increase polyp detection but also reduce the incidence of post-colonoscopy colorectal cancer,” the investigators wrote.
“AI is going to have a major impact upon most aspects of healthcare. Some areas of medical practice are now well established, and some are still in evolution,” Rees, who is also president of the British Society of Gastroenterology, said in an interview. “Within gastroenterology, the role of AI in endoscopic diagnostics is also evolving. The COLO-DETECT trial demonstrates that AI increases detection of lesions, and work is ongoing to see how AI might help with characterization and other elements of endoscopic practice.”
Study Details
The multicenter, open-label, parallel-arm, pragmatic randomized controlled trial was conducted at 12 National Health Service hospitals in England. The study cohort consisted of adults ≥ 18 years undergoing colorectal cancer (CRC) screening or colonoscopy for gastrointestinal symptom surveillance owing to personal or family history.
Recruiting staff, participants, and colonoscopists were unmasked to allocation, whereas histopathologists, cochief investigators, and trial statisticians were masked.
CADe-assisted colonoscopy consisted of standard colonoscopy plus the GI Genius module active for at least the entire inspection phase of colonoscope withdrawal.
The primary outcome was mean adenomas per procedure (total number of adenomas detected divided by total number of procedures). The key secondary outcome was adenoma detection rate (proportion of colonoscopies with at least one adenoma).
From March 2021 to April 2023, the investigators recruited 2032 participants, 55.7% men, with a mean cohort age of 62.4 years and randomly assigned them to CADe-assisted colonoscopy (n = 1015) or to standard colonoscopy (n = 1017). Of these, 60.6% were undergoing screening and 39.4% had symptomatic indications.
Mean adenomas per procedure were 1.56 (SD, 2.82; n = 1001 participants with data) in the CADe-assisted group vs 1.21 (n = 1009) in the standard group, for an adjusted mean difference of 0.36 (95% CI, 0.14-0.57; adjusted incidence rate ratio, 1.30; 95% CI, 1.15-1.47; P < .0001).
Adenomas were detected in 555 (56.6%) of 980 participants in the CADe-assisted group vs 477 (48.4%) of 986 in the standard group, representing a proportion difference of 8.3% (95% CI, 3.9-12.7; adjusted odds ratio, 1.47; 95% CI, 1.21-1.78; P < .0001).
As to safety, adverse events were numerically comparable in both the intervention and control groups, with overall events 25 vs 19 and serious events 4 vs 6. On independent review, no adverse events in the CADe-assisted colonoscopy group were related to GI Genius.
Offering a US perspective on the study, Nabil M. Mansour, MD, an associate professor and director of the McNair General GI Clinic at Baylor College of Medicine in Houston, Texas, said GI Genius and other CADe systems represent a significant advance over standard colonoscopy for identifying premalignant polyps. “While the data have been mixed, most studies, particularly randomized controlled trials have shown significant improvements with CADe in detection both terms of in adenomas per colonoscopy and reductions in adenoma miss rate,” he said in an interview.
He added that the main utility of CADe is for asymptomatic patients undergoing average-risk screening and surveillance colonoscopy for CRC screening and prevention, as well as for those with positive stool-based screening tests, “though there is no downside to using it in symptomatic patients as well.” Though AI colonoscopy likely still stands at < 50% of endoscopy centers overall, and is used mainly at academic centers, his clinic has been using it for the past year.
The main question, Mansour cautioned, is whether increased detection of small polyps will actually reduce CRC incidence or mortality, and it will likely be several years before clear, concrete data can answer that.
“Most studies have shown the improvement in adenoma detection is mainly for diminutive polyps < 5 mm in diameter, but whether that will actually translate to substantive improvements in hard outcomes is as yet unknown,” he said. “But if gastroenterologists are interested in doing everything they can today to help improve detection rates and lower miss rates of premalignant polyps, serious consideration should be given to adopting the use of CADe in practice.”
This study was supported by Medtronic. Rees reported receiving grant funding from ARC Medical, Norgine, Medtronic, 3-D Matrix, and Olympus Medical, and has been an expert witness for ARC Medical. Other authors disclosed receiving research funding, honoraria, or travel expenses from Medtronic or other private companies. Mansour had no competing interests to declare.
A version of this article appeared on Medscape.com.
Colin J. Rees, a professor of gastroenterology in the Faculty of Medical Sciences at Newcastle University in Newcastle upon Tyne, England, and colleagues compared the real-world clinical effectiveness of computer-aided detection (CADe)–assisted colonoscopy using an “intelligent” module with that of standard colonoscopy in a study in The Lancet Gastroenterology & Hepatology.
They found the GI Genius Intelligent Endoscopy Module (Medtronic) increased the mean number of adenomas detected per procedure and the adenoma detection rate, especially for small, flat (type 0-IIa) polyps, and sessile serrated lesions, which are more likely to be missed.
“Missed sessile serrated lesions disproportionately increase the risk of post-colonoscopy colorectal cancer, thus the adoption of GI Genius into routine colonoscopy practice could not only increase polyp detection but also reduce the incidence of post-colonoscopy colorectal cancer,” the investigators wrote.
“AI is going to have a major impact upon most aspects of healthcare. Some areas of medical practice are now well established, and some are still in evolution,” Rees, who is also president of the British Society of Gastroenterology, said in an interview. “Within gastroenterology, the role of AI in endoscopic diagnostics is also evolving. The COLO-DETECT trial demonstrates that AI increases detection of lesions, and work is ongoing to see how AI might help with characterization and other elements of endoscopic practice.”
Study Details
The multicenter, open-label, parallel-arm, pragmatic randomized controlled trial was conducted at 12 National Health Service hospitals in England. The study cohort consisted of adults ≥ 18 years undergoing colorectal cancer (CRC) screening or colonoscopy for gastrointestinal symptom surveillance owing to personal or family history.
Recruiting staff, participants, and colonoscopists were unmasked to allocation, whereas histopathologists, cochief investigators, and trial statisticians were masked.
CADe-assisted colonoscopy consisted of standard colonoscopy plus the GI Genius module active for at least the entire inspection phase of colonoscope withdrawal.
The primary outcome was mean adenomas per procedure (total number of adenomas detected divided by total number of procedures). The key secondary outcome was adenoma detection rate (proportion of colonoscopies with at least one adenoma).
From March 2021 to April 2023, the investigators recruited 2032 participants, 55.7% men, with a mean cohort age of 62.4 years and randomly assigned them to CADe-assisted colonoscopy (n = 1015) or to standard colonoscopy (n = 1017). Of these, 60.6% were undergoing screening and 39.4% had symptomatic indications.
Mean adenomas per procedure were 1.56 (SD, 2.82; n = 1001 participants with data) in the CADe-assisted group vs 1.21 (n = 1009) in the standard group, for an adjusted mean difference of 0.36 (95% CI, 0.14-0.57; adjusted incidence rate ratio, 1.30; 95% CI, 1.15-1.47; P < .0001).
Adenomas were detected in 555 (56.6%) of 980 participants in the CADe-assisted group vs 477 (48.4%) of 986 in the standard group, representing a proportion difference of 8.3% (95% CI, 3.9-12.7; adjusted odds ratio, 1.47; 95% CI, 1.21-1.78; P < .0001).
As to safety, adverse events were numerically comparable in both the intervention and control groups, with overall events 25 vs 19 and serious events 4 vs 6. On independent review, no adverse events in the CADe-assisted colonoscopy group were related to GI Genius.
Offering a US perspective on the study, Nabil M. Mansour, MD, an associate professor and director of the McNair General GI Clinic at Baylor College of Medicine in Houston, Texas, said GI Genius and other CADe systems represent a significant advance over standard colonoscopy for identifying premalignant polyps. “While the data have been mixed, most studies, particularly randomized controlled trials have shown significant improvements with CADe in detection both terms of in adenomas per colonoscopy and reductions in adenoma miss rate,” he said in an interview.
He added that the main utility of CADe is for asymptomatic patients undergoing average-risk screening and surveillance colonoscopy for CRC screening and prevention, as well as for those with positive stool-based screening tests, “though there is no downside to using it in symptomatic patients as well.” Though AI colonoscopy likely still stands at < 50% of endoscopy centers overall, and is used mainly at academic centers, his clinic has been using it for the past year.
The main question, Mansour cautioned, is whether increased detection of small polyps will actually reduce CRC incidence or mortality, and it will likely be several years before clear, concrete data can answer that.
“Most studies have shown the improvement in adenoma detection is mainly for diminutive polyps < 5 mm in diameter, but whether that will actually translate to substantive improvements in hard outcomes is as yet unknown,” he said. “But if gastroenterologists are interested in doing everything they can today to help improve detection rates and lower miss rates of premalignant polyps, serious consideration should be given to adopting the use of CADe in practice.”
This study was supported by Medtronic. Rees reported receiving grant funding from ARC Medical, Norgine, Medtronic, 3-D Matrix, and Olympus Medical, and has been an expert witness for ARC Medical. Other authors disclosed receiving research funding, honoraria, or travel expenses from Medtronic or other private companies. Mansour had no competing interests to declare.
A version of this article appeared on Medscape.com.
FROM THE LANCET GASTROENTEROLOGY & HEPATOLOGY
Early Postpartum IUD Doesn’t Spike Healthcare Utilization
TOPLINE:
Healthcare utilization after immediate and delayed intrauterine device (IUD) placement postpartum was comparable, with the immediate placement group making slightly fewer visits to obstetricians or gynecologists (ob/gyns). While immediate placement was associated with increased rates of imaging, it showed lower rates of laparoscopic surgery for IUD-related complications.
METHODOLOGY:
- Researchers conducted a retrospective cohort study using data from Kaiser Permanente Northern California electronic health records to compare healthcare utilization after immediate (within 24 hours of placental delivery) and delayed (after 24 hours up to 6 weeks later) IUD placement.
- They included 11,875 patients who delivered a live neonate and had an IUD placed between 0 and 63 days postpartum from 2016 to 2020, of whom 1543 received immediate IUD placement.
- The primary outcome measures focused on the number of outpatient visits to ob/gyns for any indication within 1 year after delivery.
- The secondary outcomes included pelvic or abdominal ultrasonograms performed in radiology departments, surgical interventions, hospitalizations related to IUD placement, and rates of pregnancy within 1 year.
TAKEAWAY:
- Immediate placement of an IUD was associated with a modest decrease in the number of overall visits to ob/gyns compared with delayed placement (mean visits, 2.30 vs 2.47; adjusted risk ratio [aRR], 0.91; 95% CI, 0.87-0.94; P < .001).
- Immediate placement of an IUD was associated with more imaging studies not within an ob/gyn visit (aRR, 2.26; P < .001); however, the rates of laparoscopic surgeries for complications related to IUD were lower in the immediate than in the delayed group (0.0% vs 0.4%; P = .005).
- Hospitalizations related to IUD insertion were rare and increased in the immediate group (0.4% immediate; 0.02% delayed; P < .001).
- No significant differences in repeat pregnancies were observed between the groups at 1 year (P = .342), and immediate placement of an IUD was not associated with an increased risk for ectopic pregnancies.
IN PRACTICE:
“Because one of the main goals of immediate IUD is preventing short-interval unintended pregnancies, it is of critical importance to highlight that there was no difference in the pregnancy rate between groups in the study,” the authors wrote. “This study can guide patient counseling and consent for immediate IUD,” they further added.
SOURCE:
This study was led by Talis M. Swisher, MD, of the Department of Obstetrics and Gynecology at the San Leandro Medical Center of Kaiser Permanente in San Leandro, California. It was published online on December 12, 2024, in Obstetrics & Gynecology.
LIMITATIONS:
Data on patient satisfaction were not included in this study. No analysis of cost-benefit was carried out due to challenges in comparing differences in insurance plans and regional disparities in costs across the United States. The study setting was unique to Kaiser Permanente Northern California, in which all patients in the hospital had access to IUDs and multiple settings of ultrasonography were readily available. Visits carried out virtually were not included in the analysis.
DISCLOSURES:
This study was supported by the Kaiser Permanente Northern California Graduate Medical Education Program, Kaiser Foundation Hospitals. The authors reported no potential conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Healthcare utilization after immediate and delayed intrauterine device (IUD) placement postpartum was comparable, with the immediate placement group making slightly fewer visits to obstetricians or gynecologists (ob/gyns). While immediate placement was associated with increased rates of imaging, it showed lower rates of laparoscopic surgery for IUD-related complications.
METHODOLOGY:
- Researchers conducted a retrospective cohort study using data from Kaiser Permanente Northern California electronic health records to compare healthcare utilization after immediate (within 24 hours of placental delivery) and delayed (after 24 hours up to 6 weeks later) IUD placement.
- They included 11,875 patients who delivered a live neonate and had an IUD placed between 0 and 63 days postpartum from 2016 to 2020, of whom 1543 received immediate IUD placement.
- The primary outcome measures focused on the number of outpatient visits to ob/gyns for any indication within 1 year after delivery.
- The secondary outcomes included pelvic or abdominal ultrasonograms performed in radiology departments, surgical interventions, hospitalizations related to IUD placement, and rates of pregnancy within 1 year.
TAKEAWAY:
- Immediate placement of an IUD was associated with a modest decrease in the number of overall visits to ob/gyns compared with delayed placement (mean visits, 2.30 vs 2.47; adjusted risk ratio [aRR], 0.91; 95% CI, 0.87-0.94; P < .001).
- Immediate placement of an IUD was associated with more imaging studies not within an ob/gyn visit (aRR, 2.26; P < .001); however, the rates of laparoscopic surgeries for complications related to IUD were lower in the immediate than in the delayed group (0.0% vs 0.4%; P = .005).
- Hospitalizations related to IUD insertion were rare and increased in the immediate group (0.4% immediate; 0.02% delayed; P < .001).
- No significant differences in repeat pregnancies were observed between the groups at 1 year (P = .342), and immediate placement of an IUD was not associated with an increased risk for ectopic pregnancies.
IN PRACTICE:
“Because one of the main goals of immediate IUD is preventing short-interval unintended pregnancies, it is of critical importance to highlight that there was no difference in the pregnancy rate between groups in the study,” the authors wrote. “This study can guide patient counseling and consent for immediate IUD,” they further added.
SOURCE:
This study was led by Talis M. Swisher, MD, of the Department of Obstetrics and Gynecology at the San Leandro Medical Center of Kaiser Permanente in San Leandro, California. It was published online on December 12, 2024, in Obstetrics & Gynecology.
LIMITATIONS:
Data on patient satisfaction were not included in this study. No analysis of cost-benefit was carried out due to challenges in comparing differences in insurance plans and regional disparities in costs across the United States. The study setting was unique to Kaiser Permanente Northern California, in which all patients in the hospital had access to IUDs and multiple settings of ultrasonography were readily available. Visits carried out virtually were not included in the analysis.
DISCLOSURES:
This study was supported by the Kaiser Permanente Northern California Graduate Medical Education Program, Kaiser Foundation Hospitals. The authors reported no potential conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Healthcare utilization after immediate and delayed intrauterine device (IUD) placement postpartum was comparable, with the immediate placement group making slightly fewer visits to obstetricians or gynecologists (ob/gyns). While immediate placement was associated with increased rates of imaging, it showed lower rates of laparoscopic surgery for IUD-related complications.
METHODOLOGY:
- Researchers conducted a retrospective cohort study using data from Kaiser Permanente Northern California electronic health records to compare healthcare utilization after immediate (within 24 hours of placental delivery) and delayed (after 24 hours up to 6 weeks later) IUD placement.
- They included 11,875 patients who delivered a live neonate and had an IUD placed between 0 and 63 days postpartum from 2016 to 2020, of whom 1543 received immediate IUD placement.
- The primary outcome measures focused on the number of outpatient visits to ob/gyns for any indication within 1 year after delivery.
- The secondary outcomes included pelvic or abdominal ultrasonograms performed in radiology departments, surgical interventions, hospitalizations related to IUD placement, and rates of pregnancy within 1 year.
TAKEAWAY:
- Immediate placement of an IUD was associated with a modest decrease in the number of overall visits to ob/gyns compared with delayed placement (mean visits, 2.30 vs 2.47; adjusted risk ratio [aRR], 0.91; 95% CI, 0.87-0.94; P < .001).
- Immediate placement of an IUD was associated with more imaging studies not within an ob/gyn visit (aRR, 2.26; P < .001); however, the rates of laparoscopic surgeries for complications related to IUD were lower in the immediate than in the delayed group (0.0% vs 0.4%; P = .005).
- Hospitalizations related to IUD insertion were rare and increased in the immediate group (0.4% immediate; 0.02% delayed; P < .001).
- No significant differences in repeat pregnancies were observed between the groups at 1 year (P = .342), and immediate placement of an IUD was not associated with an increased risk for ectopic pregnancies.
IN PRACTICE:
“Because one of the main goals of immediate IUD is preventing short-interval unintended pregnancies, it is of critical importance to highlight that there was no difference in the pregnancy rate between groups in the study,” the authors wrote. “This study can guide patient counseling and consent for immediate IUD,” they further added.
SOURCE:
This study was led by Talis M. Swisher, MD, of the Department of Obstetrics and Gynecology at the San Leandro Medical Center of Kaiser Permanente in San Leandro, California. It was published online on December 12, 2024, in Obstetrics & Gynecology.
LIMITATIONS:
Data on patient satisfaction were not included in this study. No analysis of cost-benefit was carried out due to challenges in comparing differences in insurance plans and regional disparities in costs across the United States. The study setting was unique to Kaiser Permanente Northern California, in which all patients in the hospital had access to IUDs and multiple settings of ultrasonography were readily available. Visits carried out virtually were not included in the analysis.
DISCLOSURES:
This study was supported by the Kaiser Permanente Northern California Graduate Medical Education Program, Kaiser Foundation Hospitals. The authors reported no potential conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Brain Changes in Youth Who Use Substances: Cause or Effect?
A widely accepted assumption in the addiction field is that neuroanatomical changes observed in young people who use alcohol or other substances are largely the consequence of exposure to these substances.
But a new study suggests that neuroanatomical features in children, including greater whole brain and cortical volumes, are evident before exposure to any substances.
The investigators, led by Alex P. Miller, PhD, assistant professor, Department of Psychiatry, Indiana University, Indianapolis, noted that the findings add to a growing body of work that suggests
The findings were published online in JAMA Network Open.
Neuroanatomy a Predisposing Risk Factor?
Earlier research showed that substance use is associated with lower gray matter volume, thinner cortex, and less white matter integrity. While it has been widely thought that these changes were induced by the use of alcohol or illicit drugs, recent longitudinal and genetic studies suggest that the neuroanatomical changes may also be predisposing risk factors for substance use.
To better understand the issue, investigators analyzed data on 9804 children (mean baseline age, 9.9 years; 53% men; 76% White) at 22 US sites enrolled in the Adolescent Brain Cognitive Development (ABCD) Study that’s examining brain and behavioral development from middle childhood to young adulthood.
The researchers collected information on the use of alcohol, nicotine, cannabis, and other illicit substances from in-person interviews at baseline and years 1, 2, and 3, as well as interim phone interviews at 6, 18, and 30 months. MRI scans provided extensive brain structural data, including global and regional cortical volume, thickness, surface area, sulcal depth, and subcortical volume.
Of the total, 3460 participants (35%) initiated substance use before age 15, with 90% reporting alcohol use initiation. There was considerable overlap between initiation of alcohol, nicotine, and cannabis.
The researchers tested whether baseline neuroanatomical variability was associated with any substance use initiation before or up to 3 years following initial neuroimaging scans. Study covariates included baseline age, sex, pubertal status, familial relationship (eg, sibling or twin), and prenatal substance exposures. Researchers didn’t control for sociodemographic characteristics as these could influence associations.
Significant Brain Differences
Compared with no substance use initiation, any substance use initiation was associated with larger global neuroanatomical indices, including whole brain (beta = 0.05; P = 2.80 × 10–8), total intracranial (beta = 0.04; P = 3.49 × 10−6), cortical (beta = 0.05; P = 4.31 × 10–8), and subcortical volumes (beta = 0.05; P = 4.39 × 10–8), as well as greater total cortical surface area (beta = 0.04; P = 6.05 × 10–7).
The direction of associations between cortical thickness and substance use initiation was regionally specific; any substance use initiation was characterized by thinner cortex in all frontal regions (eg, rostral middle frontal gyrus, beta = −0.03; P = 6.99 × 10–6), but thicker cortex in all other lobes. It was also associated with larger regional brain volumes, deeper regional sulci, and differences in regional cortical surface area.
The authors noted total cortical thickness peaks at age 1.7 years and steadily declines throughout life. By contrast, subcortical volumes peak at 14.4 years of age and generally remain stable before steep later life declines.
Secondary analyses compared initiation of the three most commonly used substances in early adolescence (alcohol, nicotine, and cannabis) with no substance use.
Findings for alcohol largely mirrored those for any substance use. However, the study uncovered additional significant associations, including greater left lateral occipital volume and bilateral para-hippocampal gyri cortical thickness and less bilateral superior frontal gyri cortical thickness.
Nicotine use was associated with lower right superior frontal gyrus volume and deeper left lateral orbitofrontal cortex sulci. And cannabis use was associated with thinner left precentral gyrus and lower right inferior parietal gyrus and right caudate volumes.
The authors noted results for nicotine and cannabis may not have had adequate statistical power, and small effects suggest these findings aren’t clinically informative for individuals. However, they wrote, “They do inform and challenge current theoretical models of addiction.”
Associations Precede Substance Use
A post hoc analysis further challenges current models of addiction. When researchers looked only at the 1203 youth who initiated substance use after the baseline neuroimaging session, they found most associations preceded substance use.
“That regional associations may precede substance use initiation, including less cortical thickness in the right rostral middle frontal gyrus, challenges predominant interpretations that these associations arise largely due to neurotoxic consequences of exposure and increases the plausibility that these features may, at least partially, reflect markers of predispositional risk,” wrote the authors.
A study limitation was that unmeasured confounders and undetected systemic differences in missing data may have influenced associations. Sociodemographic, environmental, and genetic variables that were not included as covariates are likely associated with both neuroanatomical variability and substance use initiation and may moderate associations between them, said the authors.
The ABCD Study provides “a robust and large database of longitudinal data” that goes beyond previous neuroimaging research “to understand the bidirectional relationship between brain structure and substance use,” Miller said in a press release.
“The hope is that these types of studies, in conjunction with other data on environmental exposures and genetic risk, could help change how we think about the development of substance use disorders and inform more accurate models of addiction moving forward,” Miller said.
Reevaluating Causal Assumptions
In an accompanying editorial, Felix Pichardo, MA, and Sylia Wilson, PhD, from the Institute of Child Development, University of Minnesota, Minneapolis, suggested that it may be time to “reevaluate the causal assumptions that underlie brain disease models of addiction” and the mechanisms by which it develops, persists, and becomes harmful.
Neurotoxic effects of substances are central to current brain disease models of addiction, wrote Pichardo and Wilson. “Substance exposure is thought to affect cortical and subcortical regions that support interrelated systems, resulting in desensitization of reward-related processing, increased stress that prompts cravings, negative emotions when cravings are unsated, and weakening of cognitive control abilities that leads to repeated returns to use.”
The editorial writers praised the ABCD Study for its large sample size for providing a level of precision, statistical accuracy, and ability to identify both larger and smaller effects, which are critical for addiction research.
Unlike most addiction research that relies on cross-sectional designs, the current study used longitudinal assessments, which is another of its strengths, they noted.
“Longitudinal study designs like in the ABCD Study are fundamental for establishing temporal ordering across constructs, which is important because establishing temporal precedence is a key step in determining causal links and underlying mechanisms.”
The inclusion of several genetically informative components, such as the family study design, nested twin subsamples, and DNA collection, “allows researchers to extend beyond temporal precedence toward increased causal inference and identification of mechanisms,” they added.
The study received support from the National Institutes of Health. The study authors and editorial writers had no relevant conflicts of interest.
A version of this article appeared on Medscape.com.
A widely accepted assumption in the addiction field is that neuroanatomical changes observed in young people who use alcohol or other substances are largely the consequence of exposure to these substances.
But a new study suggests that neuroanatomical features in children, including greater whole brain and cortical volumes, are evident before exposure to any substances.
The investigators, led by Alex P. Miller, PhD, assistant professor, Department of Psychiatry, Indiana University, Indianapolis, noted that the findings add to a growing body of work that suggests
The findings were published online in JAMA Network Open.
Neuroanatomy a Predisposing Risk Factor?
Earlier research showed that substance use is associated with lower gray matter volume, thinner cortex, and less white matter integrity. While it has been widely thought that these changes were induced by the use of alcohol or illicit drugs, recent longitudinal and genetic studies suggest that the neuroanatomical changes may also be predisposing risk factors for substance use.
To better understand the issue, investigators analyzed data on 9804 children (mean baseline age, 9.9 years; 53% men; 76% White) at 22 US sites enrolled in the Adolescent Brain Cognitive Development (ABCD) Study that’s examining brain and behavioral development from middle childhood to young adulthood.
The researchers collected information on the use of alcohol, nicotine, cannabis, and other illicit substances from in-person interviews at baseline and years 1, 2, and 3, as well as interim phone interviews at 6, 18, and 30 months. MRI scans provided extensive brain structural data, including global and regional cortical volume, thickness, surface area, sulcal depth, and subcortical volume.
Of the total, 3460 participants (35%) initiated substance use before age 15, with 90% reporting alcohol use initiation. There was considerable overlap between initiation of alcohol, nicotine, and cannabis.
The researchers tested whether baseline neuroanatomical variability was associated with any substance use initiation before or up to 3 years following initial neuroimaging scans. Study covariates included baseline age, sex, pubertal status, familial relationship (eg, sibling or twin), and prenatal substance exposures. Researchers didn’t control for sociodemographic characteristics as these could influence associations.
Significant Brain Differences
Compared with no substance use initiation, any substance use initiation was associated with larger global neuroanatomical indices, including whole brain (beta = 0.05; P = 2.80 × 10–8), total intracranial (beta = 0.04; P = 3.49 × 10−6), cortical (beta = 0.05; P = 4.31 × 10–8), and subcortical volumes (beta = 0.05; P = 4.39 × 10–8), as well as greater total cortical surface area (beta = 0.04; P = 6.05 × 10–7).
The direction of associations between cortical thickness and substance use initiation was regionally specific; any substance use initiation was characterized by thinner cortex in all frontal regions (eg, rostral middle frontal gyrus, beta = −0.03; P = 6.99 × 10–6), but thicker cortex in all other lobes. It was also associated with larger regional brain volumes, deeper regional sulci, and differences in regional cortical surface area.
The authors noted total cortical thickness peaks at age 1.7 years and steadily declines throughout life. By contrast, subcortical volumes peak at 14.4 years of age and generally remain stable before steep later life declines.
Secondary analyses compared initiation of the three most commonly used substances in early adolescence (alcohol, nicotine, and cannabis) with no substance use.
Findings for alcohol largely mirrored those for any substance use. However, the study uncovered additional significant associations, including greater left lateral occipital volume and bilateral para-hippocampal gyri cortical thickness and less bilateral superior frontal gyri cortical thickness.
Nicotine use was associated with lower right superior frontal gyrus volume and deeper left lateral orbitofrontal cortex sulci. And cannabis use was associated with thinner left precentral gyrus and lower right inferior parietal gyrus and right caudate volumes.
The authors noted results for nicotine and cannabis may not have had adequate statistical power, and small effects suggest these findings aren’t clinically informative for individuals. However, they wrote, “They do inform and challenge current theoretical models of addiction.”
Associations Precede Substance Use
A post hoc analysis further challenges current models of addiction. When researchers looked only at the 1203 youth who initiated substance use after the baseline neuroimaging session, they found most associations preceded substance use.
“That regional associations may precede substance use initiation, including less cortical thickness in the right rostral middle frontal gyrus, challenges predominant interpretations that these associations arise largely due to neurotoxic consequences of exposure and increases the plausibility that these features may, at least partially, reflect markers of predispositional risk,” wrote the authors.
A study limitation was that unmeasured confounders and undetected systemic differences in missing data may have influenced associations. Sociodemographic, environmental, and genetic variables that were not included as covariates are likely associated with both neuroanatomical variability and substance use initiation and may moderate associations between them, said the authors.
The ABCD Study provides “a robust and large database of longitudinal data” that goes beyond previous neuroimaging research “to understand the bidirectional relationship between brain structure and substance use,” Miller said in a press release.
“The hope is that these types of studies, in conjunction with other data on environmental exposures and genetic risk, could help change how we think about the development of substance use disorders and inform more accurate models of addiction moving forward,” Miller said.
Reevaluating Causal Assumptions
In an accompanying editorial, Felix Pichardo, MA, and Sylia Wilson, PhD, from the Institute of Child Development, University of Minnesota, Minneapolis, suggested that it may be time to “reevaluate the causal assumptions that underlie brain disease models of addiction” and the mechanisms by which it develops, persists, and becomes harmful.
Neurotoxic effects of substances are central to current brain disease models of addiction, wrote Pichardo and Wilson. “Substance exposure is thought to affect cortical and subcortical regions that support interrelated systems, resulting in desensitization of reward-related processing, increased stress that prompts cravings, negative emotions when cravings are unsated, and weakening of cognitive control abilities that leads to repeated returns to use.”
The editorial writers praised the ABCD Study for its large sample size for providing a level of precision, statistical accuracy, and ability to identify both larger and smaller effects, which are critical for addiction research.
Unlike most addiction research that relies on cross-sectional designs, the current study used longitudinal assessments, which is another of its strengths, they noted.
“Longitudinal study designs like in the ABCD Study are fundamental for establishing temporal ordering across constructs, which is important because establishing temporal precedence is a key step in determining causal links and underlying mechanisms.”
The inclusion of several genetically informative components, such as the family study design, nested twin subsamples, and DNA collection, “allows researchers to extend beyond temporal precedence toward increased causal inference and identification of mechanisms,” they added.
The study received support from the National Institutes of Health. The study authors and editorial writers had no relevant conflicts of interest.
A version of this article appeared on Medscape.com.
A widely accepted assumption in the addiction field is that neuroanatomical changes observed in young people who use alcohol or other substances are largely the consequence of exposure to these substances.
But a new study suggests that neuroanatomical features in children, including greater whole brain and cortical volumes, are evident before exposure to any substances.
The investigators, led by Alex P. Miller, PhD, assistant professor, Department of Psychiatry, Indiana University, Indianapolis, noted that the findings add to a growing body of work that suggests
The findings were published online in JAMA Network Open.
Neuroanatomy a Predisposing Risk Factor?
Earlier research showed that substance use is associated with lower gray matter volume, thinner cortex, and less white matter integrity. While it has been widely thought that these changes were induced by the use of alcohol or illicit drugs, recent longitudinal and genetic studies suggest that the neuroanatomical changes may also be predisposing risk factors for substance use.
To better understand the issue, investigators analyzed data on 9804 children (mean baseline age, 9.9 years; 53% men; 76% White) at 22 US sites enrolled in the Adolescent Brain Cognitive Development (ABCD) Study that’s examining brain and behavioral development from middle childhood to young adulthood.
The researchers collected information on the use of alcohol, nicotine, cannabis, and other illicit substances from in-person interviews at baseline and years 1, 2, and 3, as well as interim phone interviews at 6, 18, and 30 months. MRI scans provided extensive brain structural data, including global and regional cortical volume, thickness, surface area, sulcal depth, and subcortical volume.
Of the total, 3460 participants (35%) initiated substance use before age 15, with 90% reporting alcohol use initiation. There was considerable overlap between initiation of alcohol, nicotine, and cannabis.
The researchers tested whether baseline neuroanatomical variability was associated with any substance use initiation before or up to 3 years following initial neuroimaging scans. Study covariates included baseline age, sex, pubertal status, familial relationship (eg, sibling or twin), and prenatal substance exposures. Researchers didn’t control for sociodemographic characteristics as these could influence associations.
Significant Brain Differences
Compared with no substance use initiation, any substance use initiation was associated with larger global neuroanatomical indices, including whole brain (beta = 0.05; P = 2.80 × 10–8), total intracranial (beta = 0.04; P = 3.49 × 10−6), cortical (beta = 0.05; P = 4.31 × 10–8), and subcortical volumes (beta = 0.05; P = 4.39 × 10–8), as well as greater total cortical surface area (beta = 0.04; P = 6.05 × 10–7).
The direction of associations between cortical thickness and substance use initiation was regionally specific; any substance use initiation was characterized by thinner cortex in all frontal regions (eg, rostral middle frontal gyrus, beta = −0.03; P = 6.99 × 10–6), but thicker cortex in all other lobes. It was also associated with larger regional brain volumes, deeper regional sulci, and differences in regional cortical surface area.
The authors noted total cortical thickness peaks at age 1.7 years and steadily declines throughout life. By contrast, subcortical volumes peak at 14.4 years of age and generally remain stable before steep later life declines.
Secondary analyses compared initiation of the three most commonly used substances in early adolescence (alcohol, nicotine, and cannabis) with no substance use.
Findings for alcohol largely mirrored those for any substance use. However, the study uncovered additional significant associations, including greater left lateral occipital volume and bilateral para-hippocampal gyri cortical thickness and less bilateral superior frontal gyri cortical thickness.
Nicotine use was associated with lower right superior frontal gyrus volume and deeper left lateral orbitofrontal cortex sulci. And cannabis use was associated with thinner left precentral gyrus and lower right inferior parietal gyrus and right caudate volumes.
The authors noted results for nicotine and cannabis may not have had adequate statistical power, and small effects suggest these findings aren’t clinically informative for individuals. However, they wrote, “They do inform and challenge current theoretical models of addiction.”
Associations Precede Substance Use
A post hoc analysis further challenges current models of addiction. When researchers looked only at the 1203 youth who initiated substance use after the baseline neuroimaging session, they found most associations preceded substance use.
“That regional associations may precede substance use initiation, including less cortical thickness in the right rostral middle frontal gyrus, challenges predominant interpretations that these associations arise largely due to neurotoxic consequences of exposure and increases the plausibility that these features may, at least partially, reflect markers of predispositional risk,” wrote the authors.
A study limitation was that unmeasured confounders and undetected systemic differences in missing data may have influenced associations. Sociodemographic, environmental, and genetic variables that were not included as covariates are likely associated with both neuroanatomical variability and substance use initiation and may moderate associations between them, said the authors.
The ABCD Study provides “a robust and large database of longitudinal data” that goes beyond previous neuroimaging research “to understand the bidirectional relationship between brain structure and substance use,” Miller said in a press release.
“The hope is that these types of studies, in conjunction with other data on environmental exposures and genetic risk, could help change how we think about the development of substance use disorders and inform more accurate models of addiction moving forward,” Miller said.
Reevaluating Causal Assumptions
In an accompanying editorial, Felix Pichardo, MA, and Sylia Wilson, PhD, from the Institute of Child Development, University of Minnesota, Minneapolis, suggested that it may be time to “reevaluate the causal assumptions that underlie brain disease models of addiction” and the mechanisms by which it develops, persists, and becomes harmful.
Neurotoxic effects of substances are central to current brain disease models of addiction, wrote Pichardo and Wilson. “Substance exposure is thought to affect cortical and subcortical regions that support interrelated systems, resulting in desensitization of reward-related processing, increased stress that prompts cravings, negative emotions when cravings are unsated, and weakening of cognitive control abilities that leads to repeated returns to use.”
The editorial writers praised the ABCD Study for its large sample size for providing a level of precision, statistical accuracy, and ability to identify both larger and smaller effects, which are critical for addiction research.
Unlike most addiction research that relies on cross-sectional designs, the current study used longitudinal assessments, which is another of its strengths, they noted.
“Longitudinal study designs like in the ABCD Study are fundamental for establishing temporal ordering across constructs, which is important because establishing temporal precedence is a key step in determining causal links and underlying mechanisms.”
The inclusion of several genetically informative components, such as the family study design, nested twin subsamples, and DNA collection, “allows researchers to extend beyond temporal precedence toward increased causal inference and identification of mechanisms,” they added.
The study received support from the National Institutes of Health. The study authors and editorial writers had no relevant conflicts of interest.
A version of this article appeared on Medscape.com.
FROM JAMA NETWORK OPEN
Broken Sleep Linked to MASLD
TOPLINE:
Fragmented sleep — that is, increased wakefulness and reduced sleep efficiency — is a sign of metabolic dysfunction–associated steatotic liver disease (MASLD), a study using actigraphy showed.
METHODOLOGY:
- Researchers assessed sleep-wake rhythms in 35 patients with MASLD (median age, 58 years; 66% were men; 80% with metabolic syndrome) and 16 matched healthy controls (median age, 61 years; 50% were men) using data collected 24/7 via actigraphy for 4 weeks.
- Sub-analyses were conducted with MASLD comparator groups: 16 patients with MASH, 8 with MASH with cirrhosis, and 11 with non-MASH–related cirrhosis.
- All participants visited the clinic at baseline, week 2, and week 4 to undergo a clinical investigation and complete questionnaires about their sleep.
- A standardized sleep hygiene education session was conducted at week 2.
TAKEAWAY:
- Actigraphy data from patients with MASLD did not reveal significant differences in bedtime, sleep-onset latency, sleep duration, wake-up time, or time in bed compared with controls.
- However, compared with controls, those with MASLD woke 55% more often at night (8.5 vs 5.5), lay awake 113% longer after having first fallen asleep (45.4 minutes vs 21.3 minutes), and slept more often and longer during the day (decreased sleep efficiency).
- Subgroup analyses showed that actigraphy-measured sleep patterns and quality were similarly impaired in patients with MASH, MASH with cirrhosis, and non–MASH-related cirrhosis.
- Patients with MASLD self-reported their fragmented sleep as shorter sleep with a delayed onset. In sleep diaries, 32% of patients with MASLD reported sleep disturbances caused by psychological stress, compared with only 6.25% of controls and 9% of patients with cirrhosis.
- The sleep education session did not change the actigraphy measures or the sleep parameters assessed with sleep questionnaires at the end of the study.
IN PRACTICE:
“We concluded from our data that sleep fragmentation plays a role in the pathogenesis of human MASLD. Whether MASLD causes sleep disorders or vice versa remains unknown. The underlying mechanism presumably involves genetics, environmental factors, and the activation of immune responses — ultimately driven by obesity and metabolic syndrome,” said corresponding author.
SOURCE:
The study, led by Sofia Schaeffer, PhD, University of Basel, Switzerland, was published online in Frontiers in Network Physiology.
LIMITATIONS:
The study had several limitations. There was a significant difference in body mass index between patients with MASLD (median, 31) and controls (median, 23.5), representing a potential confounder that could explain the differences in sleep behavior. Undetected obstructive sleep apnea could also be a confounding factor. The small number of participants limited the interpretation and generalization of the data, especially in the MASLD subgroups.
DISCLOSURES:
This study was supported by a grant from the University of Basel. One coauthor received a research grant from the University Center for Gastrointestinal and Liver Diseases, Basel, Switzerland. Another coauthor was employed by NovoLytiX. Schaeffer and the remaining coauthors declared that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
A version of this article first appeared on Medscape.com.
TOPLINE:
Fragmented sleep — that is, increased wakefulness and reduced sleep efficiency — is a sign of metabolic dysfunction–associated steatotic liver disease (MASLD), a study using actigraphy showed.
METHODOLOGY:
- Researchers assessed sleep-wake rhythms in 35 patients with MASLD (median age, 58 years; 66% were men; 80% with metabolic syndrome) and 16 matched healthy controls (median age, 61 years; 50% were men) using data collected 24/7 via actigraphy for 4 weeks.
- Sub-analyses were conducted with MASLD comparator groups: 16 patients with MASH, 8 with MASH with cirrhosis, and 11 with non-MASH–related cirrhosis.
- All participants visited the clinic at baseline, week 2, and week 4 to undergo a clinical investigation and complete questionnaires about their sleep.
- A standardized sleep hygiene education session was conducted at week 2.
TAKEAWAY:
- Actigraphy data from patients with MASLD did not reveal significant differences in bedtime, sleep-onset latency, sleep duration, wake-up time, or time in bed compared with controls.
- However, compared with controls, those with MASLD woke 55% more often at night (8.5 vs 5.5), lay awake 113% longer after having first fallen asleep (45.4 minutes vs 21.3 minutes), and slept more often and longer during the day (decreased sleep efficiency).
- Subgroup analyses showed that actigraphy-measured sleep patterns and quality were similarly impaired in patients with MASH, MASH with cirrhosis, and non–MASH-related cirrhosis.
- Patients with MASLD self-reported their fragmented sleep as shorter sleep with a delayed onset. In sleep diaries, 32% of patients with MASLD reported sleep disturbances caused by psychological stress, compared with only 6.25% of controls and 9% of patients with cirrhosis.
- The sleep education session did not change the actigraphy measures or the sleep parameters assessed with sleep questionnaires at the end of the study.
IN PRACTICE:
“We concluded from our data that sleep fragmentation plays a role in the pathogenesis of human MASLD. Whether MASLD causes sleep disorders or vice versa remains unknown. The underlying mechanism presumably involves genetics, environmental factors, and the activation of immune responses — ultimately driven by obesity and metabolic syndrome,” said corresponding author.
SOURCE:
The study, led by Sofia Schaeffer, PhD, University of Basel, Switzerland, was published online in Frontiers in Network Physiology.
LIMITATIONS:
The study had several limitations. There was a significant difference in body mass index between patients with MASLD (median, 31) and controls (median, 23.5), representing a potential confounder that could explain the differences in sleep behavior. Undetected obstructive sleep apnea could also be a confounding factor. The small number of participants limited the interpretation and generalization of the data, especially in the MASLD subgroups.
DISCLOSURES:
This study was supported by a grant from the University of Basel. One coauthor received a research grant from the University Center for Gastrointestinal and Liver Diseases, Basel, Switzerland. Another coauthor was employed by NovoLytiX. Schaeffer and the remaining coauthors declared that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
A version of this article first appeared on Medscape.com.
TOPLINE:
Fragmented sleep — that is, increased wakefulness and reduced sleep efficiency — is a sign of metabolic dysfunction–associated steatotic liver disease (MASLD), a study using actigraphy showed.
METHODOLOGY:
- Researchers assessed sleep-wake rhythms in 35 patients with MASLD (median age, 58 years; 66% were men; 80% with metabolic syndrome) and 16 matched healthy controls (median age, 61 years; 50% were men) using data collected 24/7 via actigraphy for 4 weeks.
- Sub-analyses were conducted with MASLD comparator groups: 16 patients with MASH, 8 with MASH with cirrhosis, and 11 with non-MASH–related cirrhosis.
- All participants visited the clinic at baseline, week 2, and week 4 to undergo a clinical investigation and complete questionnaires about their sleep.
- A standardized sleep hygiene education session was conducted at week 2.
TAKEAWAY:
- Actigraphy data from patients with MASLD did not reveal significant differences in bedtime, sleep-onset latency, sleep duration, wake-up time, or time in bed compared with controls.
- However, compared with controls, those with MASLD woke 55% more often at night (8.5 vs 5.5), lay awake 113% longer after having first fallen asleep (45.4 minutes vs 21.3 minutes), and slept more often and longer during the day (decreased sleep efficiency).
- Subgroup analyses showed that actigraphy-measured sleep patterns and quality were similarly impaired in patients with MASH, MASH with cirrhosis, and non–MASH-related cirrhosis.
- Patients with MASLD self-reported their fragmented sleep as shorter sleep with a delayed onset. In sleep diaries, 32% of patients with MASLD reported sleep disturbances caused by psychological stress, compared with only 6.25% of controls and 9% of patients with cirrhosis.
- The sleep education session did not change the actigraphy measures or the sleep parameters assessed with sleep questionnaires at the end of the study.
IN PRACTICE:
“We concluded from our data that sleep fragmentation plays a role in the pathogenesis of human MASLD. Whether MASLD causes sleep disorders or vice versa remains unknown. The underlying mechanism presumably involves genetics, environmental factors, and the activation of immune responses — ultimately driven by obesity and metabolic syndrome,” said corresponding author.
SOURCE:
The study, led by Sofia Schaeffer, PhD, University of Basel, Switzerland, was published online in Frontiers in Network Physiology.
LIMITATIONS:
The study had several limitations. There was a significant difference in body mass index between patients with MASLD (median, 31) and controls (median, 23.5), representing a potential confounder that could explain the differences in sleep behavior. Undetected obstructive sleep apnea could also be a confounding factor. The small number of participants limited the interpretation and generalization of the data, especially in the MASLD subgroups.
DISCLOSURES:
This study was supported by a grant from the University of Basel. One coauthor received a research grant from the University Center for Gastrointestinal and Liver Diseases, Basel, Switzerland. Another coauthor was employed by NovoLytiX. Schaeffer and the remaining coauthors declared that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
A version of this article first appeared on Medscape.com.
Education Boosts Safe Sharps Disposal in Diabetic Care
TOPLINE:
A program combining theoretical training with free disposal containers can effectively increase knowledge and improve sharps waste disposal practices among patients with diabetes.
METHODOLOGY:
- A significant number of patients with diabetes administer insulin at home. Unsafe waste disposal including insulin pens, syringes, and lancets increases the risk for needle-stick injuries, microbial infections, and plastic waste accumulation, highlighting the need for safe disposal practices.
- Researchers conducted an experimental study at El-Horraya Polyclinic in Alexandria, Egypt, between November 2022 and April 2023 to evaluate the effectiveness of an intervention program in improving knowledge and practices related to safe sharps disposal among patients with diabetes.
- Overall, 100 patients (median age, 61 years; 92% living in urban areas) with either type 1 or type 2 diabetes were recruited and divided into the educational intervention (n = 50) and nonintervention (n = 50) groups; majority (67%) had diabetes for more than 10 years.
- The intervention group received educational sessions addressing improper disposal risks and environmental impacts along with practical demonstrations of correct sharps disposal methods; they were also given free puncture-resistant containers to safely dispose of the sharp waste generated from diabetes management.
- Assessments were performed at baseline, 2 months, and 4 months postintervention, evaluating knowledge levels (poor: < 50%, fair: 50% to < 70%, good: 70%-100%) and practice scores (poor: 0-6, fair: 7-10, good: 11-14).
TAKEAWAY:
- Overall, 58% of the patients used insulin pens, and approximately 75% required two doses of insulin daily.
- The median monthly disposal was 10 syringes per patient among syringe users and eight pen needles per patient among pen users.
- At baseline, there were no differences in the knowledge scores between the intervention and nonintervention groups; however, at both 2 and 4 months, the intervention group showed a significantly higher median knowledge score than the nonintervention group (P < .001 for both).
- Likewise, practice scores also showed marked improvements in the intervention group, compared with the nonintervention group at the end of the program (P < .001).
IN PRACTICE:
“The success of the environmental education program underscores the need for targeted interventions to enhance patient knowledge and safe sharps disposal practices. By offering accessible disposal options and raising awareness, healthcare facilities can significantly contribute to preventing accidental needle-stick injuries and reducing the risk of infectious disease transmission,” the authors wrote.
SOURCE:
This study was led by Hossam Mohamed Hassan Soliman, High Institute of Public Health, Alexandria University, Egypt. It was published online in Scientific Reports.
LIMITATIONS:
Interview bias and self-reporting bias in data collection were major limitations of this study. The quasi-experimental design, lacking randomization, may have limited the strength of causal inferences.
DISCLOSURES:
No funding was received for this study, and the authors reported no relevant conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
A program combining theoretical training with free disposal containers can effectively increase knowledge and improve sharps waste disposal practices among patients with diabetes.
METHODOLOGY:
- A significant number of patients with diabetes administer insulin at home. Unsafe waste disposal including insulin pens, syringes, and lancets increases the risk for needle-stick injuries, microbial infections, and plastic waste accumulation, highlighting the need for safe disposal practices.
- Researchers conducted an experimental study at El-Horraya Polyclinic in Alexandria, Egypt, between November 2022 and April 2023 to evaluate the effectiveness of an intervention program in improving knowledge and practices related to safe sharps disposal among patients with diabetes.
- Overall, 100 patients (median age, 61 years; 92% living in urban areas) with either type 1 or type 2 diabetes were recruited and divided into the educational intervention (n = 50) and nonintervention (n = 50) groups; majority (67%) had diabetes for more than 10 years.
- The intervention group received educational sessions addressing improper disposal risks and environmental impacts along with practical demonstrations of correct sharps disposal methods; they were also given free puncture-resistant containers to safely dispose of the sharp waste generated from diabetes management.
- Assessments were performed at baseline, 2 months, and 4 months postintervention, evaluating knowledge levels (poor: < 50%, fair: 50% to < 70%, good: 70%-100%) and practice scores (poor: 0-6, fair: 7-10, good: 11-14).
TAKEAWAY:
- Overall, 58% of the patients used insulin pens, and approximately 75% required two doses of insulin daily.
- The median monthly disposal was 10 syringes per patient among syringe users and eight pen needles per patient among pen users.
- At baseline, there were no differences in the knowledge scores between the intervention and nonintervention groups; however, at both 2 and 4 months, the intervention group showed a significantly higher median knowledge score than the nonintervention group (P < .001 for both).
- Likewise, practice scores also showed marked improvements in the intervention group, compared with the nonintervention group at the end of the program (P < .001).
IN PRACTICE:
“The success of the environmental education program underscores the need for targeted interventions to enhance patient knowledge and safe sharps disposal practices. By offering accessible disposal options and raising awareness, healthcare facilities can significantly contribute to preventing accidental needle-stick injuries and reducing the risk of infectious disease transmission,” the authors wrote.
SOURCE:
This study was led by Hossam Mohamed Hassan Soliman, High Institute of Public Health, Alexandria University, Egypt. It was published online in Scientific Reports.
LIMITATIONS:
Interview bias and self-reporting bias in data collection were major limitations of this study. The quasi-experimental design, lacking randomization, may have limited the strength of causal inferences.
DISCLOSURES:
No funding was received for this study, and the authors reported no relevant conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
A program combining theoretical training with free disposal containers can effectively increase knowledge and improve sharps waste disposal practices among patients with diabetes.
METHODOLOGY:
- A significant number of patients with diabetes administer insulin at home. Unsafe waste disposal including insulin pens, syringes, and lancets increases the risk for needle-stick injuries, microbial infections, and plastic waste accumulation, highlighting the need for safe disposal practices.
- Researchers conducted an experimental study at El-Horraya Polyclinic in Alexandria, Egypt, between November 2022 and April 2023 to evaluate the effectiveness of an intervention program in improving knowledge and practices related to safe sharps disposal among patients with diabetes.
- Overall, 100 patients (median age, 61 years; 92% living in urban areas) with either type 1 or type 2 diabetes were recruited and divided into the educational intervention (n = 50) and nonintervention (n = 50) groups; majority (67%) had diabetes for more than 10 years.
- The intervention group received educational sessions addressing improper disposal risks and environmental impacts along with practical demonstrations of correct sharps disposal methods; they were also given free puncture-resistant containers to safely dispose of the sharp waste generated from diabetes management.
- Assessments were performed at baseline, 2 months, and 4 months postintervention, evaluating knowledge levels (poor: < 50%, fair: 50% to < 70%, good: 70%-100%) and practice scores (poor: 0-6, fair: 7-10, good: 11-14).
TAKEAWAY:
- Overall, 58% of the patients used insulin pens, and approximately 75% required two doses of insulin daily.
- The median monthly disposal was 10 syringes per patient among syringe users and eight pen needles per patient among pen users.
- At baseline, there were no differences in the knowledge scores between the intervention and nonintervention groups; however, at both 2 and 4 months, the intervention group showed a significantly higher median knowledge score than the nonintervention group (P < .001 for both).
- Likewise, practice scores also showed marked improvements in the intervention group, compared with the nonintervention group at the end of the program (P < .001).
IN PRACTICE:
“The success of the environmental education program underscores the need for targeted interventions to enhance patient knowledge and safe sharps disposal practices. By offering accessible disposal options and raising awareness, healthcare facilities can significantly contribute to preventing accidental needle-stick injuries and reducing the risk of infectious disease transmission,” the authors wrote.
SOURCE:
This study was led by Hossam Mohamed Hassan Soliman, High Institute of Public Health, Alexandria University, Egypt. It was published online in Scientific Reports.
LIMITATIONS:
Interview bias and self-reporting bias in data collection were major limitations of this study. The quasi-experimental design, lacking randomization, may have limited the strength of causal inferences.
DISCLOSURES:
No funding was received for this study, and the authors reported no relevant conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Need for Biologic in Early RA Signals Lower Likelihood of Achieving Drug-Free Remission
TOPLINE:
Patients who require biologic disease-modifying antirheumatic drugs (DMARDs) for severe RA are less likely to achieve sustained DMARD-free remission than those not needing the medication.
METHODOLOGY:
- Patients with early RA from the Leiden Early Arthritis Clinic (EAC; n = 627) and the Rotterdam Early Arthritis Cohort (tREACH) trial (n = 425) were followed for 5 years and 3 years, respectively.
- Most patients in both the EAC (86%) and tREACH (64%) cohorts had never used a biologic DMARD during the follow-up period.
- The primary outcome measure was sustained DMARD-free remission, defined as the absence of clinical synovitis after discontinuation of DMARDs for at least 1 year.
TAKEAWAY:
- None of the EAC patients using a biologic DMARD achieved sustained DMARD-free remission, but 37% of those who never used the drug reached remission at 5 years (hazard ratio [HR], 0.02; P < .0001).
- No tREACH patients using a biologic DMARD reached sustained DMARD-free remission, but 15% of those who never used the drug achieved remission at 3 years (HR, 0.03; P < .0001).
- Sustained DMARD-free remission was higher in EAC patients who were negative for anti-citrullinated protein antibody (ACPA) than in those who were ACPA-positive at 5 years (56% vs 14%; P < .0001).
- During follow-up, some patients in both the EAC (9%) and tREACH (14%) cohorts experienced late flares after more than 1 year of discontinuing DMARDs.
IN PRACTICE:
“Sustained DMARD-free remission is unlikely in patients needing a biologic DMARD,” the authors said.
SOURCE:
Judith W. Heutz, MD, Erasmus Medical Center, Rotterdam, the Netherlands, led the study, published online on December 20, 2024, in The Lancet Rheumatology.
LIMITATIONS:
Because both cohorts were defined during follow-up rather than at baseline, outcomes related to the use of DMARDs and remission status could have been misinterpreted. Although the study adjusted for ACPA status, other factors such as disease activity were not corrected, which could have potentially led to residual confounding. Sparse data bias was present, especially in the biologic DMARD user group, in which none of the patients reached sustained DMARD-free remission.
DISCLOSURES:
The EAC received funding from the Dutch Arthritis Foundation and the European Research Council under the European Union’s Horizon 2020 research and innovation program. The tREACH trial was supported by an unrestricted grant from Pfizer. The authors declared no competing interests.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Patients who require biologic disease-modifying antirheumatic drugs (DMARDs) for severe RA are less likely to achieve sustained DMARD-free remission than those not needing the medication.
METHODOLOGY:
- Patients with early RA from the Leiden Early Arthritis Clinic (EAC; n = 627) and the Rotterdam Early Arthritis Cohort (tREACH) trial (n = 425) were followed for 5 years and 3 years, respectively.
- Most patients in both the EAC (86%) and tREACH (64%) cohorts had never used a biologic DMARD during the follow-up period.
- The primary outcome measure was sustained DMARD-free remission, defined as the absence of clinical synovitis after discontinuation of DMARDs for at least 1 year.
TAKEAWAY:
- None of the EAC patients using a biologic DMARD achieved sustained DMARD-free remission, but 37% of those who never used the drug reached remission at 5 years (hazard ratio [HR], 0.02; P < .0001).
- No tREACH patients using a biologic DMARD reached sustained DMARD-free remission, but 15% of those who never used the drug achieved remission at 3 years (HR, 0.03; P < .0001).
- Sustained DMARD-free remission was higher in EAC patients who were negative for anti-citrullinated protein antibody (ACPA) than in those who were ACPA-positive at 5 years (56% vs 14%; P < .0001).
- During follow-up, some patients in both the EAC (9%) and tREACH (14%) cohorts experienced late flares after more than 1 year of discontinuing DMARDs.
IN PRACTICE:
“Sustained DMARD-free remission is unlikely in patients needing a biologic DMARD,” the authors said.
SOURCE:
Judith W. Heutz, MD, Erasmus Medical Center, Rotterdam, the Netherlands, led the study, published online on December 20, 2024, in The Lancet Rheumatology.
LIMITATIONS:
Because both cohorts were defined during follow-up rather than at baseline, outcomes related to the use of DMARDs and remission status could have been misinterpreted. Although the study adjusted for ACPA status, other factors such as disease activity were not corrected, which could have potentially led to residual confounding. Sparse data bias was present, especially in the biologic DMARD user group, in which none of the patients reached sustained DMARD-free remission.
DISCLOSURES:
The EAC received funding from the Dutch Arthritis Foundation and the European Research Council under the European Union’s Horizon 2020 research and innovation program. The tREACH trial was supported by an unrestricted grant from Pfizer. The authors declared no competing interests.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Patients who require biologic disease-modifying antirheumatic drugs (DMARDs) for severe RA are less likely to achieve sustained DMARD-free remission than those not needing the medication.
METHODOLOGY:
- Patients with early RA from the Leiden Early Arthritis Clinic (EAC; n = 627) and the Rotterdam Early Arthritis Cohort (tREACH) trial (n = 425) were followed for 5 years and 3 years, respectively.
- Most patients in both the EAC (86%) and tREACH (64%) cohorts had never used a biologic DMARD during the follow-up period.
- The primary outcome measure was sustained DMARD-free remission, defined as the absence of clinical synovitis after discontinuation of DMARDs for at least 1 year.
TAKEAWAY:
- None of the EAC patients using a biologic DMARD achieved sustained DMARD-free remission, but 37% of those who never used the drug reached remission at 5 years (hazard ratio [HR], 0.02; P < .0001).
- No tREACH patients using a biologic DMARD reached sustained DMARD-free remission, but 15% of those who never used the drug achieved remission at 3 years (HR, 0.03; P < .0001).
- Sustained DMARD-free remission was higher in EAC patients who were negative for anti-citrullinated protein antibody (ACPA) than in those who were ACPA-positive at 5 years (56% vs 14%; P < .0001).
- During follow-up, some patients in both the EAC (9%) and tREACH (14%) cohorts experienced late flares after more than 1 year of discontinuing DMARDs.
IN PRACTICE:
“Sustained DMARD-free remission is unlikely in patients needing a biologic DMARD,” the authors said.
SOURCE:
Judith W. Heutz, MD, Erasmus Medical Center, Rotterdam, the Netherlands, led the study, published online on December 20, 2024, in The Lancet Rheumatology.
LIMITATIONS:
Because both cohorts were defined during follow-up rather than at baseline, outcomes related to the use of DMARDs and remission status could have been misinterpreted. Although the study adjusted for ACPA status, other factors such as disease activity were not corrected, which could have potentially led to residual confounding. Sparse data bias was present, especially in the biologic DMARD user group, in which none of the patients reached sustained DMARD-free remission.
DISCLOSURES:
The EAC received funding from the Dutch Arthritis Foundation and the European Research Council under the European Union’s Horizon 2020 research and innovation program. The tREACH trial was supported by an unrestricted grant from Pfizer. The authors declared no competing interests.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TNF Inhibitors Ward Off Fracture in Axial Spondyloarthritis
TOPLINE:
Tumor necrosis factor (TNF) inhibitors protect patients with axial spondyloarthritis (axSpA) from hip and spine fractures better than other drugs.
METHODOLOGY:
- Large US insurance claims database study comparing protective effects of TNF inhibitors vs other drugs on fractures in patients with axSpA.
- The study included 13,519 patients with axSpA aged 18-65 years, of whom 1229 had hip or spine fractures (mean age, 53 years; 38% women) and 12,290 were control participants without fractures.
- Effects of TNF inhibitors, nonsteroidal anti-inflammatory drugs (NSAIDs), conventional synthetic disease-modifying antirheumatic drugs (csDMARDs), or no medication before the fracture were compared.
- The primary outcome was a composite hip and/or spine fracture, and the secondary outcome was a spine fracture.
TAKEAWAY:
- TNF inhibitor users had a lower risk for hip and spine fractures (adjusted odds ratio [aOR], 0.75; 95% CI, 0.62-0.91) than NSAID users, but this protective association was not seen in csDMARD users.
- Sex-stratified analysis showed similar protective effects of TNF inhibitors in both women and men.
- TNF inhibitor users showed a significantly lower risk for spine fractures than NSAID users (aOR, 0.81; 95% CI, 0.66-0.99).
- The protective effect of TNF inhibitors on hip and spine fractures was also seen in patients with a history of prior fractures; however, the effect was not statistically significant.
IN PRACTICE:
“Our findings underscore the multifaceted benefits of TNF inhibitors in axSpA,” the authors wrote.
SOURCE:
Devin Driscoll, MD, Section of Rheumatology, Department of Medicine, Boston University Chobanian & Avedisian School of Medicine, Boston, led the study, published online in Arthritis & Rheumatology.
LIMITATIONS:
Identification of the axSpA cohort and fracture outcomes was based solely on diagnostic and procedure codes, which may have led to misclassification. The administrative database lacked detailed information on disease activity levels, and the high proportion of missing data for body mass index, a known strong confounder for fracture risk, may have introduced bias. There were insufficient numbers of hip fractures to conduct analyses limited solely to hip fractures.
DISCLOSURES:
The study was supported by R03 AR076495 and NIH P30 AR072571. Two authors declared receiving grants, contracts, payments, honoraria, and other affiliations with various institutions and pharmaceutical companies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Tumor necrosis factor (TNF) inhibitors protect patients with axial spondyloarthritis (axSpA) from hip and spine fractures better than other drugs.
METHODOLOGY:
- Large US insurance claims database study comparing protective effects of TNF inhibitors vs other drugs on fractures in patients with axSpA.
- The study included 13,519 patients with axSpA aged 18-65 years, of whom 1229 had hip or spine fractures (mean age, 53 years; 38% women) and 12,290 were control participants without fractures.
- Effects of TNF inhibitors, nonsteroidal anti-inflammatory drugs (NSAIDs), conventional synthetic disease-modifying antirheumatic drugs (csDMARDs), or no medication before the fracture were compared.
- The primary outcome was a composite hip and/or spine fracture, and the secondary outcome was a spine fracture.
TAKEAWAY:
- TNF inhibitor users had a lower risk for hip and spine fractures (adjusted odds ratio [aOR], 0.75; 95% CI, 0.62-0.91) than NSAID users, but this protective association was not seen in csDMARD users.
- Sex-stratified analysis showed similar protective effects of TNF inhibitors in both women and men.
- TNF inhibitor users showed a significantly lower risk for spine fractures than NSAID users (aOR, 0.81; 95% CI, 0.66-0.99).
- The protective effect of TNF inhibitors on hip and spine fractures was also seen in patients with a history of prior fractures; however, the effect was not statistically significant.
IN PRACTICE:
“Our findings underscore the multifaceted benefits of TNF inhibitors in axSpA,” the authors wrote.
SOURCE:
Devin Driscoll, MD, Section of Rheumatology, Department of Medicine, Boston University Chobanian & Avedisian School of Medicine, Boston, led the study, published online in Arthritis & Rheumatology.
LIMITATIONS:
Identification of the axSpA cohort and fracture outcomes was based solely on diagnostic and procedure codes, which may have led to misclassification. The administrative database lacked detailed information on disease activity levels, and the high proportion of missing data for body mass index, a known strong confounder for fracture risk, may have introduced bias. There were insufficient numbers of hip fractures to conduct analyses limited solely to hip fractures.
DISCLOSURES:
The study was supported by R03 AR076495 and NIH P30 AR072571. Two authors declared receiving grants, contracts, payments, honoraria, and other affiliations with various institutions and pharmaceutical companies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Tumor necrosis factor (TNF) inhibitors protect patients with axial spondyloarthritis (axSpA) from hip and spine fractures better than other drugs.
METHODOLOGY:
- Large US insurance claims database study comparing protective effects of TNF inhibitors vs other drugs on fractures in patients with axSpA.
- The study included 13,519 patients with axSpA aged 18-65 years, of whom 1229 had hip or spine fractures (mean age, 53 years; 38% women) and 12,290 were control participants without fractures.
- Effects of TNF inhibitors, nonsteroidal anti-inflammatory drugs (NSAIDs), conventional synthetic disease-modifying antirheumatic drugs (csDMARDs), or no medication before the fracture were compared.
- The primary outcome was a composite hip and/or spine fracture, and the secondary outcome was a spine fracture.
TAKEAWAY:
- TNF inhibitor users had a lower risk for hip and spine fractures (adjusted odds ratio [aOR], 0.75; 95% CI, 0.62-0.91) than NSAID users, but this protective association was not seen in csDMARD users.
- Sex-stratified analysis showed similar protective effects of TNF inhibitors in both women and men.
- TNF inhibitor users showed a significantly lower risk for spine fractures than NSAID users (aOR, 0.81; 95% CI, 0.66-0.99).
- The protective effect of TNF inhibitors on hip and spine fractures was also seen in patients with a history of prior fractures; however, the effect was not statistically significant.
IN PRACTICE:
“Our findings underscore the multifaceted benefits of TNF inhibitors in axSpA,” the authors wrote.
SOURCE:
Devin Driscoll, MD, Section of Rheumatology, Department of Medicine, Boston University Chobanian & Avedisian School of Medicine, Boston, led the study, published online in Arthritis & Rheumatology.
LIMITATIONS:
Identification of the axSpA cohort and fracture outcomes was based solely on diagnostic and procedure codes, which may have led to misclassification. The administrative database lacked detailed information on disease activity levels, and the high proportion of missing data for body mass index, a known strong confounder for fracture risk, may have introduced bias. There were insufficient numbers of hip fractures to conduct analyses limited solely to hip fractures.
DISCLOSURES:
The study was supported by R03 AR076495 and NIH P30 AR072571. Two authors declared receiving grants, contracts, payments, honoraria, and other affiliations with various institutions and pharmaceutical companies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Stem Cell Transplant Effective for Children With Arthritis
TOPLINE:
METHODOLOGY:
- Retrospective cohort study of 13 children with refractory systemic juvenile idiopathic arthritis–related lung disease (sJIA-LD) who had allogeneic hematopoietic stem cell transplantation (HSCT).
- Children whose median age was 9 years at transplantation underwent HSCT at nine hospitals in the United States and Europe between January 2018 and October 2022, with a median follow-up of 16 months.
- Outcomes included transplant-related complications, pulmonary outcomes (eg, oxygen dependence and chest CT findings), and overall outcomes (eg, complete response, partial response, and death).
TAKEAWAY:
- Five patients developed acute graft vs host disease of varying grades, but none experienced chronic disease.
- All nine surviving patients achieved a complete response at the last follow-up, with no sJIA characteristics or need for immunosuppressive therapy or supplemental oxygen.
- Four patients died from complications including cytomegalovirus pneumonitis (n = 2), intracranial hemorrhage (n = 1), and progressive sJIA-LD (n = 1).
- Of six patients who underwent posttransplant chest CT, three had improved lung health, two had stable lung disease, and one experienced worsening lung disease, ultimately resulting in death.
IN PRACTICE:
“Allogeneic HSCT should be considered for treatment-refractory sJIA-LD,” the authors wrote.
“Efforts are being pursued for earlier recognition of patients with sJIA-LD at risk of adverse reactions to biologics. Early detection should help to avoid repeated treatments that are less effective and possibly deleterious and consider therapeutic approaches (eg, anti–[interleukin]-18 or [interferon]-delta–targeted treatments) that might act as a bridge therapy to control disease activity before HSCT,” wrote the author of an accompanying editorial.
SOURCE:
Michael G. Matt, MD, and Daniel Drozdov, MD, led the study, which was published online on December 20, 2024, in The Lancet Rheumatology.
LIMITATIONS:
Limitations included sampling bias and heterogeneity in clinical follow-up. The small sample size made it difficult to identify variables affecting survival and the achievement of a complete response. Additionally, many patients had relatively short follow-up periods.
DISCLOSURES:
This study was funded by the National Institute of Arthritis and Musculoskeletal and Skin Diseases, National Institutes of Health. Several authors reported receiving advisory board fees, consulting fees, honoraria, grant funds, and stocks and shares from various research institutes and pharmaceutical organizations.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Retrospective cohort study of 13 children with refractory systemic juvenile idiopathic arthritis–related lung disease (sJIA-LD) who had allogeneic hematopoietic stem cell transplantation (HSCT).
- Children whose median age was 9 years at transplantation underwent HSCT at nine hospitals in the United States and Europe between January 2018 and October 2022, with a median follow-up of 16 months.
- Outcomes included transplant-related complications, pulmonary outcomes (eg, oxygen dependence and chest CT findings), and overall outcomes (eg, complete response, partial response, and death).
TAKEAWAY:
- Five patients developed acute graft vs host disease of varying grades, but none experienced chronic disease.
- All nine surviving patients achieved a complete response at the last follow-up, with no sJIA characteristics or need for immunosuppressive therapy or supplemental oxygen.
- Four patients died from complications including cytomegalovirus pneumonitis (n = 2), intracranial hemorrhage (n = 1), and progressive sJIA-LD (n = 1).
- Of six patients who underwent posttransplant chest CT, three had improved lung health, two had stable lung disease, and one experienced worsening lung disease, ultimately resulting in death.
IN PRACTICE:
“Allogeneic HSCT should be considered for treatment-refractory sJIA-LD,” the authors wrote.
“Efforts are being pursued for earlier recognition of patients with sJIA-LD at risk of adverse reactions to biologics. Early detection should help to avoid repeated treatments that are less effective and possibly deleterious and consider therapeutic approaches (eg, anti–[interleukin]-18 or [interferon]-delta–targeted treatments) that might act as a bridge therapy to control disease activity before HSCT,” wrote the author of an accompanying editorial.
SOURCE:
Michael G. Matt, MD, and Daniel Drozdov, MD, led the study, which was published online on December 20, 2024, in The Lancet Rheumatology.
LIMITATIONS:
Limitations included sampling bias and heterogeneity in clinical follow-up. The small sample size made it difficult to identify variables affecting survival and the achievement of a complete response. Additionally, many patients had relatively short follow-up periods.
DISCLOSURES:
This study was funded by the National Institute of Arthritis and Musculoskeletal and Skin Diseases, National Institutes of Health. Several authors reported receiving advisory board fees, consulting fees, honoraria, grant funds, and stocks and shares from various research institutes and pharmaceutical organizations.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Retrospective cohort study of 13 children with refractory systemic juvenile idiopathic arthritis–related lung disease (sJIA-LD) who had allogeneic hematopoietic stem cell transplantation (HSCT).
- Children whose median age was 9 years at transplantation underwent HSCT at nine hospitals in the United States and Europe between January 2018 and October 2022, with a median follow-up of 16 months.
- Outcomes included transplant-related complications, pulmonary outcomes (eg, oxygen dependence and chest CT findings), and overall outcomes (eg, complete response, partial response, and death).
TAKEAWAY:
- Five patients developed acute graft vs host disease of varying grades, but none experienced chronic disease.
- All nine surviving patients achieved a complete response at the last follow-up, with no sJIA characteristics or need for immunosuppressive therapy or supplemental oxygen.
- Four patients died from complications including cytomegalovirus pneumonitis (n = 2), intracranial hemorrhage (n = 1), and progressive sJIA-LD (n = 1).
- Of six patients who underwent posttransplant chest CT, three had improved lung health, two had stable lung disease, and one experienced worsening lung disease, ultimately resulting in death.
IN PRACTICE:
“Allogeneic HSCT should be considered for treatment-refractory sJIA-LD,” the authors wrote.
“Efforts are being pursued for earlier recognition of patients with sJIA-LD at risk of adverse reactions to biologics. Early detection should help to avoid repeated treatments that are less effective and possibly deleterious and consider therapeutic approaches (eg, anti–[interleukin]-18 or [interferon]-delta–targeted treatments) that might act as a bridge therapy to control disease activity before HSCT,” wrote the author of an accompanying editorial.
SOURCE:
Michael G. Matt, MD, and Daniel Drozdov, MD, led the study, which was published online on December 20, 2024, in The Lancet Rheumatology.
LIMITATIONS:
Limitations included sampling bias and heterogeneity in clinical follow-up. The small sample size made it difficult to identify variables affecting survival and the achievement of a complete response. Additionally, many patients had relatively short follow-up periods.
DISCLOSURES:
This study was funded by the National Institute of Arthritis and Musculoskeletal and Skin Diseases, National Institutes of Health. Several authors reported receiving advisory board fees, consulting fees, honoraria, grant funds, and stocks and shares from various research institutes and pharmaceutical organizations.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Colchicine Gout Flare Prophylaxis May Also Protect Against Cardiovascular Events
TOPLINE:
Gout patients who take colchicine at the start of urate-lowering therapy have a lower risk for cardiovascular events than those who do not receive prophylaxis.
METHODOLOGY:
- Retrospective cohort study of 99,800 patients (mean age, 62.8 years; 74.4% men; 85.1% White) newly diagnosed with gout between January 1997 and March 2021 who initiated urate-lowering therapy.
- Gout flare prophylaxis, defined as a colchicine prescription for 21 days or more, was prescribed to 16,028 patients for a mean duration of 47.3 days at a mean daily dose of 0.97 mg.
- Patients who received colchicine prophylaxis and 83,772 patients who did not receive prophylaxis were followed for a mean of 175.5 and 176.9 days, respectively, in the intention-to-treat analysis.
- The primary outcome was the occurrence of the first cardiovascular event (fatal or nonfatal myocardial infarction or stroke) within 180 days of initiation of urate-lowering therapy.
TAKEAWAY:
- The risk for cardiovascular events was significantly lower with colchicine prophylaxis than without it (weighted hazard ratio [HR], 0.82; 95% CI, 0.69-0.94).
- The risk for a first-ever cardiovascular event was significantly lower with colchicine prophylaxis than without it (adjusted HR, 0.80; 95% CI, 0.62-0.97).
- The findings were similar regardless of analytical approach, and the intention-to-treat analysis did not show an increased risk for diarrhea with colchicine.
IN PRACTICE:
“The findings support consideration for the use of colchicine in people with gout and cardiovascular diseases,” the authors wrote.
“The observed beneficial effect of colchicine concerns a huge group of patients worldwide. In addition, it is conceivable that, if a cardiovascular risk reduction is indeed confirmed, a strong argument arises to recommend the prescription of a course of colchicine to all [flaring] patients with gout, independently of their preference for urate-lowering therapy in general or urate-lowering therapy with or without colchicine prophylaxis more specifically,” experts wrote in a linked commentary.
SOURCE:
Edoardo Cipolletta, MD, Academic Rheumatology, School of Medicine, Nottingham City Hospital, University of Nottingham, England, led the study, which was published online in The Lancet Rheumatology.
LIMITATIONS:
Because of the retrospective nature of the data extraction from a prospective database, the study had variations in follow-up and data completeness. Potential surveillance bias could have been introduced because patients with prior cardiovascular events were included in the study, and patients’ adherence to prescribed medications could not be verified.
DISCLOSURES:
This study was funded by the Foundation for Research in Rheumatology. Some authors reported receiving consulting fees, lecturing fees, and travel grants from various pharmaceutical companies and other additional sources.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
Gout patients who take colchicine at the start of urate-lowering therapy have a lower risk for cardiovascular events than those who do not receive prophylaxis.
METHODOLOGY:
- Retrospective cohort study of 99,800 patients (mean age, 62.8 years; 74.4% men; 85.1% White) newly diagnosed with gout between January 1997 and March 2021 who initiated urate-lowering therapy.
- Gout flare prophylaxis, defined as a colchicine prescription for 21 days or more, was prescribed to 16,028 patients for a mean duration of 47.3 days at a mean daily dose of 0.97 mg.
- Patients who received colchicine prophylaxis and 83,772 patients who did not receive prophylaxis were followed for a mean of 175.5 and 176.9 days, respectively, in the intention-to-treat analysis.
- The primary outcome was the occurrence of the first cardiovascular event (fatal or nonfatal myocardial infarction or stroke) within 180 days of initiation of urate-lowering therapy.
TAKEAWAY:
- The risk for cardiovascular events was significantly lower with colchicine prophylaxis than without it (weighted hazard ratio [HR], 0.82; 95% CI, 0.69-0.94).
- The risk for a first-ever cardiovascular event was significantly lower with colchicine prophylaxis than without it (adjusted HR, 0.80; 95% CI, 0.62-0.97).
- The findings were similar regardless of analytical approach, and the intention-to-treat analysis did not show an increased risk for diarrhea with colchicine.
IN PRACTICE:
“The findings support consideration for the use of colchicine in people with gout and cardiovascular diseases,” the authors wrote.
“The observed beneficial effect of colchicine concerns a huge group of patients worldwide. In addition, it is conceivable that, if a cardiovascular risk reduction is indeed confirmed, a strong argument arises to recommend the prescription of a course of colchicine to all [flaring] patients with gout, independently of their preference for urate-lowering therapy in general or urate-lowering therapy with or without colchicine prophylaxis more specifically,” experts wrote in a linked commentary.
SOURCE:
Edoardo Cipolletta, MD, Academic Rheumatology, School of Medicine, Nottingham City Hospital, University of Nottingham, England, led the study, which was published online in The Lancet Rheumatology.
LIMITATIONS:
Because of the retrospective nature of the data extraction from a prospective database, the study had variations in follow-up and data completeness. Potential surveillance bias could have been introduced because patients with prior cardiovascular events were included in the study, and patients’ adherence to prescribed medications could not be verified.
DISCLOSURES:
This study was funded by the Foundation for Research in Rheumatology. Some authors reported receiving consulting fees, lecturing fees, and travel grants from various pharmaceutical companies and other additional sources.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
Gout patients who take colchicine at the start of urate-lowering therapy have a lower risk for cardiovascular events than those who do not receive prophylaxis.
METHODOLOGY:
- Retrospective cohort study of 99,800 patients (mean age, 62.8 years; 74.4% men; 85.1% White) newly diagnosed with gout between January 1997 and March 2021 who initiated urate-lowering therapy.
- Gout flare prophylaxis, defined as a colchicine prescription for 21 days or more, was prescribed to 16,028 patients for a mean duration of 47.3 days at a mean daily dose of 0.97 mg.
- Patients who received colchicine prophylaxis and 83,772 patients who did not receive prophylaxis were followed for a mean of 175.5 and 176.9 days, respectively, in the intention-to-treat analysis.
- The primary outcome was the occurrence of the first cardiovascular event (fatal or nonfatal myocardial infarction or stroke) within 180 days of initiation of urate-lowering therapy.
TAKEAWAY:
- The risk for cardiovascular events was significantly lower with colchicine prophylaxis than without it (weighted hazard ratio [HR], 0.82; 95% CI, 0.69-0.94).
- The risk for a first-ever cardiovascular event was significantly lower with colchicine prophylaxis than without it (adjusted HR, 0.80; 95% CI, 0.62-0.97).
- The findings were similar regardless of analytical approach, and the intention-to-treat analysis did not show an increased risk for diarrhea with colchicine.
IN PRACTICE:
“The findings support consideration for the use of colchicine in people with gout and cardiovascular diseases,” the authors wrote.
“The observed beneficial effect of colchicine concerns a huge group of patients worldwide. In addition, it is conceivable that, if a cardiovascular risk reduction is indeed confirmed, a strong argument arises to recommend the prescription of a course of colchicine to all [flaring] patients with gout, independently of their preference for urate-lowering therapy in general or urate-lowering therapy with or without colchicine prophylaxis more specifically,” experts wrote in a linked commentary.
SOURCE:
Edoardo Cipolletta, MD, Academic Rheumatology, School of Medicine, Nottingham City Hospital, University of Nottingham, England, led the study, which was published online in The Lancet Rheumatology.
LIMITATIONS:
Because of the retrospective nature of the data extraction from a prospective database, the study had variations in follow-up and data completeness. Potential surveillance bias could have been introduced because patients with prior cardiovascular events were included in the study, and patients’ adherence to prescribed medications could not be verified.
DISCLOSURES:
This study was funded by the Foundation for Research in Rheumatology. Some authors reported receiving consulting fees, lecturing fees, and travel grants from various pharmaceutical companies and other additional sources.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.