MDedge conference coverage features onsite reporting of the latest study results and expert perspectives from leading researchers.

Slot System
Featured Buckets
Featured Buckets Admin
Reverse Chronological Sort
Allow Teaser Image

Cendakimab That Targets IL-13 Shows Promise in Eosinophilic Esophagitis

Article Type
Changed
Tue, 10/29/2024 - 14:25

Cendakimab, a monoclonal antibody targeting interleukin (IL) 13, improved symptoms and reduced esophageal eosinophil counts in adult and adolescent patients with eosinophilic esophagitis (EoE), according to interim results of a pivotal phase 3 trial.

Treatment with cendakimab also improved key endoscopic and histologic features, even in patients who had an inadequate response or intolerance to steroids, reported Alain Schoepfer, MD, gastroenterologist from Centre Hospitalier Universitaire Vaudois and University of Lausanne, in Switzerland.

The drug was generally safe and well tolerated up to 24 weeks of treatment, added Schoepfer, who presented the results during a presentation at the United European Gastroenterology (UEG) Week 2024.
 

Targeting IL-13 Shows ‘Surprisingly Good Results’

EoE is a chronic, progressive, immune-mediated, inflammatory disease that is mainly driven by the cytokine, IL-13.

In a prior phase 2 study, cendakimab, which selectively binds to IL-13 and blocks its interaction with both the IL-13Ra1 and the IL-13Ra2 receptors, was shown to improve symptoms and endoscopic features of EoE.

For the current phase 3 trial, participants were required to have a peak eosinophil count (PEC) of ≥ 15 eosinophils (eos)/high power field (hpf) and 4 or more days of dysphagia over the 2 weeks prior to the start of the study. In addition, they had to have shown a complete lack of response to proton pump inhibitor (PPI) treatment for 8 weeks or more.

A total of 430 patients were randomized 1:1:1 to subcutaneous cendakimab (360 mg) once weekly for 48 weeks; subcutaneous cendakimab (360 mg) once weekly for 24 weeks, then once every 2 weeks for a further 24 weeks; or subcutaneous placebo once weekly for 48 weeks.

Patient characteristics were similar across randomization groups. The majority of participants were men, with a mean age of 35 years (range, 12-75 years); adolescents comprised 6%-11% of the total. The disease duration was around 5-6 years for all participants, of which 45% were on a stable PPI dosage and around 65% had steroid intolerance or an inadequate response. The endoscopic reference score was around 10 across all groups. The mean PEC was around 160 eos/hpf in the cendakimab arms vs 200 eos/hpf in the placebo arm.

Schoepfer reported results for the coprimary endpoints — the mean change from baseline in dysphagia days and the proportion of patients with eosinophil histologic response (PEC ≤ 6 eos/hpf) — at week 24. At this point, a total of 286 patients had received treatment with 360 mg of cendakimab once weekly, and 143 had received placebo.

The change in dysphagia days was −6.1 in patients on cendakimab once weekly vs −4.2 in patients on placebo (P = .0005). The proportion of patients with eosinophil histologic response was 28.6% in the treatment arm vs 2.2% in the placebo arm.

The results were similar for patients who were classified as having had a steroid inadequate response. The change in dysphagia days was −6.3 in the cendakimab group vs −4.7 in the placebo group (P = .0156). The eosinophil histologic response was 29.5% in the treatment group vs 2.1% in the placebo group (P < .0001).

Endoscopic response, a key secondary endpoint, showed a change from baseline to week 24 in the endoscopic features of EoE. The total endoscopic reference scores were −5.2 for patients on cendakimab once weekly and −1.2 for patients on placebo (P < .0001).

The safety profile of cendakimab was “unspectacular,” Schoepfer said, with adverse events related to the study drug occurring in 30% of patients in the treatment arm vs 18.9% of those in the placebo arm. He noted that as the trial was conducted during the COVID pandemic, there were some infections.

Serious adverse events, which were assessed by investigators to not be related to the study drug, occurred in 1.8% and 2.8% of patients on cendakimab and placebo, respectively. Drug discontinuation occurred in 1.4% in the cendakimab group and 0.7% in the placebo group. There were no deaths.

“We really need drugs for this disease, given that there are very few alternatives to steroids and PPIs,” Co-moderator Ram Dickman, MD, Division of Gastroenterology, Rabin Medical Center, Petah Tikva, Israel, said in an interview.

Right now, we have dupilumab, which targets two receptors: IL-4 and IL-13. But targeting IL-13 by itself “is showing surprisingly good results,” so cendakimab is a good candidate to be in “the first line of biologic treatments,” Dickman said.

“It’s safe and works rapidly,” he added. “Given this is a phase 3 study, I believe we’ll see it on the market.”

Schoepfer has served as a consultant for Regeneron/Sanofi, Adare/Ellodi, AbbVie, AstraZeneca, Celgene/Receptos/Bristol Myers Squibb, Dr. Falk Pharma, Gossamer Bio, GSK, Janssen, MSD, Pfizer, Regeneron/Sanofi, Takeda, and Vifor; received grant/research support from Adare/Ellodi, Celgene/Receptos/Bristol Myers Squibb, GSK, and Regeneron/Sanofi. Dickman has declared no relevant disclosures.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Cendakimab, a monoclonal antibody targeting interleukin (IL) 13, improved symptoms and reduced esophageal eosinophil counts in adult and adolescent patients with eosinophilic esophagitis (EoE), according to interim results of a pivotal phase 3 trial.

Treatment with cendakimab also improved key endoscopic and histologic features, even in patients who had an inadequate response or intolerance to steroids, reported Alain Schoepfer, MD, gastroenterologist from Centre Hospitalier Universitaire Vaudois and University of Lausanne, in Switzerland.

The drug was generally safe and well tolerated up to 24 weeks of treatment, added Schoepfer, who presented the results during a presentation at the United European Gastroenterology (UEG) Week 2024.
 

Targeting IL-13 Shows ‘Surprisingly Good Results’

EoE is a chronic, progressive, immune-mediated, inflammatory disease that is mainly driven by the cytokine, IL-13.

In a prior phase 2 study, cendakimab, which selectively binds to IL-13 and blocks its interaction with both the IL-13Ra1 and the IL-13Ra2 receptors, was shown to improve symptoms and endoscopic features of EoE.

For the current phase 3 trial, participants were required to have a peak eosinophil count (PEC) of ≥ 15 eosinophils (eos)/high power field (hpf) and 4 or more days of dysphagia over the 2 weeks prior to the start of the study. In addition, they had to have shown a complete lack of response to proton pump inhibitor (PPI) treatment for 8 weeks or more.

A total of 430 patients were randomized 1:1:1 to subcutaneous cendakimab (360 mg) once weekly for 48 weeks; subcutaneous cendakimab (360 mg) once weekly for 24 weeks, then once every 2 weeks for a further 24 weeks; or subcutaneous placebo once weekly for 48 weeks.

Patient characteristics were similar across randomization groups. The majority of participants were men, with a mean age of 35 years (range, 12-75 years); adolescents comprised 6%-11% of the total. The disease duration was around 5-6 years for all participants, of which 45% were on a stable PPI dosage and around 65% had steroid intolerance or an inadequate response. The endoscopic reference score was around 10 across all groups. The mean PEC was around 160 eos/hpf in the cendakimab arms vs 200 eos/hpf in the placebo arm.

Schoepfer reported results for the coprimary endpoints — the mean change from baseline in dysphagia days and the proportion of patients with eosinophil histologic response (PEC ≤ 6 eos/hpf) — at week 24. At this point, a total of 286 patients had received treatment with 360 mg of cendakimab once weekly, and 143 had received placebo.

The change in dysphagia days was −6.1 in patients on cendakimab once weekly vs −4.2 in patients on placebo (P = .0005). The proportion of patients with eosinophil histologic response was 28.6% in the treatment arm vs 2.2% in the placebo arm.

The results were similar for patients who were classified as having had a steroid inadequate response. The change in dysphagia days was −6.3 in the cendakimab group vs −4.7 in the placebo group (P = .0156). The eosinophil histologic response was 29.5% in the treatment group vs 2.1% in the placebo group (P < .0001).

Endoscopic response, a key secondary endpoint, showed a change from baseline to week 24 in the endoscopic features of EoE. The total endoscopic reference scores were −5.2 for patients on cendakimab once weekly and −1.2 for patients on placebo (P < .0001).

The safety profile of cendakimab was “unspectacular,” Schoepfer said, with adverse events related to the study drug occurring in 30% of patients in the treatment arm vs 18.9% of those in the placebo arm. He noted that as the trial was conducted during the COVID pandemic, there were some infections.

Serious adverse events, which were assessed by investigators to not be related to the study drug, occurred in 1.8% and 2.8% of patients on cendakimab and placebo, respectively. Drug discontinuation occurred in 1.4% in the cendakimab group and 0.7% in the placebo group. There were no deaths.

“We really need drugs for this disease, given that there are very few alternatives to steroids and PPIs,” Co-moderator Ram Dickman, MD, Division of Gastroenterology, Rabin Medical Center, Petah Tikva, Israel, said in an interview.

Right now, we have dupilumab, which targets two receptors: IL-4 and IL-13. But targeting IL-13 by itself “is showing surprisingly good results,” so cendakimab is a good candidate to be in “the first line of biologic treatments,” Dickman said.

“It’s safe and works rapidly,” he added. “Given this is a phase 3 study, I believe we’ll see it on the market.”

Schoepfer has served as a consultant for Regeneron/Sanofi, Adare/Ellodi, AbbVie, AstraZeneca, Celgene/Receptos/Bristol Myers Squibb, Dr. Falk Pharma, Gossamer Bio, GSK, Janssen, MSD, Pfizer, Regeneron/Sanofi, Takeda, and Vifor; received grant/research support from Adare/Ellodi, Celgene/Receptos/Bristol Myers Squibb, GSK, and Regeneron/Sanofi. Dickman has declared no relevant disclosures.

A version of this article appeared on Medscape.com.

Cendakimab, a monoclonal antibody targeting interleukin (IL) 13, improved symptoms and reduced esophageal eosinophil counts in adult and adolescent patients with eosinophilic esophagitis (EoE), according to interim results of a pivotal phase 3 trial.

Treatment with cendakimab also improved key endoscopic and histologic features, even in patients who had an inadequate response or intolerance to steroids, reported Alain Schoepfer, MD, gastroenterologist from Centre Hospitalier Universitaire Vaudois and University of Lausanne, in Switzerland.

The drug was generally safe and well tolerated up to 24 weeks of treatment, added Schoepfer, who presented the results during a presentation at the United European Gastroenterology (UEG) Week 2024.
 

Targeting IL-13 Shows ‘Surprisingly Good Results’

EoE is a chronic, progressive, immune-mediated, inflammatory disease that is mainly driven by the cytokine, IL-13.

In a prior phase 2 study, cendakimab, which selectively binds to IL-13 and blocks its interaction with both the IL-13Ra1 and the IL-13Ra2 receptors, was shown to improve symptoms and endoscopic features of EoE.

For the current phase 3 trial, participants were required to have a peak eosinophil count (PEC) of ≥ 15 eosinophils (eos)/high power field (hpf) and 4 or more days of dysphagia over the 2 weeks prior to the start of the study. In addition, they had to have shown a complete lack of response to proton pump inhibitor (PPI) treatment for 8 weeks or more.

A total of 430 patients were randomized 1:1:1 to subcutaneous cendakimab (360 mg) once weekly for 48 weeks; subcutaneous cendakimab (360 mg) once weekly for 24 weeks, then once every 2 weeks for a further 24 weeks; or subcutaneous placebo once weekly for 48 weeks.

Patient characteristics were similar across randomization groups. The majority of participants were men, with a mean age of 35 years (range, 12-75 years); adolescents comprised 6%-11% of the total. The disease duration was around 5-6 years for all participants, of which 45% were on a stable PPI dosage and around 65% had steroid intolerance or an inadequate response. The endoscopic reference score was around 10 across all groups. The mean PEC was around 160 eos/hpf in the cendakimab arms vs 200 eos/hpf in the placebo arm.

Schoepfer reported results for the coprimary endpoints — the mean change from baseline in dysphagia days and the proportion of patients with eosinophil histologic response (PEC ≤ 6 eos/hpf) — at week 24. At this point, a total of 286 patients had received treatment with 360 mg of cendakimab once weekly, and 143 had received placebo.

The change in dysphagia days was −6.1 in patients on cendakimab once weekly vs −4.2 in patients on placebo (P = .0005). The proportion of patients with eosinophil histologic response was 28.6% in the treatment arm vs 2.2% in the placebo arm.

The results were similar for patients who were classified as having had a steroid inadequate response. The change in dysphagia days was −6.3 in the cendakimab group vs −4.7 in the placebo group (P = .0156). The eosinophil histologic response was 29.5% in the treatment group vs 2.1% in the placebo group (P < .0001).

Endoscopic response, a key secondary endpoint, showed a change from baseline to week 24 in the endoscopic features of EoE. The total endoscopic reference scores were −5.2 for patients on cendakimab once weekly and −1.2 for patients on placebo (P < .0001).

The safety profile of cendakimab was “unspectacular,” Schoepfer said, with adverse events related to the study drug occurring in 30% of patients in the treatment arm vs 18.9% of those in the placebo arm. He noted that as the trial was conducted during the COVID pandemic, there were some infections.

Serious adverse events, which were assessed by investigators to not be related to the study drug, occurred in 1.8% and 2.8% of patients on cendakimab and placebo, respectively. Drug discontinuation occurred in 1.4% in the cendakimab group and 0.7% in the placebo group. There were no deaths.

“We really need drugs for this disease, given that there are very few alternatives to steroids and PPIs,” Co-moderator Ram Dickman, MD, Division of Gastroenterology, Rabin Medical Center, Petah Tikva, Israel, said in an interview.

Right now, we have dupilumab, which targets two receptors: IL-4 and IL-13. But targeting IL-13 by itself “is showing surprisingly good results,” so cendakimab is a good candidate to be in “the first line of biologic treatments,” Dickman said.

“It’s safe and works rapidly,” he added. “Given this is a phase 3 study, I believe we’ll see it on the market.”

Schoepfer has served as a consultant for Regeneron/Sanofi, Adare/Ellodi, AbbVie, AstraZeneca, Celgene/Receptos/Bristol Myers Squibb, Dr. Falk Pharma, Gossamer Bio, GSK, Janssen, MSD, Pfizer, Regeneron/Sanofi, Takeda, and Vifor; received grant/research support from Adare/Ellodi, Celgene/Receptos/Bristol Myers Squibb, GSK, and Regeneron/Sanofi. Dickman has declared no relevant disclosures.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM UEG 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Innovative Biomaterial May Treat Common Vaginal Changes and Discomfort in Menopausal Women

Article Type
Changed
Tue, 10/29/2024 - 11:54

A novel biomaterial developed by researchers at the University of California, San Diego, may help treat commonly overlooked menopausal vaginal changes and discomfort experienced by many women.

As many as 84% of menopausal women experience genitourinary syndrome of menopause, a condition that can cause vaginal dryness, irritation, and pain during intercourse and significantly affect quality of life. Current treatments, mainly estrogen creams, help with surface issues but don’t address deeper tissue problems.

Marianna Alperin, MD, and researchers at her lab created a gel-like material derived from pig vaginal tissue designed to mimic the natural environment of the vagina and stimulate the body’s own healing processes.

“We used porcine vaginal tissue that was minced, decellularized by detergent, lyophilized, milled into powder, and enzymatically digested,” said Alperin, professor and vice chair for translational research in the Department of Obstetrics, Gynecology, and Reproductive Sciences and professor of urology at the University of California, San Diego.

Using the vaginal extracellular matrix biomaterial on rats — which have vaginal tissue similar to that of humans — improved vaginal epithelial thickness and health of the vaginal lining.

Three days after administering the biomaterial, the treatment group exhibited a mean epithelial thickness of 32.37 ± 6.29 µm, compared with 19.00 ± 1.59 µm in the saline control group (P < .0001). Rats treated with vaginal extracellular matrix biomaterial also showed a mean smooth muscle layer thickness of 54.02 ± 10.56 µm, significantly thicker than the saline group’s 35.07 ± 7.80 µm (P < .05), the study found.

“While [the biomaterial] did not restore the epithelial thickness all the way to the level of the healthy, unperturbed animals, it certainly was superior to the other groups, especially at the higher dose,” she said.

It also enhanced the underlying muscle layer, something current treatments don’t typically achieve, the researchers noted.

Alperin’s research was awarded best overall paper at the American Urogynecologic Society’s PFD Week conference in Washington, DC.

The material seems to work by interacting with immune cells to carry the healing material deeper into the vaginal tissues, potentially explaining its widespread effects.

“It looked like the cells are trafficking the biomaterial into the deeper tissues, which is very exciting,” said Alperin, adding that unlike existing treatments, this new approach may improve both the surface layer and deeper tissues of the vagina.

Also, the benefits appeared to increase with higher doses of the material, they found.

While the study shows promise, Alperin acknowledged that further research is needed, particularly in comparing their treatment with topical estrogen.

“We are repeating the experiment with the dose adjusted to the volume of the rat vagina,” Alperin said.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

A novel biomaterial developed by researchers at the University of California, San Diego, may help treat commonly overlooked menopausal vaginal changes and discomfort experienced by many women.

As many as 84% of menopausal women experience genitourinary syndrome of menopause, a condition that can cause vaginal dryness, irritation, and pain during intercourse and significantly affect quality of life. Current treatments, mainly estrogen creams, help with surface issues but don’t address deeper tissue problems.

Marianna Alperin, MD, and researchers at her lab created a gel-like material derived from pig vaginal tissue designed to mimic the natural environment of the vagina and stimulate the body’s own healing processes.

“We used porcine vaginal tissue that was minced, decellularized by detergent, lyophilized, milled into powder, and enzymatically digested,” said Alperin, professor and vice chair for translational research in the Department of Obstetrics, Gynecology, and Reproductive Sciences and professor of urology at the University of California, San Diego.

Using the vaginal extracellular matrix biomaterial on rats — which have vaginal tissue similar to that of humans — improved vaginal epithelial thickness and health of the vaginal lining.

Three days after administering the biomaterial, the treatment group exhibited a mean epithelial thickness of 32.37 ± 6.29 µm, compared with 19.00 ± 1.59 µm in the saline control group (P < .0001). Rats treated with vaginal extracellular matrix biomaterial also showed a mean smooth muscle layer thickness of 54.02 ± 10.56 µm, significantly thicker than the saline group’s 35.07 ± 7.80 µm (P < .05), the study found.

“While [the biomaterial] did not restore the epithelial thickness all the way to the level of the healthy, unperturbed animals, it certainly was superior to the other groups, especially at the higher dose,” she said.

It also enhanced the underlying muscle layer, something current treatments don’t typically achieve, the researchers noted.

Alperin’s research was awarded best overall paper at the American Urogynecologic Society’s PFD Week conference in Washington, DC.

The material seems to work by interacting with immune cells to carry the healing material deeper into the vaginal tissues, potentially explaining its widespread effects.

“It looked like the cells are trafficking the biomaterial into the deeper tissues, which is very exciting,” said Alperin, adding that unlike existing treatments, this new approach may improve both the surface layer and deeper tissues of the vagina.

Also, the benefits appeared to increase with higher doses of the material, they found.

While the study shows promise, Alperin acknowledged that further research is needed, particularly in comparing their treatment with topical estrogen.

“We are repeating the experiment with the dose adjusted to the volume of the rat vagina,” Alperin said.
 

A version of this article appeared on Medscape.com.

A novel biomaterial developed by researchers at the University of California, San Diego, may help treat commonly overlooked menopausal vaginal changes and discomfort experienced by many women.

As many as 84% of menopausal women experience genitourinary syndrome of menopause, a condition that can cause vaginal dryness, irritation, and pain during intercourse and significantly affect quality of life. Current treatments, mainly estrogen creams, help with surface issues but don’t address deeper tissue problems.

Marianna Alperin, MD, and researchers at her lab created a gel-like material derived from pig vaginal tissue designed to mimic the natural environment of the vagina and stimulate the body’s own healing processes.

“We used porcine vaginal tissue that was minced, decellularized by detergent, lyophilized, milled into powder, and enzymatically digested,” said Alperin, professor and vice chair for translational research in the Department of Obstetrics, Gynecology, and Reproductive Sciences and professor of urology at the University of California, San Diego.

Using the vaginal extracellular matrix biomaterial on rats — which have vaginal tissue similar to that of humans — improved vaginal epithelial thickness and health of the vaginal lining.

Three days after administering the biomaterial, the treatment group exhibited a mean epithelial thickness of 32.37 ± 6.29 µm, compared with 19.00 ± 1.59 µm in the saline control group (P < .0001). Rats treated with vaginal extracellular matrix biomaterial also showed a mean smooth muscle layer thickness of 54.02 ± 10.56 µm, significantly thicker than the saline group’s 35.07 ± 7.80 µm (P < .05), the study found.

“While [the biomaterial] did not restore the epithelial thickness all the way to the level of the healthy, unperturbed animals, it certainly was superior to the other groups, especially at the higher dose,” she said.

It also enhanced the underlying muscle layer, something current treatments don’t typically achieve, the researchers noted.

Alperin’s research was awarded best overall paper at the American Urogynecologic Society’s PFD Week conference in Washington, DC.

The material seems to work by interacting with immune cells to carry the healing material deeper into the vaginal tissues, potentially explaining its widespread effects.

“It looked like the cells are trafficking the biomaterial into the deeper tissues, which is very exciting,” said Alperin, adding that unlike existing treatments, this new approach may improve both the surface layer and deeper tissues of the vagina.

Also, the benefits appeared to increase with higher doses of the material, they found.

While the study shows promise, Alperin acknowledged that further research is needed, particularly in comparing their treatment with topical estrogen.

“We are repeating the experiment with the dose adjusted to the volume of the rat vagina,” Alperin said.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

When It Comes to Polyp Diagnosis With CADx, Location Matters

Article Type
Changed
Thu, 11/07/2024 - 02:03

The effectiveness of computer-aided diagnosis (CADx) in differentiating neoplastic from non-neoplastic polyps depends on the region of the colon examined, according to a systematic review and meta-analysis.

In particular, the diagnostic performance of CADx for polyps showed significantly lower specificity in the proximal colon than in the distal colon.

“While current CADx systems are suitable for use in the distal colon, they should not be employed for diagnosing polyps in the proximal colon until new, higher performing systems are developed specifically for these lesions,” said study lead Tommy Rizkala, MD, Endoscopy Unit, IRCCS Humanitas Clinical and Research Center, Rozzano, Italy.

The “main strength” of the review is that the researchers contacted each study author for more specific information and were therefore able to divide the data into the proximal colon and the rectosigmoid colon, he explained.

“This is the first paper that has really collected these data. Most papers provide data for the entire colon or just for the rectosigmoid colon,” said Rizkala, who presented the findings at the United European Gastroenterology (UEG) Week 2024.

The study was also recently published in Clinical Gastroenterology and Hepatology.

Optical diagnosis enables real-time histologic predictions of polyps 5 mm or smaller during colonoscopy, offering potential clinical and cost-saving benefits. Two optical diagnostic strategies are used for polyps in this size range based on location: A leave-in-situ strategy (applied only in the rectosigmoid colon when there is high confidence of non-neoplastic polyps) and a resect-and-discard strategy (applied only in the whole colon when there is high confidence of neoplastic polyps upon optical diagnosis).

Rizkala carried out a review of studies that evaluated the performance of real-time CADx alone — independent of endoscopist judgment — for predicting the histology of colorectal polyps 5 mm or smaller. The primary endpoints were CADx sensitivity and specificity in the proximal colon (the portion extending from the descending colon to the cecum) and the distal colon (limited to the rectosigmoid region). Secondary outcomes were the negative predictive value (NPV), positive predictive value (PPV), and accuracy of the CADx alone in the proximal colon and the distal colon.
 

Lower Specificity in the Proximal Colon

An analysis of data based on 7782 polyps ≤ 5 mm from 11 studies found specificity values of 0.62 (95% CI, 0.52-0.71) and 0.85 (95% CI, 0.75-0.92) for the proximal and distal regions of the colon, respectively, with a risk ratio (RR) of 0.74 (95% CI, 0.72-0.84), meaning that CADx accuracy was significantly lower in the proximal colon than in the distal colon.

“According to the optical diagnosis strategy, we can use the leave-in-situ approach for the distal colon because the performance is adequate, but for the rest of the colon, CADx requires further enhancement,” Rizkala said.

Sensitivity values were 0.89 (95% CI, 0.83-0.93) and 0.87 (95% CI, 0.80-0.92) for the proximal and distal regions, respectively, with an RR of 1.00 (95% CI, 0.97-1.03).

Regarding the secondary outcomes, the NPV was 0.64 vs 0.93 for the proximal vs distal colon, with an RR of 0.71 (95% CI, 0.64-0.79), and accuracy was 0.81 vs 0.86, with an RR of 0.95 (95% CI, 0.91-0.99).

With the higher prevalence of neoplastic lesions in the proximal colon than in the distal colon, a lower NPV was observed in the proximal colon, Rizkala noted.

The PPV was 0.87 vs 0.76 for the proximal vs distal colon, with an RR of 1.11 (95% CI, 1.06-1.17), so the two parts of the colon were comparable, he reported.

In the future, CADx systems should focus on using lesions from the proximal colon to train more accurately because currently CADx systems are trained on the available endoscopic data in which most of those polyps are from the rectosigmoid colon, Rizkala said.

We would also “like manufacturers of CADx systems to provide public access to data balanced between the proximal and distal regions of the colon,” he added.
 

 

 

Diagnosis More Challenging Than Detection With CADx

Commenting on the study, comoderator David G. Graham, MD, consultant gastroenterologist at University College London Hospital in England, remarked: “The key questions here relate to why are these systems underperforming in the proximal colon, and how can we improve this?”

Are these results “due to the very different appearance of adenomas in the distal colon vs the proximal colon on CADx (which is not what we see as endoscopists but seems to be what the systems are seeing), or is it due to a different characterization of polyps,” that is, more sessile serrated lesions in the proximal colon than in the distal colon, he asked.

Also commenting on the study was Raf Bisschops, MD, head of endoscopy at KU Leuven in Belgium. He remarked that the review underscores the fact that optical diagnosis by artificial intelligence is a more challenging task than detection.

It is “not entirely clear” what would explain the difference in performance of CADx between the distal colon and proximal colon, he said. It can’t be excluded that the inclusion of different CADx systems, some of which clearly underperformed, may account for the difference.

He went on to suggest that the differences might be down to location beyond proximal and distal.

“The difference in performance between the right and left colon is also interesting, since recent insights in the molecular and morphological features of hyperplastic polyps indicates that there are different classes with more goblet cell–rich hyperplastic polyps in the right colon, and more microvesicular hyperplastic polyps in the left.”

These have “distinct microscopic and endoscopic appearances” that could account for a difference in performance of a CADx system if not included in the training and validation sets, he explained.

Rizkala and Graham reported no relevant disclosures. Bisschops reported receiving research grants and speaker fees from Medtronic, Fujifilm, and Pentax.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The effectiveness of computer-aided diagnosis (CADx) in differentiating neoplastic from non-neoplastic polyps depends on the region of the colon examined, according to a systematic review and meta-analysis.

In particular, the diagnostic performance of CADx for polyps showed significantly lower specificity in the proximal colon than in the distal colon.

“While current CADx systems are suitable for use in the distal colon, they should not be employed for diagnosing polyps in the proximal colon until new, higher performing systems are developed specifically for these lesions,” said study lead Tommy Rizkala, MD, Endoscopy Unit, IRCCS Humanitas Clinical and Research Center, Rozzano, Italy.

The “main strength” of the review is that the researchers contacted each study author for more specific information and were therefore able to divide the data into the proximal colon and the rectosigmoid colon, he explained.

“This is the first paper that has really collected these data. Most papers provide data for the entire colon or just for the rectosigmoid colon,” said Rizkala, who presented the findings at the United European Gastroenterology (UEG) Week 2024.

The study was also recently published in Clinical Gastroenterology and Hepatology.

Optical diagnosis enables real-time histologic predictions of polyps 5 mm or smaller during colonoscopy, offering potential clinical and cost-saving benefits. Two optical diagnostic strategies are used for polyps in this size range based on location: A leave-in-situ strategy (applied only in the rectosigmoid colon when there is high confidence of non-neoplastic polyps) and a resect-and-discard strategy (applied only in the whole colon when there is high confidence of neoplastic polyps upon optical diagnosis).

Rizkala carried out a review of studies that evaluated the performance of real-time CADx alone — independent of endoscopist judgment — for predicting the histology of colorectal polyps 5 mm or smaller. The primary endpoints were CADx sensitivity and specificity in the proximal colon (the portion extending from the descending colon to the cecum) and the distal colon (limited to the rectosigmoid region). Secondary outcomes were the negative predictive value (NPV), positive predictive value (PPV), and accuracy of the CADx alone in the proximal colon and the distal colon.
 

Lower Specificity in the Proximal Colon

An analysis of data based on 7782 polyps ≤ 5 mm from 11 studies found specificity values of 0.62 (95% CI, 0.52-0.71) and 0.85 (95% CI, 0.75-0.92) for the proximal and distal regions of the colon, respectively, with a risk ratio (RR) of 0.74 (95% CI, 0.72-0.84), meaning that CADx accuracy was significantly lower in the proximal colon than in the distal colon.

“According to the optical diagnosis strategy, we can use the leave-in-situ approach for the distal colon because the performance is adequate, but for the rest of the colon, CADx requires further enhancement,” Rizkala said.

Sensitivity values were 0.89 (95% CI, 0.83-0.93) and 0.87 (95% CI, 0.80-0.92) for the proximal and distal regions, respectively, with an RR of 1.00 (95% CI, 0.97-1.03).

Regarding the secondary outcomes, the NPV was 0.64 vs 0.93 for the proximal vs distal colon, with an RR of 0.71 (95% CI, 0.64-0.79), and accuracy was 0.81 vs 0.86, with an RR of 0.95 (95% CI, 0.91-0.99).

With the higher prevalence of neoplastic lesions in the proximal colon than in the distal colon, a lower NPV was observed in the proximal colon, Rizkala noted.

The PPV was 0.87 vs 0.76 for the proximal vs distal colon, with an RR of 1.11 (95% CI, 1.06-1.17), so the two parts of the colon were comparable, he reported.

In the future, CADx systems should focus on using lesions from the proximal colon to train more accurately because currently CADx systems are trained on the available endoscopic data in which most of those polyps are from the rectosigmoid colon, Rizkala said.

We would also “like manufacturers of CADx systems to provide public access to data balanced between the proximal and distal regions of the colon,” he added.
 

 

 

Diagnosis More Challenging Than Detection With CADx

Commenting on the study, comoderator David G. Graham, MD, consultant gastroenterologist at University College London Hospital in England, remarked: “The key questions here relate to why are these systems underperforming in the proximal colon, and how can we improve this?”

Are these results “due to the very different appearance of adenomas in the distal colon vs the proximal colon on CADx (which is not what we see as endoscopists but seems to be what the systems are seeing), or is it due to a different characterization of polyps,” that is, more sessile serrated lesions in the proximal colon than in the distal colon, he asked.

Also commenting on the study was Raf Bisschops, MD, head of endoscopy at KU Leuven in Belgium. He remarked that the review underscores the fact that optical diagnosis by artificial intelligence is a more challenging task than detection.

It is “not entirely clear” what would explain the difference in performance of CADx between the distal colon and proximal colon, he said. It can’t be excluded that the inclusion of different CADx systems, some of which clearly underperformed, may account for the difference.

He went on to suggest that the differences might be down to location beyond proximal and distal.

“The difference in performance between the right and left colon is also interesting, since recent insights in the molecular and morphological features of hyperplastic polyps indicates that there are different classes with more goblet cell–rich hyperplastic polyps in the right colon, and more microvesicular hyperplastic polyps in the left.”

These have “distinct microscopic and endoscopic appearances” that could account for a difference in performance of a CADx system if not included in the training and validation sets, he explained.

Rizkala and Graham reported no relevant disclosures. Bisschops reported receiving research grants and speaker fees from Medtronic, Fujifilm, and Pentax.

A version of this article first appeared on Medscape.com.

The effectiveness of computer-aided diagnosis (CADx) in differentiating neoplastic from non-neoplastic polyps depends on the region of the colon examined, according to a systematic review and meta-analysis.

In particular, the diagnostic performance of CADx for polyps showed significantly lower specificity in the proximal colon than in the distal colon.

“While current CADx systems are suitable for use in the distal colon, they should not be employed for diagnosing polyps in the proximal colon until new, higher performing systems are developed specifically for these lesions,” said study lead Tommy Rizkala, MD, Endoscopy Unit, IRCCS Humanitas Clinical and Research Center, Rozzano, Italy.

The “main strength” of the review is that the researchers contacted each study author for more specific information and were therefore able to divide the data into the proximal colon and the rectosigmoid colon, he explained.

“This is the first paper that has really collected these data. Most papers provide data for the entire colon or just for the rectosigmoid colon,” said Rizkala, who presented the findings at the United European Gastroenterology (UEG) Week 2024.

The study was also recently published in Clinical Gastroenterology and Hepatology.

Optical diagnosis enables real-time histologic predictions of polyps 5 mm or smaller during colonoscopy, offering potential clinical and cost-saving benefits. Two optical diagnostic strategies are used for polyps in this size range based on location: A leave-in-situ strategy (applied only in the rectosigmoid colon when there is high confidence of non-neoplastic polyps) and a resect-and-discard strategy (applied only in the whole colon when there is high confidence of neoplastic polyps upon optical diagnosis).

Rizkala carried out a review of studies that evaluated the performance of real-time CADx alone — independent of endoscopist judgment — for predicting the histology of colorectal polyps 5 mm or smaller. The primary endpoints were CADx sensitivity and specificity in the proximal colon (the portion extending from the descending colon to the cecum) and the distal colon (limited to the rectosigmoid region). Secondary outcomes were the negative predictive value (NPV), positive predictive value (PPV), and accuracy of the CADx alone in the proximal colon and the distal colon.
 

Lower Specificity in the Proximal Colon

An analysis of data based on 7782 polyps ≤ 5 mm from 11 studies found specificity values of 0.62 (95% CI, 0.52-0.71) and 0.85 (95% CI, 0.75-0.92) for the proximal and distal regions of the colon, respectively, with a risk ratio (RR) of 0.74 (95% CI, 0.72-0.84), meaning that CADx accuracy was significantly lower in the proximal colon than in the distal colon.

“According to the optical diagnosis strategy, we can use the leave-in-situ approach for the distal colon because the performance is adequate, but for the rest of the colon, CADx requires further enhancement,” Rizkala said.

Sensitivity values were 0.89 (95% CI, 0.83-0.93) and 0.87 (95% CI, 0.80-0.92) for the proximal and distal regions, respectively, with an RR of 1.00 (95% CI, 0.97-1.03).

Regarding the secondary outcomes, the NPV was 0.64 vs 0.93 for the proximal vs distal colon, with an RR of 0.71 (95% CI, 0.64-0.79), and accuracy was 0.81 vs 0.86, with an RR of 0.95 (95% CI, 0.91-0.99).

With the higher prevalence of neoplastic lesions in the proximal colon than in the distal colon, a lower NPV was observed in the proximal colon, Rizkala noted.

The PPV was 0.87 vs 0.76 for the proximal vs distal colon, with an RR of 1.11 (95% CI, 1.06-1.17), so the two parts of the colon were comparable, he reported.

In the future, CADx systems should focus on using lesions from the proximal colon to train more accurately because currently CADx systems are trained on the available endoscopic data in which most of those polyps are from the rectosigmoid colon, Rizkala said.

We would also “like manufacturers of CADx systems to provide public access to data balanced between the proximal and distal regions of the colon,” he added.
 

 

 

Diagnosis More Challenging Than Detection With CADx

Commenting on the study, comoderator David G. Graham, MD, consultant gastroenterologist at University College London Hospital in England, remarked: “The key questions here relate to why are these systems underperforming in the proximal colon, and how can we improve this?”

Are these results “due to the very different appearance of adenomas in the distal colon vs the proximal colon on CADx (which is not what we see as endoscopists but seems to be what the systems are seeing), or is it due to a different characterization of polyps,” that is, more sessile serrated lesions in the proximal colon than in the distal colon, he asked.

Also commenting on the study was Raf Bisschops, MD, head of endoscopy at KU Leuven in Belgium. He remarked that the review underscores the fact that optical diagnosis by artificial intelligence is a more challenging task than detection.

It is “not entirely clear” what would explain the difference in performance of CADx between the distal colon and proximal colon, he said. It can’t be excluded that the inclusion of different CADx systems, some of which clearly underperformed, may account for the difference.

He went on to suggest that the differences might be down to location beyond proximal and distal.

“The difference in performance between the right and left colon is also interesting, since recent insights in the molecular and morphological features of hyperplastic polyps indicates that there are different classes with more goblet cell–rich hyperplastic polyps in the right colon, and more microvesicular hyperplastic polyps in the left.”

These have “distinct microscopic and endoscopic appearances” that could account for a difference in performance of a CADx system if not included in the training and validation sets, he explained.

Rizkala and Graham reported no relevant disclosures. Bisschops reported receiving research grants and speaker fees from Medtronic, Fujifilm, and Pentax.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM UEG 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Just Call It ‘Chronic Rhinitis’ and Reach for These Treatments

Article Type
Changed
Tue, 10/29/2024 - 10:05

 

This transcript has been edited for clarity.

Matthew F. Watto, MD: I’m here with my great friend and America’s primary care physician, Dr. Paul Nelson Williams. Paul, are you ready to talk about rhinitis?

Paul N. Williams, MD: I’m excited. It’s always the season to talk about rhinitis.

Watto: We had a great guest for this podcast, Rhinitis and Environmental Allergies with Dr. Olajumoke Fadugba from Penn Medicine. She’s an allergist and immunologist. One of her pet peeves is when people just call everything “allergic rhinitis” because we should be calling it “chronic rhinitis,” if it’s chronic. That’s an umbrella term, and there are many buckets underneath it that people could fall into.

When you’re taking a history, you have to figure out whether it’s perennial (meaning it happens year round) because certain things can cause that. Cat dander is around all the time, so people with cats might have sinus symptoms all year. Dust mites are another one, and it’s pretty hard to avoid those. Those are some perennial allergens. 

Then there is allergic vs nonallergic rhinitis, which is something I hadn’t really put too much thought into.

Williams: I didn’t realize exactly how nuanced it got. Nonallergic rhinitis can still be seasonal because changes in temperature and humidity can trigger the rhinitis. And it matters what medications you use for what.

Watto: Here are some ways you can try to figure out if rhinitis is allergic or nonallergic. Ask the patient if they have itchy eyes and are sneezing a lot. That can be more of an allergic rhinitis, but both allergic and nonallergic rhinitis have the congestion, the rhinorrhea, so you can’t figure it out based on that alone.

Dr. Fadugba said that one clue that it might be nonallergic rhinitis is the age of onset. If the symptoms are later in onset (older age), then 30%-40% of rhinitis is nonallergic. If the patient has never had allergies and now all of a sudden they have new chronic sinus symptoms, it’s probably nonallergic rhinitis. It’s a diagnosis of exclusion.

I guess they need allergy testing?

Williams: If you want to make a definitive diagnosis, you need to rule it out. I suspect that you might be able to get away with some empirical treatment. If they get better, you can feel like a winner because getting booked in for allergy testing can be a little bit of a challenge.

Watto: The main treatment difference is that the oral antihistamines do not really seem to work for nonallergic rhinitis, but they can help with allergic rhinitis. Weirdly, the nasal antihistamines and nasal steroids do seem to work for both allergic and nonallergic rhinitis.

I don’t understand the mechanism there, but if you think someone might have nonallergic rhinitis, I wouldn’t go with the oral antihistamines as your first-line treatment. I would go with a nasal spray; you pretty much can’t go wrong with either an antihistamine or a steroid nasal spray.

Williams: We typically start with the nasal sprays. That’s kind of first-line for almost everybody, allergic or nonallergic. You’re probably going to start with an intranasal steroid, and then it’s kind of dealer’s choice what the patient can tolerate and afford. Sometimes you can get them covered by insurance, at least in my experience. 

I will say that this is one of the medications — like nicotine patches and other things — where we as doctors don’t really counsel patients on how to use it appropriately. So with our expert, we revisited the idea of the patient pointing the nasal spray laterally, toward their ear basically, and not spraying toward their brain. There should not be a slurping sound afterward, because “if you taste it, you waste it,” as the allergists and immunologists say. It’s supposed to sit up there and not be swallowed immediately. 

If your patient is sensitive to the floral flavor of some of the fluticasones (which I don’t mind so much as a user myself), then you can try mometasone or the other formulations. They are all roughly equivalent. 

Speaking of medications, which medications can cause rhinitis? Any meds we commonly use in primary care?

Williams: Apparently the combined hormonal oral contraceptives can do it. Also the phosphodiesterase 5 (PDE-5) inhibitors. Drugs that cause vasodilation can also do it. Some of the antihypertensives. I’ve seen beta-blockers and angiotensin-converting enzyme (ACE) inhibitors listed specifically, and some of the medications for benign prostatic hyperplasia (BPH). So there are a couple of medications that you can think about as a potential cause of rhinitis, although my suspicion is not going to be as high as for some of the other causes.

Watto: We mentioned medication treatments for patients who are really bothered by rhinorrhea, and maybe they are already on a steroid or an antihistamine.

You can try nasal ipratropium for people that have really prominent rhinorrhea. Dr. Fadugba said that can work well, and it’s usually taken three or four times a day. I’ve had good success prescribing it for my patients. Another one that I have never prescribed, but that Dr. Fadugba said is available over the counter, is intranasal cromolyn — a mast cell stabilizer. She said it can be beneficial.

Let’s say I had a cat allergy and I was going to visit Paul. I could use the intranasal cromolyn ahead of time to reduce rhinitis when I’m around the cats.

Paul, what about montelukast? I never know what to do with that one.

Williams: I’ve seen it prescribed as a last-ditch attempt to fix chronic rhinitis. Dr. Fadugba said she only ever prescribes it for patients who have rhinitis symptoms and asthma and never just for chronic rhinitis because it doesn’t work. And also, there have been some new black-box warnings from the US Food and Drug Administration (FDA). So unless there’s a solid indication for it, montelukast is not something you should just prescribe to try to see if it will work. That’s probably not the right approach for this.

But if the patient has challenging control asthma, and as a component, challenging nasal symptoms as well, it might be a reasonable medication to try. 

Watto: And finally, Paul, how does climate change possibly have anything to do with rhinitis?

Williams: I feel like I’m just seeing more and more of the stuff every year. I don’t know if I’m more sensitive to it or because I’m having more symptoms myself, but it turns out the prevalence actually is going up.

We’re seeing more of it in part because it’s getting hotter outside, which is in turn worsening the production of allergens and increasing the allergen exposure and the severity of the symptoms that go along with it. More people are having more severe disease because the world is changing as a result of the stuff that we do. So fix that. But also be mindful and expect to see even more of these problems as you move forward in your careers. 

Watto: Dr. Fadugba gave us so many great tips. You can listen to the full podcast episode here.

Dr. Watto, Clinical Assistant Professor, Department of Medicine, Perelman School of Medicine at University of Pennsylvania; Internist, Department of Medicine, Hospital Medicine Section, Pennsylvania Hospital, Philadelphia, has disclosed no relevant financial relationships. Dr. Williams, Associate Professor of Clinical Medicine, Department of General Internal Medicine, Lewis Katz School of Medicine; Staff Physician, Department of General Internal Medicine, Temple Internal Medicine Associates, Philadelphia, disclosed ties with The Curbsiders.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

This transcript has been edited for clarity.

Matthew F. Watto, MD: I’m here with my great friend and America’s primary care physician, Dr. Paul Nelson Williams. Paul, are you ready to talk about rhinitis?

Paul N. Williams, MD: I’m excited. It’s always the season to talk about rhinitis.

Watto: We had a great guest for this podcast, Rhinitis and Environmental Allergies with Dr. Olajumoke Fadugba from Penn Medicine. She’s an allergist and immunologist. One of her pet peeves is when people just call everything “allergic rhinitis” because we should be calling it “chronic rhinitis,” if it’s chronic. That’s an umbrella term, and there are many buckets underneath it that people could fall into.

When you’re taking a history, you have to figure out whether it’s perennial (meaning it happens year round) because certain things can cause that. Cat dander is around all the time, so people with cats might have sinus symptoms all year. Dust mites are another one, and it’s pretty hard to avoid those. Those are some perennial allergens. 

Then there is allergic vs nonallergic rhinitis, which is something I hadn’t really put too much thought into.

Williams: I didn’t realize exactly how nuanced it got. Nonallergic rhinitis can still be seasonal because changes in temperature and humidity can trigger the rhinitis. And it matters what medications you use for what.

Watto: Here are some ways you can try to figure out if rhinitis is allergic or nonallergic. Ask the patient if they have itchy eyes and are sneezing a lot. That can be more of an allergic rhinitis, but both allergic and nonallergic rhinitis have the congestion, the rhinorrhea, so you can’t figure it out based on that alone.

Dr. Fadugba said that one clue that it might be nonallergic rhinitis is the age of onset. If the symptoms are later in onset (older age), then 30%-40% of rhinitis is nonallergic. If the patient has never had allergies and now all of a sudden they have new chronic sinus symptoms, it’s probably nonallergic rhinitis. It’s a diagnosis of exclusion.

I guess they need allergy testing?

Williams: If you want to make a definitive diagnosis, you need to rule it out. I suspect that you might be able to get away with some empirical treatment. If they get better, you can feel like a winner because getting booked in for allergy testing can be a little bit of a challenge.

Watto: The main treatment difference is that the oral antihistamines do not really seem to work for nonallergic rhinitis, but they can help with allergic rhinitis. Weirdly, the nasal antihistamines and nasal steroids do seem to work for both allergic and nonallergic rhinitis.

I don’t understand the mechanism there, but if you think someone might have nonallergic rhinitis, I wouldn’t go with the oral antihistamines as your first-line treatment. I would go with a nasal spray; you pretty much can’t go wrong with either an antihistamine or a steroid nasal spray.

Williams: We typically start with the nasal sprays. That’s kind of first-line for almost everybody, allergic or nonallergic. You’re probably going to start with an intranasal steroid, and then it’s kind of dealer’s choice what the patient can tolerate and afford. Sometimes you can get them covered by insurance, at least in my experience. 

I will say that this is one of the medications — like nicotine patches and other things — where we as doctors don’t really counsel patients on how to use it appropriately. So with our expert, we revisited the idea of the patient pointing the nasal spray laterally, toward their ear basically, and not spraying toward their brain. There should not be a slurping sound afterward, because “if you taste it, you waste it,” as the allergists and immunologists say. It’s supposed to sit up there and not be swallowed immediately. 

If your patient is sensitive to the floral flavor of some of the fluticasones (which I don’t mind so much as a user myself), then you can try mometasone or the other formulations. They are all roughly equivalent. 

Speaking of medications, which medications can cause rhinitis? Any meds we commonly use in primary care?

Williams: Apparently the combined hormonal oral contraceptives can do it. Also the phosphodiesterase 5 (PDE-5) inhibitors. Drugs that cause vasodilation can also do it. Some of the antihypertensives. I’ve seen beta-blockers and angiotensin-converting enzyme (ACE) inhibitors listed specifically, and some of the medications for benign prostatic hyperplasia (BPH). So there are a couple of medications that you can think about as a potential cause of rhinitis, although my suspicion is not going to be as high as for some of the other causes.

Watto: We mentioned medication treatments for patients who are really bothered by rhinorrhea, and maybe they are already on a steroid or an antihistamine.

You can try nasal ipratropium for people that have really prominent rhinorrhea. Dr. Fadugba said that can work well, and it’s usually taken three or four times a day. I’ve had good success prescribing it for my patients. Another one that I have never prescribed, but that Dr. Fadugba said is available over the counter, is intranasal cromolyn — a mast cell stabilizer. She said it can be beneficial.

Let’s say I had a cat allergy and I was going to visit Paul. I could use the intranasal cromolyn ahead of time to reduce rhinitis when I’m around the cats.

Paul, what about montelukast? I never know what to do with that one.

Williams: I’ve seen it prescribed as a last-ditch attempt to fix chronic rhinitis. Dr. Fadugba said she only ever prescribes it for patients who have rhinitis symptoms and asthma and never just for chronic rhinitis because it doesn’t work. And also, there have been some new black-box warnings from the US Food and Drug Administration (FDA). So unless there’s a solid indication for it, montelukast is not something you should just prescribe to try to see if it will work. That’s probably not the right approach for this.

But if the patient has challenging control asthma, and as a component, challenging nasal symptoms as well, it might be a reasonable medication to try. 

Watto: And finally, Paul, how does climate change possibly have anything to do with rhinitis?

Williams: I feel like I’m just seeing more and more of the stuff every year. I don’t know if I’m more sensitive to it or because I’m having more symptoms myself, but it turns out the prevalence actually is going up.

We’re seeing more of it in part because it’s getting hotter outside, which is in turn worsening the production of allergens and increasing the allergen exposure and the severity of the symptoms that go along with it. More people are having more severe disease because the world is changing as a result of the stuff that we do. So fix that. But also be mindful and expect to see even more of these problems as you move forward in your careers. 

Watto: Dr. Fadugba gave us so many great tips. You can listen to the full podcast episode here.

Dr. Watto, Clinical Assistant Professor, Department of Medicine, Perelman School of Medicine at University of Pennsylvania; Internist, Department of Medicine, Hospital Medicine Section, Pennsylvania Hospital, Philadelphia, has disclosed no relevant financial relationships. Dr. Williams, Associate Professor of Clinical Medicine, Department of General Internal Medicine, Lewis Katz School of Medicine; Staff Physician, Department of General Internal Medicine, Temple Internal Medicine Associates, Philadelphia, disclosed ties with The Curbsiders.

A version of this article first appeared on Medscape.com.

 

This transcript has been edited for clarity.

Matthew F. Watto, MD: I’m here with my great friend and America’s primary care physician, Dr. Paul Nelson Williams. Paul, are you ready to talk about rhinitis?

Paul N. Williams, MD: I’m excited. It’s always the season to talk about rhinitis.

Watto: We had a great guest for this podcast, Rhinitis and Environmental Allergies with Dr. Olajumoke Fadugba from Penn Medicine. She’s an allergist and immunologist. One of her pet peeves is when people just call everything “allergic rhinitis” because we should be calling it “chronic rhinitis,” if it’s chronic. That’s an umbrella term, and there are many buckets underneath it that people could fall into.

When you’re taking a history, you have to figure out whether it’s perennial (meaning it happens year round) because certain things can cause that. Cat dander is around all the time, so people with cats might have sinus symptoms all year. Dust mites are another one, and it’s pretty hard to avoid those. Those are some perennial allergens. 

Then there is allergic vs nonallergic rhinitis, which is something I hadn’t really put too much thought into.

Williams: I didn’t realize exactly how nuanced it got. Nonallergic rhinitis can still be seasonal because changes in temperature and humidity can trigger the rhinitis. And it matters what medications you use for what.

Watto: Here are some ways you can try to figure out if rhinitis is allergic or nonallergic. Ask the patient if they have itchy eyes and are sneezing a lot. That can be more of an allergic rhinitis, but both allergic and nonallergic rhinitis have the congestion, the rhinorrhea, so you can’t figure it out based on that alone.

Dr. Fadugba said that one clue that it might be nonallergic rhinitis is the age of onset. If the symptoms are later in onset (older age), then 30%-40% of rhinitis is nonallergic. If the patient has never had allergies and now all of a sudden they have new chronic sinus symptoms, it’s probably nonallergic rhinitis. It’s a diagnosis of exclusion.

I guess they need allergy testing?

Williams: If you want to make a definitive diagnosis, you need to rule it out. I suspect that you might be able to get away with some empirical treatment. If they get better, you can feel like a winner because getting booked in for allergy testing can be a little bit of a challenge.

Watto: The main treatment difference is that the oral antihistamines do not really seem to work for nonallergic rhinitis, but they can help with allergic rhinitis. Weirdly, the nasal antihistamines and nasal steroids do seem to work for both allergic and nonallergic rhinitis.

I don’t understand the mechanism there, but if you think someone might have nonallergic rhinitis, I wouldn’t go with the oral antihistamines as your first-line treatment. I would go with a nasal spray; you pretty much can’t go wrong with either an antihistamine or a steroid nasal spray.

Williams: We typically start with the nasal sprays. That’s kind of first-line for almost everybody, allergic or nonallergic. You’re probably going to start with an intranasal steroid, and then it’s kind of dealer’s choice what the patient can tolerate and afford. Sometimes you can get them covered by insurance, at least in my experience. 

I will say that this is one of the medications — like nicotine patches and other things — where we as doctors don’t really counsel patients on how to use it appropriately. So with our expert, we revisited the idea of the patient pointing the nasal spray laterally, toward their ear basically, and not spraying toward their brain. There should not be a slurping sound afterward, because “if you taste it, you waste it,” as the allergists and immunologists say. It’s supposed to sit up there and not be swallowed immediately. 

If your patient is sensitive to the floral flavor of some of the fluticasones (which I don’t mind so much as a user myself), then you can try mometasone or the other formulations. They are all roughly equivalent. 

Speaking of medications, which medications can cause rhinitis? Any meds we commonly use in primary care?

Williams: Apparently the combined hormonal oral contraceptives can do it. Also the phosphodiesterase 5 (PDE-5) inhibitors. Drugs that cause vasodilation can also do it. Some of the antihypertensives. I’ve seen beta-blockers and angiotensin-converting enzyme (ACE) inhibitors listed specifically, and some of the medications for benign prostatic hyperplasia (BPH). So there are a couple of medications that you can think about as a potential cause of rhinitis, although my suspicion is not going to be as high as for some of the other causes.

Watto: We mentioned medication treatments for patients who are really bothered by rhinorrhea, and maybe they are already on a steroid or an antihistamine.

You can try nasal ipratropium for people that have really prominent rhinorrhea. Dr. Fadugba said that can work well, and it’s usually taken three or four times a day. I’ve had good success prescribing it for my patients. Another one that I have never prescribed, but that Dr. Fadugba said is available over the counter, is intranasal cromolyn — a mast cell stabilizer. She said it can be beneficial.

Let’s say I had a cat allergy and I was going to visit Paul. I could use the intranasal cromolyn ahead of time to reduce rhinitis when I’m around the cats.

Paul, what about montelukast? I never know what to do with that one.

Williams: I’ve seen it prescribed as a last-ditch attempt to fix chronic rhinitis. Dr. Fadugba said she only ever prescribes it for patients who have rhinitis symptoms and asthma and never just for chronic rhinitis because it doesn’t work. And also, there have been some new black-box warnings from the US Food and Drug Administration (FDA). So unless there’s a solid indication for it, montelukast is not something you should just prescribe to try to see if it will work. That’s probably not the right approach for this.

But if the patient has challenging control asthma, and as a component, challenging nasal symptoms as well, it might be a reasonable medication to try. 

Watto: And finally, Paul, how does climate change possibly have anything to do with rhinitis?

Williams: I feel like I’m just seeing more and more of the stuff every year. I don’t know if I’m more sensitive to it or because I’m having more symptoms myself, but it turns out the prevalence actually is going up.

We’re seeing more of it in part because it’s getting hotter outside, which is in turn worsening the production of allergens and increasing the allergen exposure and the severity of the symptoms that go along with it. More people are having more severe disease because the world is changing as a result of the stuff that we do. So fix that. But also be mindful and expect to see even more of these problems as you move forward in your careers. 

Watto: Dr. Fadugba gave us so many great tips. You can listen to the full podcast episode here.

Dr. Watto, Clinical Assistant Professor, Department of Medicine, Perelman School of Medicine at University of Pennsylvania; Internist, Department of Medicine, Hospital Medicine Section, Pennsylvania Hospital, Philadelphia, has disclosed no relevant financial relationships. Dr. Williams, Associate Professor of Clinical Medicine, Department of General Internal Medicine, Lewis Katz School of Medicine; Staff Physician, Department of General Internal Medicine, Temple Internal Medicine Associates, Philadelphia, disclosed ties with The Curbsiders.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New Data on DOAC Initiation After Stroke in AF: Final Word?

Article Type
Changed
Mon, 10/28/2024 - 15:35

— The long-standing debate as to when to start anticoagulation in patients with an acute ischemic stroke and atrial fibrillation (AF) looks as though it’s settled.

Results of the OPTIMAS trial, the largest trial to address this question, showed that initiation of a direct oral anticoagulant (DOAC) within 4 days after ischemic stroke associated with AF was noninferior to delayed initiation (7-14 days) for the composite outcome of ischemic stroke, intracranial hemorrhage, unclassifiable stroke, or systemic embolism at 90 days. Importantly, early DOAC initiation was safe with a low rate of symptomatic hemorrhage, regardless of stroke severity.

In addition, a new meta-analysis, known as CATALYST, which included all four randomized trials now available on this issue, showed a clear benefit of earlier initiation (within 4 days) versus later (5 days and up) on its primary endpoint of new ischemic stroke, symptomatic intracerebral hemorrhage, and unclassified stroke at 30 days.

The results of the OPTIMAS trial and the meta-analysis were both presented at the 16th World Stroke Congress (WSC) 2024. The OPTIMAS trial was also simultaneously published online in The Lancet.

“Our findings do not support the guideline recommended practice of delaying DOAC initiation after ischemic stroke with AF regardless of clinical stroke severity, reperfusion or prior anticoagulation,” said OPTIMAS investigator David Werring, PhD, University College London in England.

Presenting the meta-analysis, Signild Åsberg, MD, Uppsala University, Uppsala, Sweden, said his group’s findings “support the early start of DOACs (within 4 days) in clinical practice.”

Werring pointed out that starting anticoagulation early also had important logistical advantages.

“This means we can start anticoagulation before patients are discharged from hospital, thus ensuring that this important secondary prevention medication is always prescribed, when appropriate. That’s going to be a key benefit in the real world.”
 

Clinical Dilemma

Werring noted that AF accounts for 20%-30% of ischemic strokes, which tend to be more severe than other stroke types. The pivotal trials of DOACs did not include patients within 30 days of an acute ischemic stroke, creating a clinical dilemma on when to start this treatment.

“On the one hand, we wish to start anticoagulation early to reduce early recurrence of ischemic stroke. But on the other hand, there are concerns that if we start anticoagulation early, it could cause intracranial bleeding, including hemorrhagic transformation of the acute infarct. Guidelines on this issue are inconsistent and have called for randomized control trials in this area,” he noted.

So far, three randomized trials on DOAC timing have been conducted, which Werring said suggested early DOAC treatment is safe. However, these trials have provided limited data on moderate to severe stroke, patients with hemorrhagic transformation, or those already taking oral anticoagulants — subgroups in which there are particular concerns about early oral anticoagulation.

The OPTIMAS trial included a broad population of patients with acute ischemic stroke associated with AF including these critical subgroups.

The trial, conducted at 100 hospitals in the United Kingdom, included 3648 patients with AF and acute ischemic stroke who were randomly assigned to early (≤ 4 days from stroke symptom onset) or delayed (7-14 days) anticoagulation initiation with any DOAC.

There was no restriction on stroke severity, and patients with hemorrhagic transformation were allowed, with the exception of parenchymal hematoma type 2, a rare and severe type of hemorrhagic transformation.

Approximately 35% of patients had been taking an oral anticoagulant, mainly DOACs, prior to their stroke, and about 30% had revascularization with thrombolysis, thrombectomy, or both. Nearly 900 participants (25%) had moderate to severe stroke (National Institutes of Health Stroke Scale [NIHSS] score ≥ 11).

The primary outcome was a composite of recurrent ischemic stroke, symptomatic intracranial hemorrhage, unclassifiable stroke, or systemic embolism incidence at 90 days. The initial analysis aimed to show noninferiority of early DOAC initiation, with a noninferiority margin of 2 percentage points, followed by testing for superiority.

Results showed that the primary outcome occurred in 3.3% of both groups (adjusted risk difference, 0.000; 95% CI, −0.011 to 0.012), with noninferiority criteria fulfilled. Superiority was not achieved.

Symptomatic intracranial hemorrhage occurred in 0.6% of patients in the early DOAC initiation group vs 0.7% of those in the delayed group — a nonsignificant difference.
 

 

 

Applicable to Real-World Practice

A time-to-event analysis of the primary outcome showed that there were fewer outcomes in the first 30 days in the early DOAC initiation group, but the curves subsequently came together.

Subgroup analysis showed consistent results across all whole trial population, with no modification of the effect of early DOAC initiation according to stroke severity, reperfusion treatment, or previous anticoagulation.

Werring said that strengths of the OPTIMAS trial included a large sample size, a broad population with generalizability to real-world practice, and the inclusion of patients at higher bleeding risk than included in previous studies.

During the discussion, it was noted that the trial included few (about 3%) patients — about 3% — with very severe stroke (NIHSS score > 21), with the question of whether the findings could be applied to this group.

Werring noted that there was no evidence of heterogeneity, and if anything, patients with more severe strokes may have had a slightly greater benefit with early DOAC initiation. “So my feeling is probably these results do generalize to the more severe patients,” he said.

In a commentary accompanying The Lancet publication of the OPTIMAS trial, Else Charlotte Sandset, MD, University of Oslo, in Norway, and Diana Aguiar de Sousa, MD, Central Lisbon University Hospital Centre, Lisbon, Portugal, noted that the “increasing body of evidence strongly supports the message that initiating anticoagulation early for patients with ischaemic stroke is safe. The consistent absence of heterogeneity in safety outcomes suggests that the risk of symptomatic intracranial haemorrhage is not a major concern, even in patients with large infarcts.”

Regardless of the size of the treatment effect, initiating early anticoagulation makes sense when it can be done safely, as it helps prevent recurrent ischemic strokes and other embolic events. Early intervention reduces embolization risk, particularly in high-risk patients, and allows secondary prevention measures to begin while patients are still hospitalized, they added.
 

CATALYST Findings

The CATALYST meta-analysis included four trials, namely, TIMING, ELAN, OPTIMAS, and START, of early versus later DOAC administration in a total of 5411 patients with acute ischemic stroke and AF. In this meta-analysis, early was defined as within 4 days of stroke and later as 5 days or more.

The primary outcome was a composite of ischemic stroke, symptomatic, intracerebral hemorrhage, or unclassified stroke at 30 days. This was significantly reduced in the early group (2.12%) versus 3.02% in the later group, giving an odds ratio of 0.70 (95% CI, 0.50-0.98; P =.04).

The results were consistent across all subgroups, all suggesting an advantage for early DOAC.

Further analysis showed a clear benefit of early DOAC initiation in ischemic stroke with the curves separating early.

The rate of symptomatic intracerebral hemorrhage was low in both groups (0.45% in the early group and 0.40% in the later group) as was extracranial hemorrhage (0.45% vs 0.55%).

At 90 days, there were still lower event rates in the early group than the later one, but the difference was no longer statistically significant.
 

‘Practice Changing’ Results

Commenting on both studies, chair of the WSC session where the results of both OPTIMAS trial and the meta-analysis were presented, Craig Anderson, MD, The George Institute for Global Health, Sydney, Australia, described these latest results as “practice changing.”

“When to start anticoagulation in acute ischemic stroke patients with AF has been uncertain for a long time. The dogma has always been that we should wait. Over the years, we’ve become a little bit more confident, but now we’ve got good data from randomized trials showing that early initiation is safe, with the meta-analysis showing benefit,” he said.

“These new data from OPTIMAS will reassure clinicians that there’s no excessive harm and, more importantly, no excessive harm across all patient groups. And the meta-analysis clearly showed an upfront benefit of starting anticoagulation early. That’s a very convincing result,” he added.

Anderson cautioned that there still may be concerns about starting DOACs early in some groups, including Asian populations that have a higher bleeding risk (these trials included predominantly White patients) and people who are older or frail, who may have extensive small vessel disease.

During the discussion, several questions centered on the lack of imaging data available on the patients in the studies. Anderson said imaging data would help reassure clinicians on the safety of early anticoagulation in patients with large infarcts.

“Stroke clinicians make decisions on the basis of the patient and on the basis of the brain, and we only have the patient information at the moment. We don’t have information on the brain — that comes from imaging.”

Regardless, he believes these new data will lead to a shift in practice. “But maybe, it won’t be as dramatic as we would hope because I think some clinicians may still hesitate to apply these results to patients at high risk of bleeding. With imaging data from the studies that might change.”

The OPTIMAS trial was funded by University College London and the British Heart Foundation. Werring reported consulting fees from Novo Nordisk, National Institute for Health and Care Excellence, and Alnylam; payments or speaker honoraria from Novo Nordisk, Bayer, and AstraZeneca/Alexion; participation on a data safety monitoring board for the OXHARP trial; and participation as steering committee chair for the MACE-ICH and PLINTH trials. Åsberg received institutional research grants and lecture fees to her institution from AstraZeneca, Boehringer Ingelheim, Bristol Myers Squibb, and Institut Produits Synthése. Sandset and de Sousa were both steering committee members of the ELAN trial. Anderson reported grant funding from Penumbra and Takeda China.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

— The long-standing debate as to when to start anticoagulation in patients with an acute ischemic stroke and atrial fibrillation (AF) looks as though it’s settled.

Results of the OPTIMAS trial, the largest trial to address this question, showed that initiation of a direct oral anticoagulant (DOAC) within 4 days after ischemic stroke associated with AF was noninferior to delayed initiation (7-14 days) for the composite outcome of ischemic stroke, intracranial hemorrhage, unclassifiable stroke, or systemic embolism at 90 days. Importantly, early DOAC initiation was safe with a low rate of symptomatic hemorrhage, regardless of stroke severity.

In addition, a new meta-analysis, known as CATALYST, which included all four randomized trials now available on this issue, showed a clear benefit of earlier initiation (within 4 days) versus later (5 days and up) on its primary endpoint of new ischemic stroke, symptomatic intracerebral hemorrhage, and unclassified stroke at 30 days.

The results of the OPTIMAS trial and the meta-analysis were both presented at the 16th World Stroke Congress (WSC) 2024. The OPTIMAS trial was also simultaneously published online in The Lancet.

“Our findings do not support the guideline recommended practice of delaying DOAC initiation after ischemic stroke with AF regardless of clinical stroke severity, reperfusion or prior anticoagulation,” said OPTIMAS investigator David Werring, PhD, University College London in England.

Presenting the meta-analysis, Signild Åsberg, MD, Uppsala University, Uppsala, Sweden, said his group’s findings “support the early start of DOACs (within 4 days) in clinical practice.”

Werring pointed out that starting anticoagulation early also had important logistical advantages.

“This means we can start anticoagulation before patients are discharged from hospital, thus ensuring that this important secondary prevention medication is always prescribed, when appropriate. That’s going to be a key benefit in the real world.”
 

Clinical Dilemma

Werring noted that AF accounts for 20%-30% of ischemic strokes, which tend to be more severe than other stroke types. The pivotal trials of DOACs did not include patients within 30 days of an acute ischemic stroke, creating a clinical dilemma on when to start this treatment.

“On the one hand, we wish to start anticoagulation early to reduce early recurrence of ischemic stroke. But on the other hand, there are concerns that if we start anticoagulation early, it could cause intracranial bleeding, including hemorrhagic transformation of the acute infarct. Guidelines on this issue are inconsistent and have called for randomized control trials in this area,” he noted.

So far, three randomized trials on DOAC timing have been conducted, which Werring said suggested early DOAC treatment is safe. However, these trials have provided limited data on moderate to severe stroke, patients with hemorrhagic transformation, or those already taking oral anticoagulants — subgroups in which there are particular concerns about early oral anticoagulation.

The OPTIMAS trial included a broad population of patients with acute ischemic stroke associated with AF including these critical subgroups.

The trial, conducted at 100 hospitals in the United Kingdom, included 3648 patients with AF and acute ischemic stroke who were randomly assigned to early (≤ 4 days from stroke symptom onset) or delayed (7-14 days) anticoagulation initiation with any DOAC.

There was no restriction on stroke severity, and patients with hemorrhagic transformation were allowed, with the exception of parenchymal hematoma type 2, a rare and severe type of hemorrhagic transformation.

Approximately 35% of patients had been taking an oral anticoagulant, mainly DOACs, prior to their stroke, and about 30% had revascularization with thrombolysis, thrombectomy, or both. Nearly 900 participants (25%) had moderate to severe stroke (National Institutes of Health Stroke Scale [NIHSS] score ≥ 11).

The primary outcome was a composite of recurrent ischemic stroke, symptomatic intracranial hemorrhage, unclassifiable stroke, or systemic embolism incidence at 90 days. The initial analysis aimed to show noninferiority of early DOAC initiation, with a noninferiority margin of 2 percentage points, followed by testing for superiority.

Results showed that the primary outcome occurred in 3.3% of both groups (adjusted risk difference, 0.000; 95% CI, −0.011 to 0.012), with noninferiority criteria fulfilled. Superiority was not achieved.

Symptomatic intracranial hemorrhage occurred in 0.6% of patients in the early DOAC initiation group vs 0.7% of those in the delayed group — a nonsignificant difference.
 

 

 

Applicable to Real-World Practice

A time-to-event analysis of the primary outcome showed that there were fewer outcomes in the first 30 days in the early DOAC initiation group, but the curves subsequently came together.

Subgroup analysis showed consistent results across all whole trial population, with no modification of the effect of early DOAC initiation according to stroke severity, reperfusion treatment, or previous anticoagulation.

Werring said that strengths of the OPTIMAS trial included a large sample size, a broad population with generalizability to real-world practice, and the inclusion of patients at higher bleeding risk than included in previous studies.

During the discussion, it was noted that the trial included few (about 3%) patients — about 3% — with very severe stroke (NIHSS score > 21), with the question of whether the findings could be applied to this group.

Werring noted that there was no evidence of heterogeneity, and if anything, patients with more severe strokes may have had a slightly greater benefit with early DOAC initiation. “So my feeling is probably these results do generalize to the more severe patients,” he said.

In a commentary accompanying The Lancet publication of the OPTIMAS trial, Else Charlotte Sandset, MD, University of Oslo, in Norway, and Diana Aguiar de Sousa, MD, Central Lisbon University Hospital Centre, Lisbon, Portugal, noted that the “increasing body of evidence strongly supports the message that initiating anticoagulation early for patients with ischaemic stroke is safe. The consistent absence of heterogeneity in safety outcomes suggests that the risk of symptomatic intracranial haemorrhage is not a major concern, even in patients with large infarcts.”

Regardless of the size of the treatment effect, initiating early anticoagulation makes sense when it can be done safely, as it helps prevent recurrent ischemic strokes and other embolic events. Early intervention reduces embolization risk, particularly in high-risk patients, and allows secondary prevention measures to begin while patients are still hospitalized, they added.
 

CATALYST Findings

The CATALYST meta-analysis included four trials, namely, TIMING, ELAN, OPTIMAS, and START, of early versus later DOAC administration in a total of 5411 patients with acute ischemic stroke and AF. In this meta-analysis, early was defined as within 4 days of stroke and later as 5 days or more.

The primary outcome was a composite of ischemic stroke, symptomatic, intracerebral hemorrhage, or unclassified stroke at 30 days. This was significantly reduced in the early group (2.12%) versus 3.02% in the later group, giving an odds ratio of 0.70 (95% CI, 0.50-0.98; P =.04).

The results were consistent across all subgroups, all suggesting an advantage for early DOAC.

Further analysis showed a clear benefit of early DOAC initiation in ischemic stroke with the curves separating early.

The rate of symptomatic intracerebral hemorrhage was low in both groups (0.45% in the early group and 0.40% in the later group) as was extracranial hemorrhage (0.45% vs 0.55%).

At 90 days, there were still lower event rates in the early group than the later one, but the difference was no longer statistically significant.
 

‘Practice Changing’ Results

Commenting on both studies, chair of the WSC session where the results of both OPTIMAS trial and the meta-analysis were presented, Craig Anderson, MD, The George Institute for Global Health, Sydney, Australia, described these latest results as “practice changing.”

“When to start anticoagulation in acute ischemic stroke patients with AF has been uncertain for a long time. The dogma has always been that we should wait. Over the years, we’ve become a little bit more confident, but now we’ve got good data from randomized trials showing that early initiation is safe, with the meta-analysis showing benefit,” he said.

“These new data from OPTIMAS will reassure clinicians that there’s no excessive harm and, more importantly, no excessive harm across all patient groups. And the meta-analysis clearly showed an upfront benefit of starting anticoagulation early. That’s a very convincing result,” he added.

Anderson cautioned that there still may be concerns about starting DOACs early in some groups, including Asian populations that have a higher bleeding risk (these trials included predominantly White patients) and people who are older or frail, who may have extensive small vessel disease.

During the discussion, several questions centered on the lack of imaging data available on the patients in the studies. Anderson said imaging data would help reassure clinicians on the safety of early anticoagulation in patients with large infarcts.

“Stroke clinicians make decisions on the basis of the patient and on the basis of the brain, and we only have the patient information at the moment. We don’t have information on the brain — that comes from imaging.”

Regardless, he believes these new data will lead to a shift in practice. “But maybe, it won’t be as dramatic as we would hope because I think some clinicians may still hesitate to apply these results to patients at high risk of bleeding. With imaging data from the studies that might change.”

The OPTIMAS trial was funded by University College London and the British Heart Foundation. Werring reported consulting fees from Novo Nordisk, National Institute for Health and Care Excellence, and Alnylam; payments or speaker honoraria from Novo Nordisk, Bayer, and AstraZeneca/Alexion; participation on a data safety monitoring board for the OXHARP trial; and participation as steering committee chair for the MACE-ICH and PLINTH trials. Åsberg received institutional research grants and lecture fees to her institution from AstraZeneca, Boehringer Ingelheim, Bristol Myers Squibb, and Institut Produits Synthése. Sandset and de Sousa were both steering committee members of the ELAN trial. Anderson reported grant funding from Penumbra and Takeda China.
 

A version of this article appeared on Medscape.com.

— The long-standing debate as to when to start anticoagulation in patients with an acute ischemic stroke and atrial fibrillation (AF) looks as though it’s settled.

Results of the OPTIMAS trial, the largest trial to address this question, showed that initiation of a direct oral anticoagulant (DOAC) within 4 days after ischemic stroke associated with AF was noninferior to delayed initiation (7-14 days) for the composite outcome of ischemic stroke, intracranial hemorrhage, unclassifiable stroke, or systemic embolism at 90 days. Importantly, early DOAC initiation was safe with a low rate of symptomatic hemorrhage, regardless of stroke severity.

In addition, a new meta-analysis, known as CATALYST, which included all four randomized trials now available on this issue, showed a clear benefit of earlier initiation (within 4 days) versus later (5 days and up) on its primary endpoint of new ischemic stroke, symptomatic intracerebral hemorrhage, and unclassified stroke at 30 days.

The results of the OPTIMAS trial and the meta-analysis were both presented at the 16th World Stroke Congress (WSC) 2024. The OPTIMAS trial was also simultaneously published online in The Lancet.

“Our findings do not support the guideline recommended practice of delaying DOAC initiation after ischemic stroke with AF regardless of clinical stroke severity, reperfusion or prior anticoagulation,” said OPTIMAS investigator David Werring, PhD, University College London in England.

Presenting the meta-analysis, Signild Åsberg, MD, Uppsala University, Uppsala, Sweden, said his group’s findings “support the early start of DOACs (within 4 days) in clinical practice.”

Werring pointed out that starting anticoagulation early also had important logistical advantages.

“This means we can start anticoagulation before patients are discharged from hospital, thus ensuring that this important secondary prevention medication is always prescribed, when appropriate. That’s going to be a key benefit in the real world.”
 

Clinical Dilemma

Werring noted that AF accounts for 20%-30% of ischemic strokes, which tend to be more severe than other stroke types. The pivotal trials of DOACs did not include patients within 30 days of an acute ischemic stroke, creating a clinical dilemma on when to start this treatment.

“On the one hand, we wish to start anticoagulation early to reduce early recurrence of ischemic stroke. But on the other hand, there are concerns that if we start anticoagulation early, it could cause intracranial bleeding, including hemorrhagic transformation of the acute infarct. Guidelines on this issue are inconsistent and have called for randomized control trials in this area,” he noted.

So far, three randomized trials on DOAC timing have been conducted, which Werring said suggested early DOAC treatment is safe. However, these trials have provided limited data on moderate to severe stroke, patients with hemorrhagic transformation, or those already taking oral anticoagulants — subgroups in which there are particular concerns about early oral anticoagulation.

The OPTIMAS trial included a broad population of patients with acute ischemic stroke associated with AF including these critical subgroups.

The trial, conducted at 100 hospitals in the United Kingdom, included 3648 patients with AF and acute ischemic stroke who were randomly assigned to early (≤ 4 days from stroke symptom onset) or delayed (7-14 days) anticoagulation initiation with any DOAC.

There was no restriction on stroke severity, and patients with hemorrhagic transformation were allowed, with the exception of parenchymal hematoma type 2, a rare and severe type of hemorrhagic transformation.

Approximately 35% of patients had been taking an oral anticoagulant, mainly DOACs, prior to their stroke, and about 30% had revascularization with thrombolysis, thrombectomy, or both. Nearly 900 participants (25%) had moderate to severe stroke (National Institutes of Health Stroke Scale [NIHSS] score ≥ 11).

The primary outcome was a composite of recurrent ischemic stroke, symptomatic intracranial hemorrhage, unclassifiable stroke, or systemic embolism incidence at 90 days. The initial analysis aimed to show noninferiority of early DOAC initiation, with a noninferiority margin of 2 percentage points, followed by testing for superiority.

Results showed that the primary outcome occurred in 3.3% of both groups (adjusted risk difference, 0.000; 95% CI, −0.011 to 0.012), with noninferiority criteria fulfilled. Superiority was not achieved.

Symptomatic intracranial hemorrhage occurred in 0.6% of patients in the early DOAC initiation group vs 0.7% of those in the delayed group — a nonsignificant difference.
 

 

 

Applicable to Real-World Practice

A time-to-event analysis of the primary outcome showed that there were fewer outcomes in the first 30 days in the early DOAC initiation group, but the curves subsequently came together.

Subgroup analysis showed consistent results across all whole trial population, with no modification of the effect of early DOAC initiation according to stroke severity, reperfusion treatment, or previous anticoagulation.

Werring said that strengths of the OPTIMAS trial included a large sample size, a broad population with generalizability to real-world practice, and the inclusion of patients at higher bleeding risk than included in previous studies.

During the discussion, it was noted that the trial included few (about 3%) patients — about 3% — with very severe stroke (NIHSS score > 21), with the question of whether the findings could be applied to this group.

Werring noted that there was no evidence of heterogeneity, and if anything, patients with more severe strokes may have had a slightly greater benefit with early DOAC initiation. “So my feeling is probably these results do generalize to the more severe patients,” he said.

In a commentary accompanying The Lancet publication of the OPTIMAS trial, Else Charlotte Sandset, MD, University of Oslo, in Norway, and Diana Aguiar de Sousa, MD, Central Lisbon University Hospital Centre, Lisbon, Portugal, noted that the “increasing body of evidence strongly supports the message that initiating anticoagulation early for patients with ischaemic stroke is safe. The consistent absence of heterogeneity in safety outcomes suggests that the risk of symptomatic intracranial haemorrhage is not a major concern, even in patients with large infarcts.”

Regardless of the size of the treatment effect, initiating early anticoagulation makes sense when it can be done safely, as it helps prevent recurrent ischemic strokes and other embolic events. Early intervention reduces embolization risk, particularly in high-risk patients, and allows secondary prevention measures to begin while patients are still hospitalized, they added.
 

CATALYST Findings

The CATALYST meta-analysis included four trials, namely, TIMING, ELAN, OPTIMAS, and START, of early versus later DOAC administration in a total of 5411 patients with acute ischemic stroke and AF. In this meta-analysis, early was defined as within 4 days of stroke and later as 5 days or more.

The primary outcome was a composite of ischemic stroke, symptomatic, intracerebral hemorrhage, or unclassified stroke at 30 days. This was significantly reduced in the early group (2.12%) versus 3.02% in the later group, giving an odds ratio of 0.70 (95% CI, 0.50-0.98; P =.04).

The results were consistent across all subgroups, all suggesting an advantage for early DOAC.

Further analysis showed a clear benefit of early DOAC initiation in ischemic stroke with the curves separating early.

The rate of symptomatic intracerebral hemorrhage was low in both groups (0.45% in the early group and 0.40% in the later group) as was extracranial hemorrhage (0.45% vs 0.55%).

At 90 days, there were still lower event rates in the early group than the later one, but the difference was no longer statistically significant.
 

‘Practice Changing’ Results

Commenting on both studies, chair of the WSC session where the results of both OPTIMAS trial and the meta-analysis were presented, Craig Anderson, MD, The George Institute for Global Health, Sydney, Australia, described these latest results as “practice changing.”

“When to start anticoagulation in acute ischemic stroke patients with AF has been uncertain for a long time. The dogma has always been that we should wait. Over the years, we’ve become a little bit more confident, but now we’ve got good data from randomized trials showing that early initiation is safe, with the meta-analysis showing benefit,” he said.

“These new data from OPTIMAS will reassure clinicians that there’s no excessive harm and, more importantly, no excessive harm across all patient groups. And the meta-analysis clearly showed an upfront benefit of starting anticoagulation early. That’s a very convincing result,” he added.

Anderson cautioned that there still may be concerns about starting DOACs early in some groups, including Asian populations that have a higher bleeding risk (these trials included predominantly White patients) and people who are older or frail, who may have extensive small vessel disease.

During the discussion, several questions centered on the lack of imaging data available on the patients in the studies. Anderson said imaging data would help reassure clinicians on the safety of early anticoagulation in patients with large infarcts.

“Stroke clinicians make decisions on the basis of the patient and on the basis of the brain, and we only have the patient information at the moment. We don’t have information on the brain — that comes from imaging.”

Regardless, he believes these new data will lead to a shift in practice. “But maybe, it won’t be as dramatic as we would hope because I think some clinicians may still hesitate to apply these results to patients at high risk of bleeding. With imaging data from the studies that might change.”

The OPTIMAS trial was funded by University College London and the British Heart Foundation. Werring reported consulting fees from Novo Nordisk, National Institute for Health and Care Excellence, and Alnylam; payments or speaker honoraria from Novo Nordisk, Bayer, and AstraZeneca/Alexion; participation on a data safety monitoring board for the OXHARP trial; and participation as steering committee chair for the MACE-ICH and PLINTH trials. Åsberg received institutional research grants and lecture fees to her institution from AstraZeneca, Boehringer Ingelheim, Bristol Myers Squibb, and Institut Produits Synthése. Sandset and de Sousa were both steering committee members of the ELAN trial. Anderson reported grant funding from Penumbra and Takeda China.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM WSC 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Study Finds Elevated Skin Cancer Risk Among US Veterans

Article Type
Changed
Wed, 11/06/2024 - 05:25

US veterans were nearly three times more likely to develop skin cancer than the general population, according to a large cross-sectional analysis of recent national data.

“US veterans are known to have increased risk of cancers and cancer morbidity compared to the general US population,” one of the study authors, Sepideh Ashrafzadeh, MD, a third-year dermatology resident at Massachusetts General Hospital, Boston, told this news organization following the annual meeting of the American Society for Dermatologic Surgery, where the results were presented. “There have been several studies that have shown that US veterans have an increased prevalence of melanoma compared to nonveterans,” she said, noting, however, that no study has investigated the prevalence of nonmelanoma skin cancers (NMSCs), which include basal cell carcinomas and squamous cell carcinomas, compared with the general population.

Dr. Ashrafzadeh
Dr. Sepideh Ashrafzadeh

To address this knowledge gap, the researchers performed a national cross-sectional study of adults aged 18 years or older from the 2019-2023 National Health Interview Surveys to examine the prevalence of melanoma and NMSCs among veterans compared with the general US population. They aggregated and tabulated the data by veteran status, defined as having served at any point in the US armed forces, reserves, or national guard, and by demographic and socioeconomic status variables. Next, they performed multivariate logistic regression for skin cancer risk adjusted for age, sex, race, ethnicity, urbanicity, and disability status.

The study population consisted of 14,301 veterans and 209,936 nonveterans. Compared with nonveterans, veterans were more likely to have been diagnosed with skin cancer at some point in their lives (7% vs 2.4%; P < .001); had a higher mean age of skin cancer diagnosis (61.1 vs 55.8 years; P < .001); were more likely to have been diagnosed with melanoma (2.8% vs 0.9%; P < .001), and were more likely to have been diagnosed with NMSC (4.4% vs 1.6%; P < .001).

The researchers found that older age, White race, non-Hispanic ethnicity, and veteran status were all associated with higher odds of developing NMSCs, even after adjusting for relevant covariates. Specifically, veterans had 1.23 higher odds of developing NMSC than the general population, while two factors were protective for developing NMSCs: Living in a rural setting (adjusted odds ratio [aOR], 0.78) and receiving supplemental security income or disability income (aOR, 0.69).

In another part of the study, the researchers evaluated demographic and socioeconomic variables associated with developing melanoma among veterans. These included the following: Male (aOR, 1.16), older age (50-64 years: aOR, 6.82; 65-74 years: aOR, 12.55; and 75 years or older: aOR, 16.16), White race (aOR, 9.24), and non-Hispanic ethnicity (aOR, 7.15).

“Veterans may have occupational risks such as sun and chemical exposure, as well as behavioral habits for sun protection, that may contribute to their elevated risk of melanoma and NMSCs,” Ashrafzadeh said. “Therefore, US veterans would benefit from targeted and regular skin cancer screenings, sun protective preventative resources such as hats and sunscreen, and access to medical and surgical care for diagnosis and treatment of skin cancers.”

Christine Ko, MD, professor of dermatology and pathology at Yale University, New Haven, Connecticut, who was asked to comment on the findings, said that a key strength of the study is that it drew from a nationally representative sample. “A limitation is that skin cancer was self-reported rather than based on documented medical histories,” Ko said. “The study confirms that skin cancer risk is higher in older individuals (> 75 as compared to < 50) and in individuals of self-reported white race and non-Hispanic ethnicity,” she added.

Neither the researchers nor Ko reported having relevant disclosures.
 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

US veterans were nearly three times more likely to develop skin cancer than the general population, according to a large cross-sectional analysis of recent national data.

“US veterans are known to have increased risk of cancers and cancer morbidity compared to the general US population,” one of the study authors, Sepideh Ashrafzadeh, MD, a third-year dermatology resident at Massachusetts General Hospital, Boston, told this news organization following the annual meeting of the American Society for Dermatologic Surgery, where the results were presented. “There have been several studies that have shown that US veterans have an increased prevalence of melanoma compared to nonveterans,” she said, noting, however, that no study has investigated the prevalence of nonmelanoma skin cancers (NMSCs), which include basal cell carcinomas and squamous cell carcinomas, compared with the general population.

Dr. Ashrafzadeh
Dr. Sepideh Ashrafzadeh

To address this knowledge gap, the researchers performed a national cross-sectional study of adults aged 18 years or older from the 2019-2023 National Health Interview Surveys to examine the prevalence of melanoma and NMSCs among veterans compared with the general US population. They aggregated and tabulated the data by veteran status, defined as having served at any point in the US armed forces, reserves, or national guard, and by demographic and socioeconomic status variables. Next, they performed multivariate logistic regression for skin cancer risk adjusted for age, sex, race, ethnicity, urbanicity, and disability status.

The study population consisted of 14,301 veterans and 209,936 nonveterans. Compared with nonveterans, veterans were more likely to have been diagnosed with skin cancer at some point in their lives (7% vs 2.4%; P < .001); had a higher mean age of skin cancer diagnosis (61.1 vs 55.8 years; P < .001); were more likely to have been diagnosed with melanoma (2.8% vs 0.9%; P < .001), and were more likely to have been diagnosed with NMSC (4.4% vs 1.6%; P < .001).

The researchers found that older age, White race, non-Hispanic ethnicity, and veteran status were all associated with higher odds of developing NMSCs, even after adjusting for relevant covariates. Specifically, veterans had 1.23 higher odds of developing NMSC than the general population, while two factors were protective for developing NMSCs: Living in a rural setting (adjusted odds ratio [aOR], 0.78) and receiving supplemental security income or disability income (aOR, 0.69).

In another part of the study, the researchers evaluated demographic and socioeconomic variables associated with developing melanoma among veterans. These included the following: Male (aOR, 1.16), older age (50-64 years: aOR, 6.82; 65-74 years: aOR, 12.55; and 75 years or older: aOR, 16.16), White race (aOR, 9.24), and non-Hispanic ethnicity (aOR, 7.15).

“Veterans may have occupational risks such as sun and chemical exposure, as well as behavioral habits for sun protection, that may contribute to their elevated risk of melanoma and NMSCs,” Ashrafzadeh said. “Therefore, US veterans would benefit from targeted and regular skin cancer screenings, sun protective preventative resources such as hats and sunscreen, and access to medical and surgical care for diagnosis and treatment of skin cancers.”

Christine Ko, MD, professor of dermatology and pathology at Yale University, New Haven, Connecticut, who was asked to comment on the findings, said that a key strength of the study is that it drew from a nationally representative sample. “A limitation is that skin cancer was self-reported rather than based on documented medical histories,” Ko said. “The study confirms that skin cancer risk is higher in older individuals (> 75 as compared to < 50) and in individuals of self-reported white race and non-Hispanic ethnicity,” she added.

Neither the researchers nor Ko reported having relevant disclosures.
 

A version of this article first appeared on Medscape.com.

US veterans were nearly three times more likely to develop skin cancer than the general population, according to a large cross-sectional analysis of recent national data.

“US veterans are known to have increased risk of cancers and cancer morbidity compared to the general US population,” one of the study authors, Sepideh Ashrafzadeh, MD, a third-year dermatology resident at Massachusetts General Hospital, Boston, told this news organization following the annual meeting of the American Society for Dermatologic Surgery, where the results were presented. “There have been several studies that have shown that US veterans have an increased prevalence of melanoma compared to nonveterans,” she said, noting, however, that no study has investigated the prevalence of nonmelanoma skin cancers (NMSCs), which include basal cell carcinomas and squamous cell carcinomas, compared with the general population.

Dr. Ashrafzadeh
Dr. Sepideh Ashrafzadeh

To address this knowledge gap, the researchers performed a national cross-sectional study of adults aged 18 years or older from the 2019-2023 National Health Interview Surveys to examine the prevalence of melanoma and NMSCs among veterans compared with the general US population. They aggregated and tabulated the data by veteran status, defined as having served at any point in the US armed forces, reserves, or national guard, and by demographic and socioeconomic status variables. Next, they performed multivariate logistic regression for skin cancer risk adjusted for age, sex, race, ethnicity, urbanicity, and disability status.

The study population consisted of 14,301 veterans and 209,936 nonveterans. Compared with nonveterans, veterans were more likely to have been diagnosed with skin cancer at some point in their lives (7% vs 2.4%; P < .001); had a higher mean age of skin cancer diagnosis (61.1 vs 55.8 years; P < .001); were more likely to have been diagnosed with melanoma (2.8% vs 0.9%; P < .001), and were more likely to have been diagnosed with NMSC (4.4% vs 1.6%; P < .001).

The researchers found that older age, White race, non-Hispanic ethnicity, and veteran status were all associated with higher odds of developing NMSCs, even after adjusting for relevant covariates. Specifically, veterans had 1.23 higher odds of developing NMSC than the general population, while two factors were protective for developing NMSCs: Living in a rural setting (adjusted odds ratio [aOR], 0.78) and receiving supplemental security income or disability income (aOR, 0.69).

In another part of the study, the researchers evaluated demographic and socioeconomic variables associated with developing melanoma among veterans. These included the following: Male (aOR, 1.16), older age (50-64 years: aOR, 6.82; 65-74 years: aOR, 12.55; and 75 years or older: aOR, 16.16), White race (aOR, 9.24), and non-Hispanic ethnicity (aOR, 7.15).

“Veterans may have occupational risks such as sun and chemical exposure, as well as behavioral habits for sun protection, that may contribute to their elevated risk of melanoma and NMSCs,” Ashrafzadeh said. “Therefore, US veterans would benefit from targeted and regular skin cancer screenings, sun protective preventative resources such as hats and sunscreen, and access to medical and surgical care for diagnosis and treatment of skin cancers.”

Christine Ko, MD, professor of dermatology and pathology at Yale University, New Haven, Connecticut, who was asked to comment on the findings, said that a key strength of the study is that it drew from a nationally representative sample. “A limitation is that skin cancer was self-reported rather than based on documented medical histories,” Ko said. “The study confirms that skin cancer risk is higher in older individuals (> 75 as compared to < 50) and in individuals of self-reported white race and non-Hispanic ethnicity,” she added.

Neither the researchers nor Ko reported having relevant disclosures.
 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ASDS 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Cancer’s Other Toll: Long-Term Financial Fallout for Survivors

Article Type
Changed
Mon, 10/28/2024 - 14:04

— While the physical toll of cancer is well documented, the financial toll can also be severe and lasting.

Overall, patients with cancer tend to face higher rates of debt collection, medical collections, and bankruptcies, as well as lower credit scores, according to two new studies presented at the American College of Surgeons Clinical Congress 2024.

“These are the first studies to provide numerical evidence of financial toxicity among cancer survivors,” Benjamin C. James, MD, with Beth Israel Deaconess Medical Center and Harvard Medical School, both in Boston, Massachusetts, who worked on both studies, said in a statement. “Previous data on this topic largely relies on subjective survey reviews.”

In one study, researchers used the Massachusetts Cancer Registry to identify 99,175 patients diagnosed with cancer between 2010 and 2019 and matched them with 188,875 control individuals without cancer. Researchers then assessed financial toxicity using Experian credit bureau data for participants.

Overall, patients with cancer faced a range of financial challenges that often lasted years following their diagnosis.

Patients were nearly five times more likely to experience bankruptcy and had average credit scores nearly 80 points lower than control individuals without cancer. The drop in credit scores was more pronounced for survivors of bladder, liver, lung, and colorectal cancer (CRC) and persisted for up to 9.5 years.

For certain cancer types, in particular, “we are looking years after a diagnosis, and we see that the credit score goes down and it never comes back up,” James said.

The other study, which used a sample of 7227 patients with CRC from Massachusetts, identified several factors that correlated with lower credit scores.

Compared with patients who only had surgery, peers who underwent radiation only experienced a 62-point drop in their credit score after their diagnosis, while those who had chemotherapy alone had just over a 14-point drop in their credit score. Among patients who had combination treatments, those who underwent both surgery and radiation experienced a nearly 16-point drop in their credit score and those who had surgery and chemoradiation actually experienced a 2.59 bump, compared with those who had surgery alone.

Financial toxicity was worse for patients younger than 62 years, those identifying as Black or Hispanic individuals, unmarried individuals, those with an annual income below $52,000, and those living in deprived areas.

The studies add to findings from the 2015 North American Thyroid Cancer Survivorship Study, which reported that 50% of thyroid cancer survivors encountered financial toxicity because of their diagnosis.

James said the persistent financial strain of cancer care, even in a state like Massachusetts, which mandates universal healthcare, underscores the need for “broader policy changes and reforms, including reconsidering debt collection practices.”

“Financial security should be a priority in cancer care,” he added.

The studies had no specific funding. The authors have disclosed no relevant conflict of interest.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

— While the physical toll of cancer is well documented, the financial toll can also be severe and lasting.

Overall, patients with cancer tend to face higher rates of debt collection, medical collections, and bankruptcies, as well as lower credit scores, according to two new studies presented at the American College of Surgeons Clinical Congress 2024.

“These are the first studies to provide numerical evidence of financial toxicity among cancer survivors,” Benjamin C. James, MD, with Beth Israel Deaconess Medical Center and Harvard Medical School, both in Boston, Massachusetts, who worked on both studies, said in a statement. “Previous data on this topic largely relies on subjective survey reviews.”

In one study, researchers used the Massachusetts Cancer Registry to identify 99,175 patients diagnosed with cancer between 2010 and 2019 and matched them with 188,875 control individuals without cancer. Researchers then assessed financial toxicity using Experian credit bureau data for participants.

Overall, patients with cancer faced a range of financial challenges that often lasted years following their diagnosis.

Patients were nearly five times more likely to experience bankruptcy and had average credit scores nearly 80 points lower than control individuals without cancer. The drop in credit scores was more pronounced for survivors of bladder, liver, lung, and colorectal cancer (CRC) and persisted for up to 9.5 years.

For certain cancer types, in particular, “we are looking years after a diagnosis, and we see that the credit score goes down and it never comes back up,” James said.

The other study, which used a sample of 7227 patients with CRC from Massachusetts, identified several factors that correlated with lower credit scores.

Compared with patients who only had surgery, peers who underwent radiation only experienced a 62-point drop in their credit score after their diagnosis, while those who had chemotherapy alone had just over a 14-point drop in their credit score. Among patients who had combination treatments, those who underwent both surgery and radiation experienced a nearly 16-point drop in their credit score and those who had surgery and chemoradiation actually experienced a 2.59 bump, compared with those who had surgery alone.

Financial toxicity was worse for patients younger than 62 years, those identifying as Black or Hispanic individuals, unmarried individuals, those with an annual income below $52,000, and those living in deprived areas.

The studies add to findings from the 2015 North American Thyroid Cancer Survivorship Study, which reported that 50% of thyroid cancer survivors encountered financial toxicity because of their diagnosis.

James said the persistent financial strain of cancer care, even in a state like Massachusetts, which mandates universal healthcare, underscores the need for “broader policy changes and reforms, including reconsidering debt collection practices.”

“Financial security should be a priority in cancer care,” he added.

The studies had no specific funding. The authors have disclosed no relevant conflict of interest.

A version of this article first appeared on Medscape.com.

— While the physical toll of cancer is well documented, the financial toll can also be severe and lasting.

Overall, patients with cancer tend to face higher rates of debt collection, medical collections, and bankruptcies, as well as lower credit scores, according to two new studies presented at the American College of Surgeons Clinical Congress 2024.

“These are the first studies to provide numerical evidence of financial toxicity among cancer survivors,” Benjamin C. James, MD, with Beth Israel Deaconess Medical Center and Harvard Medical School, both in Boston, Massachusetts, who worked on both studies, said in a statement. “Previous data on this topic largely relies on subjective survey reviews.”

In one study, researchers used the Massachusetts Cancer Registry to identify 99,175 patients diagnosed with cancer between 2010 and 2019 and matched them with 188,875 control individuals without cancer. Researchers then assessed financial toxicity using Experian credit bureau data for participants.

Overall, patients with cancer faced a range of financial challenges that often lasted years following their diagnosis.

Patients were nearly five times more likely to experience bankruptcy and had average credit scores nearly 80 points lower than control individuals without cancer. The drop in credit scores was more pronounced for survivors of bladder, liver, lung, and colorectal cancer (CRC) and persisted for up to 9.5 years.

For certain cancer types, in particular, “we are looking years after a diagnosis, and we see that the credit score goes down and it never comes back up,” James said.

The other study, which used a sample of 7227 patients with CRC from Massachusetts, identified several factors that correlated with lower credit scores.

Compared with patients who only had surgery, peers who underwent radiation only experienced a 62-point drop in their credit score after their diagnosis, while those who had chemotherapy alone had just over a 14-point drop in their credit score. Among patients who had combination treatments, those who underwent both surgery and radiation experienced a nearly 16-point drop in their credit score and those who had surgery and chemoradiation actually experienced a 2.59 bump, compared with those who had surgery alone.

Financial toxicity was worse for patients younger than 62 years, those identifying as Black or Hispanic individuals, unmarried individuals, those with an annual income below $52,000, and those living in deprived areas.

The studies add to findings from the 2015 North American Thyroid Cancer Survivorship Study, which reported that 50% of thyroid cancer survivors encountered financial toxicity because of their diagnosis.

James said the persistent financial strain of cancer care, even in a state like Massachusetts, which mandates universal healthcare, underscores the need for “broader policy changes and reforms, including reconsidering debt collection practices.”

“Financial security should be a priority in cancer care,” he added.

The studies had no specific funding. The authors have disclosed no relevant conflict of interest.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ACSCS 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Mortality Rates From Early-Onset CRC Have Risen Considerably Over Last 2 Decades

Article Type
Changed
Thu, 10/31/2024 - 13:36

The mortality rate of early-onset colorectal cancer (EO-CRC) has increased considerably across the United States over the past 2 decades, with the effects most pronounced in those aged 20-44 years, according to a new analysis of the two largest US mortality databases. 

Data from the Centers for Disease Control and Prevention’s National Center of Health Statistics (NCHS) and the Surveillance, Epidemiology, and End Results (SEER) databases provide yet more evidence of the increasing prevalence of EO-CRC, which is defined as a diagnosis of CRC in patients younger than age 50 years. 

Furthermore, the researchers reported that increased mortality occurred across all patients included in the study (aged 20-54) regardless of tumor stage at diagnosis.

These findings “prompt tailoring further efforts toward raising awareness of colorectal cancer symptoms and keeping a low clinical suspicion in younger patients presenting with anemia, gastrointestinal bleeding, or change in bowel habits,” Yazan Abboud, MD, internal medicine PGY-3, assistant chief resident, and chair of resident research at Rutgers New Jersey Medical School, Newark, said in an interview.

Abboud presented the findings at the American College of Gastroenterology (ACG) 2024 Annual Scientific Meeting
 

Analyzing NCHS and SEER 

Rising rates of EO-CRC had prompted US medical societies to recommend reducing the screening age to 45 years. The US Preventive Services Task Force officially lowered it to this age in 2021. This shift is supported by real-world evidence, which shows that earlier screening leads to a significantly reduced risk for colorectal cancer. However, because colorectal cancer cases are decreasing overall in older adults, there is considerable interest in discovering why young adults are experiencing a paradoxical uptick in EO-CRC, and what impact this is having on associated mortality.

Abboud and colleagues collected age-adjusted mortality rates for EO-CRC between 2000 and 2022 from the NCHS database. In addition, stage-specific incidence-based mortality rates between 2004-2020 were obtained from the SEER 22 database. The NCHS database covers approximately 100% of the US population, whereas the SEER 22 database, which is included within the NCHS, covers 42%. 

The researchers divided patients into two cohorts based on age (20-44 years and 45-54 years) and tumor stage at diagnosis (early stage and late stage), and compared the annual percentage change (APC) and the average APC between the two groups. They also assessed trends for the entire cohort of patients aged 20-54 years. 

In the NCHS database, there were 147,026 deaths in total across all ages studied resulting from EO-CRC, of which 27% (39,746) occurred in those 20-44 years of age. Although associated mortality rates decreased between 2000-2005 in all ages studied (APC, –1.56), they increased from 2005-2022 (APC, 0.87). 

In the cohort aged 45-54 years, mortality decreased between 2000-2005 and increased thereafter, whereas in the cohort aged 20-44 years mortality increased steadily for the entire follow-up duration of 2000 to 2022 (APC, 0.93). A comparison of the age cohorts confirmed that those aged 20-44 years had a greater increase in mortality (average APC, 0.85; P < .001).

In the SEER 22 database, there were 4652 deaths in those with early-stage tumors across all age groups studied (average APC, 12.17). Mortality increased in patients aged 45-54 years (average APC, 11.52) with early-stage tumors, but there were insufficient numbers in those aged 20-44 years to determine this outcome. 

There were 42,120 deaths in those with late-stage tumors across all age groups (average APC, 10.05) in the SEER 22 database. And increased mortality was observed in those with late-stage tumors in both age cohorts: 45-54 years (average APC, 9.58) and 20-44 years (average APC, 11.06).

“When evaluating the SEER database and stratifying the tumors by stage at diagnosis, we demonstrated increasing mortality of early-onset colorectal cancer in both early- and late-stage tumors on average over the study period,” Abboud said. 
 

 

 

Identifying At-Risk Patients

In a comment, David A. Johnson, MD, professor of medicine and chief of gastroenterology at Eastern Virginia School of Medicine in Norfolk, said the findings speak to the need for evidence-based means of identifying younger individuals at a higher risk of EO-CRC.

“I suspect many of younger patients with CRC had their cancer detected when it was more advanced due to delayed presentation and diagnostic testing,” said Johnson, who was not involved in the study. 

But it would be interesting to evaluate if the cancers in the cohort aged 20-44 years were more aggressive biologically or if these patients were dismissive of early signs or symptoms, he said. 

Younger patients may dismiss “alarm” features that indicate CRC testing, said Johnson. “In particular, overt bleeding and iron deficiency need a focused evaluation in these younger cohorts.”

“Future research is needed to investigate the role of neoadjuvant chemotherapy in younger patients with early-stage colorectal cancer and evaluate patients’ outcomes,” Abboud added. 

The study had no specific funding. Abboud reported no relevant financial relationships. Johnson reported serving as an adviser to ISOTHRIVE. He is also on the Medscape Gastroenterology editorial board.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The mortality rate of early-onset colorectal cancer (EO-CRC) has increased considerably across the United States over the past 2 decades, with the effects most pronounced in those aged 20-44 years, according to a new analysis of the two largest US mortality databases. 

Data from the Centers for Disease Control and Prevention’s National Center of Health Statistics (NCHS) and the Surveillance, Epidemiology, and End Results (SEER) databases provide yet more evidence of the increasing prevalence of EO-CRC, which is defined as a diagnosis of CRC in patients younger than age 50 years. 

Furthermore, the researchers reported that increased mortality occurred across all patients included in the study (aged 20-54) regardless of tumor stage at diagnosis.

These findings “prompt tailoring further efforts toward raising awareness of colorectal cancer symptoms and keeping a low clinical suspicion in younger patients presenting with anemia, gastrointestinal bleeding, or change in bowel habits,” Yazan Abboud, MD, internal medicine PGY-3, assistant chief resident, and chair of resident research at Rutgers New Jersey Medical School, Newark, said in an interview.

Abboud presented the findings at the American College of Gastroenterology (ACG) 2024 Annual Scientific Meeting
 

Analyzing NCHS and SEER 

Rising rates of EO-CRC had prompted US medical societies to recommend reducing the screening age to 45 years. The US Preventive Services Task Force officially lowered it to this age in 2021. This shift is supported by real-world evidence, which shows that earlier screening leads to a significantly reduced risk for colorectal cancer. However, because colorectal cancer cases are decreasing overall in older adults, there is considerable interest in discovering why young adults are experiencing a paradoxical uptick in EO-CRC, and what impact this is having on associated mortality.

Abboud and colleagues collected age-adjusted mortality rates for EO-CRC between 2000 and 2022 from the NCHS database. In addition, stage-specific incidence-based mortality rates between 2004-2020 were obtained from the SEER 22 database. The NCHS database covers approximately 100% of the US population, whereas the SEER 22 database, which is included within the NCHS, covers 42%. 

The researchers divided patients into two cohorts based on age (20-44 years and 45-54 years) and tumor stage at diagnosis (early stage and late stage), and compared the annual percentage change (APC) and the average APC between the two groups. They also assessed trends for the entire cohort of patients aged 20-54 years. 

In the NCHS database, there were 147,026 deaths in total across all ages studied resulting from EO-CRC, of which 27% (39,746) occurred in those 20-44 years of age. Although associated mortality rates decreased between 2000-2005 in all ages studied (APC, –1.56), they increased from 2005-2022 (APC, 0.87). 

In the cohort aged 45-54 years, mortality decreased between 2000-2005 and increased thereafter, whereas in the cohort aged 20-44 years mortality increased steadily for the entire follow-up duration of 2000 to 2022 (APC, 0.93). A comparison of the age cohorts confirmed that those aged 20-44 years had a greater increase in mortality (average APC, 0.85; P < .001).

In the SEER 22 database, there were 4652 deaths in those with early-stage tumors across all age groups studied (average APC, 12.17). Mortality increased in patients aged 45-54 years (average APC, 11.52) with early-stage tumors, but there were insufficient numbers in those aged 20-44 years to determine this outcome. 

There were 42,120 deaths in those with late-stage tumors across all age groups (average APC, 10.05) in the SEER 22 database. And increased mortality was observed in those with late-stage tumors in both age cohorts: 45-54 years (average APC, 9.58) and 20-44 years (average APC, 11.06).

“When evaluating the SEER database and stratifying the tumors by stage at diagnosis, we demonstrated increasing mortality of early-onset colorectal cancer in both early- and late-stage tumors on average over the study period,” Abboud said. 
 

 

 

Identifying At-Risk Patients

In a comment, David A. Johnson, MD, professor of medicine and chief of gastroenterology at Eastern Virginia School of Medicine in Norfolk, said the findings speak to the need for evidence-based means of identifying younger individuals at a higher risk of EO-CRC.

“I suspect many of younger patients with CRC had their cancer detected when it was more advanced due to delayed presentation and diagnostic testing,” said Johnson, who was not involved in the study. 

But it would be interesting to evaluate if the cancers in the cohort aged 20-44 years were more aggressive biologically or if these patients were dismissive of early signs or symptoms, he said. 

Younger patients may dismiss “alarm” features that indicate CRC testing, said Johnson. “In particular, overt bleeding and iron deficiency need a focused evaluation in these younger cohorts.”

“Future research is needed to investigate the role of neoadjuvant chemotherapy in younger patients with early-stage colorectal cancer and evaluate patients’ outcomes,” Abboud added. 

The study had no specific funding. Abboud reported no relevant financial relationships. Johnson reported serving as an adviser to ISOTHRIVE. He is also on the Medscape Gastroenterology editorial board.

A version of this article first appeared on Medscape.com.

The mortality rate of early-onset colorectal cancer (EO-CRC) has increased considerably across the United States over the past 2 decades, with the effects most pronounced in those aged 20-44 years, according to a new analysis of the two largest US mortality databases. 

Data from the Centers for Disease Control and Prevention’s National Center of Health Statistics (NCHS) and the Surveillance, Epidemiology, and End Results (SEER) databases provide yet more evidence of the increasing prevalence of EO-CRC, which is defined as a diagnosis of CRC in patients younger than age 50 years. 

Furthermore, the researchers reported that increased mortality occurred across all patients included in the study (aged 20-54) regardless of tumor stage at diagnosis.

These findings “prompt tailoring further efforts toward raising awareness of colorectal cancer symptoms and keeping a low clinical suspicion in younger patients presenting with anemia, gastrointestinal bleeding, or change in bowel habits,” Yazan Abboud, MD, internal medicine PGY-3, assistant chief resident, and chair of resident research at Rutgers New Jersey Medical School, Newark, said in an interview.

Abboud presented the findings at the American College of Gastroenterology (ACG) 2024 Annual Scientific Meeting
 

Analyzing NCHS and SEER 

Rising rates of EO-CRC had prompted US medical societies to recommend reducing the screening age to 45 years. The US Preventive Services Task Force officially lowered it to this age in 2021. This shift is supported by real-world evidence, which shows that earlier screening leads to a significantly reduced risk for colorectal cancer. However, because colorectal cancer cases are decreasing overall in older adults, there is considerable interest in discovering why young adults are experiencing a paradoxical uptick in EO-CRC, and what impact this is having on associated mortality.

Abboud and colleagues collected age-adjusted mortality rates for EO-CRC between 2000 and 2022 from the NCHS database. In addition, stage-specific incidence-based mortality rates between 2004-2020 were obtained from the SEER 22 database. The NCHS database covers approximately 100% of the US population, whereas the SEER 22 database, which is included within the NCHS, covers 42%. 

The researchers divided patients into two cohorts based on age (20-44 years and 45-54 years) and tumor stage at diagnosis (early stage and late stage), and compared the annual percentage change (APC) and the average APC between the two groups. They also assessed trends for the entire cohort of patients aged 20-54 years. 

In the NCHS database, there were 147,026 deaths in total across all ages studied resulting from EO-CRC, of which 27% (39,746) occurred in those 20-44 years of age. Although associated mortality rates decreased between 2000-2005 in all ages studied (APC, –1.56), they increased from 2005-2022 (APC, 0.87). 

In the cohort aged 45-54 years, mortality decreased between 2000-2005 and increased thereafter, whereas in the cohort aged 20-44 years mortality increased steadily for the entire follow-up duration of 2000 to 2022 (APC, 0.93). A comparison of the age cohorts confirmed that those aged 20-44 years had a greater increase in mortality (average APC, 0.85; P < .001).

In the SEER 22 database, there were 4652 deaths in those with early-stage tumors across all age groups studied (average APC, 12.17). Mortality increased in patients aged 45-54 years (average APC, 11.52) with early-stage tumors, but there were insufficient numbers in those aged 20-44 years to determine this outcome. 

There were 42,120 deaths in those with late-stage tumors across all age groups (average APC, 10.05) in the SEER 22 database. And increased mortality was observed in those with late-stage tumors in both age cohorts: 45-54 years (average APC, 9.58) and 20-44 years (average APC, 11.06).

“When evaluating the SEER database and stratifying the tumors by stage at diagnosis, we demonstrated increasing mortality of early-onset colorectal cancer in both early- and late-stage tumors on average over the study period,” Abboud said. 
 

 

 

Identifying At-Risk Patients

In a comment, David A. Johnson, MD, professor of medicine and chief of gastroenterology at Eastern Virginia School of Medicine in Norfolk, said the findings speak to the need for evidence-based means of identifying younger individuals at a higher risk of EO-CRC.

“I suspect many of younger patients with CRC had their cancer detected when it was more advanced due to delayed presentation and diagnostic testing,” said Johnson, who was not involved in the study. 

But it would be interesting to evaluate if the cancers in the cohort aged 20-44 years were more aggressive biologically or if these patients were dismissive of early signs or symptoms, he said. 

Younger patients may dismiss “alarm” features that indicate CRC testing, said Johnson. “In particular, overt bleeding and iron deficiency need a focused evaluation in these younger cohorts.”

“Future research is needed to investigate the role of neoadjuvant chemotherapy in younger patients with early-stage colorectal cancer and evaluate patients’ outcomes,” Abboud added. 

The study had no specific funding. Abboud reported no relevant financial relationships. Johnson reported serving as an adviser to ISOTHRIVE. He is also on the Medscape Gastroenterology editorial board.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ACG 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Study Compares Punch Excision vs. Core Excision for Recalcitrant Keloids

Article Type
Changed
Mon, 10/28/2024 - 12:03

Punch excision (PE) followed by immediate cryotherapy could be a viable and simpler alternative to core excision (CE) for the treatment of recalcitrant keloids, according to the results of a small retrospective study.

The method “offers similar efficacy, faster healing, and fewer complications,” one of the study authors, Jinwoong Jung, MD, said in an interview following the annual meeting of the American Society for Dermatologic Surgery, where he presented the study results during an oral abstract session.

For the study, Jung, a dermatologist at Yonsei University College of Medicine, Seoul, South Korea, and colleagues retrospectively analyzed 22 patients with recalcitrant keloids treated with cryotherapy immediately following either PE or CE between May 2019 and March 2024. They used the Vancouver Scar Scale (VSS) to assess treatment efficacy.

Of the 22 patients, 16 underwent treatment with CE and 6 underwent treatment with PE. Pretreatment VSS scores showed no significant differences between the groups (P = .535). The CE group had a reduction in the VSS score from 8.13 to 4.00, while the PE group had a reduction from 7.83 to 3.67, but these declines did not differ significantly (P = .737). The PE group exhibited a shorter healing time than the CE group (a mean of 43.5 vs 63.87 days, respectively), though this difference was not statistically significant (P = .129).

“The uniqueness of this work lies in its simplified use of PE for recalcitrant keloids, which demonstrated efficacy comparable to CE, with the potential advantage of faster healing times,” Jung said. “Future studies with larger sample sizes and extended follow-up periods could help establish this approach as a standard treatment method.”

He acknowledged certain limitations of the study, including its small sample size and the lack of long-term follow-up data. The researchers reported having no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Punch excision (PE) followed by immediate cryotherapy could be a viable and simpler alternative to core excision (CE) for the treatment of recalcitrant keloids, according to the results of a small retrospective study.

The method “offers similar efficacy, faster healing, and fewer complications,” one of the study authors, Jinwoong Jung, MD, said in an interview following the annual meeting of the American Society for Dermatologic Surgery, where he presented the study results during an oral abstract session.

For the study, Jung, a dermatologist at Yonsei University College of Medicine, Seoul, South Korea, and colleagues retrospectively analyzed 22 patients with recalcitrant keloids treated with cryotherapy immediately following either PE or CE between May 2019 and March 2024. They used the Vancouver Scar Scale (VSS) to assess treatment efficacy.

Of the 22 patients, 16 underwent treatment with CE and 6 underwent treatment with PE. Pretreatment VSS scores showed no significant differences between the groups (P = .535). The CE group had a reduction in the VSS score from 8.13 to 4.00, while the PE group had a reduction from 7.83 to 3.67, but these declines did not differ significantly (P = .737). The PE group exhibited a shorter healing time than the CE group (a mean of 43.5 vs 63.87 days, respectively), though this difference was not statistically significant (P = .129).

“The uniqueness of this work lies in its simplified use of PE for recalcitrant keloids, which demonstrated efficacy comparable to CE, with the potential advantage of faster healing times,” Jung said. “Future studies with larger sample sizes and extended follow-up periods could help establish this approach as a standard treatment method.”

He acknowledged certain limitations of the study, including its small sample size and the lack of long-term follow-up data. The researchers reported having no relevant disclosures.

A version of this article first appeared on Medscape.com.

Punch excision (PE) followed by immediate cryotherapy could be a viable and simpler alternative to core excision (CE) for the treatment of recalcitrant keloids, according to the results of a small retrospective study.

The method “offers similar efficacy, faster healing, and fewer complications,” one of the study authors, Jinwoong Jung, MD, said in an interview following the annual meeting of the American Society for Dermatologic Surgery, where he presented the study results during an oral abstract session.

For the study, Jung, a dermatologist at Yonsei University College of Medicine, Seoul, South Korea, and colleagues retrospectively analyzed 22 patients with recalcitrant keloids treated with cryotherapy immediately following either PE or CE between May 2019 and March 2024. They used the Vancouver Scar Scale (VSS) to assess treatment efficacy.

Of the 22 patients, 16 underwent treatment with CE and 6 underwent treatment with PE. Pretreatment VSS scores showed no significant differences between the groups (P = .535). The CE group had a reduction in the VSS score from 8.13 to 4.00, while the PE group had a reduction from 7.83 to 3.67, but these declines did not differ significantly (P = .737). The PE group exhibited a shorter healing time than the CE group (a mean of 43.5 vs 63.87 days, respectively), though this difference was not statistically significant (P = .129).

“The uniqueness of this work lies in its simplified use of PE for recalcitrant keloids, which demonstrated efficacy comparable to CE, with the potential advantage of faster healing times,” Jung said. “Future studies with larger sample sizes and extended follow-up periods could help establish this approach as a standard treatment method.”

He acknowledged certain limitations of the study, including its small sample size and the lack of long-term follow-up data. The researchers reported having no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ASDS 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Neurologists Lack Awareness of Steroid Toxicity

Article Type
Changed
Mon, 10/28/2024 - 09:45

There is a lack of understanding among neuromuscular specialists on how to balance the risks for and benefits of corticosteroids when treating patients with generalized myasthenia gravis (gMG) and chronic inflammatory demyelinating polyneuropathy (CIDP), results of a US survey showed.

For both MG and CIDP specialists, uncertainty around corticosteroid dosing, duration, and toxicity underscores the need for more guidance, the investigators noted. Over 85% of respondents indicated that a tool for systematically monitoring corticosteroid toxicity would be valuable.

The results indicate “a lack of knowledge by this pool of neurologists about the guidelines and what they contain,” said study investigator Gil Wolfe, MD, professor of neurology at the Jacobs School of Medicine and Biomedical Sciences, University at Buffalo, in New York.

Clearer guidance on how to administer corticosteroids and manage toxicities in patients with gMG and CIDP “would be welcomed by neurologists and have potential for benefit to patient care,” the team noted.

The findings were presented at the American Association of Neuromuscular & Electrodiagnostic Medicine (AANEM) 2024.
 

Lack of Knowledge

Although guidelines for both CIDP and gMG recommend corticosteroids as first-line treatment and emphasize using the lowest effective dose to control symptoms, they do not include specific recommendations on dosing, duration, or toxicity monitoring, the researchers noted.

Despite this, a large proportion of survey respondents reported using guidelines to make clinical decisions on monitoring toxicity, with up to a third actually endorsing a guideline that doesn’t exist.

The cross-sectional, online survey was deployed in November and December 2023 and included 200 US neurologists. Of these, 99 answered questions on CIDP, and 101 answered similar questions on gMG.

To participate in the survey, respondents had to be board-certified neurologists, practicing for at least 2 years post-residency, and have treated or consulted on at least three patients with CIDP or 10 patients with gMG in the past year who were on a corticosteroid dose of at least 10 mg daily for 1 month or more.

CIDP respondents had been practicing a mean of 18.1 years since residency and were board certified in neuromuscular (20%), electrodiagnostic medicine/clinical neurophysiology (21%), and pediatric neurology (8%). Two thirds of them accepted referrals from other neurologists.

The gMG respondents had been practicing a mean of 20.5 years since residency and were board certified in neuromuscular (45%), electrodiagnostic medicine/clinical neurophysiology (35%), and pediatric neurology (17%). A total of 72% accepted referrals from other neurologists.

Respondents estimated that about 60% of their patients with gMG and 58% of patients with CIDP were being treated with corticosteroids, with gMG and CIDP respondents reporting a mean of 26.4 and 15.6 patients, respectively, meeting the study’s dosing criteria.
 

Appropriate Dosing

When asked what chronic, long-term (≥ 6 months) corticosteroid dose they considered safe in terms of minimizing adverse events, 43% of CIDP respondents and 51% of gMG respondents considered corticosteroid doses of 10 mg/d or less (prednisone equivalent) well tolerated; additionally, 32% and 31%, respectively, considered 20-40 mg/d well tolerated. Moreover, they said only about half of their patients would be able to taper to less than 10 mg/d in less than 6 months.

“Studies suggest safety is not seen until patients are on doses at 5 mg/d or less,” Wolfe said. “There is not enough appreciation that doses at levels we once considered safe really do pose significant risk,” he added.

“With the increasing number of treatment options in MG and to a lesser extent in CIDP, we need to do all we can to use corticosteroids as judiciously as possible and be aware of side effects our patients may not even report unless we make a pointed effort to ask about them.”

Familiarity with corticosteroid toxicities was more common among gMG respondents, of whom 77% reported being very/extremely familiar, than among 55% of CIDP respondents. Appetite/weight gain was reported among the most common adverse effects (AEs) associated with long-term CS use (reported by 68% of CIDP and 58% of gMG respondents). Other common AEs reported were insulin resistance (53% of CIDP and 50% of gMG respondents), decreased bone density (47% and 48%, respectively), immunosuppression (37% and 45%, respectively). Mood and behavioral change were noted by 56% of CIDP and 37% of gMG respondents, particularly mood swings, irritability, mania, and sleep disorders.

When asked how they balanced the risk for and benefit of corticosteroids, more than 80% of CIDP specialists reported personally monitoring for corticosteroid-related toxicity, and 42% reported they collaborated with the patient’s primary care provider. However, fewer than 10% reported ordering lab tests. Among neurologists treating gMG, 84% said they typically monitor corticosteroid toxicity independently, while 41% reported doing so in collaboration with primary care providers.

Two thirds of CIDP respondents and 53% of gMG respondents reported using guidelines to make clinical decisions on monitoring toxicity, and 34% of gMG respondents actually endorsed using the Guideline for Systematic Surveillance of Steroid Safety, which does not exist.
 

‘A Big Issue’ in Neurology

Commenting on the results, Said R. Beydoun, MD, professor and division chief, Neuromuscular Medicine, Department of Neurology at Keck Medicine of University of Southern California, Los Angeles, said steroid toxicity is “a big issue” in neurology.

These patients can be on chronic therapy, and they aren’t really monitored for osteoporosis or other complications, he said, adding that neurologists aren’t always taking the necessary precautions to prevent steroid toxicity.

Beydoun estimated that about half of neurologists are not adequately familiar with balancing the efficacy of corticosteroids versus in toxicity.

“Objective improvement, either on the functional scale or the muscle impairment scale — that’s really response treatment. Whereas adverse effects of a treatment are something separate. The patient may be improving but also maybe developing other complications from the treatment,” he said.

Also commenting, Ghazala Hayat, MD, professor of neurology and director of neuromuscular and clinical neurophysiology services at Saint Louis University in St. Louis, said there is a clear need for more education.

“I always say prednisone is our best friend initially, and then it becomes the worst enemy. If you don’t see lots of neuromuscular patients, you might not know even how to recognize toxicity or how to taper. Or the opposite to that, if you taper too quickly, patients relapse.”

The study was funded by argenx. Wolfe reported serving on advisory boards for Alexion, argenx, UCB, and Johnson & Johnson. Neelam Goyal, MD, is a consultant/advisor for Alexion, argenx, Amgen, Janssen, Lycia Therapeutics, and UCB and has received grant support from argenx. Beydoun reported receiving research support and consulting and speaking fees from Healey Center, Amylyx, AB Science, Sanofi, Janssen, Genentech, Regeneron, UCB, Abcuro argenx, Alnylam, AstraZeneca, Amylyx, CSL Behring, Grifols, Takeda, Octapharma, UCB, and Janssen. Hayat reported speaker and advisory roles with argenx, Alexion, and MTPA.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

There is a lack of understanding among neuromuscular specialists on how to balance the risks for and benefits of corticosteroids when treating patients with generalized myasthenia gravis (gMG) and chronic inflammatory demyelinating polyneuropathy (CIDP), results of a US survey showed.

For both MG and CIDP specialists, uncertainty around corticosteroid dosing, duration, and toxicity underscores the need for more guidance, the investigators noted. Over 85% of respondents indicated that a tool for systematically monitoring corticosteroid toxicity would be valuable.

The results indicate “a lack of knowledge by this pool of neurologists about the guidelines and what they contain,” said study investigator Gil Wolfe, MD, professor of neurology at the Jacobs School of Medicine and Biomedical Sciences, University at Buffalo, in New York.

Clearer guidance on how to administer corticosteroids and manage toxicities in patients with gMG and CIDP “would be welcomed by neurologists and have potential for benefit to patient care,” the team noted.

The findings were presented at the American Association of Neuromuscular & Electrodiagnostic Medicine (AANEM) 2024.
 

Lack of Knowledge

Although guidelines for both CIDP and gMG recommend corticosteroids as first-line treatment and emphasize using the lowest effective dose to control symptoms, they do not include specific recommendations on dosing, duration, or toxicity monitoring, the researchers noted.

Despite this, a large proportion of survey respondents reported using guidelines to make clinical decisions on monitoring toxicity, with up to a third actually endorsing a guideline that doesn’t exist.

The cross-sectional, online survey was deployed in November and December 2023 and included 200 US neurologists. Of these, 99 answered questions on CIDP, and 101 answered similar questions on gMG.

To participate in the survey, respondents had to be board-certified neurologists, practicing for at least 2 years post-residency, and have treated or consulted on at least three patients with CIDP or 10 patients with gMG in the past year who were on a corticosteroid dose of at least 10 mg daily for 1 month or more.

CIDP respondents had been practicing a mean of 18.1 years since residency and were board certified in neuromuscular (20%), electrodiagnostic medicine/clinical neurophysiology (21%), and pediatric neurology (8%). Two thirds of them accepted referrals from other neurologists.

The gMG respondents had been practicing a mean of 20.5 years since residency and were board certified in neuromuscular (45%), electrodiagnostic medicine/clinical neurophysiology (35%), and pediatric neurology (17%). A total of 72% accepted referrals from other neurologists.

Respondents estimated that about 60% of their patients with gMG and 58% of patients with CIDP were being treated with corticosteroids, with gMG and CIDP respondents reporting a mean of 26.4 and 15.6 patients, respectively, meeting the study’s dosing criteria.
 

Appropriate Dosing

When asked what chronic, long-term (≥ 6 months) corticosteroid dose they considered safe in terms of minimizing adverse events, 43% of CIDP respondents and 51% of gMG respondents considered corticosteroid doses of 10 mg/d or less (prednisone equivalent) well tolerated; additionally, 32% and 31%, respectively, considered 20-40 mg/d well tolerated. Moreover, they said only about half of their patients would be able to taper to less than 10 mg/d in less than 6 months.

“Studies suggest safety is not seen until patients are on doses at 5 mg/d or less,” Wolfe said. “There is not enough appreciation that doses at levels we once considered safe really do pose significant risk,” he added.

“With the increasing number of treatment options in MG and to a lesser extent in CIDP, we need to do all we can to use corticosteroids as judiciously as possible and be aware of side effects our patients may not even report unless we make a pointed effort to ask about them.”

Familiarity with corticosteroid toxicities was more common among gMG respondents, of whom 77% reported being very/extremely familiar, than among 55% of CIDP respondents. Appetite/weight gain was reported among the most common adverse effects (AEs) associated with long-term CS use (reported by 68% of CIDP and 58% of gMG respondents). Other common AEs reported were insulin resistance (53% of CIDP and 50% of gMG respondents), decreased bone density (47% and 48%, respectively), immunosuppression (37% and 45%, respectively). Mood and behavioral change were noted by 56% of CIDP and 37% of gMG respondents, particularly mood swings, irritability, mania, and sleep disorders.

When asked how they balanced the risk for and benefit of corticosteroids, more than 80% of CIDP specialists reported personally monitoring for corticosteroid-related toxicity, and 42% reported they collaborated with the patient’s primary care provider. However, fewer than 10% reported ordering lab tests. Among neurologists treating gMG, 84% said they typically monitor corticosteroid toxicity independently, while 41% reported doing so in collaboration with primary care providers.

Two thirds of CIDP respondents and 53% of gMG respondents reported using guidelines to make clinical decisions on monitoring toxicity, and 34% of gMG respondents actually endorsed using the Guideline for Systematic Surveillance of Steroid Safety, which does not exist.
 

‘A Big Issue’ in Neurology

Commenting on the results, Said R. Beydoun, MD, professor and division chief, Neuromuscular Medicine, Department of Neurology at Keck Medicine of University of Southern California, Los Angeles, said steroid toxicity is “a big issue” in neurology.

These patients can be on chronic therapy, and they aren’t really monitored for osteoporosis or other complications, he said, adding that neurologists aren’t always taking the necessary precautions to prevent steroid toxicity.

Beydoun estimated that about half of neurologists are not adequately familiar with balancing the efficacy of corticosteroids versus in toxicity.

“Objective improvement, either on the functional scale or the muscle impairment scale — that’s really response treatment. Whereas adverse effects of a treatment are something separate. The patient may be improving but also maybe developing other complications from the treatment,” he said.

Also commenting, Ghazala Hayat, MD, professor of neurology and director of neuromuscular and clinical neurophysiology services at Saint Louis University in St. Louis, said there is a clear need for more education.

“I always say prednisone is our best friend initially, and then it becomes the worst enemy. If you don’t see lots of neuromuscular patients, you might not know even how to recognize toxicity or how to taper. Or the opposite to that, if you taper too quickly, patients relapse.”

The study was funded by argenx. Wolfe reported serving on advisory boards for Alexion, argenx, UCB, and Johnson & Johnson. Neelam Goyal, MD, is a consultant/advisor for Alexion, argenx, Amgen, Janssen, Lycia Therapeutics, and UCB and has received grant support from argenx. Beydoun reported receiving research support and consulting and speaking fees from Healey Center, Amylyx, AB Science, Sanofi, Janssen, Genentech, Regeneron, UCB, Abcuro argenx, Alnylam, AstraZeneca, Amylyx, CSL Behring, Grifols, Takeda, Octapharma, UCB, and Janssen. Hayat reported speaker and advisory roles with argenx, Alexion, and MTPA.
 

A version of this article appeared on Medscape.com.

There is a lack of understanding among neuromuscular specialists on how to balance the risks for and benefits of corticosteroids when treating patients with generalized myasthenia gravis (gMG) and chronic inflammatory demyelinating polyneuropathy (CIDP), results of a US survey showed.

For both MG and CIDP specialists, uncertainty around corticosteroid dosing, duration, and toxicity underscores the need for more guidance, the investigators noted. Over 85% of respondents indicated that a tool for systematically monitoring corticosteroid toxicity would be valuable.

The results indicate “a lack of knowledge by this pool of neurologists about the guidelines and what they contain,” said study investigator Gil Wolfe, MD, professor of neurology at the Jacobs School of Medicine and Biomedical Sciences, University at Buffalo, in New York.

Clearer guidance on how to administer corticosteroids and manage toxicities in patients with gMG and CIDP “would be welcomed by neurologists and have potential for benefit to patient care,” the team noted.

The findings were presented at the American Association of Neuromuscular & Electrodiagnostic Medicine (AANEM) 2024.
 

Lack of Knowledge

Although guidelines for both CIDP and gMG recommend corticosteroids as first-line treatment and emphasize using the lowest effective dose to control symptoms, they do not include specific recommendations on dosing, duration, or toxicity monitoring, the researchers noted.

Despite this, a large proportion of survey respondents reported using guidelines to make clinical decisions on monitoring toxicity, with up to a third actually endorsing a guideline that doesn’t exist.

The cross-sectional, online survey was deployed in November and December 2023 and included 200 US neurologists. Of these, 99 answered questions on CIDP, and 101 answered similar questions on gMG.

To participate in the survey, respondents had to be board-certified neurologists, practicing for at least 2 years post-residency, and have treated or consulted on at least three patients with CIDP or 10 patients with gMG in the past year who were on a corticosteroid dose of at least 10 mg daily for 1 month or more.

CIDP respondents had been practicing a mean of 18.1 years since residency and were board certified in neuromuscular (20%), electrodiagnostic medicine/clinical neurophysiology (21%), and pediatric neurology (8%). Two thirds of them accepted referrals from other neurologists.

The gMG respondents had been practicing a mean of 20.5 years since residency and were board certified in neuromuscular (45%), electrodiagnostic medicine/clinical neurophysiology (35%), and pediatric neurology (17%). A total of 72% accepted referrals from other neurologists.

Respondents estimated that about 60% of their patients with gMG and 58% of patients with CIDP were being treated with corticosteroids, with gMG and CIDP respondents reporting a mean of 26.4 and 15.6 patients, respectively, meeting the study’s dosing criteria.
 

Appropriate Dosing

When asked what chronic, long-term (≥ 6 months) corticosteroid dose they considered safe in terms of minimizing adverse events, 43% of CIDP respondents and 51% of gMG respondents considered corticosteroid doses of 10 mg/d or less (prednisone equivalent) well tolerated; additionally, 32% and 31%, respectively, considered 20-40 mg/d well tolerated. Moreover, they said only about half of their patients would be able to taper to less than 10 mg/d in less than 6 months.

“Studies suggest safety is not seen until patients are on doses at 5 mg/d or less,” Wolfe said. “There is not enough appreciation that doses at levels we once considered safe really do pose significant risk,” he added.

“With the increasing number of treatment options in MG and to a lesser extent in CIDP, we need to do all we can to use corticosteroids as judiciously as possible and be aware of side effects our patients may not even report unless we make a pointed effort to ask about them.”

Familiarity with corticosteroid toxicities was more common among gMG respondents, of whom 77% reported being very/extremely familiar, than among 55% of CIDP respondents. Appetite/weight gain was reported among the most common adverse effects (AEs) associated with long-term CS use (reported by 68% of CIDP and 58% of gMG respondents). Other common AEs reported were insulin resistance (53% of CIDP and 50% of gMG respondents), decreased bone density (47% and 48%, respectively), immunosuppression (37% and 45%, respectively). Mood and behavioral change were noted by 56% of CIDP and 37% of gMG respondents, particularly mood swings, irritability, mania, and sleep disorders.

When asked how they balanced the risk for and benefit of corticosteroids, more than 80% of CIDP specialists reported personally monitoring for corticosteroid-related toxicity, and 42% reported they collaborated with the patient’s primary care provider. However, fewer than 10% reported ordering lab tests. Among neurologists treating gMG, 84% said they typically monitor corticosteroid toxicity independently, while 41% reported doing so in collaboration with primary care providers.

Two thirds of CIDP respondents and 53% of gMG respondents reported using guidelines to make clinical decisions on monitoring toxicity, and 34% of gMG respondents actually endorsed using the Guideline for Systematic Surveillance of Steroid Safety, which does not exist.
 

‘A Big Issue’ in Neurology

Commenting on the results, Said R. Beydoun, MD, professor and division chief, Neuromuscular Medicine, Department of Neurology at Keck Medicine of University of Southern California, Los Angeles, said steroid toxicity is “a big issue” in neurology.

These patients can be on chronic therapy, and they aren’t really monitored for osteoporosis or other complications, he said, adding that neurologists aren’t always taking the necessary precautions to prevent steroid toxicity.

Beydoun estimated that about half of neurologists are not adequately familiar with balancing the efficacy of corticosteroids versus in toxicity.

“Objective improvement, either on the functional scale or the muscle impairment scale — that’s really response treatment. Whereas adverse effects of a treatment are something separate. The patient may be improving but also maybe developing other complications from the treatment,” he said.

Also commenting, Ghazala Hayat, MD, professor of neurology and director of neuromuscular and clinical neurophysiology services at Saint Louis University in St. Louis, said there is a clear need for more education.

“I always say prednisone is our best friend initially, and then it becomes the worst enemy. If you don’t see lots of neuromuscular patients, you might not know even how to recognize toxicity or how to taper. Or the opposite to that, if you taper too quickly, patients relapse.”

The study was funded by argenx. Wolfe reported serving on advisory boards for Alexion, argenx, UCB, and Johnson & Johnson. Neelam Goyal, MD, is a consultant/advisor for Alexion, argenx, Amgen, Janssen, Lycia Therapeutics, and UCB and has received grant support from argenx. Beydoun reported receiving research support and consulting and speaking fees from Healey Center, Amylyx, AB Science, Sanofi, Janssen, Genentech, Regeneron, UCB, Abcuro argenx, Alnylam, AstraZeneca, Amylyx, CSL Behring, Grifols, Takeda, Octapharma, UCB, and Janssen. Hayat reported speaker and advisory roles with argenx, Alexion, and MTPA.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AANEM 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article