User login
Lenabasum improved skin symptoms in dermatomyositis, but future is uncertain
An study.
– some of it statistically significant – in a phase 2, double-blind, randomized, controlledPatients taking lenabasum experienced greater reductions in the Cutaneous Dermatomyositis Disease Area and Severity Index (CDASI) – a validated outcome designed to assess inflammatory skin involvement in the rare autoimmune disease – and improvements in patient-reported and biomarker outcomes, compared with those on placebo, dermatologist Victoria P. Werth, MD, and coinvestigators reported.
And in a recently completed phase 3 trial, reported by the manufacturer, a subpopulation of patients with active skin disease and no active muscle disease again showed greater reductions in CDASI activity scores – a secondary outcome in the trial.
However, the phase 3 DETERMINE trial produced negative findings overall. It enrolled a more heterogeneous group of patients – including those with both muscle weakness and skin involvement – and its primary outcome measure was a broader composite measure, the Total Improvement Score. The trial failed to meet this primary endpoint, Corbus Pharmaceuticals, the developer of lenabasum, announced in a press release in June 2021.
The phase 3 results are “frustrating” for patients with symptomatic and refractory skin manifestations of dermatomyositis (DM), given the promising findings from the phase 2 trial and from an open-label extension study, said Dr. Werth, professor of dermatology and medicine, University of Pennsylvania, Philadelphia, and principal investigator and coprincipal investigator of the phase 2 and phase 3 studies, respectively.
Dr. Werth is scheduled to present the results from the phase 3 trial at the annual European Alliance of Associations for Rheumatology meeting in June.
“With lenabasum, we have a therapy that doesn’t work for every patient, but does work for quite a number of them,” Dr. Werth said in an interview. “It’s oral, it’s not really that immunosuppressing, and there aren’t many side effects. Right now, patients are often being managed with steroids ... we really need treatments that are not as toxic.”
Robert Spiera, MD, a rheumatologist who led trials of lenabasum for treatment of diffuse cutaneous systemic sclerosis (dcSSc), agreed. “The CB2 agonist strategy is appealing because it’s nonimmunosuppressing and has both anti-inflammatory and antifibrotic properties,” he said in an interview. “I wouldn’t want to give up on it ... especially [for patients] with scleroderma and dermatomyositis who are treated with substantial drugs that are associated with morbidity.”
Lenabasum, he said, has proven to be “incredibly safe, and incredibly safe in the long term.”
While the phase 2 trial of the drug for dcSSc showed clear benefit over placebo, the phase 3 trial did not meet its primary endpoint using the American College of Rheumatology Combined Response Index in Diffuse Cutaneous Systemic Sclerosis.
It allowed background immunosuppressant therapy to reflect real-world clinical practice, and “there was such a high response rate to [that therapy, largely mycophenolate] that there was little room to show benefit beyond that,” said Dr. Spiera, director of the vasculitis and scleroderma program, Hospital for Special Surgery, New York.
The drug led to more improvement in the small subset of participants who were not receiving background immunotherapy during the trial, he noted.
Corbus is currently “seeking a partnership to further explore the drug” for treatment in different subpopulations, according to a company spokesperson. Results of a phase 2 trial of lenabasum for the treatment of systemic lupus erythematosus – with a pain rating as the primary outcome measure – are expected soon.
Phase 2 findings
The single-center phase 2 trial of lenabasum for DM enrolled 22 adults with minimal muscle involvement as evidenced by normal maximal resistance on muscle testing at entry and throughout the study. Most were taking immunosuppressant medication, and all had CDASI scores of at least 20, with mean scores in the severe range (> 26). Symptoms registered on patient-reported outcome measures were moderate to severe.
Patients received a half-dose of lenabasum (20 mg daily) for 1 month and a full dose (20 mg twice daily) for 2 months, or placebo, and were followed for an additional month without dosing.
Starting at day 43 – approximately 2 weeks after the dose was increased – there was “a trend for the change from baseline CDASI to be greater” in the lenabasum group, compared with those on placebo, Dr. Werth and colleagues reported. The differences reached statistical significance on day 113 (P = .038), a month after patients discontinued lenabasum, “suggesting that the modulation of the inflammatory response by lenabasum continued beyond its last dose.”
Five of the 11 patients treated with lenabasum (45%), and none of those on placebo, achieved at least a 40% reduction in the CDASI activity score by the end of the study.
Patients in the lenabasum group also had greater improvement in the Skindex-29 Symptoms scores – an objective measure of itch – and improvements in other secondary efficacy outcomes, including pain, though these did not reach statistical significance.
Skin biopsies before and after treatment showed significant reductions in inflammatory cytokines relevant to DM pathogenesis. Patients treated with the CB2 agonist had a downward trend in the CD4+ T cell population, which correlated with decreased CDASI activity scores, for instance, and a decrease in IL-31 protein expression, which correlated with decreased Skindex-29 Symptoms scores, the investigators reported.
There were no serious adverse events related to the CB2 agonist, and no treatment discontinuations.
The main part of the phase 2 trial, conducted from 2015 to 2017, was followed by a 3-year, open-label extension, in which 20 of the 22 patients took lenabasum 20 mg twice a day. The drug continued to be safe and well tolerated, and the CDASI activity score and other outcomes improved through year 1 and remained stable thereafter, according to a poster presented by Dr. Werth at the 2021 EULAR meeting.
After 1 year in the open-label extension, 60%-70% of patients had achieved mild skin disease, and 75% had achieved at least a 40% reduction in CDASI activity.
“A lot of patients, even if they weren’t completely cleared, were much happier in terms of their itch,” said Dr. Werth, also chief of dermatology, Corporal Michael J. Crescenz Veterans Affairs Medical Center, Philadelphia. “It’s been difficult for a lot of them now that they’re off the long-term extension ... a lot of them are flaring.”
The future
In the lab, with funding from the National Institutes of Health, Dr. Werth is continuing to investigate how lenabasum may be working in DM. A paper just published in the open access journal Arthritis Research & Therapy describes CB2 receptor distribution and up-regulation on key immune cells in the skin and blood, and how, in DM skin, its highest expression is on dendritic cells.
Through both mechanistic and more clinical research, “it’s important to understand the characteristics of the people [lenabasum] worked in or didn’t work in,” she said.
And in clinical trials, it’s important to capture meaningful improvement from the patient perspective. “It may be,” she noted, “that more global, systemic assessments are not the way to go for autoimmune skin disease.”
For dcSSc, Dr. Spiera said, it’s possible that a CB2 agonist may be helpful for patients who have been on immunosuppressants, particularly mycophenolate, for more than 6 months “and are still struggling.”
The phase 2 trial in DM was funded by the National Institutes of Health, the Department of Veterans Affairs, and Corbus Pharmaceuticals. The phase 3 trials in DM and in dcSSc were funded by Corbus. Dr. Werth disclosed grant support from Corbus and several other pharmaceutical companies. Dr. Spiera disclosed that he has received grant support or consulting fees from Roche-Genentech, GlaxoSmithKline, and several other pharmaceutical companies.
A version of this article first appeared on Medscape.com.
An study.
– some of it statistically significant – in a phase 2, double-blind, randomized, controlledPatients taking lenabasum experienced greater reductions in the Cutaneous Dermatomyositis Disease Area and Severity Index (CDASI) – a validated outcome designed to assess inflammatory skin involvement in the rare autoimmune disease – and improvements in patient-reported and biomarker outcomes, compared with those on placebo, dermatologist Victoria P. Werth, MD, and coinvestigators reported.
And in a recently completed phase 3 trial, reported by the manufacturer, a subpopulation of patients with active skin disease and no active muscle disease again showed greater reductions in CDASI activity scores – a secondary outcome in the trial.
However, the phase 3 DETERMINE trial produced negative findings overall. It enrolled a more heterogeneous group of patients – including those with both muscle weakness and skin involvement – and its primary outcome measure was a broader composite measure, the Total Improvement Score. The trial failed to meet this primary endpoint, Corbus Pharmaceuticals, the developer of lenabasum, announced in a press release in June 2021.
The phase 3 results are “frustrating” for patients with symptomatic and refractory skin manifestations of dermatomyositis (DM), given the promising findings from the phase 2 trial and from an open-label extension study, said Dr. Werth, professor of dermatology and medicine, University of Pennsylvania, Philadelphia, and principal investigator and coprincipal investigator of the phase 2 and phase 3 studies, respectively.
Dr. Werth is scheduled to present the results from the phase 3 trial at the annual European Alliance of Associations for Rheumatology meeting in June.
“With lenabasum, we have a therapy that doesn’t work for every patient, but does work for quite a number of them,” Dr. Werth said in an interview. “It’s oral, it’s not really that immunosuppressing, and there aren’t many side effects. Right now, patients are often being managed with steroids ... we really need treatments that are not as toxic.”
Robert Spiera, MD, a rheumatologist who led trials of lenabasum for treatment of diffuse cutaneous systemic sclerosis (dcSSc), agreed. “The CB2 agonist strategy is appealing because it’s nonimmunosuppressing and has both anti-inflammatory and antifibrotic properties,” he said in an interview. “I wouldn’t want to give up on it ... especially [for patients] with scleroderma and dermatomyositis who are treated with substantial drugs that are associated with morbidity.”
Lenabasum, he said, has proven to be “incredibly safe, and incredibly safe in the long term.”
While the phase 2 trial of the drug for dcSSc showed clear benefit over placebo, the phase 3 trial did not meet its primary endpoint using the American College of Rheumatology Combined Response Index in Diffuse Cutaneous Systemic Sclerosis.
It allowed background immunosuppressant therapy to reflect real-world clinical practice, and “there was such a high response rate to [that therapy, largely mycophenolate] that there was little room to show benefit beyond that,” said Dr. Spiera, director of the vasculitis and scleroderma program, Hospital for Special Surgery, New York.
The drug led to more improvement in the small subset of participants who were not receiving background immunotherapy during the trial, he noted.
Corbus is currently “seeking a partnership to further explore the drug” for treatment in different subpopulations, according to a company spokesperson. Results of a phase 2 trial of lenabasum for the treatment of systemic lupus erythematosus – with a pain rating as the primary outcome measure – are expected soon.
Phase 2 findings
The single-center phase 2 trial of lenabasum for DM enrolled 22 adults with minimal muscle involvement as evidenced by normal maximal resistance on muscle testing at entry and throughout the study. Most were taking immunosuppressant medication, and all had CDASI scores of at least 20, with mean scores in the severe range (> 26). Symptoms registered on patient-reported outcome measures were moderate to severe.
Patients received a half-dose of lenabasum (20 mg daily) for 1 month and a full dose (20 mg twice daily) for 2 months, or placebo, and were followed for an additional month without dosing.
Starting at day 43 – approximately 2 weeks after the dose was increased – there was “a trend for the change from baseline CDASI to be greater” in the lenabasum group, compared with those on placebo, Dr. Werth and colleagues reported. The differences reached statistical significance on day 113 (P = .038), a month after patients discontinued lenabasum, “suggesting that the modulation of the inflammatory response by lenabasum continued beyond its last dose.”
Five of the 11 patients treated with lenabasum (45%), and none of those on placebo, achieved at least a 40% reduction in the CDASI activity score by the end of the study.
Patients in the lenabasum group also had greater improvement in the Skindex-29 Symptoms scores – an objective measure of itch – and improvements in other secondary efficacy outcomes, including pain, though these did not reach statistical significance.
Skin biopsies before and after treatment showed significant reductions in inflammatory cytokines relevant to DM pathogenesis. Patients treated with the CB2 agonist had a downward trend in the CD4+ T cell population, which correlated with decreased CDASI activity scores, for instance, and a decrease in IL-31 protein expression, which correlated with decreased Skindex-29 Symptoms scores, the investigators reported.
There were no serious adverse events related to the CB2 agonist, and no treatment discontinuations.
The main part of the phase 2 trial, conducted from 2015 to 2017, was followed by a 3-year, open-label extension, in which 20 of the 22 patients took lenabasum 20 mg twice a day. The drug continued to be safe and well tolerated, and the CDASI activity score and other outcomes improved through year 1 and remained stable thereafter, according to a poster presented by Dr. Werth at the 2021 EULAR meeting.
After 1 year in the open-label extension, 60%-70% of patients had achieved mild skin disease, and 75% had achieved at least a 40% reduction in CDASI activity.
“A lot of patients, even if they weren’t completely cleared, were much happier in terms of their itch,” said Dr. Werth, also chief of dermatology, Corporal Michael J. Crescenz Veterans Affairs Medical Center, Philadelphia. “It’s been difficult for a lot of them now that they’re off the long-term extension ... a lot of them are flaring.”
The future
In the lab, with funding from the National Institutes of Health, Dr. Werth is continuing to investigate how lenabasum may be working in DM. A paper just published in the open access journal Arthritis Research & Therapy describes CB2 receptor distribution and up-regulation on key immune cells in the skin and blood, and how, in DM skin, its highest expression is on dendritic cells.
Through both mechanistic and more clinical research, “it’s important to understand the characteristics of the people [lenabasum] worked in or didn’t work in,” she said.
And in clinical trials, it’s important to capture meaningful improvement from the patient perspective. “It may be,” she noted, “that more global, systemic assessments are not the way to go for autoimmune skin disease.”
For dcSSc, Dr. Spiera said, it’s possible that a CB2 agonist may be helpful for patients who have been on immunosuppressants, particularly mycophenolate, for more than 6 months “and are still struggling.”
The phase 2 trial in DM was funded by the National Institutes of Health, the Department of Veterans Affairs, and Corbus Pharmaceuticals. The phase 3 trials in DM and in dcSSc were funded by Corbus. Dr. Werth disclosed grant support from Corbus and several other pharmaceutical companies. Dr. Spiera disclosed that he has received grant support or consulting fees from Roche-Genentech, GlaxoSmithKline, and several other pharmaceutical companies.
A version of this article first appeared on Medscape.com.
An study.
– some of it statistically significant – in a phase 2, double-blind, randomized, controlledPatients taking lenabasum experienced greater reductions in the Cutaneous Dermatomyositis Disease Area and Severity Index (CDASI) – a validated outcome designed to assess inflammatory skin involvement in the rare autoimmune disease – and improvements in patient-reported and biomarker outcomes, compared with those on placebo, dermatologist Victoria P. Werth, MD, and coinvestigators reported.
And in a recently completed phase 3 trial, reported by the manufacturer, a subpopulation of patients with active skin disease and no active muscle disease again showed greater reductions in CDASI activity scores – a secondary outcome in the trial.
However, the phase 3 DETERMINE trial produced negative findings overall. It enrolled a more heterogeneous group of patients – including those with both muscle weakness and skin involvement – and its primary outcome measure was a broader composite measure, the Total Improvement Score. The trial failed to meet this primary endpoint, Corbus Pharmaceuticals, the developer of lenabasum, announced in a press release in June 2021.
The phase 3 results are “frustrating” for patients with symptomatic and refractory skin manifestations of dermatomyositis (DM), given the promising findings from the phase 2 trial and from an open-label extension study, said Dr. Werth, professor of dermatology and medicine, University of Pennsylvania, Philadelphia, and principal investigator and coprincipal investigator of the phase 2 and phase 3 studies, respectively.
Dr. Werth is scheduled to present the results from the phase 3 trial at the annual European Alliance of Associations for Rheumatology meeting in June.
“With lenabasum, we have a therapy that doesn’t work for every patient, but does work for quite a number of them,” Dr. Werth said in an interview. “It’s oral, it’s not really that immunosuppressing, and there aren’t many side effects. Right now, patients are often being managed with steroids ... we really need treatments that are not as toxic.”
Robert Spiera, MD, a rheumatologist who led trials of lenabasum for treatment of diffuse cutaneous systemic sclerosis (dcSSc), agreed. “The CB2 agonist strategy is appealing because it’s nonimmunosuppressing and has both anti-inflammatory and antifibrotic properties,” he said in an interview. “I wouldn’t want to give up on it ... especially [for patients] with scleroderma and dermatomyositis who are treated with substantial drugs that are associated with morbidity.”
Lenabasum, he said, has proven to be “incredibly safe, and incredibly safe in the long term.”
While the phase 2 trial of the drug for dcSSc showed clear benefit over placebo, the phase 3 trial did not meet its primary endpoint using the American College of Rheumatology Combined Response Index in Diffuse Cutaneous Systemic Sclerosis.
It allowed background immunosuppressant therapy to reflect real-world clinical practice, and “there was such a high response rate to [that therapy, largely mycophenolate] that there was little room to show benefit beyond that,” said Dr. Spiera, director of the vasculitis and scleroderma program, Hospital for Special Surgery, New York.
The drug led to more improvement in the small subset of participants who were not receiving background immunotherapy during the trial, he noted.
Corbus is currently “seeking a partnership to further explore the drug” for treatment in different subpopulations, according to a company spokesperson. Results of a phase 2 trial of lenabasum for the treatment of systemic lupus erythematosus – with a pain rating as the primary outcome measure – are expected soon.
Phase 2 findings
The single-center phase 2 trial of lenabasum for DM enrolled 22 adults with minimal muscle involvement as evidenced by normal maximal resistance on muscle testing at entry and throughout the study. Most were taking immunosuppressant medication, and all had CDASI scores of at least 20, with mean scores in the severe range (> 26). Symptoms registered on patient-reported outcome measures were moderate to severe.
Patients received a half-dose of lenabasum (20 mg daily) for 1 month and a full dose (20 mg twice daily) for 2 months, or placebo, and were followed for an additional month without dosing.
Starting at day 43 – approximately 2 weeks after the dose was increased – there was “a trend for the change from baseline CDASI to be greater” in the lenabasum group, compared with those on placebo, Dr. Werth and colleagues reported. The differences reached statistical significance on day 113 (P = .038), a month after patients discontinued lenabasum, “suggesting that the modulation of the inflammatory response by lenabasum continued beyond its last dose.”
Five of the 11 patients treated with lenabasum (45%), and none of those on placebo, achieved at least a 40% reduction in the CDASI activity score by the end of the study.
Patients in the lenabasum group also had greater improvement in the Skindex-29 Symptoms scores – an objective measure of itch – and improvements in other secondary efficacy outcomes, including pain, though these did not reach statistical significance.
Skin biopsies before and after treatment showed significant reductions in inflammatory cytokines relevant to DM pathogenesis. Patients treated with the CB2 agonist had a downward trend in the CD4+ T cell population, which correlated with decreased CDASI activity scores, for instance, and a decrease in IL-31 protein expression, which correlated with decreased Skindex-29 Symptoms scores, the investigators reported.
There were no serious adverse events related to the CB2 agonist, and no treatment discontinuations.
The main part of the phase 2 trial, conducted from 2015 to 2017, was followed by a 3-year, open-label extension, in which 20 of the 22 patients took lenabasum 20 mg twice a day. The drug continued to be safe and well tolerated, and the CDASI activity score and other outcomes improved through year 1 and remained stable thereafter, according to a poster presented by Dr. Werth at the 2021 EULAR meeting.
After 1 year in the open-label extension, 60%-70% of patients had achieved mild skin disease, and 75% had achieved at least a 40% reduction in CDASI activity.
“A lot of patients, even if they weren’t completely cleared, were much happier in terms of their itch,” said Dr. Werth, also chief of dermatology, Corporal Michael J. Crescenz Veterans Affairs Medical Center, Philadelphia. “It’s been difficult for a lot of them now that they’re off the long-term extension ... a lot of them are flaring.”
The future
In the lab, with funding from the National Institutes of Health, Dr. Werth is continuing to investigate how lenabasum may be working in DM. A paper just published in the open access journal Arthritis Research & Therapy describes CB2 receptor distribution and up-regulation on key immune cells in the skin and blood, and how, in DM skin, its highest expression is on dendritic cells.
Through both mechanistic and more clinical research, “it’s important to understand the characteristics of the people [lenabasum] worked in or didn’t work in,” she said.
And in clinical trials, it’s important to capture meaningful improvement from the patient perspective. “It may be,” she noted, “that more global, systemic assessments are not the way to go for autoimmune skin disease.”
For dcSSc, Dr. Spiera said, it’s possible that a CB2 agonist may be helpful for patients who have been on immunosuppressants, particularly mycophenolate, for more than 6 months “and are still struggling.”
The phase 2 trial in DM was funded by the National Institutes of Health, the Department of Veterans Affairs, and Corbus Pharmaceuticals. The phase 3 trials in DM and in dcSSc were funded by Corbus. Dr. Werth disclosed grant support from Corbus and several other pharmaceutical companies. Dr. Spiera disclosed that he has received grant support or consulting fees from Roche-Genentech, GlaxoSmithKline, and several other pharmaceutical companies.
A version of this article first appeared on Medscape.com.
FROM THE JOURNAL OF INVESTIGATIVE DERMATOLOGY
Stroke in the young: Cancer in disguise?
The data were presented by Jamie Verhoeven, MD, Radboud University Medical Centre, the Netherlands, at the recent European Stroke Organisation Conference 2022.
Dr. Verhoeven noted that 10% of all stroke cases occur in individuals younger than 50 years. During the past few decades, the incidence of stroke in the young has steadily increased, whereas the incidence of stroke in older adults has stabilized or decreased.
“Stroke in the young differs from stroke in older patients, and one of the major differences is that stroke in the young has a higher proportion of cryptogenic stroke, with no clear cause found in over one-third of patients,” she said.
Also, having an active cancer is known to be a risk factor for thrombosis. This association is strongest in venous thrombosis and has been less well investigated in arterial thrombosis, Dr. Verhoeven reported.
Her group aimed to investigate whether in some patients with cryptogenic stroke, this may be the first manifestation of an underlying cancer. “If this hypothesis is true, then it would be more obvious in young patients who have a higher incidence of cryptogenic stroke,” she said.
They performed a population-based observational cohort study using diagnostic ICD codes from the national Hospital Discharge Registry in the Netherlands and the Dutch Population Registry from 1998 to 2019.
Patients with a history of cancer before their first stroke and those with central nervous system cancers at the time of stroke or nonmelanoma skin cancers (which have been shown to have no systemic effects) were excluded.
Reference data came from the Netherlands Comprehensive Cancer Organisation, which collects data on all cancer diagnoses in the country.
The researchers identified 27,616 young stroke patients (age range, 15-49 years; median age, 45 years) and 362,782 older stroke patients (age range, 50 years and older; median age, 76 years).
The cumulative incidence of any cancer at 10 years was 3.7% in the younger group and 8.5% in the older group.
The data were compared with matched peers from the general population. The main outcome measures were cumulative incidence of first-ever cancer after stroke (stratified by stroke subtype, age and sex) and standardized incidence rates.
Results showed that the risk for cancer was higher in the younger age group than in the matched general population.
In this age group, the 1-year risk of any new cancer was 2.6 times higher (95% confidence interval, 2.2-3.1) after ischemic stroke and 5.4 times (95% CI, 3.8-7.3) after intracerebral hemorrhage than in matched peers from the general population.
In contrast, in stroke patients older than 50 years, the 1-year risk for any new cancer was 1.2 times higher than the general population after either ischemic or hemorrhagic stroke.
“The younger patients have a higher risk increase of cancer than older patients, and this risk increase is most evident in the first 1 to 2 years after stroke but remains statistically significant for up to 5 to 8 years later,” Dr. Verhoeven said.
The cancers that were most involved in this risk increase were those of the lower respiratory tract, hematologic cancers, and gastrointestinal cancers.
The main strength of this study was the use of national databases that allowed for a very large sample size, but this brings with it the danger of misclassification of events and the lack of clinical data, Dr. Verhoeven noted.
“Young stroke patients are at increased risk of developing a new cancer in the years following their stroke compared to peers from the general population, but this risk is only marginally increased in the older stroke population,” she concluded.
She pointed out that it is not possible to confirm any causal relation from this study design, but a clear association has been shown.
“We need more studies into this field. We need a large clinical dataset to examine which clinical phenotypes are associated with possible underlying cancers to identify which patients are most at risk. We are already working on this,” she said. “Then it remains to be investigated whether screening for an underlying cancer should be added to the diagnostic workup in young stroke patients.”
Commenting on the study after the presentation, William Whiteley, BM, PhD, a clinical epidemiologist at the University of Edinburgh, Scotland, and a consultant neurologist in NHS Lothian, said it was difficult to know whether the link shown between stroke and cancer was causal, but the effect size in this study was “quite large.”
He pointed out that the associations with bowel and lung cancer could be due to shared risk factors, such as smoking, but he said the finding on a link with hematologic cancers is “interesting.”
Noting that there are links between hematologic cancers and thrombotic events, he said: “People have wondered if that is because of clonal expansion, which has been shown to increase the risk of atherosclerosis, so the question is whether this is some kind of common risk factor here.”
Dr. Verhoeven said she did not believe that shared risk factors fully explained the difference in increased risks between young and older patients.
“It does not fully explain why the risk of cancer is specifically higher in the first 1 to 2 years after the stroke diagnosis. I would think if it was just shared risk factors, the risk increase should remain relatively stable, or even increase due to the build-up of exposure to risk factors over the years,” she said.
Dr. Whiteley said that data like these are “really useful in trying to estimate these associations and it gives us some hypotheses to investigate in smaller mechanistic studies.”
Asked whether these data justify screening younger cryptogenic stroke patients more systematically for cancer, Dr. Whiteley replied: “I think we need some absolute risk estimates for that; for example, what proportion of younger patients would be at risk over the next few years when that screening would make a difference.”
Dr. Verhoeven reports no disclosures.
A version of this article first appeared on Medscape.com.
The data were presented by Jamie Verhoeven, MD, Radboud University Medical Centre, the Netherlands, at the recent European Stroke Organisation Conference 2022.
Dr. Verhoeven noted that 10% of all stroke cases occur in individuals younger than 50 years. During the past few decades, the incidence of stroke in the young has steadily increased, whereas the incidence of stroke in older adults has stabilized or decreased.
“Stroke in the young differs from stroke in older patients, and one of the major differences is that stroke in the young has a higher proportion of cryptogenic stroke, with no clear cause found in over one-third of patients,” she said.
Also, having an active cancer is known to be a risk factor for thrombosis. This association is strongest in venous thrombosis and has been less well investigated in arterial thrombosis, Dr. Verhoeven reported.
Her group aimed to investigate whether in some patients with cryptogenic stroke, this may be the first manifestation of an underlying cancer. “If this hypothesis is true, then it would be more obvious in young patients who have a higher incidence of cryptogenic stroke,” she said.
They performed a population-based observational cohort study using diagnostic ICD codes from the national Hospital Discharge Registry in the Netherlands and the Dutch Population Registry from 1998 to 2019.
Patients with a history of cancer before their first stroke and those with central nervous system cancers at the time of stroke or nonmelanoma skin cancers (which have been shown to have no systemic effects) were excluded.
Reference data came from the Netherlands Comprehensive Cancer Organisation, which collects data on all cancer diagnoses in the country.
The researchers identified 27,616 young stroke patients (age range, 15-49 years; median age, 45 years) and 362,782 older stroke patients (age range, 50 years and older; median age, 76 years).
The cumulative incidence of any cancer at 10 years was 3.7% in the younger group and 8.5% in the older group.
The data were compared with matched peers from the general population. The main outcome measures were cumulative incidence of first-ever cancer after stroke (stratified by stroke subtype, age and sex) and standardized incidence rates.
Results showed that the risk for cancer was higher in the younger age group than in the matched general population.
In this age group, the 1-year risk of any new cancer was 2.6 times higher (95% confidence interval, 2.2-3.1) after ischemic stroke and 5.4 times (95% CI, 3.8-7.3) after intracerebral hemorrhage than in matched peers from the general population.
In contrast, in stroke patients older than 50 years, the 1-year risk for any new cancer was 1.2 times higher than the general population after either ischemic or hemorrhagic stroke.
“The younger patients have a higher risk increase of cancer than older patients, and this risk increase is most evident in the first 1 to 2 years after stroke but remains statistically significant for up to 5 to 8 years later,” Dr. Verhoeven said.
The cancers that were most involved in this risk increase were those of the lower respiratory tract, hematologic cancers, and gastrointestinal cancers.
The main strength of this study was the use of national databases that allowed for a very large sample size, but this brings with it the danger of misclassification of events and the lack of clinical data, Dr. Verhoeven noted.
“Young stroke patients are at increased risk of developing a new cancer in the years following their stroke compared to peers from the general population, but this risk is only marginally increased in the older stroke population,” she concluded.
She pointed out that it is not possible to confirm any causal relation from this study design, but a clear association has been shown.
“We need more studies into this field. We need a large clinical dataset to examine which clinical phenotypes are associated with possible underlying cancers to identify which patients are most at risk. We are already working on this,” she said. “Then it remains to be investigated whether screening for an underlying cancer should be added to the diagnostic workup in young stroke patients.”
Commenting on the study after the presentation, William Whiteley, BM, PhD, a clinical epidemiologist at the University of Edinburgh, Scotland, and a consultant neurologist in NHS Lothian, said it was difficult to know whether the link shown between stroke and cancer was causal, but the effect size in this study was “quite large.”
He pointed out that the associations with bowel and lung cancer could be due to shared risk factors, such as smoking, but he said the finding on a link with hematologic cancers is “interesting.”
Noting that there are links between hematologic cancers and thrombotic events, he said: “People have wondered if that is because of clonal expansion, which has been shown to increase the risk of atherosclerosis, so the question is whether this is some kind of common risk factor here.”
Dr. Verhoeven said she did not believe that shared risk factors fully explained the difference in increased risks between young and older patients.
“It does not fully explain why the risk of cancer is specifically higher in the first 1 to 2 years after the stroke diagnosis. I would think if it was just shared risk factors, the risk increase should remain relatively stable, or even increase due to the build-up of exposure to risk factors over the years,” she said.
Dr. Whiteley said that data like these are “really useful in trying to estimate these associations and it gives us some hypotheses to investigate in smaller mechanistic studies.”
Asked whether these data justify screening younger cryptogenic stroke patients more systematically for cancer, Dr. Whiteley replied: “I think we need some absolute risk estimates for that; for example, what proportion of younger patients would be at risk over the next few years when that screening would make a difference.”
Dr. Verhoeven reports no disclosures.
A version of this article first appeared on Medscape.com.
The data were presented by Jamie Verhoeven, MD, Radboud University Medical Centre, the Netherlands, at the recent European Stroke Organisation Conference 2022.
Dr. Verhoeven noted that 10% of all stroke cases occur in individuals younger than 50 years. During the past few decades, the incidence of stroke in the young has steadily increased, whereas the incidence of stroke in older adults has stabilized or decreased.
“Stroke in the young differs from stroke in older patients, and one of the major differences is that stroke in the young has a higher proportion of cryptogenic stroke, with no clear cause found in over one-third of patients,” she said.
Also, having an active cancer is known to be a risk factor for thrombosis. This association is strongest in venous thrombosis and has been less well investigated in arterial thrombosis, Dr. Verhoeven reported.
Her group aimed to investigate whether in some patients with cryptogenic stroke, this may be the first manifestation of an underlying cancer. “If this hypothesis is true, then it would be more obvious in young patients who have a higher incidence of cryptogenic stroke,” she said.
They performed a population-based observational cohort study using diagnostic ICD codes from the national Hospital Discharge Registry in the Netherlands and the Dutch Population Registry from 1998 to 2019.
Patients with a history of cancer before their first stroke and those with central nervous system cancers at the time of stroke or nonmelanoma skin cancers (which have been shown to have no systemic effects) were excluded.
Reference data came from the Netherlands Comprehensive Cancer Organisation, which collects data on all cancer diagnoses in the country.
The researchers identified 27,616 young stroke patients (age range, 15-49 years; median age, 45 years) and 362,782 older stroke patients (age range, 50 years and older; median age, 76 years).
The cumulative incidence of any cancer at 10 years was 3.7% in the younger group and 8.5% in the older group.
The data were compared with matched peers from the general population. The main outcome measures were cumulative incidence of first-ever cancer after stroke (stratified by stroke subtype, age and sex) and standardized incidence rates.
Results showed that the risk for cancer was higher in the younger age group than in the matched general population.
In this age group, the 1-year risk of any new cancer was 2.6 times higher (95% confidence interval, 2.2-3.1) after ischemic stroke and 5.4 times (95% CI, 3.8-7.3) after intracerebral hemorrhage than in matched peers from the general population.
In contrast, in stroke patients older than 50 years, the 1-year risk for any new cancer was 1.2 times higher than the general population after either ischemic or hemorrhagic stroke.
“The younger patients have a higher risk increase of cancer than older patients, and this risk increase is most evident in the first 1 to 2 years after stroke but remains statistically significant for up to 5 to 8 years later,” Dr. Verhoeven said.
The cancers that were most involved in this risk increase were those of the lower respiratory tract, hematologic cancers, and gastrointestinal cancers.
The main strength of this study was the use of national databases that allowed for a very large sample size, but this brings with it the danger of misclassification of events and the lack of clinical data, Dr. Verhoeven noted.
“Young stroke patients are at increased risk of developing a new cancer in the years following their stroke compared to peers from the general population, but this risk is only marginally increased in the older stroke population,” she concluded.
She pointed out that it is not possible to confirm any causal relation from this study design, but a clear association has been shown.
“We need more studies into this field. We need a large clinical dataset to examine which clinical phenotypes are associated with possible underlying cancers to identify which patients are most at risk. We are already working on this,” she said. “Then it remains to be investigated whether screening for an underlying cancer should be added to the diagnostic workup in young stroke patients.”
Commenting on the study after the presentation, William Whiteley, BM, PhD, a clinical epidemiologist at the University of Edinburgh, Scotland, and a consultant neurologist in NHS Lothian, said it was difficult to know whether the link shown between stroke and cancer was causal, but the effect size in this study was “quite large.”
He pointed out that the associations with bowel and lung cancer could be due to shared risk factors, such as smoking, but he said the finding on a link with hematologic cancers is “interesting.”
Noting that there are links between hematologic cancers and thrombotic events, he said: “People have wondered if that is because of clonal expansion, which has been shown to increase the risk of atherosclerosis, so the question is whether this is some kind of common risk factor here.”
Dr. Verhoeven said she did not believe that shared risk factors fully explained the difference in increased risks between young and older patients.
“It does not fully explain why the risk of cancer is specifically higher in the first 1 to 2 years after the stroke diagnosis. I would think if it was just shared risk factors, the risk increase should remain relatively stable, or even increase due to the build-up of exposure to risk factors over the years,” she said.
Dr. Whiteley said that data like these are “really useful in trying to estimate these associations and it gives us some hypotheses to investigate in smaller mechanistic studies.”
Asked whether these data justify screening younger cryptogenic stroke patients more systematically for cancer, Dr. Whiteley replied: “I think we need some absolute risk estimates for that; for example, what proportion of younger patients would be at risk over the next few years when that screening would make a difference.”
Dr. Verhoeven reports no disclosures.
A version of this article first appeared on Medscape.com.
FROM ESOC 2022
Gallstone disease may be a harbinger of pancreatic cancer
in a SEER-Medicare database analysis. Patients with PDAC were six times more likely to have had gallstone disease in the year prior to diagnosis than noncancer patients, they found.
“We can’t be certain at this time as to whether gallstone disease is a precursor to PDAC or whether it is the end result of PDAC, but we do know there is an association, and we plan to explore it further,” commented study author Teviah Sachs, MD, MPH, Boston Medical Center.
“We don’t want anyone with gallstone disease to think that they have pancreatic cancer because, certainly, the overwhelming majority of patients with gallstone disease do not have pancreatic cancer,” he emphasized.
“But I would say to physicians that if you have a patient who presents with gallstone disease and they have other symptoms, you should not necessarily attribute those symptoms just to their gallstone disease,” Dr. Sachs commented.
“The diagnosis of pancreatic cancer should be on the differential in patients who present with symptoms that might not otherwise correlate with typical gallstones,” he added.
Dr. Sachs was speaking at a press briefing ahead of the annual Digestive Disease Week® (DDW), where the study will be presented.
“PDAC is often fatal because it’s frequently not diagnosed until it is late-stage disease,” Dr. Sachs noted.
Complicating earlier diagnosis is the fact that symptoms of PDAC often mirror those associated with gallstone disease and gallbladder infection, “both of which have been demonstrated to be risk factors for PDAC,” Dr. Sachs added.
Annual incidence
The purpose of the present study was to compare the incidence of cholelithiasis or cholecystitis in the year before a diagnosis of PDAC with the annual incidence in the general population.
A total of 18,700 patients with PDAC, median age 76 years, were identified in the SEER-Medicare database between 2008 and 2015. The incidence of hospital visits for gallstone disease in the year prior to PDAC diagnosis as well as the annual incidence of gallstone disease in the SEER-Medicare noncancer cohort were assessed.
An average of 99,287 patients per year were available from the noncancer cohort, 0.8% of whom had gallstone disease and 0.3% of whom had their gallbladders removed. In contrast, in the year before their diagnosis, 4.7% of PDAC patients had a diagnosis of gallstone disease and 1.6% had their gallbladders removed.
“Gallstone disease does not cause pancreatic cancer,” lead author, Marianna Papageorge, MD, research fellow, also of Boston Medical Center, said in a statement.
“But understanding its association with PDAC can help combat the high mortality rate with pancreatic cancer by providing the opportunity for earlier diagnosis and treatment,” she added.
A version of this article first appeared on Medscape.com.
in a SEER-Medicare database analysis. Patients with PDAC were six times more likely to have had gallstone disease in the year prior to diagnosis than noncancer patients, they found.
“We can’t be certain at this time as to whether gallstone disease is a precursor to PDAC or whether it is the end result of PDAC, but we do know there is an association, and we plan to explore it further,” commented study author Teviah Sachs, MD, MPH, Boston Medical Center.
“We don’t want anyone with gallstone disease to think that they have pancreatic cancer because, certainly, the overwhelming majority of patients with gallstone disease do not have pancreatic cancer,” he emphasized.
“But I would say to physicians that if you have a patient who presents with gallstone disease and they have other symptoms, you should not necessarily attribute those symptoms just to their gallstone disease,” Dr. Sachs commented.
“The diagnosis of pancreatic cancer should be on the differential in patients who present with symptoms that might not otherwise correlate with typical gallstones,” he added.
Dr. Sachs was speaking at a press briefing ahead of the annual Digestive Disease Week® (DDW), where the study will be presented.
“PDAC is often fatal because it’s frequently not diagnosed until it is late-stage disease,” Dr. Sachs noted.
Complicating earlier diagnosis is the fact that symptoms of PDAC often mirror those associated with gallstone disease and gallbladder infection, “both of which have been demonstrated to be risk factors for PDAC,” Dr. Sachs added.
Annual incidence
The purpose of the present study was to compare the incidence of cholelithiasis or cholecystitis in the year before a diagnosis of PDAC with the annual incidence in the general population.
A total of 18,700 patients with PDAC, median age 76 years, were identified in the SEER-Medicare database between 2008 and 2015. The incidence of hospital visits for gallstone disease in the year prior to PDAC diagnosis as well as the annual incidence of gallstone disease in the SEER-Medicare noncancer cohort were assessed.
An average of 99,287 patients per year were available from the noncancer cohort, 0.8% of whom had gallstone disease and 0.3% of whom had their gallbladders removed. In contrast, in the year before their diagnosis, 4.7% of PDAC patients had a diagnosis of gallstone disease and 1.6% had their gallbladders removed.
“Gallstone disease does not cause pancreatic cancer,” lead author, Marianna Papageorge, MD, research fellow, also of Boston Medical Center, said in a statement.
“But understanding its association with PDAC can help combat the high mortality rate with pancreatic cancer by providing the opportunity for earlier diagnosis and treatment,” she added.
A version of this article first appeared on Medscape.com.
in a SEER-Medicare database analysis. Patients with PDAC were six times more likely to have had gallstone disease in the year prior to diagnosis than noncancer patients, they found.
“We can’t be certain at this time as to whether gallstone disease is a precursor to PDAC or whether it is the end result of PDAC, but we do know there is an association, and we plan to explore it further,” commented study author Teviah Sachs, MD, MPH, Boston Medical Center.
“We don’t want anyone with gallstone disease to think that they have pancreatic cancer because, certainly, the overwhelming majority of patients with gallstone disease do not have pancreatic cancer,” he emphasized.
“But I would say to physicians that if you have a patient who presents with gallstone disease and they have other symptoms, you should not necessarily attribute those symptoms just to their gallstone disease,” Dr. Sachs commented.
“The diagnosis of pancreatic cancer should be on the differential in patients who present with symptoms that might not otherwise correlate with typical gallstones,” he added.
Dr. Sachs was speaking at a press briefing ahead of the annual Digestive Disease Week® (DDW), where the study will be presented.
“PDAC is often fatal because it’s frequently not diagnosed until it is late-stage disease,” Dr. Sachs noted.
Complicating earlier diagnosis is the fact that symptoms of PDAC often mirror those associated with gallstone disease and gallbladder infection, “both of which have been demonstrated to be risk factors for PDAC,” Dr. Sachs added.
Annual incidence
The purpose of the present study was to compare the incidence of cholelithiasis or cholecystitis in the year before a diagnosis of PDAC with the annual incidence in the general population.
A total of 18,700 patients with PDAC, median age 76 years, were identified in the SEER-Medicare database between 2008 and 2015. The incidence of hospital visits for gallstone disease in the year prior to PDAC diagnosis as well as the annual incidence of gallstone disease in the SEER-Medicare noncancer cohort were assessed.
An average of 99,287 patients per year were available from the noncancer cohort, 0.8% of whom had gallstone disease and 0.3% of whom had their gallbladders removed. In contrast, in the year before their diagnosis, 4.7% of PDAC patients had a diagnosis of gallstone disease and 1.6% had their gallbladders removed.
“Gallstone disease does not cause pancreatic cancer,” lead author, Marianna Papageorge, MD, research fellow, also of Boston Medical Center, said in a statement.
“But understanding its association with PDAC can help combat the high mortality rate with pancreatic cancer by providing the opportunity for earlier diagnosis and treatment,” she added.
A version of this article first appeared on Medscape.com.
FROM DDW 2022
Value of screening urinalysis before office procedures questioned
according to the results of a randomized trial.
Some centers perform the pre-procedure test to avoid urinary tract infections (UTIs), a feared iatrogenic complication, but the new findings indicate the step is unnecessary.
These results “will alter screening urinalysis practice at our hospital,” said Alexa Rose, a clinical research coordinator in the department of urology, University of Wisconsin School of Medicine and Public Health, Madison. Ms. Rose’s group presented the findings at the annual meeting of the American Urological Association.
Although rates of post procedure UTI are generally low after office-based cytology, it is the most common complication, Ms. Rose said. To minimize risk, preprocedural urinalysis had become standard practice at her institution.
For the study, Ms. Rose and colleagues sought to determine if the testing was indeed helping reduce the risk of UTI. They randomly divided 641 patients into two groups. Both received urinalysis, but test results of participants in the experimental group were not forwarded to clinicians.
Patients were undergoing one of three types of office urology procedures: cystoscopy (66.6%), intravesical therapy for bladder cancer (24.5%), and prostate biopsy (8.9%). Median age was 70 years and most participants (83%) were men.
The primary endpoint was a symptomatic UTI confirmed by culture 30 days after the procedure.
In the 323 patients managed without access to the results of urinalysis, the rate of UTI was 1.2%. In the 318 patients who received usual care guided by urinalysis, the rate of UTI was 1.6% – and the difference was a single case.
The nonsignificant difference fell easily within the study definition of noninferiority, according to Ms. Rose. Others offering preprocedural urinalysis should take heed.
“Due to the large cohort of patients we enrolled, we expect that it will be applicable to other institutions,” she said in an interview.
SUB: Expert pushes back
In a 2020 Best Practice Statement from the AUA on antibiotic prophylaxis, the risk of procedural-related UTI was considered to be much lower in out-patient versus in-patient settings.
The statement identified a long list of variables to guide screening for UTI and initiation of prophylactic antibiotics prior to urology procedures for hospitalized patients, but office-based procedures in low-risk, largely healthy patients were treated differently.
As a result, the operating hypothesis of the Wisconsin study is “flawed,” said Anthony J. Schaeffer, MD, professor of urology, Feinberg School of Medicine at Northwestern University, Chicago.
“An asymptomatic patient undergoing office cystoscopy for [such indications as] hematuria or bladder tumor doesn’t need pre-procedural urinalysis or prophylactic antibiotics unless they have risk factors, such as immunosuppression,” Dr. Schaeffer told this news organization. “There is no relationship between preprocedural urinalysis and a post-procedure UTI caused by instrumentation.”
According to the AUA, neither antibiotic prophylaxis nor screening such as urodynamic studies is recommended prior to simple outpatient cystoscopy if patients are otherwise healthy and have no signs or symptoms of a UTI.
While antibiotic prophylaxis is standard of care for some outpatient urological procedures, such as transrectal ultrasound (TRUS)-guided prostate biopsies, the practice is appropriate whether or not patients undergo urinalysis, according to Dr. Schaeffer.
As a result, one problem with the new study was the lack of discussion about antibiotic prophylaxis.
“Presumably the patients undergoing TRUS prostate biopsies received antibiotic prophylaxis, which is a critical cofounder that they do not even mention,” Dr. Schaeffer said.
In patients with UTI symptoms, some screening is appropriate whether with a simple dipstick, laboratory-performed microscopy, or culture, according to the AUA.
In the absence of symptoms for a patient undergoing a class I (clean) procedure, the AUA statement recommends – and Dr. Schaeffer said he agreed – that antibiotic prophylaxis, let alone urinalysis, is not a standard, particularly for simple outpatient cystoscopy.
Ms. Rose and Dr. Schaeffer have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
according to the results of a randomized trial.
Some centers perform the pre-procedure test to avoid urinary tract infections (UTIs), a feared iatrogenic complication, but the new findings indicate the step is unnecessary.
These results “will alter screening urinalysis practice at our hospital,” said Alexa Rose, a clinical research coordinator in the department of urology, University of Wisconsin School of Medicine and Public Health, Madison. Ms. Rose’s group presented the findings at the annual meeting of the American Urological Association.
Although rates of post procedure UTI are generally low after office-based cytology, it is the most common complication, Ms. Rose said. To minimize risk, preprocedural urinalysis had become standard practice at her institution.
For the study, Ms. Rose and colleagues sought to determine if the testing was indeed helping reduce the risk of UTI. They randomly divided 641 patients into two groups. Both received urinalysis, but test results of participants in the experimental group were not forwarded to clinicians.
Patients were undergoing one of three types of office urology procedures: cystoscopy (66.6%), intravesical therapy for bladder cancer (24.5%), and prostate biopsy (8.9%). Median age was 70 years and most participants (83%) were men.
The primary endpoint was a symptomatic UTI confirmed by culture 30 days after the procedure.
In the 323 patients managed without access to the results of urinalysis, the rate of UTI was 1.2%. In the 318 patients who received usual care guided by urinalysis, the rate of UTI was 1.6% – and the difference was a single case.
The nonsignificant difference fell easily within the study definition of noninferiority, according to Ms. Rose. Others offering preprocedural urinalysis should take heed.
“Due to the large cohort of patients we enrolled, we expect that it will be applicable to other institutions,” she said in an interview.
SUB: Expert pushes back
In a 2020 Best Practice Statement from the AUA on antibiotic prophylaxis, the risk of procedural-related UTI was considered to be much lower in out-patient versus in-patient settings.
The statement identified a long list of variables to guide screening for UTI and initiation of prophylactic antibiotics prior to urology procedures for hospitalized patients, but office-based procedures in low-risk, largely healthy patients were treated differently.
As a result, the operating hypothesis of the Wisconsin study is “flawed,” said Anthony J. Schaeffer, MD, professor of urology, Feinberg School of Medicine at Northwestern University, Chicago.
“An asymptomatic patient undergoing office cystoscopy for [such indications as] hematuria or bladder tumor doesn’t need pre-procedural urinalysis or prophylactic antibiotics unless they have risk factors, such as immunosuppression,” Dr. Schaeffer told this news organization. “There is no relationship between preprocedural urinalysis and a post-procedure UTI caused by instrumentation.”
According to the AUA, neither antibiotic prophylaxis nor screening such as urodynamic studies is recommended prior to simple outpatient cystoscopy if patients are otherwise healthy and have no signs or symptoms of a UTI.
While antibiotic prophylaxis is standard of care for some outpatient urological procedures, such as transrectal ultrasound (TRUS)-guided prostate biopsies, the practice is appropriate whether or not patients undergo urinalysis, according to Dr. Schaeffer.
As a result, one problem with the new study was the lack of discussion about antibiotic prophylaxis.
“Presumably the patients undergoing TRUS prostate biopsies received antibiotic prophylaxis, which is a critical cofounder that they do not even mention,” Dr. Schaeffer said.
In patients with UTI symptoms, some screening is appropriate whether with a simple dipstick, laboratory-performed microscopy, or culture, according to the AUA.
In the absence of symptoms for a patient undergoing a class I (clean) procedure, the AUA statement recommends – and Dr. Schaeffer said he agreed – that antibiotic prophylaxis, let alone urinalysis, is not a standard, particularly for simple outpatient cystoscopy.
Ms. Rose and Dr. Schaeffer have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
according to the results of a randomized trial.
Some centers perform the pre-procedure test to avoid urinary tract infections (UTIs), a feared iatrogenic complication, but the new findings indicate the step is unnecessary.
These results “will alter screening urinalysis practice at our hospital,” said Alexa Rose, a clinical research coordinator in the department of urology, University of Wisconsin School of Medicine and Public Health, Madison. Ms. Rose’s group presented the findings at the annual meeting of the American Urological Association.
Although rates of post procedure UTI are generally low after office-based cytology, it is the most common complication, Ms. Rose said. To minimize risk, preprocedural urinalysis had become standard practice at her institution.
For the study, Ms. Rose and colleagues sought to determine if the testing was indeed helping reduce the risk of UTI. They randomly divided 641 patients into two groups. Both received urinalysis, but test results of participants in the experimental group were not forwarded to clinicians.
Patients were undergoing one of three types of office urology procedures: cystoscopy (66.6%), intravesical therapy for bladder cancer (24.5%), and prostate biopsy (8.9%). Median age was 70 years and most participants (83%) were men.
The primary endpoint was a symptomatic UTI confirmed by culture 30 days after the procedure.
In the 323 patients managed without access to the results of urinalysis, the rate of UTI was 1.2%. In the 318 patients who received usual care guided by urinalysis, the rate of UTI was 1.6% – and the difference was a single case.
The nonsignificant difference fell easily within the study definition of noninferiority, according to Ms. Rose. Others offering preprocedural urinalysis should take heed.
“Due to the large cohort of patients we enrolled, we expect that it will be applicable to other institutions,” she said in an interview.
SUB: Expert pushes back
In a 2020 Best Practice Statement from the AUA on antibiotic prophylaxis, the risk of procedural-related UTI was considered to be much lower in out-patient versus in-patient settings.
The statement identified a long list of variables to guide screening for UTI and initiation of prophylactic antibiotics prior to urology procedures for hospitalized patients, but office-based procedures in low-risk, largely healthy patients were treated differently.
As a result, the operating hypothesis of the Wisconsin study is “flawed,” said Anthony J. Schaeffer, MD, professor of urology, Feinberg School of Medicine at Northwestern University, Chicago.
“An asymptomatic patient undergoing office cystoscopy for [such indications as] hematuria or bladder tumor doesn’t need pre-procedural urinalysis or prophylactic antibiotics unless they have risk factors, such as immunosuppression,” Dr. Schaeffer told this news organization. “There is no relationship between preprocedural urinalysis and a post-procedure UTI caused by instrumentation.”
According to the AUA, neither antibiotic prophylaxis nor screening such as urodynamic studies is recommended prior to simple outpatient cystoscopy if patients are otherwise healthy and have no signs or symptoms of a UTI.
While antibiotic prophylaxis is standard of care for some outpatient urological procedures, such as transrectal ultrasound (TRUS)-guided prostate biopsies, the practice is appropriate whether or not patients undergo urinalysis, according to Dr. Schaeffer.
As a result, one problem with the new study was the lack of discussion about antibiotic prophylaxis.
“Presumably the patients undergoing TRUS prostate biopsies received antibiotic prophylaxis, which is a critical cofounder that they do not even mention,” Dr. Schaeffer said.
In patients with UTI symptoms, some screening is appropriate whether with a simple dipstick, laboratory-performed microscopy, or culture, according to the AUA.
In the absence of symptoms for a patient undergoing a class I (clean) procedure, the AUA statement recommends – and Dr. Schaeffer said he agreed – that antibiotic prophylaxis, let alone urinalysis, is not a standard, particularly for simple outpatient cystoscopy.
Ms. Rose and Dr. Schaeffer have reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM THE AUA ANNUAL MEETING
Dupilumab for Allergic Contact Dermatitis: An Overview of Its Use and Impact on Patch Testing
Dupilumab is a humanized monoclonal antibody approved by the US Food and Drug Administration (FDA) for the treatment of moderate to severe atopic dermatitis. Through inhibition of the IL-4R α subunit, it prevents activation of the IL-4/IL-13 signaling cascade. This dampens the T H 2 inflammatory response, thereby improving the symptoms associated with atopic dermatitis. 1,2 Recent literature suggests that dupilumab may be useful in the treatment of other chronic dermatologic conditions, including allergic contact dermatitis (ACD) refractory to allergen avoidance and other treatments. Herein, we provide an overview of ACD, the role that dupilumab may play in its management, and its impact on patch testing results.
Pathogenesis of ACD
Allergic contact dermatitis is a cell-mediated type IV hypersensitivity reaction that develops through 2 distinct stages. In the sensitization phase, an allergen penetrates the skin and subsequently is engulfed by a cutaneous antigen-presenting cell. The allergen is then combined with a peptide to form a complex that is presented to naïve T lymphocytes in regional lymph nodes. The result is clonal expansion of a T-cell population that recognizes the allergen. In the elicitation phase, repeat exposure to the allergen leads to the recruitment of primed T cells to the skin, followed by cytokine release, inflammation, and resultant dermatitis.3
Historically, ACD was thought to be primarily driven by the TH1 inflammatory response; however, it is now known that TH2, TH9, TH17, and TH22 also may play a role in its pathogenesis.4,5 Another key finding is that the immune response in ACD appears to be at least partially allergen specific. Molecular profiling has revealed that nickel primarily induces a TH1/TH17 response, while allergens such as fragrance and rubber primarily induce a TH2 response.4
Management of ACD
Allergen avoidance is the mainstay of ACD treatment; however, in some patients, this approach does not always improve symptoms. In addition, eliminating the source of the allergen may not be possible in those with certain occupational, environmental, or medical exposures.
There are no FDA-approved treatments for ACD. When allergen avoidance alone is insufficient, first-line pharmacologic therapy typically includes topical or oral corticosteroids, the choice of which depends on the extent and severity of the dermatitis; however, a steroid-sparing agent often is preferred to avoid the unfavorable effects of long-term steroid use. Other systemic treatments for ACD include methotrexate, cyclosporine, mycophenolate mofetil, and azathioprine.6 These agents are used for severe ACD and typically are chosen as a last resort due to their immunosuppressive activity.
Phototherapy is another option, often as an adjunct to other therapies. Narrowband UVB and psoralen plus UVA have both been used. Psoralen plus UVA tends to have more side effects; therefore, narrowband UVB often is preferred.7,8
Use of Dupilumab in ACD
Biologics are unique, as they can target a single step in the immune response to improve a wide variety of symptoms. Research investigating their role as a treatment modality for ACD is still evolving alongside our increasing knowledge of its pathophysiology.9 Of note, studies examining the anti–IL-17 biologic secukinumab revealed it to be ineffective against ACD,10,11 which suggests that targeting specific immune components may not always result in improvement of ACD symptoms, likely because its pathophysiology involves several pathways.
There have been multiple reports demonstrating the effectiveness of dupilumab in the treatment of ACD (eTable).12-20 The findings from these studies show that dupilumab can improve recalcitrant dermatitis caused by a broad range of contact allergens, including nickel. This highlights its ability to improve ACD caused by allergens with a TH1 bias, despite its primarily TH2-dampening effects. Notably, several studies have reported successful use of dupilumab for systemic ACD.12,18 In addition, dupilumab may be able to improve symptoms of ACD in as little as 1 to 4 weeks. Unlike some systemic therapies for ACD, dupilumab also benefits from its lack of notable immunosuppressive effects.9 A phase 4 clinical trial at Brigham and Women’s Hospital (Boston, Massachusetts) is recruiting participants, with a primary goal of investigating dupilumab’s impact on ACD in patients who have not improved despite allergen avoidance (ClinicalTrials.gov identifier NCT03935971).
There are a few potential disadvantages to dupilumab. Because it is not yet FDA approved for the treatment of ACD, insurance companies may deny coverage, making it likely to be unaffordable for most patients. Furthermore, the side-effect profile has not been fully characterized. In addition to ocular adverse effects, a growing number of studies have reported face and neck erythema after starting dupilumab. Although the cause is unclear, one theory is that the inhibition of IL-4/IL-13 leads to TH1/TH17 polarization, thereby worsening ACD caused by allergens that activate a TH1-predominant response.21 Finally, not all cases of ACD respond to dupilumab.22
Patch Testing While on Dupilumab
Diagnosing ACD is a challenging process. An accurate history and physical examination are critical, and patch testing remains the gold standard when it comes to identifying the source of the contact allergen(s).
There is ongoing debate among contact dermatitis experts regarding the diagnostic accuracy of patch testing for those on immunomodulators or immunosuppressants, as these medications can dampen positive results and increase the risk for false-negative readings.23 Consequently, some have questioned whether patch testing on dupilumab is accurate or feasible.24 Contact dermatitis experts have examined patch testing results before and after initiation of dupilumab to further investigate. Puza and Atwater25 established that patients are able to mount a positive patch test reaction while on dupilumab. Moreover, a retrospective review by Raffi et al26 found that out of 125 before therapy/on therapy patch test pairs, only 13 were lost after administration of dupilumab. Although this would suggest that dupilumab has little impact on patch testing, Jo et al27 found in a systematic review that patch test reactions may remain positive, change to negative, or become newly positive after dupilumab initiation.
This inconsistency in results may relate to the allergen-specific pathogenesis of ACD—one allergen may have a different response to the mechanism of dupilumab than another.28,29 More recently, de Wijs et al30 reported a series of 20 patients in whom more than two-thirds of prior positive patch test reactions were lost after retesting on dupilumab; there were no clear trends according to the immune polarity of the allergens. This finding suggests that patient-specific factors also should be considered, as this too could have an impact on the reliability of patch test findings after starting dupilumab.29
Final Interpretation
Given its overall excellent safety profile, dupilumab may be a feasible off-label option for patients with ACD that does not respond to allergen avoidance or for those who experience adverse effects from traditional therapies; however, it remains difficult to obtain through insurance because it is not yet FDA approved for ACD. Likewise, its impact on the accuracy of patch testing is not yet well defined. Further investigations are needed to elucidate the pathophysiology of ACD and to guide further use of dupilumab in its treatment.
- Harb H, Chatila TA. Mechanisms of dupilumab. Clin Exp Allergy. 2020;50:5-14. doi:10.1111/cea.13491
- Gooderham MJ, Hong HC, Eshtiaghi P, et al. Dupilumab: a review of its use in the treatment of atopic dermatitis. J Am Acad Dermatol. 2018;78(3 suppl 1):S28-S36. doi:10.1016/j.jaad.2017.12.022
- Murphy PB, Atwater AR, Mueller M. Allergic Contact Dermatitis. StatPearls Publishing; 2022. https://www.ncbi.nlm.nih.gov/books/NBK532866/
- Dhingra N, Shemer A, Correa da Rosa J, et al. Molecular profiling of contact dermatitis skin identifies allergen-dependent differences in immune response. J Allergy Clin Immunol. 2014;134:362-372. doi:10.1016/j.jaci.2014.03.009
- Owen JL, Vakharia PP, Silverberg JI. The role and diagnosis of allergic contact dermatitis in patients with atopic dermatitis. Am J Clin Dermatol. 2018;19:293-302. doi:10.1007/s40257-017-0340-7
- Sung CT, McGowan MA, Machler BC, et al. Systemic treatments for allergic contact dermatitis. Dermatitis. 2019;30:46-53. doi:10.1097/DER.0000000000000435
- Chan CX, Zug KA. Diagnosis and management of dermatitis, including atopic, contact, and hand eczemas. Med Clin North Am. 2021;105:611-626. doi:10.1016/j.mcna.2021.04.003
- Simons JR, Bohnen IJ, van der Valk PG. A left-right comparison of UVB phototherapy and topical photochemotherapy in bilateral chronic hand dermatitis after 6 weeks’ treatment. Clin Exp Dermatol. 1997;22:7-10. doi:10.1046/j.1365-2230.1997.1640585.x
- Bhatia J, Sarin A, Wollina U, et al. Review of biologics in allergic contact dermatitis. Contact Dermatitis. 2020;83:179-181. doi:10.1111/cod.13584
- Todberg T, Zachariae C, Krustrup D, et al. The effect of anti-IL-17 treatment on the reaction to a nickel patch test in patients with allergic contact dermatitis. Int J Dermatol. 2019;58:E58-E61. doi:10.1111/ijd.14347
- Todberg T, Zachariae C, Krustrup D, et al. The effect of treatment with anti-interleukin-17 in patients with allergic contact dermatitis. Contact Dermatitis. 2018;78:431-432. doi:10.1111/cod.12988
- Joshi SR, Khan DA. Effective use of dupilumab in managing systemic allergic contact dermatitis. Dermatitis. 2018;29:282-284. doi:10.1097/DER.0000000000000409
- Goldminz AM, Scheinman PL. A case series of dupilumab-treated allergic contact dermatitis patients. Dermatol Ther. 2018;31:E12701. doi:10.1111/dth.12701
- Chipalkatti N, Lee N, Zancanaro P, et al. Dupilumab as a treatment for allergic contact dermatitis. Dermatitis. 2018;29:347-348. doi:10.1097/DER.0000000000000414
- Zhu GA, Chen JK, Chiou A, et al. Repeat patch testing in a patient with allergic contact dermatitis improved on dupilumab. JAAD Case Rep. 2019;5:336-338. doi:10.1016/j.jdcr.2019.01.023
- Machler BC, Sung CT, Darwin E, et al. Dupilumab use in allergic contact dermatitis. J Am Acad Dermatol. 2019;80:280-281.e1. doi:10.1016/j.jaad.2018.07.043
- Chipalkatti N, Lee N, Zancanaro P, et al. A retrospective review of dupilumab for atopic dermatitis patients with allergic contact dermatitis. J Am Acad Dermatol. 2019;80:1166-1167. doi:10.1016/j.jaad.2018.12.048
- Jacob SE, Sung CT, Machler BC. Dupilumab for systemic allergy syndrome with dermatitis. Dermatitis. 2019;30:164-167. doi:10.1097/DER.0000000000000446
- Ruge IF, Skov L, Zachariae C, et al. Dupilumab treatment in two patients with severe allergic contact dermatitis caused by sesquiterpene lactones. Contact Dermatitis. 2020;83:137-139. doi:10.1111/cod.13545
- Wilson B, Balogh E, Rayhan D, et al. Chromate-induced allergic contact dermatitis treated with dupilumab. J Drugs Dermatol. 2021;20:1340-1342. doi:10.36849/jdd.6246
- Jo CE, Finstad A, Georgakopoulos JR, et al. Facial and neck erythema associated with dupilumab treatment: a systematic review. J Am Acad Dermatol. 2021;84:1339-1347. doi:10.1016/j.jaad.2021.01.012
- Koblinski JE, Hamann D. Mixed occupational and iatrogenic allergic contact dermatitis in a hairdresser. Occup Med (Lond). 2020;70:523-526. doi:10.1093/occmed/kqaa152
- Levian B, Chan J, DeLeo VA, et al. Patch testing and immunosuppression: a comprehensive review. Curr Derm Rep. 2021;10:128-139.
- Shah P, Milam EC, Lo Sicco KI, et al. Dupilumab for allergic contact dermatitis and implications for patch testing: irreconcilable differences. J Am Acad Dermatol. 2020;83:E215-E216. doi:10.1016/j.jaad.2020.05.036
- Puza CJ, Atwater AR. Positive patch test reaction in a patient taking dupilumab. Dermatitis. 2018;29:89. doi:10.1097/DER.0000000000000346
- Raffi J, Suresh R, Botto N, et al. The impact of dupilumab on patch testing and the prevalence of comorbid allergic contact dermatitis in recalcitrant atopic dermatitis: a retrospective chart review. J Am Acad Dermatol. 2020;82:132-138. doi:10.1016/j.jaad.2019.09.028
- Jo CE, Mufti A, Sachdeva M, et al. Effect of dupilumab on allergic contact dermatitis and patch testing. J Am Acad Dermatol. 2021;84:1772-1776. doi:10.1016/j.jaad.2021.02.044
- Raffi J, Botto N. Patch testing and allergen-specific inhibition in a patient taking dupilumab. JAMA Dermatol. 2019;155:120-121. doi:10.1001/jamadermatol.2018.4098
- Ludwig CM, Krase JM, Shi VY. T helper 2 inhibitors in allergic contact dermatitis. Dermatitis. 2021;32:15-18. doi: 10.1097/DER.0000000000000616
- de Wijs LEM, van der Waa JD, Nijsten T, et al. Effects of dupilumab treatment on patch test reactions: a retrospective evaluation. Clin Exp Allergy. 2021;51:959-967. doi:10.1111/cea.13892
Dupilumab is a humanized monoclonal antibody approved by the US Food and Drug Administration (FDA) for the treatment of moderate to severe atopic dermatitis. Through inhibition of the IL-4R α subunit, it prevents activation of the IL-4/IL-13 signaling cascade. This dampens the T H 2 inflammatory response, thereby improving the symptoms associated with atopic dermatitis. 1,2 Recent literature suggests that dupilumab may be useful in the treatment of other chronic dermatologic conditions, including allergic contact dermatitis (ACD) refractory to allergen avoidance and other treatments. Herein, we provide an overview of ACD, the role that dupilumab may play in its management, and its impact on patch testing results.
Pathogenesis of ACD
Allergic contact dermatitis is a cell-mediated type IV hypersensitivity reaction that develops through 2 distinct stages. In the sensitization phase, an allergen penetrates the skin and subsequently is engulfed by a cutaneous antigen-presenting cell. The allergen is then combined with a peptide to form a complex that is presented to naïve T lymphocytes in regional lymph nodes. The result is clonal expansion of a T-cell population that recognizes the allergen. In the elicitation phase, repeat exposure to the allergen leads to the recruitment of primed T cells to the skin, followed by cytokine release, inflammation, and resultant dermatitis.3
Historically, ACD was thought to be primarily driven by the TH1 inflammatory response; however, it is now known that TH2, TH9, TH17, and TH22 also may play a role in its pathogenesis.4,5 Another key finding is that the immune response in ACD appears to be at least partially allergen specific. Molecular profiling has revealed that nickel primarily induces a TH1/TH17 response, while allergens such as fragrance and rubber primarily induce a TH2 response.4
Management of ACD
Allergen avoidance is the mainstay of ACD treatment; however, in some patients, this approach does not always improve symptoms. In addition, eliminating the source of the allergen may not be possible in those with certain occupational, environmental, or medical exposures.
There are no FDA-approved treatments for ACD. When allergen avoidance alone is insufficient, first-line pharmacologic therapy typically includes topical or oral corticosteroids, the choice of which depends on the extent and severity of the dermatitis; however, a steroid-sparing agent often is preferred to avoid the unfavorable effects of long-term steroid use. Other systemic treatments for ACD include methotrexate, cyclosporine, mycophenolate mofetil, and azathioprine.6 These agents are used for severe ACD and typically are chosen as a last resort due to their immunosuppressive activity.
Phototherapy is another option, often as an adjunct to other therapies. Narrowband UVB and psoralen plus UVA have both been used. Psoralen plus UVA tends to have more side effects; therefore, narrowband UVB often is preferred.7,8
Use of Dupilumab in ACD
Biologics are unique, as they can target a single step in the immune response to improve a wide variety of symptoms. Research investigating their role as a treatment modality for ACD is still evolving alongside our increasing knowledge of its pathophysiology.9 Of note, studies examining the anti–IL-17 biologic secukinumab revealed it to be ineffective against ACD,10,11 which suggests that targeting specific immune components may not always result in improvement of ACD symptoms, likely because its pathophysiology involves several pathways.
There have been multiple reports demonstrating the effectiveness of dupilumab in the treatment of ACD (eTable).12-20 The findings from these studies show that dupilumab can improve recalcitrant dermatitis caused by a broad range of contact allergens, including nickel. This highlights its ability to improve ACD caused by allergens with a TH1 bias, despite its primarily TH2-dampening effects. Notably, several studies have reported successful use of dupilumab for systemic ACD.12,18 In addition, dupilumab may be able to improve symptoms of ACD in as little as 1 to 4 weeks. Unlike some systemic therapies for ACD, dupilumab also benefits from its lack of notable immunosuppressive effects.9 A phase 4 clinical trial at Brigham and Women’s Hospital (Boston, Massachusetts) is recruiting participants, with a primary goal of investigating dupilumab’s impact on ACD in patients who have not improved despite allergen avoidance (ClinicalTrials.gov identifier NCT03935971).
There are a few potential disadvantages to dupilumab. Because it is not yet FDA approved for the treatment of ACD, insurance companies may deny coverage, making it likely to be unaffordable for most patients. Furthermore, the side-effect profile has not been fully characterized. In addition to ocular adverse effects, a growing number of studies have reported face and neck erythema after starting dupilumab. Although the cause is unclear, one theory is that the inhibition of IL-4/IL-13 leads to TH1/TH17 polarization, thereby worsening ACD caused by allergens that activate a TH1-predominant response.21 Finally, not all cases of ACD respond to dupilumab.22
Patch Testing While on Dupilumab
Diagnosing ACD is a challenging process. An accurate history and physical examination are critical, and patch testing remains the gold standard when it comes to identifying the source of the contact allergen(s).
There is ongoing debate among contact dermatitis experts regarding the diagnostic accuracy of patch testing for those on immunomodulators or immunosuppressants, as these medications can dampen positive results and increase the risk for false-negative readings.23 Consequently, some have questioned whether patch testing on dupilumab is accurate or feasible.24 Contact dermatitis experts have examined patch testing results before and after initiation of dupilumab to further investigate. Puza and Atwater25 established that patients are able to mount a positive patch test reaction while on dupilumab. Moreover, a retrospective review by Raffi et al26 found that out of 125 before therapy/on therapy patch test pairs, only 13 were lost after administration of dupilumab. Although this would suggest that dupilumab has little impact on patch testing, Jo et al27 found in a systematic review that patch test reactions may remain positive, change to negative, or become newly positive after dupilumab initiation.
This inconsistency in results may relate to the allergen-specific pathogenesis of ACD—one allergen may have a different response to the mechanism of dupilumab than another.28,29 More recently, de Wijs et al30 reported a series of 20 patients in whom more than two-thirds of prior positive patch test reactions were lost after retesting on dupilumab; there were no clear trends according to the immune polarity of the allergens. This finding suggests that patient-specific factors also should be considered, as this too could have an impact on the reliability of patch test findings after starting dupilumab.29
Final Interpretation
Given its overall excellent safety profile, dupilumab may be a feasible off-label option for patients with ACD that does not respond to allergen avoidance or for those who experience adverse effects from traditional therapies; however, it remains difficult to obtain through insurance because it is not yet FDA approved for ACD. Likewise, its impact on the accuracy of patch testing is not yet well defined. Further investigations are needed to elucidate the pathophysiology of ACD and to guide further use of dupilumab in its treatment.
Dupilumab is a humanized monoclonal antibody approved by the US Food and Drug Administration (FDA) for the treatment of moderate to severe atopic dermatitis. Through inhibition of the IL-4R α subunit, it prevents activation of the IL-4/IL-13 signaling cascade. This dampens the T H 2 inflammatory response, thereby improving the symptoms associated with atopic dermatitis. 1,2 Recent literature suggests that dupilumab may be useful in the treatment of other chronic dermatologic conditions, including allergic contact dermatitis (ACD) refractory to allergen avoidance and other treatments. Herein, we provide an overview of ACD, the role that dupilumab may play in its management, and its impact on patch testing results.
Pathogenesis of ACD
Allergic contact dermatitis is a cell-mediated type IV hypersensitivity reaction that develops through 2 distinct stages. In the sensitization phase, an allergen penetrates the skin and subsequently is engulfed by a cutaneous antigen-presenting cell. The allergen is then combined with a peptide to form a complex that is presented to naïve T lymphocytes in regional lymph nodes. The result is clonal expansion of a T-cell population that recognizes the allergen. In the elicitation phase, repeat exposure to the allergen leads to the recruitment of primed T cells to the skin, followed by cytokine release, inflammation, and resultant dermatitis.3
Historically, ACD was thought to be primarily driven by the TH1 inflammatory response; however, it is now known that TH2, TH9, TH17, and TH22 also may play a role in its pathogenesis.4,5 Another key finding is that the immune response in ACD appears to be at least partially allergen specific. Molecular profiling has revealed that nickel primarily induces a TH1/TH17 response, while allergens such as fragrance and rubber primarily induce a TH2 response.4
Management of ACD
Allergen avoidance is the mainstay of ACD treatment; however, in some patients, this approach does not always improve symptoms. In addition, eliminating the source of the allergen may not be possible in those with certain occupational, environmental, or medical exposures.
There are no FDA-approved treatments for ACD. When allergen avoidance alone is insufficient, first-line pharmacologic therapy typically includes topical or oral corticosteroids, the choice of which depends on the extent and severity of the dermatitis; however, a steroid-sparing agent often is preferred to avoid the unfavorable effects of long-term steroid use. Other systemic treatments for ACD include methotrexate, cyclosporine, mycophenolate mofetil, and azathioprine.6 These agents are used for severe ACD and typically are chosen as a last resort due to their immunosuppressive activity.
Phototherapy is another option, often as an adjunct to other therapies. Narrowband UVB and psoralen plus UVA have both been used. Psoralen plus UVA tends to have more side effects; therefore, narrowband UVB often is preferred.7,8
Use of Dupilumab in ACD
Biologics are unique, as they can target a single step in the immune response to improve a wide variety of symptoms. Research investigating their role as a treatment modality for ACD is still evolving alongside our increasing knowledge of its pathophysiology.9 Of note, studies examining the anti–IL-17 biologic secukinumab revealed it to be ineffective against ACD,10,11 which suggests that targeting specific immune components may not always result in improvement of ACD symptoms, likely because its pathophysiology involves several pathways.
There have been multiple reports demonstrating the effectiveness of dupilumab in the treatment of ACD (eTable).12-20 The findings from these studies show that dupilumab can improve recalcitrant dermatitis caused by a broad range of contact allergens, including nickel. This highlights its ability to improve ACD caused by allergens with a TH1 bias, despite its primarily TH2-dampening effects. Notably, several studies have reported successful use of dupilumab for systemic ACD.12,18 In addition, dupilumab may be able to improve symptoms of ACD in as little as 1 to 4 weeks. Unlike some systemic therapies for ACD, dupilumab also benefits from its lack of notable immunosuppressive effects.9 A phase 4 clinical trial at Brigham and Women’s Hospital (Boston, Massachusetts) is recruiting participants, with a primary goal of investigating dupilumab’s impact on ACD in patients who have not improved despite allergen avoidance (ClinicalTrials.gov identifier NCT03935971).
There are a few potential disadvantages to dupilumab. Because it is not yet FDA approved for the treatment of ACD, insurance companies may deny coverage, making it likely to be unaffordable for most patients. Furthermore, the side-effect profile has not been fully characterized. In addition to ocular adverse effects, a growing number of studies have reported face and neck erythema after starting dupilumab. Although the cause is unclear, one theory is that the inhibition of IL-4/IL-13 leads to TH1/TH17 polarization, thereby worsening ACD caused by allergens that activate a TH1-predominant response.21 Finally, not all cases of ACD respond to dupilumab.22
Patch Testing While on Dupilumab
Diagnosing ACD is a challenging process. An accurate history and physical examination are critical, and patch testing remains the gold standard when it comes to identifying the source of the contact allergen(s).
There is ongoing debate among contact dermatitis experts regarding the diagnostic accuracy of patch testing for those on immunomodulators or immunosuppressants, as these medications can dampen positive results and increase the risk for false-negative readings.23 Consequently, some have questioned whether patch testing on dupilumab is accurate or feasible.24 Contact dermatitis experts have examined patch testing results before and after initiation of dupilumab to further investigate. Puza and Atwater25 established that patients are able to mount a positive patch test reaction while on dupilumab. Moreover, a retrospective review by Raffi et al26 found that out of 125 before therapy/on therapy patch test pairs, only 13 were lost after administration of dupilumab. Although this would suggest that dupilumab has little impact on patch testing, Jo et al27 found in a systematic review that patch test reactions may remain positive, change to negative, or become newly positive after dupilumab initiation.
This inconsistency in results may relate to the allergen-specific pathogenesis of ACD—one allergen may have a different response to the mechanism of dupilumab than another.28,29 More recently, de Wijs et al30 reported a series of 20 patients in whom more than two-thirds of prior positive patch test reactions were lost after retesting on dupilumab; there were no clear trends according to the immune polarity of the allergens. This finding suggests that patient-specific factors also should be considered, as this too could have an impact on the reliability of patch test findings after starting dupilumab.29
Final Interpretation
Given its overall excellent safety profile, dupilumab may be a feasible off-label option for patients with ACD that does not respond to allergen avoidance or for those who experience adverse effects from traditional therapies; however, it remains difficult to obtain through insurance because it is not yet FDA approved for ACD. Likewise, its impact on the accuracy of patch testing is not yet well defined. Further investigations are needed to elucidate the pathophysiology of ACD and to guide further use of dupilumab in its treatment.
- Harb H, Chatila TA. Mechanisms of dupilumab. Clin Exp Allergy. 2020;50:5-14. doi:10.1111/cea.13491
- Gooderham MJ, Hong HC, Eshtiaghi P, et al. Dupilumab: a review of its use in the treatment of atopic dermatitis. J Am Acad Dermatol. 2018;78(3 suppl 1):S28-S36. doi:10.1016/j.jaad.2017.12.022
- Murphy PB, Atwater AR, Mueller M. Allergic Contact Dermatitis. StatPearls Publishing; 2022. https://www.ncbi.nlm.nih.gov/books/NBK532866/
- Dhingra N, Shemer A, Correa da Rosa J, et al. Molecular profiling of contact dermatitis skin identifies allergen-dependent differences in immune response. J Allergy Clin Immunol. 2014;134:362-372. doi:10.1016/j.jaci.2014.03.009
- Owen JL, Vakharia PP, Silverberg JI. The role and diagnosis of allergic contact dermatitis in patients with atopic dermatitis. Am J Clin Dermatol. 2018;19:293-302. doi:10.1007/s40257-017-0340-7
- Sung CT, McGowan MA, Machler BC, et al. Systemic treatments for allergic contact dermatitis. Dermatitis. 2019;30:46-53. doi:10.1097/DER.0000000000000435
- Chan CX, Zug KA. Diagnosis and management of dermatitis, including atopic, contact, and hand eczemas. Med Clin North Am. 2021;105:611-626. doi:10.1016/j.mcna.2021.04.003
- Simons JR, Bohnen IJ, van der Valk PG. A left-right comparison of UVB phototherapy and topical photochemotherapy in bilateral chronic hand dermatitis after 6 weeks’ treatment. Clin Exp Dermatol. 1997;22:7-10. doi:10.1046/j.1365-2230.1997.1640585.x
- Bhatia J, Sarin A, Wollina U, et al. Review of biologics in allergic contact dermatitis. Contact Dermatitis. 2020;83:179-181. doi:10.1111/cod.13584
- Todberg T, Zachariae C, Krustrup D, et al. The effect of anti-IL-17 treatment on the reaction to a nickel patch test in patients with allergic contact dermatitis. Int J Dermatol. 2019;58:E58-E61. doi:10.1111/ijd.14347
- Todberg T, Zachariae C, Krustrup D, et al. The effect of treatment with anti-interleukin-17 in patients with allergic contact dermatitis. Contact Dermatitis. 2018;78:431-432. doi:10.1111/cod.12988
- Joshi SR, Khan DA. Effective use of dupilumab in managing systemic allergic contact dermatitis. Dermatitis. 2018;29:282-284. doi:10.1097/DER.0000000000000409
- Goldminz AM, Scheinman PL. A case series of dupilumab-treated allergic contact dermatitis patients. Dermatol Ther. 2018;31:E12701. doi:10.1111/dth.12701
- Chipalkatti N, Lee N, Zancanaro P, et al. Dupilumab as a treatment for allergic contact dermatitis. Dermatitis. 2018;29:347-348. doi:10.1097/DER.0000000000000414
- Zhu GA, Chen JK, Chiou A, et al. Repeat patch testing in a patient with allergic contact dermatitis improved on dupilumab. JAAD Case Rep. 2019;5:336-338. doi:10.1016/j.jdcr.2019.01.023
- Machler BC, Sung CT, Darwin E, et al. Dupilumab use in allergic contact dermatitis. J Am Acad Dermatol. 2019;80:280-281.e1. doi:10.1016/j.jaad.2018.07.043
- Chipalkatti N, Lee N, Zancanaro P, et al. A retrospective review of dupilumab for atopic dermatitis patients with allergic contact dermatitis. J Am Acad Dermatol. 2019;80:1166-1167. doi:10.1016/j.jaad.2018.12.048
- Jacob SE, Sung CT, Machler BC. Dupilumab for systemic allergy syndrome with dermatitis. Dermatitis. 2019;30:164-167. doi:10.1097/DER.0000000000000446
- Ruge IF, Skov L, Zachariae C, et al. Dupilumab treatment in two patients with severe allergic contact dermatitis caused by sesquiterpene lactones. Contact Dermatitis. 2020;83:137-139. doi:10.1111/cod.13545
- Wilson B, Balogh E, Rayhan D, et al. Chromate-induced allergic contact dermatitis treated with dupilumab. J Drugs Dermatol. 2021;20:1340-1342. doi:10.36849/jdd.6246
- Jo CE, Finstad A, Georgakopoulos JR, et al. Facial and neck erythema associated with dupilumab treatment: a systematic review. J Am Acad Dermatol. 2021;84:1339-1347. doi:10.1016/j.jaad.2021.01.012
- Koblinski JE, Hamann D. Mixed occupational and iatrogenic allergic contact dermatitis in a hairdresser. Occup Med (Lond). 2020;70:523-526. doi:10.1093/occmed/kqaa152
- Levian B, Chan J, DeLeo VA, et al. Patch testing and immunosuppression: a comprehensive review. Curr Derm Rep. 2021;10:128-139.
- Shah P, Milam EC, Lo Sicco KI, et al. Dupilumab for allergic contact dermatitis and implications for patch testing: irreconcilable differences. J Am Acad Dermatol. 2020;83:E215-E216. doi:10.1016/j.jaad.2020.05.036
- Puza CJ, Atwater AR. Positive patch test reaction in a patient taking dupilumab. Dermatitis. 2018;29:89. doi:10.1097/DER.0000000000000346
- Raffi J, Suresh R, Botto N, et al. The impact of dupilumab on patch testing and the prevalence of comorbid allergic contact dermatitis in recalcitrant atopic dermatitis: a retrospective chart review. J Am Acad Dermatol. 2020;82:132-138. doi:10.1016/j.jaad.2019.09.028
- Jo CE, Mufti A, Sachdeva M, et al. Effect of dupilumab on allergic contact dermatitis and patch testing. J Am Acad Dermatol. 2021;84:1772-1776. doi:10.1016/j.jaad.2021.02.044
- Raffi J, Botto N. Patch testing and allergen-specific inhibition in a patient taking dupilumab. JAMA Dermatol. 2019;155:120-121. doi:10.1001/jamadermatol.2018.4098
- Ludwig CM, Krase JM, Shi VY. T helper 2 inhibitors in allergic contact dermatitis. Dermatitis. 2021;32:15-18. doi: 10.1097/DER.0000000000000616
- de Wijs LEM, van der Waa JD, Nijsten T, et al. Effects of dupilumab treatment on patch test reactions: a retrospective evaluation. Clin Exp Allergy. 2021;51:959-967. doi:10.1111/cea.13892
- Harb H, Chatila TA. Mechanisms of dupilumab. Clin Exp Allergy. 2020;50:5-14. doi:10.1111/cea.13491
- Gooderham MJ, Hong HC, Eshtiaghi P, et al. Dupilumab: a review of its use in the treatment of atopic dermatitis. J Am Acad Dermatol. 2018;78(3 suppl 1):S28-S36. doi:10.1016/j.jaad.2017.12.022
- Murphy PB, Atwater AR, Mueller M. Allergic Contact Dermatitis. StatPearls Publishing; 2022. https://www.ncbi.nlm.nih.gov/books/NBK532866/
- Dhingra N, Shemer A, Correa da Rosa J, et al. Molecular profiling of contact dermatitis skin identifies allergen-dependent differences in immune response. J Allergy Clin Immunol. 2014;134:362-372. doi:10.1016/j.jaci.2014.03.009
- Owen JL, Vakharia PP, Silverberg JI. The role and diagnosis of allergic contact dermatitis in patients with atopic dermatitis. Am J Clin Dermatol. 2018;19:293-302. doi:10.1007/s40257-017-0340-7
- Sung CT, McGowan MA, Machler BC, et al. Systemic treatments for allergic contact dermatitis. Dermatitis. 2019;30:46-53. doi:10.1097/DER.0000000000000435
- Chan CX, Zug KA. Diagnosis and management of dermatitis, including atopic, contact, and hand eczemas. Med Clin North Am. 2021;105:611-626. doi:10.1016/j.mcna.2021.04.003
- Simons JR, Bohnen IJ, van der Valk PG. A left-right comparison of UVB phototherapy and topical photochemotherapy in bilateral chronic hand dermatitis after 6 weeks’ treatment. Clin Exp Dermatol. 1997;22:7-10. doi:10.1046/j.1365-2230.1997.1640585.x
- Bhatia J, Sarin A, Wollina U, et al. Review of biologics in allergic contact dermatitis. Contact Dermatitis. 2020;83:179-181. doi:10.1111/cod.13584
- Todberg T, Zachariae C, Krustrup D, et al. The effect of anti-IL-17 treatment on the reaction to a nickel patch test in patients with allergic contact dermatitis. Int J Dermatol. 2019;58:E58-E61. doi:10.1111/ijd.14347
- Todberg T, Zachariae C, Krustrup D, et al. The effect of treatment with anti-interleukin-17 in patients with allergic contact dermatitis. Contact Dermatitis. 2018;78:431-432. doi:10.1111/cod.12988
- Joshi SR, Khan DA. Effective use of dupilumab in managing systemic allergic contact dermatitis. Dermatitis. 2018;29:282-284. doi:10.1097/DER.0000000000000409
- Goldminz AM, Scheinman PL. A case series of dupilumab-treated allergic contact dermatitis patients. Dermatol Ther. 2018;31:E12701. doi:10.1111/dth.12701
- Chipalkatti N, Lee N, Zancanaro P, et al. Dupilumab as a treatment for allergic contact dermatitis. Dermatitis. 2018;29:347-348. doi:10.1097/DER.0000000000000414
- Zhu GA, Chen JK, Chiou A, et al. Repeat patch testing in a patient with allergic contact dermatitis improved on dupilumab. JAAD Case Rep. 2019;5:336-338. doi:10.1016/j.jdcr.2019.01.023
- Machler BC, Sung CT, Darwin E, et al. Dupilumab use in allergic contact dermatitis. J Am Acad Dermatol. 2019;80:280-281.e1. doi:10.1016/j.jaad.2018.07.043
- Chipalkatti N, Lee N, Zancanaro P, et al. A retrospective review of dupilumab for atopic dermatitis patients with allergic contact dermatitis. J Am Acad Dermatol. 2019;80:1166-1167. doi:10.1016/j.jaad.2018.12.048
- Jacob SE, Sung CT, Machler BC. Dupilumab for systemic allergy syndrome with dermatitis. Dermatitis. 2019;30:164-167. doi:10.1097/DER.0000000000000446
- Ruge IF, Skov L, Zachariae C, et al. Dupilumab treatment in two patients with severe allergic contact dermatitis caused by sesquiterpene lactones. Contact Dermatitis. 2020;83:137-139. doi:10.1111/cod.13545
- Wilson B, Balogh E, Rayhan D, et al. Chromate-induced allergic contact dermatitis treated with dupilumab. J Drugs Dermatol. 2021;20:1340-1342. doi:10.36849/jdd.6246
- Jo CE, Finstad A, Georgakopoulos JR, et al. Facial and neck erythema associated with dupilumab treatment: a systematic review. J Am Acad Dermatol. 2021;84:1339-1347. doi:10.1016/j.jaad.2021.01.012
- Koblinski JE, Hamann D. Mixed occupational and iatrogenic allergic contact dermatitis in a hairdresser. Occup Med (Lond). 2020;70:523-526. doi:10.1093/occmed/kqaa152
- Levian B, Chan J, DeLeo VA, et al. Patch testing and immunosuppression: a comprehensive review. Curr Derm Rep. 2021;10:128-139.
- Shah P, Milam EC, Lo Sicco KI, et al. Dupilumab for allergic contact dermatitis and implications for patch testing: irreconcilable differences. J Am Acad Dermatol. 2020;83:E215-E216. doi:10.1016/j.jaad.2020.05.036
- Puza CJ, Atwater AR. Positive patch test reaction in a patient taking dupilumab. Dermatitis. 2018;29:89. doi:10.1097/DER.0000000000000346
- Raffi J, Suresh R, Botto N, et al. The impact of dupilumab on patch testing and the prevalence of comorbid allergic contact dermatitis in recalcitrant atopic dermatitis: a retrospective chart review. J Am Acad Dermatol. 2020;82:132-138. doi:10.1016/j.jaad.2019.09.028
- Jo CE, Mufti A, Sachdeva M, et al. Effect of dupilumab on allergic contact dermatitis and patch testing. J Am Acad Dermatol. 2021;84:1772-1776. doi:10.1016/j.jaad.2021.02.044
- Raffi J, Botto N. Patch testing and allergen-specific inhibition in a patient taking dupilumab. JAMA Dermatol. 2019;155:120-121. doi:10.1001/jamadermatol.2018.4098
- Ludwig CM, Krase JM, Shi VY. T helper 2 inhibitors in allergic contact dermatitis. Dermatitis. 2021;32:15-18. doi: 10.1097/DER.0000000000000616
- de Wijs LEM, van der Waa JD, Nijsten T, et al. Effects of dupilumab treatment on patch test reactions: a retrospective evaluation. Clin Exp Allergy. 2021;51:959-967. doi:10.1111/cea.13892
Practice Points
- Dupilumab is approved by the US Food and Drug Administration for the treatment of moderate to severe atopic dermatitis.
- Multiple reports have suggested that dupilumab may be effective in the treatment of allergic contact dermatitis, and a phase 4 clinical trial is ongoing.
- The accuracy of patch testing after dupilumab initiation is unclear, as reactions may remain positive, change to negative, or become newly positive after its administration.
Ondansetron use for acute gastroenteritis in children accelerates
Use of oral ondansetron for acute gastroenteritis in children in an emergency setting increased significantly between 2006 and 2018, but use of intravenous fluids remained consistent, based on data from a cross-sectional analysis.
Recommendations for managing acute gastroenteritis in children include oral rehydration therapy for mild to moderate cases and intravenous rehydration for severe cases, Brett Burstein, MDCM, of McGill University, Montreal, and colleagues wrote.
Oral ondansetron has been shown to reduce vomiting and the need for intravenous rehydration, as well as reduce the need for hospitalization in children with evidence of dehydration, but has no significant benefits for children who are not dehydrated, the researchers noted.
“Given the high prevalence and costs associated with acute gastroenteritis treatment for children, understanding national trends in management in a broad, generalizable sample is important,” they wrote.
In a study published in JAMA Network Open, the researchers identified data from the National Hospital Ambulatory Medical Care Survey from Jan. 1, 2006, to Dec. 31, 2018. They analyzed ED visits by individuals younger than 18 years with either a primary discharge diagnosis of acute gastroenteritis or a primary diagnosis of nausea, vomiting, diarrhea, or dehydration with a secondary diagnosis of acute gastroenteritis. The study population included 4,122 patients with a mean age of 4.8 years. Approximately 85% of the visits were to nonacademic EDs, and 80% were to nonpediatric EDs.
Overall, ED visits for acute gastroenteritis increased over time, from 1.23 million in 2006 to 1.87 million in 2018 (P = .03 for trend). ED visits for acute gastroenteritis also increased significantly as a proportion of all ED pediatric visits, from 4.7% in 2006 to 5.6% in 2018 (P = .02 for trend).
Notably, the use of ondansetron increased from 10.6% in 2006 to 59.2% in 2018; however, intravenous rehydration and hospitalizations remained consistent over the study period, the researchers wrote. Approximately half of children who received intravenous fluids (53.9%) and those hospitalized (49.1%) also received ondansetron.
“Approximately half of children administered intravenous fluids or hospitalized did not receive ondansetron, suggesting that many children without dehydration receive ondansetron with limited benefit, whereas those most likely to benefit receive intravenous fluids without an adequate trial of ondansetron and oral rehydration therapy,” the researchers wrote in their discussion of the findings.
The study findings were limited by several factors including the lack of data on detailed patient-level information such as severity of dehydration, the researchers noted. Other limitations include lack of data on return visits and lack of data on the route of medication administration, which means that the perceived lack of benefit from ondansetron may be the result of children treated with both intravenous ondansetron and fluids, they said.
“Ondansetron-supported oral rehydration therapy for appropriately selected children can achieve intravenous rehydration rates of 9%, more than threefold lower than 2018 national estimates,” and more initiatives are needed to optimize ondansetron and reduce the excessive use of intravenous fluids, the researchers concluded.
Emergency care setting may promote IV fluid use
“Acute gastroenteritis has remained a major cause of pediatric morbidity and mortality worldwide with significant costs for the health care system,” Tim Joos, MD, a Seattle-based clinician with a combination internal medicine/pediatrics practice who was not involved in the current study, said in an interview. “The authors highlight that although ondansetron use for acute gastroenteritis in the ED has increased substantially, there are still a number of children who receive intravenous fluids in the ED without a trial of ondansetron and [oral rehydration therapy] first. For the individual patient, it is not surprising that the fast-paced culture of the ED doesn’t cater to a watchful waiting approach. This highlights the need for a more protocol-based algorithm for care of these patients upon check-in.
“Often the practice in the ED is a single dose of ondansetron, followed by attempts at oral rehydration 30 minutes later,” said Dr. Joos. “It would be interesting to know the extent that outpatient clinics are practicing this model prior to sending the patient on to the ED. Despite it becoming a common practice, there is still ongoing research into the efficacy and safety of multidose oral ondansetron at home in reducing ED visits/hospitalizations.”
The study received no outside funding. The researchers had no financial conflicts to disclose. Dr. Joos had no financial conflicts to disclose. Lead author Dr. Burstein received a career award from the Quebec Health Research Fund.
Use of oral ondansetron for acute gastroenteritis in children in an emergency setting increased significantly between 2006 and 2018, but use of intravenous fluids remained consistent, based on data from a cross-sectional analysis.
Recommendations for managing acute gastroenteritis in children include oral rehydration therapy for mild to moderate cases and intravenous rehydration for severe cases, Brett Burstein, MDCM, of McGill University, Montreal, and colleagues wrote.
Oral ondansetron has been shown to reduce vomiting and the need for intravenous rehydration, as well as reduce the need for hospitalization in children with evidence of dehydration, but has no significant benefits for children who are not dehydrated, the researchers noted.
“Given the high prevalence and costs associated with acute gastroenteritis treatment for children, understanding national trends in management in a broad, generalizable sample is important,” they wrote.
In a study published in JAMA Network Open, the researchers identified data from the National Hospital Ambulatory Medical Care Survey from Jan. 1, 2006, to Dec. 31, 2018. They analyzed ED visits by individuals younger than 18 years with either a primary discharge diagnosis of acute gastroenteritis or a primary diagnosis of nausea, vomiting, diarrhea, or dehydration with a secondary diagnosis of acute gastroenteritis. The study population included 4,122 patients with a mean age of 4.8 years. Approximately 85% of the visits were to nonacademic EDs, and 80% were to nonpediatric EDs.
Overall, ED visits for acute gastroenteritis increased over time, from 1.23 million in 2006 to 1.87 million in 2018 (P = .03 for trend). ED visits for acute gastroenteritis also increased significantly as a proportion of all ED pediatric visits, from 4.7% in 2006 to 5.6% in 2018 (P = .02 for trend).
Notably, the use of ondansetron increased from 10.6% in 2006 to 59.2% in 2018; however, intravenous rehydration and hospitalizations remained consistent over the study period, the researchers wrote. Approximately half of children who received intravenous fluids (53.9%) and those hospitalized (49.1%) also received ondansetron.
“Approximately half of children administered intravenous fluids or hospitalized did not receive ondansetron, suggesting that many children without dehydration receive ondansetron with limited benefit, whereas those most likely to benefit receive intravenous fluids without an adequate trial of ondansetron and oral rehydration therapy,” the researchers wrote in their discussion of the findings.
The study findings were limited by several factors including the lack of data on detailed patient-level information such as severity of dehydration, the researchers noted. Other limitations include lack of data on return visits and lack of data on the route of medication administration, which means that the perceived lack of benefit from ondansetron may be the result of children treated with both intravenous ondansetron and fluids, they said.
“Ondansetron-supported oral rehydration therapy for appropriately selected children can achieve intravenous rehydration rates of 9%, more than threefold lower than 2018 national estimates,” and more initiatives are needed to optimize ondansetron and reduce the excessive use of intravenous fluids, the researchers concluded.
Emergency care setting may promote IV fluid use
“Acute gastroenteritis has remained a major cause of pediatric morbidity and mortality worldwide with significant costs for the health care system,” Tim Joos, MD, a Seattle-based clinician with a combination internal medicine/pediatrics practice who was not involved in the current study, said in an interview. “The authors highlight that although ondansetron use for acute gastroenteritis in the ED has increased substantially, there are still a number of children who receive intravenous fluids in the ED without a trial of ondansetron and [oral rehydration therapy] first. For the individual patient, it is not surprising that the fast-paced culture of the ED doesn’t cater to a watchful waiting approach. This highlights the need for a more protocol-based algorithm for care of these patients upon check-in.
“Often the practice in the ED is a single dose of ondansetron, followed by attempts at oral rehydration 30 minutes later,” said Dr. Joos. “It would be interesting to know the extent that outpatient clinics are practicing this model prior to sending the patient on to the ED. Despite it becoming a common practice, there is still ongoing research into the efficacy and safety of multidose oral ondansetron at home in reducing ED visits/hospitalizations.”
The study received no outside funding. The researchers had no financial conflicts to disclose. Dr. Joos had no financial conflicts to disclose. Lead author Dr. Burstein received a career award from the Quebec Health Research Fund.
Use of oral ondansetron for acute gastroenteritis in children in an emergency setting increased significantly between 2006 and 2018, but use of intravenous fluids remained consistent, based on data from a cross-sectional analysis.
Recommendations for managing acute gastroenteritis in children include oral rehydration therapy for mild to moderate cases and intravenous rehydration for severe cases, Brett Burstein, MDCM, of McGill University, Montreal, and colleagues wrote.
Oral ondansetron has been shown to reduce vomiting and the need for intravenous rehydration, as well as reduce the need for hospitalization in children with evidence of dehydration, but has no significant benefits for children who are not dehydrated, the researchers noted.
“Given the high prevalence and costs associated with acute gastroenteritis treatment for children, understanding national trends in management in a broad, generalizable sample is important,” they wrote.
In a study published in JAMA Network Open, the researchers identified data from the National Hospital Ambulatory Medical Care Survey from Jan. 1, 2006, to Dec. 31, 2018. They analyzed ED visits by individuals younger than 18 years with either a primary discharge diagnosis of acute gastroenteritis or a primary diagnosis of nausea, vomiting, diarrhea, or dehydration with a secondary diagnosis of acute gastroenteritis. The study population included 4,122 patients with a mean age of 4.8 years. Approximately 85% of the visits were to nonacademic EDs, and 80% were to nonpediatric EDs.
Overall, ED visits for acute gastroenteritis increased over time, from 1.23 million in 2006 to 1.87 million in 2018 (P = .03 for trend). ED visits for acute gastroenteritis also increased significantly as a proportion of all ED pediatric visits, from 4.7% in 2006 to 5.6% in 2018 (P = .02 for trend).
Notably, the use of ondansetron increased from 10.6% in 2006 to 59.2% in 2018; however, intravenous rehydration and hospitalizations remained consistent over the study period, the researchers wrote. Approximately half of children who received intravenous fluids (53.9%) and those hospitalized (49.1%) also received ondansetron.
“Approximately half of children administered intravenous fluids or hospitalized did not receive ondansetron, suggesting that many children without dehydration receive ondansetron with limited benefit, whereas those most likely to benefit receive intravenous fluids without an adequate trial of ondansetron and oral rehydration therapy,” the researchers wrote in their discussion of the findings.
The study findings were limited by several factors including the lack of data on detailed patient-level information such as severity of dehydration, the researchers noted. Other limitations include lack of data on return visits and lack of data on the route of medication administration, which means that the perceived lack of benefit from ondansetron may be the result of children treated with both intravenous ondansetron and fluids, they said.
“Ondansetron-supported oral rehydration therapy for appropriately selected children can achieve intravenous rehydration rates of 9%, more than threefold lower than 2018 national estimates,” and more initiatives are needed to optimize ondansetron and reduce the excessive use of intravenous fluids, the researchers concluded.
Emergency care setting may promote IV fluid use
“Acute gastroenteritis has remained a major cause of pediatric morbidity and mortality worldwide with significant costs for the health care system,” Tim Joos, MD, a Seattle-based clinician with a combination internal medicine/pediatrics practice who was not involved in the current study, said in an interview. “The authors highlight that although ondansetron use for acute gastroenteritis in the ED has increased substantially, there are still a number of children who receive intravenous fluids in the ED without a trial of ondansetron and [oral rehydration therapy] first. For the individual patient, it is not surprising that the fast-paced culture of the ED doesn’t cater to a watchful waiting approach. This highlights the need for a more protocol-based algorithm for care of these patients upon check-in.
“Often the practice in the ED is a single dose of ondansetron, followed by attempts at oral rehydration 30 minutes later,” said Dr. Joos. “It would be interesting to know the extent that outpatient clinics are practicing this model prior to sending the patient on to the ED. Despite it becoming a common practice, there is still ongoing research into the efficacy and safety of multidose oral ondansetron at home in reducing ED visits/hospitalizations.”
The study received no outside funding. The researchers had no financial conflicts to disclose. Dr. Joos had no financial conflicts to disclose. Lead author Dr. Burstein received a career award from the Quebec Health Research Fund.
FROM JAMA NETWORK OPEN
Skin Cancer Education in the Medical School Curriculum
To the Editor:
Skin cancer represents a notable health care burden of rising incidence.1-3 Nondermatologist health care providers play a key role in skin cancer screening through the use of skin cancer examination (SCE)1,4; however, several factors including poor diagnostic accuracy, low confidence, and lack of training have contributed to limited use of the SCE by these providers.4,5 Therefore, it is important to identify and implement changes in the medical school curriculum that can facilitate improved use of SCE in clinical practice. We sought to examine factors in the medical school curriculum that influence skin cancer education.
A voluntary electronic survey was distributed through class email and social media to all medical student classes at 4 medical schools (Figure). Responses were collected between March 2 and April 20, 2020. Survey items assessed demographics and curricular factors that influence skin cancer education.
Knowledge of the clinical features of melanoma was assessed by asking participants to correctly identify at least 5 of 6 pigmented lesions as concerning or not concerning for melanoma. Confidence in performing the SCE—the primary outcome—was measured by dichotomizing a 4-point Likert-type scale (“very confident” and “moderately confident” against “slightly confident” and “not at all confident”).
Logistic regression was used to examine curricular factors associated with confidence; descriptive statistics were used for remaining analyses. Analyses were performed using SAS 9.4 statistical software. Prior to analysis, responses from the University of South Carolina School of Medicine Greenville were excluded because the response rate was less than 20%.
The survey was distributed to 1524 students; 619 (40.6%) answered at least 1 question, with a variable response rate to each item (eTable 1). Most respondents were female (351 [56.7%]); 438 (70.8%) were White.
Most respondents said that they received 3 hours or less of general skin cancer (74.9%) or SCE-specific (93.0%) education by the end of their fourth year of medical training. Lecture was the most common method of instruction. Education was provided most often by dermatologists (48.6%), followed by general practice physicians (21.2%). Numerous (26.9%) fourth-year respondents reported that they had never observed SCE; even more (47.6%) had never performed SCE. Almost half of second- and third-year students (43.2% and 44.8%, respectively) considered themselves knowledgeable about the clinical features of melanoma, but only 31.9% of fourth-year students considered themselves knowledgeable.
Only 24.1% of fourth-year students reported confidence performing SCE (eTable 1). Students who received most of their instruction through real clinical encounters were 4.14 times more likely to be confident performing SCE than students who had been given lecture-based learning. Students who performed 1 to 3 SCE or 4 or more SCE were 3.02 and 32.25 times, respectively, more likely to be confident than students who had never performed SCE (eTable 2).
Consistent with a recent study,6 our results reflect the discrepancy between the burden and education of skin cancer. This is especially demonstrated by our cohort’s low confidence in performing SCE, a metric associated with both intention to perform and actual performance of SCE in practice.4,5 We also observed a downward trend in knowledge among students who were about to enter residency, potentially indicating the need for longitudinal training.
Given curricular time constraints, it is essential that medical schools implement changes in learning that will have the greatest impact. Although our results strongly support the efficacy of hands-on clinical training, exposure to dermatology in the second half of medical school training is limited nationwide.6 Concentrated efforts to increase clinical exposure might help prepare future physicians in all specialties to combat the burden of this disease.
Limitations of our study include the potential for selection and recall biases. Although our survey spanned multiple institutions in different regions of the United States, results might not be universally representative.
Acknowledgments—We thank Dirk Elston, MD, and Amy Wahlquist, MS (both from Charleston, South Carolina), who helped facilitate the survey on which our research is based. We also acknowledge the assistance of Philip Carmon, MD (Columbia, South Carolina); Julie Flugel (Columbia, South Carolina); Algimantas Simpson, MD (Columbia, South Carolina); Nathan Jasperse, MD (Irvine, California); Jeremy Teruel, MD (Charleston, South Carolina); Alan Snyder, MD, MSCR (Charleston, South Carolina); John Bosland (Charleston, South Carolina); and Daniel Spangler (Greenville, South Carolina).
- Guy GP Jr, Machlin SR, Ekwueme DU, et al. Prevalence and costs of skin cancer treatment in the U.S., 2002–2006 and 2007-2011. Am J Prev Med. 2015;48:183-187. doi:10.1016/j.amepre.2014.08.036
- Paulson KG, Gupta D, Kim TS, et al. Age-specific incidence of melanoma in the United States. JAMA Dermatol. 2020;156:57-64. doi:10.1001/jamadermatol.2019.3353
- Lim HW, Collins SAB, Resneck JS Jr, et al. Contribution of health care factors to the burden of skin disease in the United States. J Am Acad Dermatol. 2017;76:1151-1160.e21. doi:10.1016/j.jaad.2017.03.006
- Garg A, Wang J, Reddy SB, et al; Integrated Skin Exam Consortium. Curricular factors associated with medical students’ practice of the skin cancer examination: an educational enhancement initiative by the Integrated Skin Exam Consortium. JAMA Dermatol. 2014;150:850-855. doi:10.1001/jamadermatol.2013.8723
- Oliveria SA, Heneghan MK, Cushman LF, et al. Skin cancer screening by dermatologists, family practitioners, and internists: barriers and facilitating factors. Arch Dermatol. 2011;147:39-44. doi:10.1001/archdermatol.2010.414
- Cahn BA, Harper HE, Halverstam CP, et al. Current status of dermatologic education in US medical schools. JAMA Dermatol. 2020;156:468-470. doi:10.1001/jamadermatol.2020.0006
To the Editor:
Skin cancer represents a notable health care burden of rising incidence.1-3 Nondermatologist health care providers play a key role in skin cancer screening through the use of skin cancer examination (SCE)1,4; however, several factors including poor diagnostic accuracy, low confidence, and lack of training have contributed to limited use of the SCE by these providers.4,5 Therefore, it is important to identify and implement changes in the medical school curriculum that can facilitate improved use of SCE in clinical practice. We sought to examine factors in the medical school curriculum that influence skin cancer education.
A voluntary electronic survey was distributed through class email and social media to all medical student classes at 4 medical schools (Figure). Responses were collected between March 2 and April 20, 2020. Survey items assessed demographics and curricular factors that influence skin cancer education.
Knowledge of the clinical features of melanoma was assessed by asking participants to correctly identify at least 5 of 6 pigmented lesions as concerning or not concerning for melanoma. Confidence in performing the SCE—the primary outcome—was measured by dichotomizing a 4-point Likert-type scale (“very confident” and “moderately confident” against “slightly confident” and “not at all confident”).
Logistic regression was used to examine curricular factors associated with confidence; descriptive statistics were used for remaining analyses. Analyses were performed using SAS 9.4 statistical software. Prior to analysis, responses from the University of South Carolina School of Medicine Greenville were excluded because the response rate was less than 20%.
The survey was distributed to 1524 students; 619 (40.6%) answered at least 1 question, with a variable response rate to each item (eTable 1). Most respondents were female (351 [56.7%]); 438 (70.8%) were White.
Most respondents said that they received 3 hours or less of general skin cancer (74.9%) or SCE-specific (93.0%) education by the end of their fourth year of medical training. Lecture was the most common method of instruction. Education was provided most often by dermatologists (48.6%), followed by general practice physicians (21.2%). Numerous (26.9%) fourth-year respondents reported that they had never observed SCE; even more (47.6%) had never performed SCE. Almost half of second- and third-year students (43.2% and 44.8%, respectively) considered themselves knowledgeable about the clinical features of melanoma, but only 31.9% of fourth-year students considered themselves knowledgeable.
Only 24.1% of fourth-year students reported confidence performing SCE (eTable 1). Students who received most of their instruction through real clinical encounters were 4.14 times more likely to be confident performing SCE than students who had been given lecture-based learning. Students who performed 1 to 3 SCE or 4 or more SCE were 3.02 and 32.25 times, respectively, more likely to be confident than students who had never performed SCE (eTable 2).
Consistent with a recent study,6 our results reflect the discrepancy between the burden and education of skin cancer. This is especially demonstrated by our cohort’s low confidence in performing SCE, a metric associated with both intention to perform and actual performance of SCE in practice.4,5 We also observed a downward trend in knowledge among students who were about to enter residency, potentially indicating the need for longitudinal training.
Given curricular time constraints, it is essential that medical schools implement changes in learning that will have the greatest impact. Although our results strongly support the efficacy of hands-on clinical training, exposure to dermatology in the second half of medical school training is limited nationwide.6 Concentrated efforts to increase clinical exposure might help prepare future physicians in all specialties to combat the burden of this disease.
Limitations of our study include the potential for selection and recall biases. Although our survey spanned multiple institutions in different regions of the United States, results might not be universally representative.
Acknowledgments—We thank Dirk Elston, MD, and Amy Wahlquist, MS (both from Charleston, South Carolina), who helped facilitate the survey on which our research is based. We also acknowledge the assistance of Philip Carmon, MD (Columbia, South Carolina); Julie Flugel (Columbia, South Carolina); Algimantas Simpson, MD (Columbia, South Carolina); Nathan Jasperse, MD (Irvine, California); Jeremy Teruel, MD (Charleston, South Carolina); Alan Snyder, MD, MSCR (Charleston, South Carolina); John Bosland (Charleston, South Carolina); and Daniel Spangler (Greenville, South Carolina).
To the Editor:
Skin cancer represents a notable health care burden of rising incidence.1-3 Nondermatologist health care providers play a key role in skin cancer screening through the use of skin cancer examination (SCE)1,4; however, several factors including poor diagnostic accuracy, low confidence, and lack of training have contributed to limited use of the SCE by these providers.4,5 Therefore, it is important to identify and implement changes in the medical school curriculum that can facilitate improved use of SCE in clinical practice. We sought to examine factors in the medical school curriculum that influence skin cancer education.
A voluntary electronic survey was distributed through class email and social media to all medical student classes at 4 medical schools (Figure). Responses were collected between March 2 and April 20, 2020. Survey items assessed demographics and curricular factors that influence skin cancer education.
Knowledge of the clinical features of melanoma was assessed by asking participants to correctly identify at least 5 of 6 pigmented lesions as concerning or not concerning for melanoma. Confidence in performing the SCE—the primary outcome—was measured by dichotomizing a 4-point Likert-type scale (“very confident” and “moderately confident” against “slightly confident” and “not at all confident”).
Logistic regression was used to examine curricular factors associated with confidence; descriptive statistics were used for remaining analyses. Analyses were performed using SAS 9.4 statistical software. Prior to analysis, responses from the University of South Carolina School of Medicine Greenville were excluded because the response rate was less than 20%.
The survey was distributed to 1524 students; 619 (40.6%) answered at least 1 question, with a variable response rate to each item (eTable 1). Most respondents were female (351 [56.7%]); 438 (70.8%) were White.
Most respondents said that they received 3 hours or less of general skin cancer (74.9%) or SCE-specific (93.0%) education by the end of their fourth year of medical training. Lecture was the most common method of instruction. Education was provided most often by dermatologists (48.6%), followed by general practice physicians (21.2%). Numerous (26.9%) fourth-year respondents reported that they had never observed SCE; even more (47.6%) had never performed SCE. Almost half of second- and third-year students (43.2% and 44.8%, respectively) considered themselves knowledgeable about the clinical features of melanoma, but only 31.9% of fourth-year students considered themselves knowledgeable.
Only 24.1% of fourth-year students reported confidence performing SCE (eTable 1). Students who received most of their instruction through real clinical encounters were 4.14 times more likely to be confident performing SCE than students who had been given lecture-based learning. Students who performed 1 to 3 SCE or 4 or more SCE were 3.02 and 32.25 times, respectively, more likely to be confident than students who had never performed SCE (eTable 2).
Consistent with a recent study,6 our results reflect the discrepancy between the burden and education of skin cancer. This is especially demonstrated by our cohort’s low confidence in performing SCE, a metric associated with both intention to perform and actual performance of SCE in practice.4,5 We also observed a downward trend in knowledge among students who were about to enter residency, potentially indicating the need for longitudinal training.
Given curricular time constraints, it is essential that medical schools implement changes in learning that will have the greatest impact. Although our results strongly support the efficacy of hands-on clinical training, exposure to dermatology in the second half of medical school training is limited nationwide.6 Concentrated efforts to increase clinical exposure might help prepare future physicians in all specialties to combat the burden of this disease.
Limitations of our study include the potential for selection and recall biases. Although our survey spanned multiple institutions in different regions of the United States, results might not be universally representative.
Acknowledgments—We thank Dirk Elston, MD, and Amy Wahlquist, MS (both from Charleston, South Carolina), who helped facilitate the survey on which our research is based. We also acknowledge the assistance of Philip Carmon, MD (Columbia, South Carolina); Julie Flugel (Columbia, South Carolina); Algimantas Simpson, MD (Columbia, South Carolina); Nathan Jasperse, MD (Irvine, California); Jeremy Teruel, MD (Charleston, South Carolina); Alan Snyder, MD, MSCR (Charleston, South Carolina); John Bosland (Charleston, South Carolina); and Daniel Spangler (Greenville, South Carolina).
- Guy GP Jr, Machlin SR, Ekwueme DU, et al. Prevalence and costs of skin cancer treatment in the U.S., 2002–2006 and 2007-2011. Am J Prev Med. 2015;48:183-187. doi:10.1016/j.amepre.2014.08.036
- Paulson KG, Gupta D, Kim TS, et al. Age-specific incidence of melanoma in the United States. JAMA Dermatol. 2020;156:57-64. doi:10.1001/jamadermatol.2019.3353
- Lim HW, Collins SAB, Resneck JS Jr, et al. Contribution of health care factors to the burden of skin disease in the United States. J Am Acad Dermatol. 2017;76:1151-1160.e21. doi:10.1016/j.jaad.2017.03.006
- Garg A, Wang J, Reddy SB, et al; Integrated Skin Exam Consortium. Curricular factors associated with medical students’ practice of the skin cancer examination: an educational enhancement initiative by the Integrated Skin Exam Consortium. JAMA Dermatol. 2014;150:850-855. doi:10.1001/jamadermatol.2013.8723
- Oliveria SA, Heneghan MK, Cushman LF, et al. Skin cancer screening by dermatologists, family practitioners, and internists: barriers and facilitating factors. Arch Dermatol. 2011;147:39-44. doi:10.1001/archdermatol.2010.414
- Cahn BA, Harper HE, Halverstam CP, et al. Current status of dermatologic education in US medical schools. JAMA Dermatol. 2020;156:468-470. doi:10.1001/jamadermatol.2020.0006
- Guy GP Jr, Machlin SR, Ekwueme DU, et al. Prevalence and costs of skin cancer treatment in the U.S., 2002–2006 and 2007-2011. Am J Prev Med. 2015;48:183-187. doi:10.1016/j.amepre.2014.08.036
- Paulson KG, Gupta D, Kim TS, et al. Age-specific incidence of melanoma in the United States. JAMA Dermatol. 2020;156:57-64. doi:10.1001/jamadermatol.2019.3353
- Lim HW, Collins SAB, Resneck JS Jr, et al. Contribution of health care factors to the burden of skin disease in the United States. J Am Acad Dermatol. 2017;76:1151-1160.e21. doi:10.1016/j.jaad.2017.03.006
- Garg A, Wang J, Reddy SB, et al; Integrated Skin Exam Consortium. Curricular factors associated with medical students’ practice of the skin cancer examination: an educational enhancement initiative by the Integrated Skin Exam Consortium. JAMA Dermatol. 2014;150:850-855. doi:10.1001/jamadermatol.2013.8723
- Oliveria SA, Heneghan MK, Cushman LF, et al. Skin cancer screening by dermatologists, family practitioners, and internists: barriers and facilitating factors. Arch Dermatol. 2011;147:39-44. doi:10.1001/archdermatol.2010.414
- Cahn BA, Harper HE, Halverstam CP, et al. Current status of dermatologic education in US medical schools. JAMA Dermatol. 2020;156:468-470. doi:10.1001/jamadermatol.2020.0006
Practice Points
- Nondermatologist practitioners play a notable role in mitigating the health care burden of skin cancer by screening with the skin cancer examination.
- Exposure to the skin cancer examination should occur during medical school prior to graduates’ entering diverse specialties.
- Most medical students received relatively few hours of skin cancer education, and many never performed or even observed a skin cancer examination prior to graduating medical school.
- Increasing hands-on training and clinical exposure during medical school is imperative to adequately prepare future physicians.
Aquatic Antagonists: Marine Rashes (Seabather’s Eruption and Diver’s Dermatitis)
Background and Clinical Presentation
Seabather’s Eruption—Seabather’s eruption is a type I and IV hypersensitivity reaction caused by nematocysts of larval-stage thimble jellyfish (Linuche unguiculata), sea anemones (eg, Edwardsiella lineata), and larval cnidarians.1Linuche unguiculata commonly is found along the southeast coast of the United States and in the Caribbean, the Gulf of Mexico, and the coasts of Florida; less commonly, it has been reported along the coasts of Brazil and Papua New Guinea. Edwardsiella lineata more commonly is seen along the East Coast of the United States.2 Seabather’s eruption presents as numerous scattered, pruritic, red macules and papules (measuring 1 mm to 1.5 cm in size) distributed in areas covered by skin folds, wet clothing, or hair following exposure to marine water (Figure 1). This maculopapular rash generally appears shortly after exiting the water and can last up to several weeks in some cases.3 The cause for this delayed presentation is that the marine organisms become entrapped between the skin of the human contact and another object (eg, swimwear) but do not release their preformed antivenom until they are exposed to air after removal from the water, at which point the organisms die and cell lysis results in injection of the venom.
Diver’s Dermatitis—Diver’s dermatitis (also referred to as “swimmer’s itch”) is a type I and IV hypersensitivity reaction caused by schistosome cercariae released by aquatic snails.4 There are several different cercarial species known to be capable of causing diver dermatitis, but the most commonly implicated genera are Trichobilharzia and Gigantobilharzia. These parasites most commonly are found in freshwater lakes but also occur in oceans, particularly in brackish areas adjacent to freshwater access. Factors associated with increased concentrations of these parasites include shallow, slow-moving water and prolonged onshore wind causing accumulation near the shoreline. It also is thought that the snail host will shed greater concentrations of the parasitic worm in the morning hours and after prolonged exposure to sunlight.4 These flatworm trematodes have a 2-host life cycle. The snails function as intermediate hosts for the parasites before they enter their final host, which are birds. Humans only function as incidental and nonviable hosts for these worms. The parasites gain access to the human body by burrowing into exposed skin. Because the parasite is unable to survive on human hosts, it dies shortly after penetrating the skin, which leads to an intense inflammatory response causing symptoms of pruritus within hours of exposure (Figure 2). The initial eruption progresses over a few days into a diffuse, maculopapular, pruritic rash, similar to that seen in seabather’s eruption. This rash then regresses completely in 1 to 3 weeks. Subsequent exposure to the same parasite is associated with increased severity of future rashes, likely due to antibody-mediated sensitization.4
Diagnosis—Marine-derived dermatoses from various sources can present very similarly; thus, it is difficult to discern the specific etiology behind the clinical presentation. No commonly utilized imaging modalities can differentiate between seabather’s eruption and diver’s dermatitis, but eliciting a thorough patient history often can aid in differentiation of the cause of the eruption. For example, lesions located only on nonexposed areas of the skin increases the likelihood of seabather’s eruption due to nematocysts being trapped between clothing and the skin. In contrast, diver’s dermatitis generally appears on areas of the skin that were directly exposed to water and uncovered by clothing.5 Patient reports of a lack of symptoms until shortly after exiting the water further support a diagnosis of seabather’s eruption, as this delayed presentation of symptoms is caused by lysis of the culprit organisms following removal from the marine environment. The cell lysis is responsible for the widespread injection of preformed venom via the numerous nematocysts trapped between clothing and the patient’s body.1
Treatment
For both conditions, the symptoms are treated with hydrocortisone or other topical steroid solutions in conjunction with oral hydroxyzine. Alternative treatments include calamine lotion with 1% menthol and nonsteroidal anti-inflammatory drugs. Taking baths with oatmeal, Epsom salts, or baking soda also may alleviate some of the pruritic symptoms.2
Prevention
The ability to diagnose the precise cause of these similar marine rashes can bring peace of mind to both patients and physicians regardless of their similar management strategies. Severe contact dermatitis of unknown etiology can be disconcerting for patients. Additionally, documenting the causes of marine rashes in particular geographic locations can be beneficial for establishing which organisms are most likely to affect visitors to those areas. This type of data collection can be utilized to develop preventative recommendations, such as deciding when to avoid the water. Education of the public can be done with the use of informational posters located near popular swimming areas and online public service announcements. Informing the general public about the dangers of entering the ocean, especially during certain times of the year when nematocyst-equipped sea creatures are in abundance, could serve to prevent numerous cases of seabather’s eruption. Likewise, advising against immersion in shallow, slow-moving water during the morning hours or after prolonged sun exposure in trematode-endemic areas could prevent numerous cases of diver’s dermatitis. Basic information on what to expect if afflicted by a marine rash also may reduce the number of emergency department visits for these conditions, thus providing economic benefit for patients and for hospitals since patients would better know how to acutely treat these rashes and lessen the patient load at hospital emergency departments. If individuals can assure themselves of the self-limited nature of these types of dermatoses, they may be less inclined to seek medical consultation.
Final Thoughts
As the climate continues to change, the incidence of marine rashes such as seabather’s eruption and diver’s dermatitis is expected to increase due to warmer surface temperatures causing more frequent and earlier blooms of L unguiculata and E lineata. Cases of diver’s dermatitis also could increase due to a longer season of more frequent human exposure from an increase in warmer temperatures. The projected uptick in incidences of these marine rashes makes understanding these pathologies even more pertinent for physicians.6 Increasing our understanding of the different types of marine rashes and their causes will help guide future recommendations for the general public when visiting the ocean.
Future research may wish to investigate unique ways in which to prevent contact between these organisms and humans. Past research on mice indicated that topical application of DEET (N,N-diethyl-meta-toluamide) prior to trematode exposure prevented penetration of the skin by parasitic worms.7 Future studies are needed to examine the effectiveness of this preventative technique on humans. For now, dermatologists may counsel our ocean-going patients on preventative behaviors as well as provide reassurance and symptomatic relief when they present to our clinics with marine rashes.
- Parrish DO. Seabather’s eruption or diver’s dermatitis? JAMA. 1993;270:2300-2301. doi:10.1001/jama.1993.03510190054021
- Tomchik RS, Russell MT, Szmant AM, et al. Clinical perspectives on seabather’s eruption, also known as ‘sea lice’. JAMA. 1993;269:1669-1672. doi:10.1001/jama.1993.03500130083037
- Bonamonte D, Filoni A, Verni P, et al. Dermatitis caused by algae and Bryozoans. In: Bonamonte D, Angelini G, eds. Aquatic Dermatology: Biotic, Chemical, and Physical Agents. Springer; 2016:127-137.
- Tracz ES, Al-Jubury A, Buchmann K, et al. Outbreak of swimmer’s itch in Denmark. Acta Derm Venereol. 2019;99:1116-1120. doi:10.2340/00015555-3309
- Freudenthal AR, Joseph PR. Seabather’s eruption. N Engl J Med. 1993;329:542-544. doi:10.1056/NEJM199308193290805
- Kaffenberger BH, Shetlar D, Norton SA, et al. The effect of climate change on skin disease in North America. JAAD. 2016;76:140-147. doi:10.1016/j.jaad.2016.08.014
- Salafsky B, Ramaswamy K, He YX, et al. Development and evaluation of LIPODEET, a new long-acting formulation of N, N-diethyl-m-toluamide (DEET) for the prevention of schistosomiasis. Am J Trop Med Hyg. 1999;61:743-750. doi:10.4269/ajtmh.1999.61.743
Background and Clinical Presentation
Seabather’s Eruption—Seabather’s eruption is a type I and IV hypersensitivity reaction caused by nematocysts of larval-stage thimble jellyfish (Linuche unguiculata), sea anemones (eg, Edwardsiella lineata), and larval cnidarians.1Linuche unguiculata commonly is found along the southeast coast of the United States and in the Caribbean, the Gulf of Mexico, and the coasts of Florida; less commonly, it has been reported along the coasts of Brazil and Papua New Guinea. Edwardsiella lineata more commonly is seen along the East Coast of the United States.2 Seabather’s eruption presents as numerous scattered, pruritic, red macules and papules (measuring 1 mm to 1.5 cm in size) distributed in areas covered by skin folds, wet clothing, or hair following exposure to marine water (Figure 1). This maculopapular rash generally appears shortly after exiting the water and can last up to several weeks in some cases.3 The cause for this delayed presentation is that the marine organisms become entrapped between the skin of the human contact and another object (eg, swimwear) but do not release their preformed antivenom until they are exposed to air after removal from the water, at which point the organisms die and cell lysis results in injection of the venom.
Diver’s Dermatitis—Diver’s dermatitis (also referred to as “swimmer’s itch”) is a type I and IV hypersensitivity reaction caused by schistosome cercariae released by aquatic snails.4 There are several different cercarial species known to be capable of causing diver dermatitis, but the most commonly implicated genera are Trichobilharzia and Gigantobilharzia. These parasites most commonly are found in freshwater lakes but also occur in oceans, particularly in brackish areas adjacent to freshwater access. Factors associated with increased concentrations of these parasites include shallow, slow-moving water and prolonged onshore wind causing accumulation near the shoreline. It also is thought that the snail host will shed greater concentrations of the parasitic worm in the morning hours and after prolonged exposure to sunlight.4 These flatworm trematodes have a 2-host life cycle. The snails function as intermediate hosts for the parasites before they enter their final host, which are birds. Humans only function as incidental and nonviable hosts for these worms. The parasites gain access to the human body by burrowing into exposed skin. Because the parasite is unable to survive on human hosts, it dies shortly after penetrating the skin, which leads to an intense inflammatory response causing symptoms of pruritus within hours of exposure (Figure 2). The initial eruption progresses over a few days into a diffuse, maculopapular, pruritic rash, similar to that seen in seabather’s eruption. This rash then regresses completely in 1 to 3 weeks. Subsequent exposure to the same parasite is associated with increased severity of future rashes, likely due to antibody-mediated sensitization.4
Diagnosis—Marine-derived dermatoses from various sources can present very similarly; thus, it is difficult to discern the specific etiology behind the clinical presentation. No commonly utilized imaging modalities can differentiate between seabather’s eruption and diver’s dermatitis, but eliciting a thorough patient history often can aid in differentiation of the cause of the eruption. For example, lesions located only on nonexposed areas of the skin increases the likelihood of seabather’s eruption due to nematocysts being trapped between clothing and the skin. In contrast, diver’s dermatitis generally appears on areas of the skin that were directly exposed to water and uncovered by clothing.5 Patient reports of a lack of symptoms until shortly after exiting the water further support a diagnosis of seabather’s eruption, as this delayed presentation of symptoms is caused by lysis of the culprit organisms following removal from the marine environment. The cell lysis is responsible for the widespread injection of preformed venom via the numerous nematocysts trapped between clothing and the patient’s body.1
Treatment
For both conditions, the symptoms are treated with hydrocortisone or other topical steroid solutions in conjunction with oral hydroxyzine. Alternative treatments include calamine lotion with 1% menthol and nonsteroidal anti-inflammatory drugs. Taking baths with oatmeal, Epsom salts, or baking soda also may alleviate some of the pruritic symptoms.2
Prevention
The ability to diagnose the precise cause of these similar marine rashes can bring peace of mind to both patients and physicians regardless of their similar management strategies. Severe contact dermatitis of unknown etiology can be disconcerting for patients. Additionally, documenting the causes of marine rashes in particular geographic locations can be beneficial for establishing which organisms are most likely to affect visitors to those areas. This type of data collection can be utilized to develop preventative recommendations, such as deciding when to avoid the water. Education of the public can be done with the use of informational posters located near popular swimming areas and online public service announcements. Informing the general public about the dangers of entering the ocean, especially during certain times of the year when nematocyst-equipped sea creatures are in abundance, could serve to prevent numerous cases of seabather’s eruption. Likewise, advising against immersion in shallow, slow-moving water during the morning hours or after prolonged sun exposure in trematode-endemic areas could prevent numerous cases of diver’s dermatitis. Basic information on what to expect if afflicted by a marine rash also may reduce the number of emergency department visits for these conditions, thus providing economic benefit for patients and for hospitals since patients would better know how to acutely treat these rashes and lessen the patient load at hospital emergency departments. If individuals can assure themselves of the self-limited nature of these types of dermatoses, they may be less inclined to seek medical consultation.
Final Thoughts
As the climate continues to change, the incidence of marine rashes such as seabather’s eruption and diver’s dermatitis is expected to increase due to warmer surface temperatures causing more frequent and earlier blooms of L unguiculata and E lineata. Cases of diver’s dermatitis also could increase due to a longer season of more frequent human exposure from an increase in warmer temperatures. The projected uptick in incidences of these marine rashes makes understanding these pathologies even more pertinent for physicians.6 Increasing our understanding of the different types of marine rashes and their causes will help guide future recommendations for the general public when visiting the ocean.
Future research may wish to investigate unique ways in which to prevent contact between these organisms and humans. Past research on mice indicated that topical application of DEET (N,N-diethyl-meta-toluamide) prior to trematode exposure prevented penetration of the skin by parasitic worms.7 Future studies are needed to examine the effectiveness of this preventative technique on humans. For now, dermatologists may counsel our ocean-going patients on preventative behaviors as well as provide reassurance and symptomatic relief when they present to our clinics with marine rashes.
Background and Clinical Presentation
Seabather’s Eruption—Seabather’s eruption is a type I and IV hypersensitivity reaction caused by nematocysts of larval-stage thimble jellyfish (Linuche unguiculata), sea anemones (eg, Edwardsiella lineata), and larval cnidarians.1Linuche unguiculata commonly is found along the southeast coast of the United States and in the Caribbean, the Gulf of Mexico, and the coasts of Florida; less commonly, it has been reported along the coasts of Brazil and Papua New Guinea. Edwardsiella lineata more commonly is seen along the East Coast of the United States.2 Seabather’s eruption presents as numerous scattered, pruritic, red macules and papules (measuring 1 mm to 1.5 cm in size) distributed in areas covered by skin folds, wet clothing, or hair following exposure to marine water (Figure 1). This maculopapular rash generally appears shortly after exiting the water and can last up to several weeks in some cases.3 The cause for this delayed presentation is that the marine organisms become entrapped between the skin of the human contact and another object (eg, swimwear) but do not release their preformed antivenom until they are exposed to air after removal from the water, at which point the organisms die and cell lysis results in injection of the venom.
Diver’s Dermatitis—Diver’s dermatitis (also referred to as “swimmer’s itch”) is a type I and IV hypersensitivity reaction caused by schistosome cercariae released by aquatic snails.4 There are several different cercarial species known to be capable of causing diver dermatitis, but the most commonly implicated genera are Trichobilharzia and Gigantobilharzia. These parasites most commonly are found in freshwater lakes but also occur in oceans, particularly in brackish areas adjacent to freshwater access. Factors associated with increased concentrations of these parasites include shallow, slow-moving water and prolonged onshore wind causing accumulation near the shoreline. It also is thought that the snail host will shed greater concentrations of the parasitic worm in the morning hours and after prolonged exposure to sunlight.4 These flatworm trematodes have a 2-host life cycle. The snails function as intermediate hosts for the parasites before they enter their final host, which are birds. Humans only function as incidental and nonviable hosts for these worms. The parasites gain access to the human body by burrowing into exposed skin. Because the parasite is unable to survive on human hosts, it dies shortly after penetrating the skin, which leads to an intense inflammatory response causing symptoms of pruritus within hours of exposure (Figure 2). The initial eruption progresses over a few days into a diffuse, maculopapular, pruritic rash, similar to that seen in seabather’s eruption. This rash then regresses completely in 1 to 3 weeks. Subsequent exposure to the same parasite is associated with increased severity of future rashes, likely due to antibody-mediated sensitization.4
Diagnosis—Marine-derived dermatoses from various sources can present very similarly; thus, it is difficult to discern the specific etiology behind the clinical presentation. No commonly utilized imaging modalities can differentiate between seabather’s eruption and diver’s dermatitis, but eliciting a thorough patient history often can aid in differentiation of the cause of the eruption. For example, lesions located only on nonexposed areas of the skin increases the likelihood of seabather’s eruption due to nematocysts being trapped between clothing and the skin. In contrast, diver’s dermatitis generally appears on areas of the skin that were directly exposed to water and uncovered by clothing.5 Patient reports of a lack of symptoms until shortly after exiting the water further support a diagnosis of seabather’s eruption, as this delayed presentation of symptoms is caused by lysis of the culprit organisms following removal from the marine environment. The cell lysis is responsible for the widespread injection of preformed venom via the numerous nematocysts trapped between clothing and the patient’s body.1
Treatment
For both conditions, the symptoms are treated with hydrocortisone or other topical steroid solutions in conjunction with oral hydroxyzine. Alternative treatments include calamine lotion with 1% menthol and nonsteroidal anti-inflammatory drugs. Taking baths with oatmeal, Epsom salts, or baking soda also may alleviate some of the pruritic symptoms.2
Prevention
The ability to diagnose the precise cause of these similar marine rashes can bring peace of mind to both patients and physicians regardless of their similar management strategies. Severe contact dermatitis of unknown etiology can be disconcerting for patients. Additionally, documenting the causes of marine rashes in particular geographic locations can be beneficial for establishing which organisms are most likely to affect visitors to those areas. This type of data collection can be utilized to develop preventative recommendations, such as deciding when to avoid the water. Education of the public can be done with the use of informational posters located near popular swimming areas and online public service announcements. Informing the general public about the dangers of entering the ocean, especially during certain times of the year when nematocyst-equipped sea creatures are in abundance, could serve to prevent numerous cases of seabather’s eruption. Likewise, advising against immersion in shallow, slow-moving water during the morning hours or after prolonged sun exposure in trematode-endemic areas could prevent numerous cases of diver’s dermatitis. Basic information on what to expect if afflicted by a marine rash also may reduce the number of emergency department visits for these conditions, thus providing economic benefit for patients and for hospitals since patients would better know how to acutely treat these rashes and lessen the patient load at hospital emergency departments. If individuals can assure themselves of the self-limited nature of these types of dermatoses, they may be less inclined to seek medical consultation.
Final Thoughts
As the climate continues to change, the incidence of marine rashes such as seabather’s eruption and diver’s dermatitis is expected to increase due to warmer surface temperatures causing more frequent and earlier blooms of L unguiculata and E lineata. Cases of diver’s dermatitis also could increase due to a longer season of more frequent human exposure from an increase in warmer temperatures. The projected uptick in incidences of these marine rashes makes understanding these pathologies even more pertinent for physicians.6 Increasing our understanding of the different types of marine rashes and their causes will help guide future recommendations for the general public when visiting the ocean.
Future research may wish to investigate unique ways in which to prevent contact between these organisms and humans. Past research on mice indicated that topical application of DEET (N,N-diethyl-meta-toluamide) prior to trematode exposure prevented penetration of the skin by parasitic worms.7 Future studies are needed to examine the effectiveness of this preventative technique on humans. For now, dermatologists may counsel our ocean-going patients on preventative behaviors as well as provide reassurance and symptomatic relief when they present to our clinics with marine rashes.
- Parrish DO. Seabather’s eruption or diver’s dermatitis? JAMA. 1993;270:2300-2301. doi:10.1001/jama.1993.03510190054021
- Tomchik RS, Russell MT, Szmant AM, et al. Clinical perspectives on seabather’s eruption, also known as ‘sea lice’. JAMA. 1993;269:1669-1672. doi:10.1001/jama.1993.03500130083037
- Bonamonte D, Filoni A, Verni P, et al. Dermatitis caused by algae and Bryozoans. In: Bonamonte D, Angelini G, eds. Aquatic Dermatology: Biotic, Chemical, and Physical Agents. Springer; 2016:127-137.
- Tracz ES, Al-Jubury A, Buchmann K, et al. Outbreak of swimmer’s itch in Denmark. Acta Derm Venereol. 2019;99:1116-1120. doi:10.2340/00015555-3309
- Freudenthal AR, Joseph PR. Seabather’s eruption. N Engl J Med. 1993;329:542-544. doi:10.1056/NEJM199308193290805
- Kaffenberger BH, Shetlar D, Norton SA, et al. The effect of climate change on skin disease in North America. JAAD. 2016;76:140-147. doi:10.1016/j.jaad.2016.08.014
- Salafsky B, Ramaswamy K, He YX, et al. Development and evaluation of LIPODEET, a new long-acting formulation of N, N-diethyl-m-toluamide (DEET) for the prevention of schistosomiasis. Am J Trop Med Hyg. 1999;61:743-750. doi:10.4269/ajtmh.1999.61.743
- Parrish DO. Seabather’s eruption or diver’s dermatitis? JAMA. 1993;270:2300-2301. doi:10.1001/jama.1993.03510190054021
- Tomchik RS, Russell MT, Szmant AM, et al. Clinical perspectives on seabather’s eruption, also known as ‘sea lice’. JAMA. 1993;269:1669-1672. doi:10.1001/jama.1993.03500130083037
- Bonamonte D, Filoni A, Verni P, et al. Dermatitis caused by algae and Bryozoans. In: Bonamonte D, Angelini G, eds. Aquatic Dermatology: Biotic, Chemical, and Physical Agents. Springer; 2016:127-137.
- Tracz ES, Al-Jubury A, Buchmann K, et al. Outbreak of swimmer’s itch in Denmark. Acta Derm Venereol. 2019;99:1116-1120. doi:10.2340/00015555-3309
- Freudenthal AR, Joseph PR. Seabather’s eruption. N Engl J Med. 1993;329:542-544. doi:10.1056/NEJM199308193290805
- Kaffenberger BH, Shetlar D, Norton SA, et al. The effect of climate change on skin disease in North America. JAAD. 2016;76:140-147. doi:10.1016/j.jaad.2016.08.014
- Salafsky B, Ramaswamy K, He YX, et al. Development and evaluation of LIPODEET, a new long-acting formulation of N, N-diethyl-m-toluamide (DEET) for the prevention of schistosomiasis. Am J Trop Med Hyg. 1999;61:743-750. doi:10.4269/ajtmh.1999.61.743
Practice Points
- Seabather’s eruption and diver’s dermatitis have similar clinical presentations but differ in the ways that organisms come in contact with the skin.
- No commonly utilized imaging modality can differentiate between seabather’s eruption and diver’s dermatitis, but eliciting a thorough history often can aid in differentiating these marine rashes.
- Physicians should understand the pathologies of common marine rashes due to a projected uptick in the number of cases related to climate change.
Creatinine variability linked to liver transplant outcomes
Patients with greater changes in serum creatinine are more likely to have worse pre- and post–liver transplant outcomes. Moreover, underserved patients may be most frequently affected, according to a retrospective analysis of UNOS (United Network for Organ Sharing) data.
These results should drive further development of serum creatinine coefficient of variation (sCr CoV) as an independent predictor of renal-related mortality risk, according to lead author Giuseppe Cullaro, MD, of the University of California, San Francisco, and colleagues.
“Intra-individual clinical and laboratory parameter dynamics often provide additional prognostic information – added information that goes beyond what can be found with cross-sectional data,” the researchers wrote in Hepatology. “This finding has been seen in several scenarios in the general population – intra-individual variability in blood pressure, weight, hemoglobin, and kidney function, have all been associated with worse clinical outcomes. However, in cirrhosis patients, and more specifically in patients awaiting a liver transplant, kidney function dynamics as a predictor of clinical outcomes has yet to be investigated.”
To gauge the predictive power of shifting kidney values, Dr. Cullaro and colleagues analyzed UNOS/OPTN (Organ Procurement and Transplantation Network) registry data from 2011 through 2019. Exclusion criteria included patients who were aged younger than 18 years, were listed as status 1, received a living donor liver transplantation, were on hemodialysis, or had fewer than three updates. The final dataset included 25,204 patients.
After the researchers sorted patients into low, intermediate, and high sCr CoV tertiles, they used logistic regression to determine relationships between higher sCr and a variety of covariates, such as age, sex, diagnosis, presence of acute kidney injury, or chronic kidney disease. A competing risk regression was then done to look for associations between wait list mortality and the covariables, with liver transplant used as the competing risk.
The median sCr CoV was 17.4% (interquartile range [IQR], 10.8%-29.5%). Patients in the bottom sCr CoV tertile had a median value of 8.8% (IQR, 6.6%-10.8%), compared with 17.4% (IQR, 14.8%-20.4%) in the intermediate variability group and 36.8% (IQR, 29.5%-48.8%) in the high variability group. High variability was associated with female sex, Hispanic ethnicity, ascites, and hepatic encephalopathy as well as higher body mass index, MELDNa score, and serum creatinine.
Of note, each decreasing serum creatinine variability tertile was associated with a significantly lower rate of wait list mortality (34.7% vs. 19.6% vs. 11.7%; P < .001). The creatinine variability profiles were similarly associated with the likelihood of receiving a liver transplant (52.3% vs. 48.9% vs. 43.7%; P < .001) and posttransplant mortality (7.5% vs. 5.5% vs. 3.9%; P < .001).
A multivariate model showed that each 10% increase in sCr CoV predicted an 8% increased risk of a combined outcome comprising post–liver transplant death or post–liver transplant kidney transplant (KALT), independently of other variables (adjusted hazard ratio, 1.08; 95% confidence interval, 1.05-1.11).
“These data highlight that all fluctuations in sCr are associated with worse pre- and post–liver transplant outcomes,” the investigators concluded. “Moreover, the groups that are most underserved by sCr, specifically women, were most likely to have greater sCr CoVs. We believe our work lays the foundation for implementing the sCr CoV as an independent metric of renal-related mortality risk and may be most beneficial for those groups most underserved by sCr values alone.”
According to Brian P. Lee, MD, a hepatologist with Keck Medicine of USC and assistant professor of clinical medicine with the Keck School of Medicine of USC in Los Angeles, “this is a great study ... in an area of high need” that used “high quality data.”
Current liver allocation strategies depend on a snapshot of kidney function, but these new findings suggest that a more dynamic approach may be needed. “As a practicing liver specialist I see that creatinine numbers can fluctuate a lot. ... So which number do you use when you’re trying to calculate what a patient’s risk of death is on the wait list? This study gets toward that answer. If there is a lot of variability, these might be higher risk patients; these might be patients that we should put higher on the transplant waiting list,” said Dr. Lee.
He suggested that clinicians should account for creatinine fluctuations when considering mortality risk; however, the evidence is “not quite there yet ... in terms of changing transplant policy and allocation.” He pointed out three unanswered questions: Why are creatinine values fluctuating? How should fluctuations be scored for risk modeling? And, what impact would those risk scores have on transplant waitlist prioritization?
“I think that that’s the work that you would need to do before you could really change national transplant policy,” Dr. Lee concluded.
The study was supported by the National Institutes of Health and the UCSF Liver Center. Dr. Cullaro and another author have disclosed relationships with Mallinckrodt Pharmaceuticals and Axcella Health, respectively. Dr. Lee reported no conflicts of interest.
Patients with greater changes in serum creatinine are more likely to have worse pre- and post–liver transplant outcomes. Moreover, underserved patients may be most frequently affected, according to a retrospective analysis of UNOS (United Network for Organ Sharing) data.
These results should drive further development of serum creatinine coefficient of variation (sCr CoV) as an independent predictor of renal-related mortality risk, according to lead author Giuseppe Cullaro, MD, of the University of California, San Francisco, and colleagues.
“Intra-individual clinical and laboratory parameter dynamics often provide additional prognostic information – added information that goes beyond what can be found with cross-sectional data,” the researchers wrote in Hepatology. “This finding has been seen in several scenarios in the general population – intra-individual variability in blood pressure, weight, hemoglobin, and kidney function, have all been associated with worse clinical outcomes. However, in cirrhosis patients, and more specifically in patients awaiting a liver transplant, kidney function dynamics as a predictor of clinical outcomes has yet to be investigated.”
To gauge the predictive power of shifting kidney values, Dr. Cullaro and colleagues analyzed UNOS/OPTN (Organ Procurement and Transplantation Network) registry data from 2011 through 2019. Exclusion criteria included patients who were aged younger than 18 years, were listed as status 1, received a living donor liver transplantation, were on hemodialysis, or had fewer than three updates. The final dataset included 25,204 patients.
After the researchers sorted patients into low, intermediate, and high sCr CoV tertiles, they used logistic regression to determine relationships between higher sCr and a variety of covariates, such as age, sex, diagnosis, presence of acute kidney injury, or chronic kidney disease. A competing risk regression was then done to look for associations between wait list mortality and the covariables, with liver transplant used as the competing risk.
The median sCr CoV was 17.4% (interquartile range [IQR], 10.8%-29.5%). Patients in the bottom sCr CoV tertile had a median value of 8.8% (IQR, 6.6%-10.8%), compared with 17.4% (IQR, 14.8%-20.4%) in the intermediate variability group and 36.8% (IQR, 29.5%-48.8%) in the high variability group. High variability was associated with female sex, Hispanic ethnicity, ascites, and hepatic encephalopathy as well as higher body mass index, MELDNa score, and serum creatinine.
Of note, each decreasing serum creatinine variability tertile was associated with a significantly lower rate of wait list mortality (34.7% vs. 19.6% vs. 11.7%; P < .001). The creatinine variability profiles were similarly associated with the likelihood of receiving a liver transplant (52.3% vs. 48.9% vs. 43.7%; P < .001) and posttransplant mortality (7.5% vs. 5.5% vs. 3.9%; P < .001).
A multivariate model showed that each 10% increase in sCr CoV predicted an 8% increased risk of a combined outcome comprising post–liver transplant death or post–liver transplant kidney transplant (KALT), independently of other variables (adjusted hazard ratio, 1.08; 95% confidence interval, 1.05-1.11).
“These data highlight that all fluctuations in sCr are associated with worse pre- and post–liver transplant outcomes,” the investigators concluded. “Moreover, the groups that are most underserved by sCr, specifically women, were most likely to have greater sCr CoVs. We believe our work lays the foundation for implementing the sCr CoV as an independent metric of renal-related mortality risk and may be most beneficial for those groups most underserved by sCr values alone.”
According to Brian P. Lee, MD, a hepatologist with Keck Medicine of USC and assistant professor of clinical medicine with the Keck School of Medicine of USC in Los Angeles, “this is a great study ... in an area of high need” that used “high quality data.”
Current liver allocation strategies depend on a snapshot of kidney function, but these new findings suggest that a more dynamic approach may be needed. “As a practicing liver specialist I see that creatinine numbers can fluctuate a lot. ... So which number do you use when you’re trying to calculate what a patient’s risk of death is on the wait list? This study gets toward that answer. If there is a lot of variability, these might be higher risk patients; these might be patients that we should put higher on the transplant waiting list,” said Dr. Lee.
He suggested that clinicians should account for creatinine fluctuations when considering mortality risk; however, the evidence is “not quite there yet ... in terms of changing transplant policy and allocation.” He pointed out three unanswered questions: Why are creatinine values fluctuating? How should fluctuations be scored for risk modeling? And, what impact would those risk scores have on transplant waitlist prioritization?
“I think that that’s the work that you would need to do before you could really change national transplant policy,” Dr. Lee concluded.
The study was supported by the National Institutes of Health and the UCSF Liver Center. Dr. Cullaro and another author have disclosed relationships with Mallinckrodt Pharmaceuticals and Axcella Health, respectively. Dr. Lee reported no conflicts of interest.
Patients with greater changes in serum creatinine are more likely to have worse pre- and post–liver transplant outcomes. Moreover, underserved patients may be most frequently affected, according to a retrospective analysis of UNOS (United Network for Organ Sharing) data.
These results should drive further development of serum creatinine coefficient of variation (sCr CoV) as an independent predictor of renal-related mortality risk, according to lead author Giuseppe Cullaro, MD, of the University of California, San Francisco, and colleagues.
“Intra-individual clinical and laboratory parameter dynamics often provide additional prognostic information – added information that goes beyond what can be found with cross-sectional data,” the researchers wrote in Hepatology. “This finding has been seen in several scenarios in the general population – intra-individual variability in blood pressure, weight, hemoglobin, and kidney function, have all been associated with worse clinical outcomes. However, in cirrhosis patients, and more specifically in patients awaiting a liver transplant, kidney function dynamics as a predictor of clinical outcomes has yet to be investigated.”
To gauge the predictive power of shifting kidney values, Dr. Cullaro and colleagues analyzed UNOS/OPTN (Organ Procurement and Transplantation Network) registry data from 2011 through 2019. Exclusion criteria included patients who were aged younger than 18 years, were listed as status 1, received a living donor liver transplantation, were on hemodialysis, or had fewer than three updates. The final dataset included 25,204 patients.
After the researchers sorted patients into low, intermediate, and high sCr CoV tertiles, they used logistic regression to determine relationships between higher sCr and a variety of covariates, such as age, sex, diagnosis, presence of acute kidney injury, or chronic kidney disease. A competing risk regression was then done to look for associations between wait list mortality and the covariables, with liver transplant used as the competing risk.
The median sCr CoV was 17.4% (interquartile range [IQR], 10.8%-29.5%). Patients in the bottom sCr CoV tertile had a median value of 8.8% (IQR, 6.6%-10.8%), compared with 17.4% (IQR, 14.8%-20.4%) in the intermediate variability group and 36.8% (IQR, 29.5%-48.8%) in the high variability group. High variability was associated with female sex, Hispanic ethnicity, ascites, and hepatic encephalopathy as well as higher body mass index, MELDNa score, and serum creatinine.
Of note, each decreasing serum creatinine variability tertile was associated with a significantly lower rate of wait list mortality (34.7% vs. 19.6% vs. 11.7%; P < .001). The creatinine variability profiles were similarly associated with the likelihood of receiving a liver transplant (52.3% vs. 48.9% vs. 43.7%; P < .001) and posttransplant mortality (7.5% vs. 5.5% vs. 3.9%; P < .001).
A multivariate model showed that each 10% increase in sCr CoV predicted an 8% increased risk of a combined outcome comprising post–liver transplant death or post–liver transplant kidney transplant (KALT), independently of other variables (adjusted hazard ratio, 1.08; 95% confidence interval, 1.05-1.11).
“These data highlight that all fluctuations in sCr are associated with worse pre- and post–liver transplant outcomes,” the investigators concluded. “Moreover, the groups that are most underserved by sCr, specifically women, were most likely to have greater sCr CoVs. We believe our work lays the foundation for implementing the sCr CoV as an independent metric of renal-related mortality risk and may be most beneficial for those groups most underserved by sCr values alone.”
According to Brian P. Lee, MD, a hepatologist with Keck Medicine of USC and assistant professor of clinical medicine with the Keck School of Medicine of USC in Los Angeles, “this is a great study ... in an area of high need” that used “high quality data.”
Current liver allocation strategies depend on a snapshot of kidney function, but these new findings suggest that a more dynamic approach may be needed. “As a practicing liver specialist I see that creatinine numbers can fluctuate a lot. ... So which number do you use when you’re trying to calculate what a patient’s risk of death is on the wait list? This study gets toward that answer. If there is a lot of variability, these might be higher risk patients; these might be patients that we should put higher on the transplant waiting list,” said Dr. Lee.
He suggested that clinicians should account for creatinine fluctuations when considering mortality risk; however, the evidence is “not quite there yet ... in terms of changing transplant policy and allocation.” He pointed out three unanswered questions: Why are creatinine values fluctuating? How should fluctuations be scored for risk modeling? And, what impact would those risk scores have on transplant waitlist prioritization?
“I think that that’s the work that you would need to do before you could really change national transplant policy,” Dr. Lee concluded.
The study was supported by the National Institutes of Health and the UCSF Liver Center. Dr. Cullaro and another author have disclosed relationships with Mallinckrodt Pharmaceuticals and Axcella Health, respectively. Dr. Lee reported no conflicts of interest.
FROM HEPATOLOGY
NeoChemo preserves rectum in half of patients with rectal cancer
Among patients with stage II or stage III rectal adenocarcinoma, organ preservation is achievable in up to half of patients who undergo total neoadjuvant chemotherapy (TNT), according to the results from a new randomized phase 2 trial.
The study included 324 patients from 18 centers who were randomized into one of two groups: induction chemotherapy followed by chemoradiotherapy (INCT-CRT) or chemoradiotherapy followed by consolidation chemotherapy (CRT-CNCT). Patients in both groups then underwent either total mesorectal excision (TME) or a watch-and-wait strategy, depending on tumor response.
“What the study shows is that the order of the chemo and the radiation dose doesn’t affect survival, but it seems to affect the probability of preserving the rectum. That data is consistent with other studies that have compared head-to-head chemotherapy followed by radiation versus radiation followed by chemotherapy. In addition, the survival rate for this study is no different from other prospective studies that included patients with similar-stage tumors selected by MRI. So the data suggest that you can probably avoid surgery in half of the patients with locally advanced rectal cancer and still achieve similar survival compared to patients treated with more conventional neoadjuvant treatments and mandatory surgery,” said lead author Julio Garcia-Aguilar, MD, PhD, in an interview.
“It is a significant shift in the treatment paradigm, that can potentially benefit half of the 50,000 rectal cancer patients diagnosed every year in the United States,” said Dr. Garcia-Aguilar, chief of colorectal surgery at Memorial Sloan Kettering Cancer Center, New York.
The study was published online in the Journal of Clinical Oncology.
Neoadjuvant CRT, TME, and adjuvant chemotherapy is an effective treatment strategy for locally advanced rectal adenocarcinoma, but the regimen can cause bowel, urinary, and sexual dysfunction. The majority of adverse effects from the therapy can be traced to surgery. In addition, some patients with distal rectal cancer often require a permanent colostomy.
TNT is a newer approach that delivers chemotherapy plus radiotherapy before surgery. It is designed to improve treatment compliance and eradicate micrometastases in advance of surgery.
After a median follow-up of 3 years, disease-free survival (76% in both groups) was similar to historical controls (75%). Both groups had similar rates of local recurrence-free survival (94% each) and distant metastasis–free survival (84% for INCT-CRT and 82% for CRT-CNCT).
Following TNT, 26% of patients were recommended for TME, including 28% in the INCT-CRT group and 24% in the CRT-CNCT group, and the rest offered watchful-waiting. Forty percent of those in the INCT-CRT group and 27% in the CRT-CNCT group who went on to watchful waiting had tumor regrowth. Of these combined 75 patients, 67 underwent successful salvage surgery.
In the intention-to-treat analysis, 53% of patients had a preserved rectum at 3 years (95% confidence interval, 45%-62%) in the CRT-CNCT group versus 41% in the INCT-CRT group (95% CI, 33%-50%; P = .01).
The new results reinforce other results and should contribute to shifting clinical practice, according to Dr. Garcia-Aguilar. “I think what we have learned is that rectal cancers respond to chemotherapy and radiation at a higher rate that we thought previously, but that the response takes time. That’s something that we use currently in an adaptive way to modify the treatment as we observe the tumor response,” he said.
The slow regrowth means that patients can be closely monitored without undue risk, but such an approach demands buy-in from the patient. “The patient needs to be compliant with a close surveillance protocol, because otherwise it can be a disaster. I think that’s really part of the message,” Dr. Garcia-Aguilar said.
Dr. Garcia-Aguilar has an ownership interest in Intuitive Surgical and has advised or consulted for Medtronic, Intuitive Surgical, and Johnson & Johnson.
Among patients with stage II or stage III rectal adenocarcinoma, organ preservation is achievable in up to half of patients who undergo total neoadjuvant chemotherapy (TNT), according to the results from a new randomized phase 2 trial.
The study included 324 patients from 18 centers who were randomized into one of two groups: induction chemotherapy followed by chemoradiotherapy (INCT-CRT) or chemoradiotherapy followed by consolidation chemotherapy (CRT-CNCT). Patients in both groups then underwent either total mesorectal excision (TME) or a watch-and-wait strategy, depending on tumor response.
“What the study shows is that the order of the chemo and the radiation dose doesn’t affect survival, but it seems to affect the probability of preserving the rectum. That data is consistent with other studies that have compared head-to-head chemotherapy followed by radiation versus radiation followed by chemotherapy. In addition, the survival rate for this study is no different from other prospective studies that included patients with similar-stage tumors selected by MRI. So the data suggest that you can probably avoid surgery in half of the patients with locally advanced rectal cancer and still achieve similar survival compared to patients treated with more conventional neoadjuvant treatments and mandatory surgery,” said lead author Julio Garcia-Aguilar, MD, PhD, in an interview.
“It is a significant shift in the treatment paradigm, that can potentially benefit half of the 50,000 rectal cancer patients diagnosed every year in the United States,” said Dr. Garcia-Aguilar, chief of colorectal surgery at Memorial Sloan Kettering Cancer Center, New York.
The study was published online in the Journal of Clinical Oncology.
Neoadjuvant CRT, TME, and adjuvant chemotherapy is an effective treatment strategy for locally advanced rectal adenocarcinoma, but the regimen can cause bowel, urinary, and sexual dysfunction. The majority of adverse effects from the therapy can be traced to surgery. In addition, some patients with distal rectal cancer often require a permanent colostomy.
TNT is a newer approach that delivers chemotherapy plus radiotherapy before surgery. It is designed to improve treatment compliance and eradicate micrometastases in advance of surgery.
After a median follow-up of 3 years, disease-free survival (76% in both groups) was similar to historical controls (75%). Both groups had similar rates of local recurrence-free survival (94% each) and distant metastasis–free survival (84% for INCT-CRT and 82% for CRT-CNCT).
Following TNT, 26% of patients were recommended for TME, including 28% in the INCT-CRT group and 24% in the CRT-CNCT group, and the rest offered watchful-waiting. Forty percent of those in the INCT-CRT group and 27% in the CRT-CNCT group who went on to watchful waiting had tumor regrowth. Of these combined 75 patients, 67 underwent successful salvage surgery.
In the intention-to-treat analysis, 53% of patients had a preserved rectum at 3 years (95% confidence interval, 45%-62%) in the CRT-CNCT group versus 41% in the INCT-CRT group (95% CI, 33%-50%; P = .01).
The new results reinforce other results and should contribute to shifting clinical practice, according to Dr. Garcia-Aguilar. “I think what we have learned is that rectal cancers respond to chemotherapy and radiation at a higher rate that we thought previously, but that the response takes time. That’s something that we use currently in an adaptive way to modify the treatment as we observe the tumor response,” he said.
The slow regrowth means that patients can be closely monitored without undue risk, but such an approach demands buy-in from the patient. “The patient needs to be compliant with a close surveillance protocol, because otherwise it can be a disaster. I think that’s really part of the message,” Dr. Garcia-Aguilar said.
Dr. Garcia-Aguilar has an ownership interest in Intuitive Surgical and has advised or consulted for Medtronic, Intuitive Surgical, and Johnson & Johnson.
Among patients with stage II or stage III rectal adenocarcinoma, organ preservation is achievable in up to half of patients who undergo total neoadjuvant chemotherapy (TNT), according to the results from a new randomized phase 2 trial.
The study included 324 patients from 18 centers who were randomized into one of two groups: induction chemotherapy followed by chemoradiotherapy (INCT-CRT) or chemoradiotherapy followed by consolidation chemotherapy (CRT-CNCT). Patients in both groups then underwent either total mesorectal excision (TME) or a watch-and-wait strategy, depending on tumor response.
“What the study shows is that the order of the chemo and the radiation dose doesn’t affect survival, but it seems to affect the probability of preserving the rectum. That data is consistent with other studies that have compared head-to-head chemotherapy followed by radiation versus radiation followed by chemotherapy. In addition, the survival rate for this study is no different from other prospective studies that included patients with similar-stage tumors selected by MRI. So the data suggest that you can probably avoid surgery in half of the patients with locally advanced rectal cancer and still achieve similar survival compared to patients treated with more conventional neoadjuvant treatments and mandatory surgery,” said lead author Julio Garcia-Aguilar, MD, PhD, in an interview.
“It is a significant shift in the treatment paradigm, that can potentially benefit half of the 50,000 rectal cancer patients diagnosed every year in the United States,” said Dr. Garcia-Aguilar, chief of colorectal surgery at Memorial Sloan Kettering Cancer Center, New York.
The study was published online in the Journal of Clinical Oncology.
Neoadjuvant CRT, TME, and adjuvant chemotherapy is an effective treatment strategy for locally advanced rectal adenocarcinoma, but the regimen can cause bowel, urinary, and sexual dysfunction. The majority of adverse effects from the therapy can be traced to surgery. In addition, some patients with distal rectal cancer often require a permanent colostomy.
TNT is a newer approach that delivers chemotherapy plus radiotherapy before surgery. It is designed to improve treatment compliance and eradicate micrometastases in advance of surgery.
After a median follow-up of 3 years, disease-free survival (76% in both groups) was similar to historical controls (75%). Both groups had similar rates of local recurrence-free survival (94% each) and distant metastasis–free survival (84% for INCT-CRT and 82% for CRT-CNCT).
Following TNT, 26% of patients were recommended for TME, including 28% in the INCT-CRT group and 24% in the CRT-CNCT group, and the rest offered watchful-waiting. Forty percent of those in the INCT-CRT group and 27% in the CRT-CNCT group who went on to watchful waiting had tumor regrowth. Of these combined 75 patients, 67 underwent successful salvage surgery.
In the intention-to-treat analysis, 53% of patients had a preserved rectum at 3 years (95% confidence interval, 45%-62%) in the CRT-CNCT group versus 41% in the INCT-CRT group (95% CI, 33%-50%; P = .01).
The new results reinforce other results and should contribute to shifting clinical practice, according to Dr. Garcia-Aguilar. “I think what we have learned is that rectal cancers respond to chemotherapy and radiation at a higher rate that we thought previously, but that the response takes time. That’s something that we use currently in an adaptive way to modify the treatment as we observe the tumor response,” he said.
The slow regrowth means that patients can be closely monitored without undue risk, but such an approach demands buy-in from the patient. “The patient needs to be compliant with a close surveillance protocol, because otherwise it can be a disaster. I think that’s really part of the message,” Dr. Garcia-Aguilar said.
Dr. Garcia-Aguilar has an ownership interest in Intuitive Surgical and has advised or consulted for Medtronic, Intuitive Surgical, and Johnson & Johnson.
FROM JOURNAL OF CLINICAL ONCOLOGY