Subtle visual dysfunctions often precede early-stage psychosis

Article Type
Changed
Thu, 09/08/2022 - 15:17

Subtle subjective visual dysfunctions (VisDys) are common and are associated with poorer outcomes of patients with schizophrenia and recent-onset psychosis or who are at clinical high risk (CHR) for psychosis, new research suggests.

A multinational group of investigators found that VisDys were reported considerably more often by patients with recent-onset psychosis and CHR than by those with recent-onset depression or a group acting as healthy control participants.

In addition, vision problems of higher severity were associated with less functional remission both for patients at CHR and those with recent-onset psychosis. Among patients with CHR, VisDys was also linked to lower quality of life (QOL), higher depressiveness, and more severe impairment of visuospatial constructability.

The researchers used fMRI imaging to compare resting-state functional brain connectivity in participants with recent-onset psychosis, CHR, and recent-onset depression. They found that the occipital (ON) and frontoparietal (FPN) subnetworks were particularly implicated in VisDys.

“Subtle VisDys should be regarded as a frequent phenomenon across the psychosis spectrum, impinging negatively on patients’ current ability to function in several settings of their daily and social life, their QOL, and visuospatial abilities,” write investigators led by Johanna Schwarzer, Institute for Translational Psychiatry, University of Muenster (Germany).

“These large-sample study findings suggest that VisDys are clinically highly relevant not only in [recent-onset psychosis] but especially in CHR,” they stated.

The findings were published online in Neuropsychopharmacology.
 

Subtle, underrecognized

Unlike patients with nonpsychotic disorders, approximately 50%-60% of patients diagnosed with schizophrenia report VisDys involving brightness, motion, form, color perception, or distorted perception of their own face, the researchers reported.

These “subtle” VisDys are “often underrecognized during clinical examination, despite their clinical relevance related to suicidal ideation, cognitive impairment, or poorer treatment response,” they wrote.

Most research into these vision problems in patients with schizophrenia has focused on patients in which the illness is in a stable, chronic state – although VisDys often appear years before the diagnosis of a psychotic disorder.

Moreover, there has been little research into the neurobiological underpinnings of VisDys, specifically in early states of psychosis and/or in comparison to other disorders, such as depression.

The Personalised Prognostic Indicators for Early Psychosis Management (PRONIA) Consortium studied the psychophysiological phenomenon of VisDys in a large sample of adolescents and young adults. The sample consisted of three diagnostic groups: those with recent-onset psychosis, those with CHR, and those with recent-onset depression.

VisDys in daily life were measured using the Schizophrenia Proneness Instrument–Adult Scale (SPI-A), which assesses basic symptoms that indicate increased risk for psychosis.
 

Visual information processing

Resting-state imaging data on intrinsic brain networks were also assessed in the PRONIA sample and were analyzed across 12,720 functional connectivities between 160 regions of interest across the whole brain.

In particular, the researchers were interested in the primary networks involved in visual information processing, especially the dorsal visual stream, with further focus on the ON and FPN intrinsic subnetworks.

The ON was chosen because it comprises “primary visual processing pathways,” while the FPN is “widely suggested to modulate attention related to visual information processing at higher cognitive levels.”

The investigators used a machine-learning multivariate pattern analysis approach that “enables the consideration of multiple interactions within brain systems.”

The current study involved 721 participants from the PRONIA database, including 147 participants with recent-onset psychosis (mean age, 28.45 years; 60.5% men), 143 with CHR (mean age, 26.97 years; about 50% men), 151 with recent-onset depression (mean age, 29.13 years; 47% men), and 280 in the healthy-controls group (mean age, 28.54 years; 39.4% men).

The researchers selected 14 items to assess from the SPI-A that represented different aspects of VisDys. Severity was defined by the maximum frequency within the past 3 months – from 1 (never) to 6 (daily).

The 14 items were as follows: oversensitivity to light and/or certain visual perception objects, photopsia, micropsia/macropsia, near and tele-vision, metamorphopsia, changes in color vision, altered perception of a patient’s own face, pseudomovements of optic stimuli, diplopia or oblique vision, disturbances of the estimation of distances or sizes, disturbances of the perception of straight lines/contours, maintenance of optic stimuli “visual echoes,” partial seeing (including tubular vision), and captivation of attention by details of the visual field.

Participants also completed the Beck Depression Inventory–II scale (BDI-II), the Positive and Negative Syndrome Scale (PANSS), the Functional Remission in General Schizophrenia, and several other scales that measure global and social functioning.

Other assessments included QOL and the Rey-Osterrieth Complex Figure Test, which is a neuropsychological measurement of visuospatial constructability.
 

 

 

Specific to early-stage psychosis?

Results showed that VisDys were reported more frequently in both recent-onset psychosis and CHR groups compared with the recent-onset depression and healthy control groups (50.34% and 55.94% vs. 16.56% and 4.28%, respectively).

The investigators noted that VisDys sum scores “showed high internal consistency” (Cronbachs alpha, 0.78 over all participants).

Among those with recent-onset psychosis, a higher VisDys sum score was correlated with lower scores for functional remission (P = .036) and social functioning (P = .014).

In CHR, higher VisDys sum scores were associated with lower scores for health-related functional remission (P = .024), lower physical and psychological QOL (P = .004 and P = .015, respectively), more severe depression on the BDI-II (P = .021), and more impaired visuospatial constructability (P = .027).

Among those with recent-onset depression and their healthy peers, “no relevant correlations were found between VisDys sum scores and any parameters representing functional remission, QOL, depressiveness, or visuospatial constructability,” the researchers wrote.

A total of 135 participants with recent-onset psychosis, 128 with CHR, and 134 with recent-onset depression also underwent resting-state fMRI.

ON functional connectivity predicted presence of VisDys in patients with recent-onset psychosis and those with CHR, with a balanced accuracy of 60.17% (P = .0001) and 67.38% (P = .029), respectively. In the combined recent-onset psychosis plus CHR sample, VisDys were predicted by FPN functional connectivity (balanced accuracy, 61.1%; P = .006).

“Findings from multivariate pattern analysis support a model of functional integrity within ON and FPN driving the VisDys phenomenon and being implicated in core disease mechanisms of early psychosis states,” the investigators noted.

“The main findings from this large sample study support the idea of VisDys being specific to the psychosis spectrum already at early stages,” while being less frequently reported in recent-onset depression, they wrote. VisDys also “appeared negligible” among those without psychiatric disorders.
 

Regular assessment needed

Steven Silverstein, PhD, professor of biopsychosocial medicine and professor of psychiatry, neuroscience, and ophthalmology, Center for Visual Science, University of Rochester (N.Y.) Medical Center, called the findings “important” because “they will increase appreciation in the field of mental health for the frequency and disabling nature of visual symptoms and the need for regular assessment in routine clinical practice with people at risk for or with psychotic disorders.”

University of Rochester Medical Center
Dr. Steven Silverstein

In addition, “the brain imaging findings are providing needed information that could lead to treatments that target the brain networks generating the visual symptoms,” such as neurofeedback or brain stimulation, said Dr. Silverstein, who was not involved with the research.

The study was funded by a grant for the PRONIA Consortium. Individual researchers received funding from NARSAD Young Investigator Award of the Brain and Behavior Research Foundation, the Koeln Fortune Program/Faculty of Medicine, the University of Cologne, and the European Union’s Horizon 2020 research and innovation program. Open Access funding was enabled and organized by Projekt DEAL. Ms. Schwarzer and Dr. Silverstein reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Subtle subjective visual dysfunctions (VisDys) are common and are associated with poorer outcomes of patients with schizophrenia and recent-onset psychosis or who are at clinical high risk (CHR) for psychosis, new research suggests.

A multinational group of investigators found that VisDys were reported considerably more often by patients with recent-onset psychosis and CHR than by those with recent-onset depression or a group acting as healthy control participants.

In addition, vision problems of higher severity were associated with less functional remission both for patients at CHR and those with recent-onset psychosis. Among patients with CHR, VisDys was also linked to lower quality of life (QOL), higher depressiveness, and more severe impairment of visuospatial constructability.

The researchers used fMRI imaging to compare resting-state functional brain connectivity in participants with recent-onset psychosis, CHR, and recent-onset depression. They found that the occipital (ON) and frontoparietal (FPN) subnetworks were particularly implicated in VisDys.

“Subtle VisDys should be regarded as a frequent phenomenon across the psychosis spectrum, impinging negatively on patients’ current ability to function in several settings of their daily and social life, their QOL, and visuospatial abilities,” write investigators led by Johanna Schwarzer, Institute for Translational Psychiatry, University of Muenster (Germany).

“These large-sample study findings suggest that VisDys are clinically highly relevant not only in [recent-onset psychosis] but especially in CHR,” they stated.

The findings were published online in Neuropsychopharmacology.
 

Subtle, underrecognized

Unlike patients with nonpsychotic disorders, approximately 50%-60% of patients diagnosed with schizophrenia report VisDys involving brightness, motion, form, color perception, or distorted perception of their own face, the researchers reported.

These “subtle” VisDys are “often underrecognized during clinical examination, despite their clinical relevance related to suicidal ideation, cognitive impairment, or poorer treatment response,” they wrote.

Most research into these vision problems in patients with schizophrenia has focused on patients in which the illness is in a stable, chronic state – although VisDys often appear years before the diagnosis of a psychotic disorder.

Moreover, there has been little research into the neurobiological underpinnings of VisDys, specifically in early states of psychosis and/or in comparison to other disorders, such as depression.

The Personalised Prognostic Indicators for Early Psychosis Management (PRONIA) Consortium studied the psychophysiological phenomenon of VisDys in a large sample of adolescents and young adults. The sample consisted of three diagnostic groups: those with recent-onset psychosis, those with CHR, and those with recent-onset depression.

VisDys in daily life were measured using the Schizophrenia Proneness Instrument–Adult Scale (SPI-A), which assesses basic symptoms that indicate increased risk for psychosis.
 

Visual information processing

Resting-state imaging data on intrinsic brain networks were also assessed in the PRONIA sample and were analyzed across 12,720 functional connectivities between 160 regions of interest across the whole brain.

In particular, the researchers were interested in the primary networks involved in visual information processing, especially the dorsal visual stream, with further focus on the ON and FPN intrinsic subnetworks.

The ON was chosen because it comprises “primary visual processing pathways,” while the FPN is “widely suggested to modulate attention related to visual information processing at higher cognitive levels.”

The investigators used a machine-learning multivariate pattern analysis approach that “enables the consideration of multiple interactions within brain systems.”

The current study involved 721 participants from the PRONIA database, including 147 participants with recent-onset psychosis (mean age, 28.45 years; 60.5% men), 143 with CHR (mean age, 26.97 years; about 50% men), 151 with recent-onset depression (mean age, 29.13 years; 47% men), and 280 in the healthy-controls group (mean age, 28.54 years; 39.4% men).

The researchers selected 14 items to assess from the SPI-A that represented different aspects of VisDys. Severity was defined by the maximum frequency within the past 3 months – from 1 (never) to 6 (daily).

The 14 items were as follows: oversensitivity to light and/or certain visual perception objects, photopsia, micropsia/macropsia, near and tele-vision, metamorphopsia, changes in color vision, altered perception of a patient’s own face, pseudomovements of optic stimuli, diplopia or oblique vision, disturbances of the estimation of distances or sizes, disturbances of the perception of straight lines/contours, maintenance of optic stimuli “visual echoes,” partial seeing (including tubular vision), and captivation of attention by details of the visual field.

Participants also completed the Beck Depression Inventory–II scale (BDI-II), the Positive and Negative Syndrome Scale (PANSS), the Functional Remission in General Schizophrenia, and several other scales that measure global and social functioning.

Other assessments included QOL and the Rey-Osterrieth Complex Figure Test, which is a neuropsychological measurement of visuospatial constructability.
 

 

 

Specific to early-stage psychosis?

Results showed that VisDys were reported more frequently in both recent-onset psychosis and CHR groups compared with the recent-onset depression and healthy control groups (50.34% and 55.94% vs. 16.56% and 4.28%, respectively).

The investigators noted that VisDys sum scores “showed high internal consistency” (Cronbachs alpha, 0.78 over all participants).

Among those with recent-onset psychosis, a higher VisDys sum score was correlated with lower scores for functional remission (P = .036) and social functioning (P = .014).

In CHR, higher VisDys sum scores were associated with lower scores for health-related functional remission (P = .024), lower physical and psychological QOL (P = .004 and P = .015, respectively), more severe depression on the BDI-II (P = .021), and more impaired visuospatial constructability (P = .027).

Among those with recent-onset depression and their healthy peers, “no relevant correlations were found between VisDys sum scores and any parameters representing functional remission, QOL, depressiveness, or visuospatial constructability,” the researchers wrote.

A total of 135 participants with recent-onset psychosis, 128 with CHR, and 134 with recent-onset depression also underwent resting-state fMRI.

ON functional connectivity predicted presence of VisDys in patients with recent-onset psychosis and those with CHR, with a balanced accuracy of 60.17% (P = .0001) and 67.38% (P = .029), respectively. In the combined recent-onset psychosis plus CHR sample, VisDys were predicted by FPN functional connectivity (balanced accuracy, 61.1%; P = .006).

“Findings from multivariate pattern analysis support a model of functional integrity within ON and FPN driving the VisDys phenomenon and being implicated in core disease mechanisms of early psychosis states,” the investigators noted.

“The main findings from this large sample study support the idea of VisDys being specific to the psychosis spectrum already at early stages,” while being less frequently reported in recent-onset depression, they wrote. VisDys also “appeared negligible” among those without psychiatric disorders.
 

Regular assessment needed

Steven Silverstein, PhD, professor of biopsychosocial medicine and professor of psychiatry, neuroscience, and ophthalmology, Center for Visual Science, University of Rochester (N.Y.) Medical Center, called the findings “important” because “they will increase appreciation in the field of mental health for the frequency and disabling nature of visual symptoms and the need for regular assessment in routine clinical practice with people at risk for or with psychotic disorders.”

University of Rochester Medical Center
Dr. Steven Silverstein

In addition, “the brain imaging findings are providing needed information that could lead to treatments that target the brain networks generating the visual symptoms,” such as neurofeedback or brain stimulation, said Dr. Silverstein, who was not involved with the research.

The study was funded by a grant for the PRONIA Consortium. Individual researchers received funding from NARSAD Young Investigator Award of the Brain and Behavior Research Foundation, the Koeln Fortune Program/Faculty of Medicine, the University of Cologne, and the European Union’s Horizon 2020 research and innovation program. Open Access funding was enabled and organized by Projekt DEAL. Ms. Schwarzer and Dr. Silverstein reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Subtle subjective visual dysfunctions (VisDys) are common and are associated with poorer outcomes of patients with schizophrenia and recent-onset psychosis or who are at clinical high risk (CHR) for psychosis, new research suggests.

A multinational group of investigators found that VisDys were reported considerably more often by patients with recent-onset psychosis and CHR than by those with recent-onset depression or a group acting as healthy control participants.

In addition, vision problems of higher severity were associated with less functional remission both for patients at CHR and those with recent-onset psychosis. Among patients with CHR, VisDys was also linked to lower quality of life (QOL), higher depressiveness, and more severe impairment of visuospatial constructability.

The researchers used fMRI imaging to compare resting-state functional brain connectivity in participants with recent-onset psychosis, CHR, and recent-onset depression. They found that the occipital (ON) and frontoparietal (FPN) subnetworks were particularly implicated in VisDys.

“Subtle VisDys should be regarded as a frequent phenomenon across the psychosis spectrum, impinging negatively on patients’ current ability to function in several settings of their daily and social life, their QOL, and visuospatial abilities,” write investigators led by Johanna Schwarzer, Institute for Translational Psychiatry, University of Muenster (Germany).

“These large-sample study findings suggest that VisDys are clinically highly relevant not only in [recent-onset psychosis] but especially in CHR,” they stated.

The findings were published online in Neuropsychopharmacology.
 

Subtle, underrecognized

Unlike patients with nonpsychotic disorders, approximately 50%-60% of patients diagnosed with schizophrenia report VisDys involving brightness, motion, form, color perception, or distorted perception of their own face, the researchers reported.

These “subtle” VisDys are “often underrecognized during clinical examination, despite their clinical relevance related to suicidal ideation, cognitive impairment, or poorer treatment response,” they wrote.

Most research into these vision problems in patients with schizophrenia has focused on patients in which the illness is in a stable, chronic state – although VisDys often appear years before the diagnosis of a psychotic disorder.

Moreover, there has been little research into the neurobiological underpinnings of VisDys, specifically in early states of psychosis and/or in comparison to other disorders, such as depression.

The Personalised Prognostic Indicators for Early Psychosis Management (PRONIA) Consortium studied the psychophysiological phenomenon of VisDys in a large sample of adolescents and young adults. The sample consisted of three diagnostic groups: those with recent-onset psychosis, those with CHR, and those with recent-onset depression.

VisDys in daily life were measured using the Schizophrenia Proneness Instrument–Adult Scale (SPI-A), which assesses basic symptoms that indicate increased risk for psychosis.
 

Visual information processing

Resting-state imaging data on intrinsic brain networks were also assessed in the PRONIA sample and were analyzed across 12,720 functional connectivities between 160 regions of interest across the whole brain.

In particular, the researchers were interested in the primary networks involved in visual information processing, especially the dorsal visual stream, with further focus on the ON and FPN intrinsic subnetworks.

The ON was chosen because it comprises “primary visual processing pathways,” while the FPN is “widely suggested to modulate attention related to visual information processing at higher cognitive levels.”

The investigators used a machine-learning multivariate pattern analysis approach that “enables the consideration of multiple interactions within brain systems.”

The current study involved 721 participants from the PRONIA database, including 147 participants with recent-onset psychosis (mean age, 28.45 years; 60.5% men), 143 with CHR (mean age, 26.97 years; about 50% men), 151 with recent-onset depression (mean age, 29.13 years; 47% men), and 280 in the healthy-controls group (mean age, 28.54 years; 39.4% men).

The researchers selected 14 items to assess from the SPI-A that represented different aspects of VisDys. Severity was defined by the maximum frequency within the past 3 months – from 1 (never) to 6 (daily).

The 14 items were as follows: oversensitivity to light and/or certain visual perception objects, photopsia, micropsia/macropsia, near and tele-vision, metamorphopsia, changes in color vision, altered perception of a patient’s own face, pseudomovements of optic stimuli, diplopia or oblique vision, disturbances of the estimation of distances or sizes, disturbances of the perception of straight lines/contours, maintenance of optic stimuli “visual echoes,” partial seeing (including tubular vision), and captivation of attention by details of the visual field.

Participants also completed the Beck Depression Inventory–II scale (BDI-II), the Positive and Negative Syndrome Scale (PANSS), the Functional Remission in General Schizophrenia, and several other scales that measure global and social functioning.

Other assessments included QOL and the Rey-Osterrieth Complex Figure Test, which is a neuropsychological measurement of visuospatial constructability.
 

 

 

Specific to early-stage psychosis?

Results showed that VisDys were reported more frequently in both recent-onset psychosis and CHR groups compared with the recent-onset depression and healthy control groups (50.34% and 55.94% vs. 16.56% and 4.28%, respectively).

The investigators noted that VisDys sum scores “showed high internal consistency” (Cronbachs alpha, 0.78 over all participants).

Among those with recent-onset psychosis, a higher VisDys sum score was correlated with lower scores for functional remission (P = .036) and social functioning (P = .014).

In CHR, higher VisDys sum scores were associated with lower scores for health-related functional remission (P = .024), lower physical and psychological QOL (P = .004 and P = .015, respectively), more severe depression on the BDI-II (P = .021), and more impaired visuospatial constructability (P = .027).

Among those with recent-onset depression and their healthy peers, “no relevant correlations were found between VisDys sum scores and any parameters representing functional remission, QOL, depressiveness, or visuospatial constructability,” the researchers wrote.

A total of 135 participants with recent-onset psychosis, 128 with CHR, and 134 with recent-onset depression also underwent resting-state fMRI.

ON functional connectivity predicted presence of VisDys in patients with recent-onset psychosis and those with CHR, with a balanced accuracy of 60.17% (P = .0001) and 67.38% (P = .029), respectively. In the combined recent-onset psychosis plus CHR sample, VisDys were predicted by FPN functional connectivity (balanced accuracy, 61.1%; P = .006).

“Findings from multivariate pattern analysis support a model of functional integrity within ON and FPN driving the VisDys phenomenon and being implicated in core disease mechanisms of early psychosis states,” the investigators noted.

“The main findings from this large sample study support the idea of VisDys being specific to the psychosis spectrum already at early stages,” while being less frequently reported in recent-onset depression, they wrote. VisDys also “appeared negligible” among those without psychiatric disorders.
 

Regular assessment needed

Steven Silverstein, PhD, professor of biopsychosocial medicine and professor of psychiatry, neuroscience, and ophthalmology, Center for Visual Science, University of Rochester (N.Y.) Medical Center, called the findings “important” because “they will increase appreciation in the field of mental health for the frequency and disabling nature of visual symptoms and the need for regular assessment in routine clinical practice with people at risk for or with psychotic disorders.”

University of Rochester Medical Center
Dr. Steven Silverstein

In addition, “the brain imaging findings are providing needed information that could lead to treatments that target the brain networks generating the visual symptoms,” such as neurofeedback or brain stimulation, said Dr. Silverstein, who was not involved with the research.

The study was funded by a grant for the PRONIA Consortium. Individual researchers received funding from NARSAD Young Investigator Award of the Brain and Behavior Research Foundation, the Koeln Fortune Program/Faculty of Medicine, the University of Cologne, and the European Union’s Horizon 2020 research and innovation program. Open Access funding was enabled and organized by Projekt DEAL. Ms. Schwarzer and Dr. Silverstein reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NEUROPSYCHOPHARMACOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

AGA Clinical Practice Update: Expert review on endoscopic management for recurrent acute and chronic pancreatitis

Article Type
Changed
Tue, 09/20/2022 - 17:26

 

Endoscopy plays an integral role in the evaluation and management of patients with recurrent acute pancreatitis and chronic pancreatitis, according to a new American Gastroenterological Association clinical practice update published in Gastroenterology.

Acute pancreatitis remains the leading cause of inpatient care among gastrointestinal conditions, with about 10%-30% of patients developing recurrent acute pancreatitis, wrote co–first authors Daniel Strand, MD, from the University of Virginia Health System, Charlottesville, and Ryan J. Law, MD, from the Mayo Clinic, Rochester, Minn., and colleagues. About 35% of patients with recurrent acute pancreatitis will progress to chronic pancreatitis. Both conditions are associated with significant morbidity and mortality.

“Interventions aimed to better evaluate, mitigate the progression of, and treat symptoms related to [acute pancreatitis] and [chronic pancreatitis] are critical to improve patients’ quality of life and other long-term outcomes,” the authors of the expert review wrote.

The authors reviewed randomized controlled trials, observational studies, systematic reviews and meta-analyses, and expert consensus in the field to develop eight clinical practice advice statements.

First, when the initial evaluation reveals no clear explanation for acute or recurrent pancreatitis, endoscopic ultrasound is the preferred diagnostic test. The authors noted that, although there isn’t a concretely defined optimal timing for EUS defined, most experts advise a short delay of 2-6 weeks after resolution of acute pancreatitis. MRI with contrast and cholangiopancreatography can be a reasonable complementary or alternative test, based on local expertise and availability.

Second, the role of ERCP remains controversial for reducing the frequency of acute pancreatitis episodes in patients with pancreas divisum, the most common congenital pancreatic anomaly, the authors wrote. However, minor papilla endotherapy may be useful, particularly for those with objective signs of outflow obstruction, such as a dilated dorsal pancreatic duct or santorinicele. However, there is no role for ERCP in treating pain alone in patients with pancreas divisum.

Third, ERCP remains even more controversial for reducing the frequency of pancreatitis episodes in patients with unexplained recurrent acute pancreatitis and standard pancreatic ductal anatomy, according to the authors. It should only be considered after a comprehensive discussion of the uncertain benefits and potentially severe procedure-related adverse events. When used, ERCP with biliary sphincterotomy alone may be preferable to dual sphincterotomy.

Fourth, for long-term treatment of patients with painful obstructive chronic pancreatitis, surgical intervention should be considered over endoscopic therapy, the study authors wrote. Pain is the most common symptom and important driver of impaired quality of life in patients with chronic pancreatitis, among whom a subset will be affected by intraductal hypertension from an obstructed pancreatic duct. The authors noted that endoscopic intervention remains a reasonable alternative to surgery for suboptimal operative candidates or patients who want a less-invasive approach, as long as they are clearly informed that the best practice advice primarily favors surgery.

Fifth, when using ERCP for pancreatic duct stones, small main pancreatic duct stones of 5 mm or less can be treated with pancreatography and conventional stone extraction maneuvers. For larger stones, however, extracorporeal shockwave lithotripsy or pancreatoscopy with intraductal lithotripsy can be considered, although the former is not widely available in the United States and the success rates for the latter vary.

Sixth, when using ERCP for pancreatic duct strictures, prolonged stent therapy for 6-12 months is effective for treating symptoms and remodeling main pancreatic duct strictures. The preferred approach is to place and sequentially add multiple plastic stents in parallel, or up-sizing. Emerging evidence suggests that fully covered self-expanding metal stents may be useful in this case, but additional research is needed. For example, one study suggested that patients treated with these self-expanding stents required fewer ERCPs, but their adverse event rate was significantly higher (39% vs. 14%).

Seventh, ERCP with stent insertion is the preferred treatment for benign biliary stricture caused by chronic pancreatitis. Fully covered self-expanding metal stents are favored over placing multiple plastic stents when feasible, given the similar efficacy but significantly lower need for stent exchange procedures during the treatment course.

Eighth, celiac plexus block shouldn’t be routinely performed for the management of pain caused by chronic pancreatitis. Celiac plexus block could be considered in certain patients on a case-by-case basis if they have debilitating pain that hasn’t responded to other therapeutic measures. However, this should only be considered after a discussion about the unclear outcomes and its procedural risks.

“Given the current lack of evidence, additional well-designed prospective comparative studies are needed to support a more unified diagnostic and therapeutic pathway for the treatment of these complex cases,” the authors concluded.

The authors reported no grant support or funding sources for this report. Several authors disclosed financial relationships with companies such as Olympus America, Medtronic, and Microtech.
 

Publications
Topics
Sections

 

Endoscopy plays an integral role in the evaluation and management of patients with recurrent acute pancreatitis and chronic pancreatitis, according to a new American Gastroenterological Association clinical practice update published in Gastroenterology.

Acute pancreatitis remains the leading cause of inpatient care among gastrointestinal conditions, with about 10%-30% of patients developing recurrent acute pancreatitis, wrote co–first authors Daniel Strand, MD, from the University of Virginia Health System, Charlottesville, and Ryan J. Law, MD, from the Mayo Clinic, Rochester, Minn., and colleagues. About 35% of patients with recurrent acute pancreatitis will progress to chronic pancreatitis. Both conditions are associated with significant morbidity and mortality.

“Interventions aimed to better evaluate, mitigate the progression of, and treat symptoms related to [acute pancreatitis] and [chronic pancreatitis] are critical to improve patients’ quality of life and other long-term outcomes,” the authors of the expert review wrote.

The authors reviewed randomized controlled trials, observational studies, systematic reviews and meta-analyses, and expert consensus in the field to develop eight clinical practice advice statements.

First, when the initial evaluation reveals no clear explanation for acute or recurrent pancreatitis, endoscopic ultrasound is the preferred diagnostic test. The authors noted that, although there isn’t a concretely defined optimal timing for EUS defined, most experts advise a short delay of 2-6 weeks after resolution of acute pancreatitis. MRI with contrast and cholangiopancreatography can be a reasonable complementary or alternative test, based on local expertise and availability.

Second, the role of ERCP remains controversial for reducing the frequency of acute pancreatitis episodes in patients with pancreas divisum, the most common congenital pancreatic anomaly, the authors wrote. However, minor papilla endotherapy may be useful, particularly for those with objective signs of outflow obstruction, such as a dilated dorsal pancreatic duct or santorinicele. However, there is no role for ERCP in treating pain alone in patients with pancreas divisum.

Third, ERCP remains even more controversial for reducing the frequency of pancreatitis episodes in patients with unexplained recurrent acute pancreatitis and standard pancreatic ductal anatomy, according to the authors. It should only be considered after a comprehensive discussion of the uncertain benefits and potentially severe procedure-related adverse events. When used, ERCP with biliary sphincterotomy alone may be preferable to dual sphincterotomy.

Fourth, for long-term treatment of patients with painful obstructive chronic pancreatitis, surgical intervention should be considered over endoscopic therapy, the study authors wrote. Pain is the most common symptom and important driver of impaired quality of life in patients with chronic pancreatitis, among whom a subset will be affected by intraductal hypertension from an obstructed pancreatic duct. The authors noted that endoscopic intervention remains a reasonable alternative to surgery for suboptimal operative candidates or patients who want a less-invasive approach, as long as they are clearly informed that the best practice advice primarily favors surgery.

Fifth, when using ERCP for pancreatic duct stones, small main pancreatic duct stones of 5 mm or less can be treated with pancreatography and conventional stone extraction maneuvers. For larger stones, however, extracorporeal shockwave lithotripsy or pancreatoscopy with intraductal lithotripsy can be considered, although the former is not widely available in the United States and the success rates for the latter vary.

Sixth, when using ERCP for pancreatic duct strictures, prolonged stent therapy for 6-12 months is effective for treating symptoms and remodeling main pancreatic duct strictures. The preferred approach is to place and sequentially add multiple plastic stents in parallel, or up-sizing. Emerging evidence suggests that fully covered self-expanding metal stents may be useful in this case, but additional research is needed. For example, one study suggested that patients treated with these self-expanding stents required fewer ERCPs, but their adverse event rate was significantly higher (39% vs. 14%).

Seventh, ERCP with stent insertion is the preferred treatment for benign biliary stricture caused by chronic pancreatitis. Fully covered self-expanding metal stents are favored over placing multiple plastic stents when feasible, given the similar efficacy but significantly lower need for stent exchange procedures during the treatment course.

Eighth, celiac plexus block shouldn’t be routinely performed for the management of pain caused by chronic pancreatitis. Celiac plexus block could be considered in certain patients on a case-by-case basis if they have debilitating pain that hasn’t responded to other therapeutic measures. However, this should only be considered after a discussion about the unclear outcomes and its procedural risks.

“Given the current lack of evidence, additional well-designed prospective comparative studies are needed to support a more unified diagnostic and therapeutic pathway for the treatment of these complex cases,” the authors concluded.

The authors reported no grant support or funding sources for this report. Several authors disclosed financial relationships with companies such as Olympus America, Medtronic, and Microtech.
 

 

Endoscopy plays an integral role in the evaluation and management of patients with recurrent acute pancreatitis and chronic pancreatitis, according to a new American Gastroenterological Association clinical practice update published in Gastroenterology.

Acute pancreatitis remains the leading cause of inpatient care among gastrointestinal conditions, with about 10%-30% of patients developing recurrent acute pancreatitis, wrote co–first authors Daniel Strand, MD, from the University of Virginia Health System, Charlottesville, and Ryan J. Law, MD, from the Mayo Clinic, Rochester, Minn., and colleagues. About 35% of patients with recurrent acute pancreatitis will progress to chronic pancreatitis. Both conditions are associated with significant morbidity and mortality.

“Interventions aimed to better evaluate, mitigate the progression of, and treat symptoms related to [acute pancreatitis] and [chronic pancreatitis] are critical to improve patients’ quality of life and other long-term outcomes,” the authors of the expert review wrote.

The authors reviewed randomized controlled trials, observational studies, systematic reviews and meta-analyses, and expert consensus in the field to develop eight clinical practice advice statements.

First, when the initial evaluation reveals no clear explanation for acute or recurrent pancreatitis, endoscopic ultrasound is the preferred diagnostic test. The authors noted that, although there isn’t a concretely defined optimal timing for EUS defined, most experts advise a short delay of 2-6 weeks after resolution of acute pancreatitis. MRI with contrast and cholangiopancreatography can be a reasonable complementary or alternative test, based on local expertise and availability.

Second, the role of ERCP remains controversial for reducing the frequency of acute pancreatitis episodes in patients with pancreas divisum, the most common congenital pancreatic anomaly, the authors wrote. However, minor papilla endotherapy may be useful, particularly for those with objective signs of outflow obstruction, such as a dilated dorsal pancreatic duct or santorinicele. However, there is no role for ERCP in treating pain alone in patients with pancreas divisum.

Third, ERCP remains even more controversial for reducing the frequency of pancreatitis episodes in patients with unexplained recurrent acute pancreatitis and standard pancreatic ductal anatomy, according to the authors. It should only be considered after a comprehensive discussion of the uncertain benefits and potentially severe procedure-related adverse events. When used, ERCP with biliary sphincterotomy alone may be preferable to dual sphincterotomy.

Fourth, for long-term treatment of patients with painful obstructive chronic pancreatitis, surgical intervention should be considered over endoscopic therapy, the study authors wrote. Pain is the most common symptom and important driver of impaired quality of life in patients with chronic pancreatitis, among whom a subset will be affected by intraductal hypertension from an obstructed pancreatic duct. The authors noted that endoscopic intervention remains a reasonable alternative to surgery for suboptimal operative candidates or patients who want a less-invasive approach, as long as they are clearly informed that the best practice advice primarily favors surgery.

Fifth, when using ERCP for pancreatic duct stones, small main pancreatic duct stones of 5 mm or less can be treated with pancreatography and conventional stone extraction maneuvers. For larger stones, however, extracorporeal shockwave lithotripsy or pancreatoscopy with intraductal lithotripsy can be considered, although the former is not widely available in the United States and the success rates for the latter vary.

Sixth, when using ERCP for pancreatic duct strictures, prolonged stent therapy for 6-12 months is effective for treating symptoms and remodeling main pancreatic duct strictures. The preferred approach is to place and sequentially add multiple plastic stents in parallel, or up-sizing. Emerging evidence suggests that fully covered self-expanding metal stents may be useful in this case, but additional research is needed. For example, one study suggested that patients treated with these self-expanding stents required fewer ERCPs, but their adverse event rate was significantly higher (39% vs. 14%).

Seventh, ERCP with stent insertion is the preferred treatment for benign biliary stricture caused by chronic pancreatitis. Fully covered self-expanding metal stents are favored over placing multiple plastic stents when feasible, given the similar efficacy but significantly lower need for stent exchange procedures during the treatment course.

Eighth, celiac plexus block shouldn’t be routinely performed for the management of pain caused by chronic pancreatitis. Celiac plexus block could be considered in certain patients on a case-by-case basis if they have debilitating pain that hasn’t responded to other therapeutic measures. However, this should only be considered after a discussion about the unclear outcomes and its procedural risks.

“Given the current lack of evidence, additional well-designed prospective comparative studies are needed to support a more unified diagnostic and therapeutic pathway for the treatment of these complex cases,” the authors concluded.

The authors reported no grant support or funding sources for this report. Several authors disclosed financial relationships with companies such as Olympus America, Medtronic, and Microtech.
 

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Psychiatrists’ views on psychoactive drugs clash with U.S. policy

Article Type
Changed
Thu, 09/08/2022 - 15:19

Psychiatrists’ perceptions of the safety, therapeutic value, and abuse potential of psychoactive drugs are inconsistent with current drug policy, results from a new survey show.

“The consensus among experts, including psychiatrists, about specific drugs is not consistent or congruent with the schedule of these drugs” in the United States, lead author Adam Levin, MD, third-year psychiatry resident, Ohio State University, Columbus, and affiliate scholar at the Center for Psychedelic Drug Research and Education, Ohio State College of Social Work, told this news organization.

Dr. Adam Levin

Dr. Levin stressed the importance of appropriate drug scheduling to improve access to treatments such as psilocybin (psychedelic mushrooms) and 4-methylenedioxy methamphetamine (MDMA), which are now being tested for psychiatric disorders.

“We are in the middle of a mental health crisis so having any new tools would be really important,” he said.

The survey findings were published online  in the International Journal of Drug Policy.
 

Five drug schedules

The Controlled Substances Act of 1970 created five “schedules” that organized drugs from most to least dangerous (schedule I-V). However, Dr. Levin said that the schedules do not accurately reflect the harms or therapeutic benefits of the various drugs.

Some drugs in lower, less restrictive schedules have greater potential for harm than do those in higher schedules, he noted. For example, methamphetamine, which has been recalled in multiple formulations because of concerns about abuse and limited medical use, remains a schedule II drug.

In addition, several schedule I drugs, including psilocybin and MDMA that are deemed dangerous and of no medical value, have shown therapeutic potential and low rates of misuse, addiction, or physical harm, the investigators noted.

In fact, the Food and Drug Administration has granted breakthrough therapy status to psilocybin for treatment-resistant depression and major depressive disorder (MDD) and to MDMA for posttraumatic stress disorder. This has positioned these drugs for possible FDA approval within the next few years.

Access to schedule I drugs for research purposes is tightly controlled. “Once psilocybin was placed in schedule I, there was this massive drop-off in the research funding and amount of research; and we’re just now starting to understand the potential therapeutic value of this drug,” said Dr. Levin.

Even with a recent research resurgence, most studies are funded by charitable donations or for-profit companies because of continued hesitancy on the part of grant-making organizations, he added.
 

Apparent contradictions

Given the pending approval of several schedule I drugs and escalating abuse of drugs in lower schedules, there is a growing need to understand physician attitudes surrounding the apparent contradictions in the drug schedule, the investigators noted.

Their survey included a geographically diverse group of 181 mostly middle-aged psychiatrists (65.2% were men) with an average of 16.2 years of practice after residency.

Participants were randomly assigned to respond to a vignette depicting a clinical scenario where a patient wants one of four drugs to help treat severe depression: psilocybin, a schedule I drug; methamphetamine (Desoxyn), a schedule II drug; ketamine, a Schedule III drug; or alprazolam (Xanax), a schedule IV drug.

Each of these therapies has established antidepressant properties, but none are FDA approved for treatment of MDD. However, an intranasal formulation of the ketamine enantiomer Spravato (esketamine) was recently approved for treatment-resistant depression.

There were significant differences among the groups presented with different vignettes. Participants were more likely to warn against repeated use of and development of a new psychiatric problem with methamphetamine and alprazolam compared with psilocybin or ketamine.

Respondents were most concerned about increased suicide risk after the nonprescribed use of alprazolam compared with psilocybin and ketamine.

Compared with all other drugs, ketamine was more likely to be integrated into treatment plans.
 

 

 

Therapeutic value, abuse potential

Participants were asked to rate the safety, therapeutic value, and abuse potential of the four drugs as well as alcohol, a nonscheduled legal drug, if used properly or as directed.

Respondents viewed psilocybin and ketamine as similarly safe – and safer than methamphetamine and alprazolam. They considered ketamine as having the highest therapeutic potential, followed by psilocybin, and then alprazolam and methamphetamine. “Last was alcohol, which we expected because alcohol is not used therapeutically,” said Dr. Levin.

Survey completers viewed methamphetamine, alprazolam, and alcohol as having similarly high abuse potential, and ketamine as having mid-level abuse potential. Psilocybin was rated as having the lowest abuse potential, “which is exactly the opposite of what is implied by its schedule I status,” noted Dr. Levin.

The results provide evidence these drugs “are incorrectly scheduled,” he said.

“This suggests the schedule does not reflect current evidence, which I think is really important to understand because there are consequences to the drug schedule,” including criminal justice and research consequences, he added.

Dr. Levin pointed out that possession of drugs in more harmful schedules is linked to sometimes lengthy prison sentences.

The psychiatrists’ perceptions of the drugs “overlaps pretty significantly” with recent surveys of other mental health professionals, including psychologists and addiction experts, he noted.

The study was funded by the Drug Enforcement and Policy Center, Moritz College of Law, and The Ohio State University. Dr. Levin reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Psychiatrists’ perceptions of the safety, therapeutic value, and abuse potential of psychoactive drugs are inconsistent with current drug policy, results from a new survey show.

“The consensus among experts, including psychiatrists, about specific drugs is not consistent or congruent with the schedule of these drugs” in the United States, lead author Adam Levin, MD, third-year psychiatry resident, Ohio State University, Columbus, and affiliate scholar at the Center for Psychedelic Drug Research and Education, Ohio State College of Social Work, told this news organization.

Dr. Adam Levin

Dr. Levin stressed the importance of appropriate drug scheduling to improve access to treatments such as psilocybin (psychedelic mushrooms) and 4-methylenedioxy methamphetamine (MDMA), which are now being tested for psychiatric disorders.

“We are in the middle of a mental health crisis so having any new tools would be really important,” he said.

The survey findings were published online  in the International Journal of Drug Policy.
 

Five drug schedules

The Controlled Substances Act of 1970 created five “schedules” that organized drugs from most to least dangerous (schedule I-V). However, Dr. Levin said that the schedules do not accurately reflect the harms or therapeutic benefits of the various drugs.

Some drugs in lower, less restrictive schedules have greater potential for harm than do those in higher schedules, he noted. For example, methamphetamine, which has been recalled in multiple formulations because of concerns about abuse and limited medical use, remains a schedule II drug.

In addition, several schedule I drugs, including psilocybin and MDMA that are deemed dangerous and of no medical value, have shown therapeutic potential and low rates of misuse, addiction, or physical harm, the investigators noted.

In fact, the Food and Drug Administration has granted breakthrough therapy status to psilocybin for treatment-resistant depression and major depressive disorder (MDD) and to MDMA for posttraumatic stress disorder. This has positioned these drugs for possible FDA approval within the next few years.

Access to schedule I drugs for research purposes is tightly controlled. “Once psilocybin was placed in schedule I, there was this massive drop-off in the research funding and amount of research; and we’re just now starting to understand the potential therapeutic value of this drug,” said Dr. Levin.

Even with a recent research resurgence, most studies are funded by charitable donations or for-profit companies because of continued hesitancy on the part of grant-making organizations, he added.
 

Apparent contradictions

Given the pending approval of several schedule I drugs and escalating abuse of drugs in lower schedules, there is a growing need to understand physician attitudes surrounding the apparent contradictions in the drug schedule, the investigators noted.

Their survey included a geographically diverse group of 181 mostly middle-aged psychiatrists (65.2% were men) with an average of 16.2 years of practice after residency.

Participants were randomly assigned to respond to a vignette depicting a clinical scenario where a patient wants one of four drugs to help treat severe depression: psilocybin, a schedule I drug; methamphetamine (Desoxyn), a schedule II drug; ketamine, a Schedule III drug; or alprazolam (Xanax), a schedule IV drug.

Each of these therapies has established antidepressant properties, but none are FDA approved for treatment of MDD. However, an intranasal formulation of the ketamine enantiomer Spravato (esketamine) was recently approved for treatment-resistant depression.

There were significant differences among the groups presented with different vignettes. Participants were more likely to warn against repeated use of and development of a new psychiatric problem with methamphetamine and alprazolam compared with psilocybin or ketamine.

Respondents were most concerned about increased suicide risk after the nonprescribed use of alprazolam compared with psilocybin and ketamine.

Compared with all other drugs, ketamine was more likely to be integrated into treatment plans.
 

 

 

Therapeutic value, abuse potential

Participants were asked to rate the safety, therapeutic value, and abuse potential of the four drugs as well as alcohol, a nonscheduled legal drug, if used properly or as directed.

Respondents viewed psilocybin and ketamine as similarly safe – and safer than methamphetamine and alprazolam. They considered ketamine as having the highest therapeutic potential, followed by psilocybin, and then alprazolam and methamphetamine. “Last was alcohol, which we expected because alcohol is not used therapeutically,” said Dr. Levin.

Survey completers viewed methamphetamine, alprazolam, and alcohol as having similarly high abuse potential, and ketamine as having mid-level abuse potential. Psilocybin was rated as having the lowest abuse potential, “which is exactly the opposite of what is implied by its schedule I status,” noted Dr. Levin.

The results provide evidence these drugs “are incorrectly scheduled,” he said.

“This suggests the schedule does not reflect current evidence, which I think is really important to understand because there are consequences to the drug schedule,” including criminal justice and research consequences, he added.

Dr. Levin pointed out that possession of drugs in more harmful schedules is linked to sometimes lengthy prison sentences.

The psychiatrists’ perceptions of the drugs “overlaps pretty significantly” with recent surveys of other mental health professionals, including psychologists and addiction experts, he noted.

The study was funded by the Drug Enforcement and Policy Center, Moritz College of Law, and The Ohio State University. Dr. Levin reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Psychiatrists’ perceptions of the safety, therapeutic value, and abuse potential of psychoactive drugs are inconsistent with current drug policy, results from a new survey show.

“The consensus among experts, including psychiatrists, about specific drugs is not consistent or congruent with the schedule of these drugs” in the United States, lead author Adam Levin, MD, third-year psychiatry resident, Ohio State University, Columbus, and affiliate scholar at the Center for Psychedelic Drug Research and Education, Ohio State College of Social Work, told this news organization.

Dr. Adam Levin

Dr. Levin stressed the importance of appropriate drug scheduling to improve access to treatments such as psilocybin (psychedelic mushrooms) and 4-methylenedioxy methamphetamine (MDMA), which are now being tested for psychiatric disorders.

“We are in the middle of a mental health crisis so having any new tools would be really important,” he said.

The survey findings were published online  in the International Journal of Drug Policy.
 

Five drug schedules

The Controlled Substances Act of 1970 created five “schedules” that organized drugs from most to least dangerous (schedule I-V). However, Dr. Levin said that the schedules do not accurately reflect the harms or therapeutic benefits of the various drugs.

Some drugs in lower, less restrictive schedules have greater potential for harm than do those in higher schedules, he noted. For example, methamphetamine, which has been recalled in multiple formulations because of concerns about abuse and limited medical use, remains a schedule II drug.

In addition, several schedule I drugs, including psilocybin and MDMA that are deemed dangerous and of no medical value, have shown therapeutic potential and low rates of misuse, addiction, or physical harm, the investigators noted.

In fact, the Food and Drug Administration has granted breakthrough therapy status to psilocybin for treatment-resistant depression and major depressive disorder (MDD) and to MDMA for posttraumatic stress disorder. This has positioned these drugs for possible FDA approval within the next few years.

Access to schedule I drugs for research purposes is tightly controlled. “Once psilocybin was placed in schedule I, there was this massive drop-off in the research funding and amount of research; and we’re just now starting to understand the potential therapeutic value of this drug,” said Dr. Levin.

Even with a recent research resurgence, most studies are funded by charitable donations or for-profit companies because of continued hesitancy on the part of grant-making organizations, he added.
 

Apparent contradictions

Given the pending approval of several schedule I drugs and escalating abuse of drugs in lower schedules, there is a growing need to understand physician attitudes surrounding the apparent contradictions in the drug schedule, the investigators noted.

Their survey included a geographically diverse group of 181 mostly middle-aged psychiatrists (65.2% were men) with an average of 16.2 years of practice after residency.

Participants were randomly assigned to respond to a vignette depicting a clinical scenario where a patient wants one of four drugs to help treat severe depression: psilocybin, a schedule I drug; methamphetamine (Desoxyn), a schedule II drug; ketamine, a Schedule III drug; or alprazolam (Xanax), a schedule IV drug.

Each of these therapies has established antidepressant properties, but none are FDA approved for treatment of MDD. However, an intranasal formulation of the ketamine enantiomer Spravato (esketamine) was recently approved for treatment-resistant depression.

There were significant differences among the groups presented with different vignettes. Participants were more likely to warn against repeated use of and development of a new psychiatric problem with methamphetamine and alprazolam compared with psilocybin or ketamine.

Respondents were most concerned about increased suicide risk after the nonprescribed use of alprazolam compared with psilocybin and ketamine.

Compared with all other drugs, ketamine was more likely to be integrated into treatment plans.
 

 

 

Therapeutic value, abuse potential

Participants were asked to rate the safety, therapeutic value, and abuse potential of the four drugs as well as alcohol, a nonscheduled legal drug, if used properly or as directed.

Respondents viewed psilocybin and ketamine as similarly safe – and safer than methamphetamine and alprazolam. They considered ketamine as having the highest therapeutic potential, followed by psilocybin, and then alprazolam and methamphetamine. “Last was alcohol, which we expected because alcohol is not used therapeutically,” said Dr. Levin.

Survey completers viewed methamphetamine, alprazolam, and alcohol as having similarly high abuse potential, and ketamine as having mid-level abuse potential. Psilocybin was rated as having the lowest abuse potential, “which is exactly the opposite of what is implied by its schedule I status,” noted Dr. Levin.

The results provide evidence these drugs “are incorrectly scheduled,” he said.

“This suggests the schedule does not reflect current evidence, which I think is really important to understand because there are consequences to the drug schedule,” including criminal justice and research consequences, he added.

Dr. Levin pointed out that possession of drugs in more harmful schedules is linked to sometimes lengthy prison sentences.

The psychiatrists’ perceptions of the drugs “overlaps pretty significantly” with recent surveys of other mental health professionals, including psychologists and addiction experts, he noted.

The study was funded by the Drug Enforcement and Policy Center, Moritz College of Law, and The Ohio State University. Dr. Levin reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE INTERNATIONAL JOURNAL OF DRUG POLICY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

AI algorithm detects erosions, ankylosis with high accuracy in patients with sacroiliitis

Article Type
Changed
Thu, 09/08/2022 - 14:46

 

Erosions and ankylosis in patients with sacroiliitis are detectable to a high degree of accuracy on CT images using an artificial intelligence (AI)–based algorithm, according to research presented at the 13th International Congress on Spondyloarthritides.

Lennart Jans, MD, head of clinics in musculoskeletal imaging in the department of radiology at Ghent (Belgium) University Hospital, shared data on the development and validation of the algorithm for automatic detection of erosion and ankylosis on CT images of the sacroiliac (SI) joints.

Becky McCall/MDedge News
Dr. Lennart Jans

“Essentially, in terms of statistics, this AI algorithm has 95% sensitivity for picking up erosions in patients with clinical symptoms of sacroiliitis, and if this is further developed as a tool, it could aid detection in people with erosions that would otherwise go undetected and undiagnosed,” Dr. Jans said in an interview, stressing that the results were still preliminary.

“We want to move from reporting one patient at a time to a system that detects and helps to diagnose larger numbers of patients and makes a larger impact on patient outcomes.”

He stressed that, with thousands of images per patient, it is an impossible workload for any radiology department to read every image necessary to inform diagnoses, and this is only exacerbated by the shortage of rheumatologists, especially in the United States.

Denis Poddubnyy, MD, head of rheumatology at Charité University Hospital, Berlin, acknowledged that AI has potential to improve the recognition of changes indicative of spondyloarthritis (SpA) on imaging. “A standardized, valid, and reliable detection of those changes is relevant for both diagnosis, including differential diagnosis, and classification of SpA.”

Dr. Poddubnyy added that the AI-based algorithm developed by Dr. Jans and associates is designed to detect very specific SpA structural changes in the SI joints on CT. “CT is usually applied in the clinical practice after MRI ... normally in cases where MRI does not provide conclusive results,” he said. Since MRI scans have also been recently used to develop an AI-based algorithm for the detection of active inflammation – not captured by CT – and structural changes in SI joints, he noted that the “generated data on CT should be, therefore, seen in a broader context toward standardization of imaging findings detection.”
 

Proof-of-concept findings are due for scale-up

Dr. Jans acknowledged that the current data only establish proof of concept. Among the study’s 145 patients, 60% were used for training the AI algorithm and 40% for testing it. All patients who had clinical symptoms of sacroiliitis and had undergone a SI joint CT scan were included from two hospitals: Ghent University Hospital and the University of Alberta Hospital, Edmonton. The majority of patients were female (81 of 145). They had a mean age of 40 years, 84 had diagnosed axial SpA, 15 had mechanical back pain, and 46 did not have a final diagnosis.

CT images were examined by three independent and blinded radiologists who annotated erosions more than 1 mm and ankylosis more than 2 mm, while a type of AI algorithm known as a neural network pipeline was developed to segment the SI joints and detect structural lesions.

In the first instance, Dr. Jans explained, examination of CT images using the AI algorithm from patients who enter the hospital for other reasons, such as trauma, rheumatic diseases, kidney stones, or appendicitis, might lead to the detection of otherwise unknown erosions. “Often patients have complained of backache for years, seeing various physiotherapists and similar, but had no idea what might be causing it,” he said. “We just don’t have the time for examining all the thousands of images separately. We need some kind of aid here. We need an extra pair of eyes. This is what AI software does.”

Dr. Jans said rheumatologists who ultimately want to detect and diagnose patients with SI erosions want to reduce the false-negative findings. “They want the system to pick up all the patients who have erosions. Here, the most important parameter is sensitivity, and we find that our algorithm shows a very high sensitivity. Optimization of the AI algorithm to reduce false negatives resulted in a sensitivity of 95% for detection of erosions on CT of the sacroiliac joints on a patient level.”

While overall accuracy was over 90%, Dr. Jans acknowledged that the algorithm was run in a relatively select population of dedicated CT scans of the joints. He is also aware that a good AI algorithm needs to work well across locations and populations. “If you make something within your institution alone, it will not work in a hospital on the other side of the street.”

However, he added, the researchers used images from four different CT scanners and images from two different institutions – one in Canada and their own in Belgium, providing a case mix that makes their algorithm more refined.
 

Next step: Test in an unselected population

When asked to comment on the study, Mikael Boesen, MD, PhD, of Bispebjerg and Frederiksberg Hospital, Copenhagen, congratulated Dr. Jans on the work and remarked that he found the research potentially clinically useful.

“The next steps would be to test the performance of the model in an unselected population of patients who have CT scans of the abdomen for other reasons to test the model’s ability to flag potential SI joint disease to the reader, which is often overlooked, as well as [to see] how the model performs in larger datasets from other hospitals, vendors, and CT-reconstruction algorithms.”

Finally, Dr. Boesen pointed out that it would be interesting to see if the AI algorithm can detect different reasons for erosions. “Especially [for] separation between mechanical and inflammatory courses. This could potentially be done by automatically mapping the location of the erosions in the SI joints.”

Dr. Jans has now opened up the project to other radiologists to collaborate and provide images to train and test the algorithm further. “We now have 2.4 million images that have been enriched, and we will use these in the near future as we move beyond the proof-of-concept stage.

He is looking for as for as many partners as possible to help collect enriched images and develop this into a real tool for use in hospitals worldwide on clinical patients. “We have joined forces with several hospitals but continue looking for further collaborations.

“We need, just like self-driving cars, not just thousands, but tens of thousands or millions of images to develop this.”

Dr. Jans declared receiving speaker fees from UCB, AbbVie, Lilly, and Novartis, and that he is cofounder of a future spin-off of Ghent University RheumaFinder. Dr. Poddubnyy and Dr. Boesen declared no relevant disclosures.
 

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

Erosions and ankylosis in patients with sacroiliitis are detectable to a high degree of accuracy on CT images using an artificial intelligence (AI)–based algorithm, according to research presented at the 13th International Congress on Spondyloarthritides.

Lennart Jans, MD, head of clinics in musculoskeletal imaging in the department of radiology at Ghent (Belgium) University Hospital, shared data on the development and validation of the algorithm for automatic detection of erosion and ankylosis on CT images of the sacroiliac (SI) joints.

Becky McCall/MDedge News
Dr. Lennart Jans

“Essentially, in terms of statistics, this AI algorithm has 95% sensitivity for picking up erosions in patients with clinical symptoms of sacroiliitis, and if this is further developed as a tool, it could aid detection in people with erosions that would otherwise go undetected and undiagnosed,” Dr. Jans said in an interview, stressing that the results were still preliminary.

“We want to move from reporting one patient at a time to a system that detects and helps to diagnose larger numbers of patients and makes a larger impact on patient outcomes.”

He stressed that, with thousands of images per patient, it is an impossible workload for any radiology department to read every image necessary to inform diagnoses, and this is only exacerbated by the shortage of rheumatologists, especially in the United States.

Denis Poddubnyy, MD, head of rheumatology at Charité University Hospital, Berlin, acknowledged that AI has potential to improve the recognition of changes indicative of spondyloarthritis (SpA) on imaging. “A standardized, valid, and reliable detection of those changes is relevant for both diagnosis, including differential diagnosis, and classification of SpA.”

Dr. Poddubnyy added that the AI-based algorithm developed by Dr. Jans and associates is designed to detect very specific SpA structural changes in the SI joints on CT. “CT is usually applied in the clinical practice after MRI ... normally in cases where MRI does not provide conclusive results,” he said. Since MRI scans have also been recently used to develop an AI-based algorithm for the detection of active inflammation – not captured by CT – and structural changes in SI joints, he noted that the “generated data on CT should be, therefore, seen in a broader context toward standardization of imaging findings detection.”
 

Proof-of-concept findings are due for scale-up

Dr. Jans acknowledged that the current data only establish proof of concept. Among the study’s 145 patients, 60% were used for training the AI algorithm and 40% for testing it. All patients who had clinical symptoms of sacroiliitis and had undergone a SI joint CT scan were included from two hospitals: Ghent University Hospital and the University of Alberta Hospital, Edmonton. The majority of patients were female (81 of 145). They had a mean age of 40 years, 84 had diagnosed axial SpA, 15 had mechanical back pain, and 46 did not have a final diagnosis.

CT images were examined by three independent and blinded radiologists who annotated erosions more than 1 mm and ankylosis more than 2 mm, while a type of AI algorithm known as a neural network pipeline was developed to segment the SI joints and detect structural lesions.

In the first instance, Dr. Jans explained, examination of CT images using the AI algorithm from patients who enter the hospital for other reasons, such as trauma, rheumatic diseases, kidney stones, or appendicitis, might lead to the detection of otherwise unknown erosions. “Often patients have complained of backache for years, seeing various physiotherapists and similar, but had no idea what might be causing it,” he said. “We just don’t have the time for examining all the thousands of images separately. We need some kind of aid here. We need an extra pair of eyes. This is what AI software does.”

Dr. Jans said rheumatologists who ultimately want to detect and diagnose patients with SI erosions want to reduce the false-negative findings. “They want the system to pick up all the patients who have erosions. Here, the most important parameter is sensitivity, and we find that our algorithm shows a very high sensitivity. Optimization of the AI algorithm to reduce false negatives resulted in a sensitivity of 95% for detection of erosions on CT of the sacroiliac joints on a patient level.”

While overall accuracy was over 90%, Dr. Jans acknowledged that the algorithm was run in a relatively select population of dedicated CT scans of the joints. He is also aware that a good AI algorithm needs to work well across locations and populations. “If you make something within your institution alone, it will not work in a hospital on the other side of the street.”

However, he added, the researchers used images from four different CT scanners and images from two different institutions – one in Canada and their own in Belgium, providing a case mix that makes their algorithm more refined.
 

Next step: Test in an unselected population

When asked to comment on the study, Mikael Boesen, MD, PhD, of Bispebjerg and Frederiksberg Hospital, Copenhagen, congratulated Dr. Jans on the work and remarked that he found the research potentially clinically useful.

“The next steps would be to test the performance of the model in an unselected population of patients who have CT scans of the abdomen for other reasons to test the model’s ability to flag potential SI joint disease to the reader, which is often overlooked, as well as [to see] how the model performs in larger datasets from other hospitals, vendors, and CT-reconstruction algorithms.”

Finally, Dr. Boesen pointed out that it would be interesting to see if the AI algorithm can detect different reasons for erosions. “Especially [for] separation between mechanical and inflammatory courses. This could potentially be done by automatically mapping the location of the erosions in the SI joints.”

Dr. Jans has now opened up the project to other radiologists to collaborate and provide images to train and test the algorithm further. “We now have 2.4 million images that have been enriched, and we will use these in the near future as we move beyond the proof-of-concept stage.

He is looking for as for as many partners as possible to help collect enriched images and develop this into a real tool for use in hospitals worldwide on clinical patients. “We have joined forces with several hospitals but continue looking for further collaborations.

“We need, just like self-driving cars, not just thousands, but tens of thousands or millions of images to develop this.”

Dr. Jans declared receiving speaker fees from UCB, AbbVie, Lilly, and Novartis, and that he is cofounder of a future spin-off of Ghent University RheumaFinder. Dr. Poddubnyy and Dr. Boesen declared no relevant disclosures.
 

 

Erosions and ankylosis in patients with sacroiliitis are detectable to a high degree of accuracy on CT images using an artificial intelligence (AI)–based algorithm, according to research presented at the 13th International Congress on Spondyloarthritides.

Lennart Jans, MD, head of clinics in musculoskeletal imaging in the department of radiology at Ghent (Belgium) University Hospital, shared data on the development and validation of the algorithm for automatic detection of erosion and ankylosis on CT images of the sacroiliac (SI) joints.

Becky McCall/MDedge News
Dr. Lennart Jans

“Essentially, in terms of statistics, this AI algorithm has 95% sensitivity for picking up erosions in patients with clinical symptoms of sacroiliitis, and if this is further developed as a tool, it could aid detection in people with erosions that would otherwise go undetected and undiagnosed,” Dr. Jans said in an interview, stressing that the results were still preliminary.

“We want to move from reporting one patient at a time to a system that detects and helps to diagnose larger numbers of patients and makes a larger impact on patient outcomes.”

He stressed that, with thousands of images per patient, it is an impossible workload for any radiology department to read every image necessary to inform diagnoses, and this is only exacerbated by the shortage of rheumatologists, especially in the United States.

Denis Poddubnyy, MD, head of rheumatology at Charité University Hospital, Berlin, acknowledged that AI has potential to improve the recognition of changes indicative of spondyloarthritis (SpA) on imaging. “A standardized, valid, and reliable detection of those changes is relevant for both diagnosis, including differential diagnosis, and classification of SpA.”

Dr. Poddubnyy added that the AI-based algorithm developed by Dr. Jans and associates is designed to detect very specific SpA structural changes in the SI joints on CT. “CT is usually applied in the clinical practice after MRI ... normally in cases where MRI does not provide conclusive results,” he said. Since MRI scans have also been recently used to develop an AI-based algorithm for the detection of active inflammation – not captured by CT – and structural changes in SI joints, he noted that the “generated data on CT should be, therefore, seen in a broader context toward standardization of imaging findings detection.”
 

Proof-of-concept findings are due for scale-up

Dr. Jans acknowledged that the current data only establish proof of concept. Among the study’s 145 patients, 60% were used for training the AI algorithm and 40% for testing it. All patients who had clinical symptoms of sacroiliitis and had undergone a SI joint CT scan were included from two hospitals: Ghent University Hospital and the University of Alberta Hospital, Edmonton. The majority of patients were female (81 of 145). They had a mean age of 40 years, 84 had diagnosed axial SpA, 15 had mechanical back pain, and 46 did not have a final diagnosis.

CT images were examined by three independent and blinded radiologists who annotated erosions more than 1 mm and ankylosis more than 2 mm, while a type of AI algorithm known as a neural network pipeline was developed to segment the SI joints and detect structural lesions.

In the first instance, Dr. Jans explained, examination of CT images using the AI algorithm from patients who enter the hospital for other reasons, such as trauma, rheumatic diseases, kidney stones, or appendicitis, might lead to the detection of otherwise unknown erosions. “Often patients have complained of backache for years, seeing various physiotherapists and similar, but had no idea what might be causing it,” he said. “We just don’t have the time for examining all the thousands of images separately. We need some kind of aid here. We need an extra pair of eyes. This is what AI software does.”

Dr. Jans said rheumatologists who ultimately want to detect and diagnose patients with SI erosions want to reduce the false-negative findings. “They want the system to pick up all the patients who have erosions. Here, the most important parameter is sensitivity, and we find that our algorithm shows a very high sensitivity. Optimization of the AI algorithm to reduce false negatives resulted in a sensitivity of 95% for detection of erosions on CT of the sacroiliac joints on a patient level.”

While overall accuracy was over 90%, Dr. Jans acknowledged that the algorithm was run in a relatively select population of dedicated CT scans of the joints. He is also aware that a good AI algorithm needs to work well across locations and populations. “If you make something within your institution alone, it will not work in a hospital on the other side of the street.”

However, he added, the researchers used images from four different CT scanners and images from two different institutions – one in Canada and their own in Belgium, providing a case mix that makes their algorithm more refined.
 

Next step: Test in an unselected population

When asked to comment on the study, Mikael Boesen, MD, PhD, of Bispebjerg and Frederiksberg Hospital, Copenhagen, congratulated Dr. Jans on the work and remarked that he found the research potentially clinically useful.

“The next steps would be to test the performance of the model in an unselected population of patients who have CT scans of the abdomen for other reasons to test the model’s ability to flag potential SI joint disease to the reader, which is often overlooked, as well as [to see] how the model performs in larger datasets from other hospitals, vendors, and CT-reconstruction algorithms.”

Finally, Dr. Boesen pointed out that it would be interesting to see if the AI algorithm can detect different reasons for erosions. “Especially [for] separation between mechanical and inflammatory courses. This could potentially be done by automatically mapping the location of the erosions in the SI joints.”

Dr. Jans has now opened up the project to other radiologists to collaborate and provide images to train and test the algorithm further. “We now have 2.4 million images that have been enriched, and we will use these in the near future as we move beyond the proof-of-concept stage.

He is looking for as for as many partners as possible to help collect enriched images and develop this into a real tool for use in hospitals worldwide on clinical patients. “We have joined forces with several hospitals but continue looking for further collaborations.

“We need, just like self-driving cars, not just thousands, but tens of thousands or millions of images to develop this.”

Dr. Jans declared receiving speaker fees from UCB, AbbVie, Lilly, and Novartis, and that he is cofounder of a future spin-off of Ghent University RheumaFinder. Dr. Poddubnyy and Dr. Boesen declared no relevant disclosures.
 

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE 2022 SPA CONGRESS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Should patients with PsA or ankylosing spondylitis with axial disease be ‘lumped’ or ‘split’?

Article Type
Changed
Tue, 02/07/2023 - 16:39

 

A new study provides evidence that two conditions that fall under the umbrella of spondyloarthritis – isolated axial disease in patients with psoriatic arthritis (PsA) and isolated axial disease in patients with ankylosing spondylitis (AS) accompanied by psoriasis – are different clinical entities and may need different treatments. These relatively rare rheumatologic conditions, defined by their back involvement, have considerable clinical overlap and are often lumped together under the label axial spondyloarthritis.

This is a hot topic and current matter of debate within the scientific community: Are axial PsA and axial AS two separate diseases or just two phenotypes under the spondyloarthritis umbrella? said Fabian Proft, MD, a rheumatologist and researcher at Charité Universitätsmedizin Berlin, commenting on the new study, which was published online in Annals of the Rheumatic Diseases.

Dr. Fabian Proft

Both conditions belong to the spectrum of spondyloarthritis, but with varying viewpoints on nomenclature. They have intersections and overlaps, but not all treatments are equally effective for both. “We need to better understand their differences and similarities,” Dr. Proft said, adding that the new study is noteworthy for the size of the population included, its long-term follow-up data, and the researchers’ depth of experience treating these patients.

The researchers are based at the University of Toronto, which has separate clinics dedicated to PsA and to AS, said Dafna D. Gladman, MD, professor of medicine at the university, codirector of the PsA clinic, and corresponding author for the new study. The two clinics follow the same standardized protocols, including clinical, radiographic, genetic, and laboratory assessments. Even though the patients present quite similarly, she credits referring physicians for recognizing the distinctions by their referrals to the PsA or AS clinic.

According to previous research, pure axial PsA, without peripheral involvement, is rare, affecting about 2%-5% of patients with PsA. For this study, an observational cohort of 1,576 patients from the PsA clinic included 31% (n = 495) with axial disease, 2% (n = 32) with isolated axial PsA, and 29% (n = 463) with both axial and peripheral involvement. A total of 25 of the patients with isolated axial PsA ultimately developed peripheral disease by their most recent clinic follow-up visit. In a second cohort of 1,688 patients with AS, nearly 5% (n = 68) had isolated axial disease with psoriasis.

Dr. Dafna D. Gladman

“In our logistic regression analysis, isolated axial PsA was found to be a different clinical entity than isolated AS with psoriasis. They are not the same patients,” Dr. Gladman said. The patients with isolated axial PsA were older at diagnosis, more likely to have psoriatic nail lesions, and less likely to have inflammatory back pain than were patients with isolated axial AS and accompanying psoriasis.

When interviewed in early September, Dr. Gladman was preparing to fly to Ghent, Belgium, to participate in a debate at the International Congress on Spondyloarthritides, taking the pro position on the thesis: Is axial inflammation in PsA distinct from axial spondyloarthritis? Taking the con position was to be Robert Landewé, MD, PhD, of Amsterdam University Medical Center in the Netherlands.

“This is an old debate, splitters versus lumpers,” Dr. Gladman told this news organization. “My message is that when you place patients in more homogeneous groups, you can learn more and perhaps find better opportunities for treating their disease.” For example, even with the similarities, do these patients need to be treated with different medications? Medications for psoriasis, including those targeting the interleukin-23 cytokine, may not be effective for AS, but patients with axial PsA may not get them because of the association with axial AS.



“Now is the opportunity to really understand what – if any – are the differences between various components of this disease group. If you lump people together, you may miss the forest for the trees,” Dr. Gladman said. “If, at the end of the day, we find out these patients essentially are the same, I will lump. But until we have proved that there are no important differences, I will split.” She added that it is important for practicing rheumatologists to make the correct diagnosis so that they know to access certain drugs.

Dr. Proft credited Dr. Gladman and colleagues’ study for adding another piece of the puzzle to better understand differences and similarities for these two axial diseases. He noted, however, that the study did not include MRI scans for every participating patient, which could have given a deeper picture.

“International efforts are being made to recruit patients for a multinational, multicenter study of axial involvement in PsA,” which will include MRI data, Dr. Gladman said. She and Dr. Proft are both part of AXIS, the Axial Involvement in Psoriatic Arthritis cohort, now recruiting patients for such a study. AXIS is a joint project of the Assessment of SpondyloArthritis international Society and the Group for Research and Assessment of Psoriasis and Psoriatic Arthritis.

“We don’t have final answers yet, although we have given evidence to support the differences.” The proof is in the pudding, she said, and that pudding will be the clinical trials.

The University of Toronto Psoriatic Arthritis Program is supported by a grant from the Krembil Foundation. The study authors declared no competing interests. Dr. Proft reported receiving research support from Novartis, Eli Lilly, and UCB, and fees for consulting and serving on speakers bureaus from AbbVie, Amgen, Bristol-Myers Squibb, Celgene, Eli Lilly, Hexal, Janssen, Merck Sharp & Dohme, Novartis, Pfizer, Roche, and UCB.

Publications
Topics
Sections

 

A new study provides evidence that two conditions that fall under the umbrella of spondyloarthritis – isolated axial disease in patients with psoriatic arthritis (PsA) and isolated axial disease in patients with ankylosing spondylitis (AS) accompanied by psoriasis – are different clinical entities and may need different treatments. These relatively rare rheumatologic conditions, defined by their back involvement, have considerable clinical overlap and are often lumped together under the label axial spondyloarthritis.

This is a hot topic and current matter of debate within the scientific community: Are axial PsA and axial AS two separate diseases or just two phenotypes under the spondyloarthritis umbrella? said Fabian Proft, MD, a rheumatologist and researcher at Charité Universitätsmedizin Berlin, commenting on the new study, which was published online in Annals of the Rheumatic Diseases.

Dr. Fabian Proft

Both conditions belong to the spectrum of spondyloarthritis, but with varying viewpoints on nomenclature. They have intersections and overlaps, but not all treatments are equally effective for both. “We need to better understand their differences and similarities,” Dr. Proft said, adding that the new study is noteworthy for the size of the population included, its long-term follow-up data, and the researchers’ depth of experience treating these patients.

The researchers are based at the University of Toronto, which has separate clinics dedicated to PsA and to AS, said Dafna D. Gladman, MD, professor of medicine at the university, codirector of the PsA clinic, and corresponding author for the new study. The two clinics follow the same standardized protocols, including clinical, radiographic, genetic, and laboratory assessments. Even though the patients present quite similarly, she credits referring physicians for recognizing the distinctions by their referrals to the PsA or AS clinic.

According to previous research, pure axial PsA, without peripheral involvement, is rare, affecting about 2%-5% of patients with PsA. For this study, an observational cohort of 1,576 patients from the PsA clinic included 31% (n = 495) with axial disease, 2% (n = 32) with isolated axial PsA, and 29% (n = 463) with both axial and peripheral involvement. A total of 25 of the patients with isolated axial PsA ultimately developed peripheral disease by their most recent clinic follow-up visit. In a second cohort of 1,688 patients with AS, nearly 5% (n = 68) had isolated axial disease with psoriasis.

Dr. Dafna D. Gladman

“In our logistic regression analysis, isolated axial PsA was found to be a different clinical entity than isolated AS with psoriasis. They are not the same patients,” Dr. Gladman said. The patients with isolated axial PsA were older at diagnosis, more likely to have psoriatic nail lesions, and less likely to have inflammatory back pain than were patients with isolated axial AS and accompanying psoriasis.

When interviewed in early September, Dr. Gladman was preparing to fly to Ghent, Belgium, to participate in a debate at the International Congress on Spondyloarthritides, taking the pro position on the thesis: Is axial inflammation in PsA distinct from axial spondyloarthritis? Taking the con position was to be Robert Landewé, MD, PhD, of Amsterdam University Medical Center in the Netherlands.

“This is an old debate, splitters versus lumpers,” Dr. Gladman told this news organization. “My message is that when you place patients in more homogeneous groups, you can learn more and perhaps find better opportunities for treating their disease.” For example, even with the similarities, do these patients need to be treated with different medications? Medications for psoriasis, including those targeting the interleukin-23 cytokine, may not be effective for AS, but patients with axial PsA may not get them because of the association with axial AS.



“Now is the opportunity to really understand what – if any – are the differences between various components of this disease group. If you lump people together, you may miss the forest for the trees,” Dr. Gladman said. “If, at the end of the day, we find out these patients essentially are the same, I will lump. But until we have proved that there are no important differences, I will split.” She added that it is important for practicing rheumatologists to make the correct diagnosis so that they know to access certain drugs.

Dr. Proft credited Dr. Gladman and colleagues’ study for adding another piece of the puzzle to better understand differences and similarities for these two axial diseases. He noted, however, that the study did not include MRI scans for every participating patient, which could have given a deeper picture.

“International efforts are being made to recruit patients for a multinational, multicenter study of axial involvement in PsA,” which will include MRI data, Dr. Gladman said. She and Dr. Proft are both part of AXIS, the Axial Involvement in Psoriatic Arthritis cohort, now recruiting patients for such a study. AXIS is a joint project of the Assessment of SpondyloArthritis international Society and the Group for Research and Assessment of Psoriasis and Psoriatic Arthritis.

“We don’t have final answers yet, although we have given evidence to support the differences.” The proof is in the pudding, she said, and that pudding will be the clinical trials.

The University of Toronto Psoriatic Arthritis Program is supported by a grant from the Krembil Foundation. The study authors declared no competing interests. Dr. Proft reported receiving research support from Novartis, Eli Lilly, and UCB, and fees for consulting and serving on speakers bureaus from AbbVie, Amgen, Bristol-Myers Squibb, Celgene, Eli Lilly, Hexal, Janssen, Merck Sharp & Dohme, Novartis, Pfizer, Roche, and UCB.

 

A new study provides evidence that two conditions that fall under the umbrella of spondyloarthritis – isolated axial disease in patients with psoriatic arthritis (PsA) and isolated axial disease in patients with ankylosing spondylitis (AS) accompanied by psoriasis – are different clinical entities and may need different treatments. These relatively rare rheumatologic conditions, defined by their back involvement, have considerable clinical overlap and are often lumped together under the label axial spondyloarthritis.

This is a hot topic and current matter of debate within the scientific community: Are axial PsA and axial AS two separate diseases or just two phenotypes under the spondyloarthritis umbrella? said Fabian Proft, MD, a rheumatologist and researcher at Charité Universitätsmedizin Berlin, commenting on the new study, which was published online in Annals of the Rheumatic Diseases.

Dr. Fabian Proft

Both conditions belong to the spectrum of spondyloarthritis, but with varying viewpoints on nomenclature. They have intersections and overlaps, but not all treatments are equally effective for both. “We need to better understand their differences and similarities,” Dr. Proft said, adding that the new study is noteworthy for the size of the population included, its long-term follow-up data, and the researchers’ depth of experience treating these patients.

The researchers are based at the University of Toronto, which has separate clinics dedicated to PsA and to AS, said Dafna D. Gladman, MD, professor of medicine at the university, codirector of the PsA clinic, and corresponding author for the new study. The two clinics follow the same standardized protocols, including clinical, radiographic, genetic, and laboratory assessments. Even though the patients present quite similarly, she credits referring physicians for recognizing the distinctions by their referrals to the PsA or AS clinic.

According to previous research, pure axial PsA, without peripheral involvement, is rare, affecting about 2%-5% of patients with PsA. For this study, an observational cohort of 1,576 patients from the PsA clinic included 31% (n = 495) with axial disease, 2% (n = 32) with isolated axial PsA, and 29% (n = 463) with both axial and peripheral involvement. A total of 25 of the patients with isolated axial PsA ultimately developed peripheral disease by their most recent clinic follow-up visit. In a second cohort of 1,688 patients with AS, nearly 5% (n = 68) had isolated axial disease with psoriasis.

Dr. Dafna D. Gladman

“In our logistic regression analysis, isolated axial PsA was found to be a different clinical entity than isolated AS with psoriasis. They are not the same patients,” Dr. Gladman said. The patients with isolated axial PsA were older at diagnosis, more likely to have psoriatic nail lesions, and less likely to have inflammatory back pain than were patients with isolated axial AS and accompanying psoriasis.

When interviewed in early September, Dr. Gladman was preparing to fly to Ghent, Belgium, to participate in a debate at the International Congress on Spondyloarthritides, taking the pro position on the thesis: Is axial inflammation in PsA distinct from axial spondyloarthritis? Taking the con position was to be Robert Landewé, MD, PhD, of Amsterdam University Medical Center in the Netherlands.

“This is an old debate, splitters versus lumpers,” Dr. Gladman told this news organization. “My message is that when you place patients in more homogeneous groups, you can learn more and perhaps find better opportunities for treating their disease.” For example, even with the similarities, do these patients need to be treated with different medications? Medications for psoriasis, including those targeting the interleukin-23 cytokine, may not be effective for AS, but patients with axial PsA may not get them because of the association with axial AS.



“Now is the opportunity to really understand what – if any – are the differences between various components of this disease group. If you lump people together, you may miss the forest for the trees,” Dr. Gladman said. “If, at the end of the day, we find out these patients essentially are the same, I will lump. But until we have proved that there are no important differences, I will split.” She added that it is important for practicing rheumatologists to make the correct diagnosis so that they know to access certain drugs.

Dr. Proft credited Dr. Gladman and colleagues’ study for adding another piece of the puzzle to better understand differences and similarities for these two axial diseases. He noted, however, that the study did not include MRI scans for every participating patient, which could have given a deeper picture.

“International efforts are being made to recruit patients for a multinational, multicenter study of axial involvement in PsA,” which will include MRI data, Dr. Gladman said. She and Dr. Proft are both part of AXIS, the Axial Involvement in Psoriatic Arthritis cohort, now recruiting patients for such a study. AXIS is a joint project of the Assessment of SpondyloArthritis international Society and the Group for Research and Assessment of Psoriasis and Psoriatic Arthritis.

“We don’t have final answers yet, although we have given evidence to support the differences.” The proof is in the pudding, she said, and that pudding will be the clinical trials.

The University of Toronto Psoriatic Arthritis Program is supported by a grant from the Krembil Foundation. The study authors declared no competing interests. Dr. Proft reported receiving research support from Novartis, Eli Lilly, and UCB, and fees for consulting and serving on speakers bureaus from AbbVie, Amgen, Bristol-Myers Squibb, Celgene, Eli Lilly, Hexal, Janssen, Merck Sharp & Dohme, Novartis, Pfizer, Roche, and UCB.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ANNALS OF THE RHEUMATIC DISEASES

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

One fish, two fish, are good fish for you ... fish

Article Type
Changed
Thu, 09/08/2022 - 14:51

 

Good news for pregnant women; bad news for fish

As soon as women find out they’re pregnant, doctors recommend they give up smoking, drinking, and eating certain types of fish. That last item may need to be reconsidered, since a recent study supports the idea that it doesn’t matter what type of fish pregnant women are eating, as long as they’re eating it.

franckreporter/Getty Images

Researchers collected data from two different studies that reviewed the mercury levels of mothers from Bristol, England, and the Seychelles, a island chain off East Africa where “fish consumption is high and prenatal mercury levels are 10 times higher than in the [United States],” they said in NeuroToxicology.

Those data showed that the mercury levels had no adverse effects on child development as long as the mother ate fish. The nutrients and vitamins in the fish – vitamin D, long-chain fatty acids, selenium, and iodine – provide protection against mercury. There’s also the already-known benefits to eyesight and intellectual abilities that have been associated with fish consumption.

This analysis goes starkly against the grain of what is commonly recommended to expectant mothers, which is to cut out fish altogether. The researchers suggested that governments should review and change those recommendations to focus on the benefits instead.

As long as women follow the researchers’ recommendation to eat “at least two portions of fish a week, one of which should be oily,” they may not have to lay off on the sushi after all.
 

We’ll show our gut worms the world

Never let it be said that mankind is not a generous species. Sure, we could maybe be kinder to our fellow human beings, maybe declare a little less war on each other, but for the past 50,000 years, we’ve been giving a free ride to millions upon millions to one of mankind’s closest companions: the whipworm.

oksmith/openclipart.org

This revelation into human kindness comes from Denmark, where researchers from Copenhagen conducted a genetic analysis of ancient preserved whipworm eggs found in old Viking and Norse settlements, some of which date back over 2,000 years. In normal conditions genetic material wouldn’t last very long, but these were Viking whipworms eggs with tiny little horned helmets, so the DNA within has remained unchanged. Or it may be the tough chitinous exterior of the eggs protecting the DNA from degrading, combined with their preservation in moist soil.

Once they had their Viking whipworm DNA, the researchers compared it with whipworm DNA from all over the world, tracing its history as it followed mankind from Africa. And it’s been a while: We brought whipworms with us during our initial migration into Asia and Europe over 50,000 years ago. When the Bering land bridge opened up and humanity moved into the Americas, the worms came as well.

This is all possible because the whipworm goes about its parasitic business quietly and cleverly. It mostly sits harmlessly in our digestive systems, producing thousands of eggs a day that get expelled through poop and picked up by another host (human or otherwise); whipworms only cause disease in those with compromised immune systems.

The researchers noted that their study, the first complete genetic analysis of the whipworm, could help combat the parasite, which to this day infects hundred of millions who don’t have access to modern medicine or sanitary conditions. Hopefully, though, the days of free rides will soon be over for the whipworm. After all, if we have to pay hundreds or thousands of dollars to visit other countries, it’s only fair that our parasites do as well.
 

 

 

From zero to vasectomy in 6.7 seconds

There’s an old saying that you’ve probably heard: When life gives you lemons, make lemonade. It’s meant to encourage optimism in the face of adversity. Then there’s the new saying we just made up: When life gives you a power outage, plug your surgical instruments into an electric pickup.

Rivian

That’s what Dr. Christopher Yang did, and now we’re making the urologist from Austin, Tex., famous by sharing his surgical/electrical adventure with all 17 of LOTME’s regular readers. That’s some serious lemonade.

Dr. Yang’s tale begins when the electricity went out at his clinic, seemingly forcing him to cancel or reschedule several surgical procedures. Not so fast. Dr. Yang happens to own a Rivian R1T, an electric pickup truck that has four power outlets. A staff member suggested plugging the surgical instruments into the truck and, surprisingly, one of the day’s patients agreed to go ahead with his vasectomy.

“We were fortunate that my normal parking spot is close enough to a patient room to run an extension cord,” Dr. Yang said on TheDrive.com. That extension cord was attached to an electrocautery device, with a handheld device available as backup, and “after we were done, I told his family. We all had a good laugh together too,” Dr. Yang told radio station WGLT in Normal, Ill.

To us, anyway, this opens up all sorts of alternative energy possibilities. Can a windmill power a liposuction? Is a gerbil running in a wheel enough to do a colonoscopy? How many potatoes do you need to keep an EHR going?
 

Learning through random acts of not-exactly noisiness

First things first. Transcranial random noise stimulation (tRNS) is not really noise in the auditory sense of the word. For some people with learning disabilities, though, it can actually be very helpful. The technology, which uses electrodes attached to the head so a weak current can pass through specific parts of the brain, may help those with learning disabilities, perhaps even those with brain injuries and visual deficits, learn, said Dr. Onno van der Groen of Edith Cowan University in Perth, Australia.

littlehenrabi/Getty Images

“When you add this type of stimulation during learning, you get better performance, faster learning and better attention afterwards as well,” he said in a statement from the university.

The researchers say that tRNS can allow the brain to form new connections and pathways, which in turn help a person learn more effectively. “If you do 10 sessions of a visual perception task with the tRNS and then come back and do it again without it, you’ll find you perform better than the control group who hasn’t used it,” Dr. van der Groen noted.

Can this also work for the average person? It’s possible, but tRNS didn’t seem to improve the math skills of a top-level mathematician who underwent the process, according to a case study that Dr. van der Groen mentioned.

This line of work is still pretty new, though, so researchers don’t have all the answers yet. As always, we’re rooting for you, science!

Publications
Topics
Sections

 

Good news for pregnant women; bad news for fish

As soon as women find out they’re pregnant, doctors recommend they give up smoking, drinking, and eating certain types of fish. That last item may need to be reconsidered, since a recent study supports the idea that it doesn’t matter what type of fish pregnant women are eating, as long as they’re eating it.

franckreporter/Getty Images

Researchers collected data from two different studies that reviewed the mercury levels of mothers from Bristol, England, and the Seychelles, a island chain off East Africa where “fish consumption is high and prenatal mercury levels are 10 times higher than in the [United States],” they said in NeuroToxicology.

Those data showed that the mercury levels had no adverse effects on child development as long as the mother ate fish. The nutrients and vitamins in the fish – vitamin D, long-chain fatty acids, selenium, and iodine – provide protection against mercury. There’s also the already-known benefits to eyesight and intellectual abilities that have been associated with fish consumption.

This analysis goes starkly against the grain of what is commonly recommended to expectant mothers, which is to cut out fish altogether. The researchers suggested that governments should review and change those recommendations to focus on the benefits instead.

As long as women follow the researchers’ recommendation to eat “at least two portions of fish a week, one of which should be oily,” they may not have to lay off on the sushi after all.
 

We’ll show our gut worms the world

Never let it be said that mankind is not a generous species. Sure, we could maybe be kinder to our fellow human beings, maybe declare a little less war on each other, but for the past 50,000 years, we’ve been giving a free ride to millions upon millions to one of mankind’s closest companions: the whipworm.

oksmith/openclipart.org

This revelation into human kindness comes from Denmark, where researchers from Copenhagen conducted a genetic analysis of ancient preserved whipworm eggs found in old Viking and Norse settlements, some of which date back over 2,000 years. In normal conditions genetic material wouldn’t last very long, but these were Viking whipworms eggs with tiny little horned helmets, so the DNA within has remained unchanged. Or it may be the tough chitinous exterior of the eggs protecting the DNA from degrading, combined with their preservation in moist soil.

Once they had their Viking whipworm DNA, the researchers compared it with whipworm DNA from all over the world, tracing its history as it followed mankind from Africa. And it’s been a while: We brought whipworms with us during our initial migration into Asia and Europe over 50,000 years ago. When the Bering land bridge opened up and humanity moved into the Americas, the worms came as well.

This is all possible because the whipworm goes about its parasitic business quietly and cleverly. It mostly sits harmlessly in our digestive systems, producing thousands of eggs a day that get expelled through poop and picked up by another host (human or otherwise); whipworms only cause disease in those with compromised immune systems.

The researchers noted that their study, the first complete genetic analysis of the whipworm, could help combat the parasite, which to this day infects hundred of millions who don’t have access to modern medicine or sanitary conditions. Hopefully, though, the days of free rides will soon be over for the whipworm. After all, if we have to pay hundreds or thousands of dollars to visit other countries, it’s only fair that our parasites do as well.
 

 

 

From zero to vasectomy in 6.7 seconds

There’s an old saying that you’ve probably heard: When life gives you lemons, make lemonade. It’s meant to encourage optimism in the face of adversity. Then there’s the new saying we just made up: When life gives you a power outage, plug your surgical instruments into an electric pickup.

Rivian

That’s what Dr. Christopher Yang did, and now we’re making the urologist from Austin, Tex., famous by sharing his surgical/electrical adventure with all 17 of LOTME’s regular readers. That’s some serious lemonade.

Dr. Yang’s tale begins when the electricity went out at his clinic, seemingly forcing him to cancel or reschedule several surgical procedures. Not so fast. Dr. Yang happens to own a Rivian R1T, an electric pickup truck that has four power outlets. A staff member suggested plugging the surgical instruments into the truck and, surprisingly, one of the day’s patients agreed to go ahead with his vasectomy.

“We were fortunate that my normal parking spot is close enough to a patient room to run an extension cord,” Dr. Yang said on TheDrive.com. That extension cord was attached to an electrocautery device, with a handheld device available as backup, and “after we were done, I told his family. We all had a good laugh together too,” Dr. Yang told radio station WGLT in Normal, Ill.

To us, anyway, this opens up all sorts of alternative energy possibilities. Can a windmill power a liposuction? Is a gerbil running in a wheel enough to do a colonoscopy? How many potatoes do you need to keep an EHR going?
 

Learning through random acts of not-exactly noisiness

First things first. Transcranial random noise stimulation (tRNS) is not really noise in the auditory sense of the word. For some people with learning disabilities, though, it can actually be very helpful. The technology, which uses electrodes attached to the head so a weak current can pass through specific parts of the brain, may help those with learning disabilities, perhaps even those with brain injuries and visual deficits, learn, said Dr. Onno van der Groen of Edith Cowan University in Perth, Australia.

littlehenrabi/Getty Images

“When you add this type of stimulation during learning, you get better performance, faster learning and better attention afterwards as well,” he said in a statement from the university.

The researchers say that tRNS can allow the brain to form new connections and pathways, which in turn help a person learn more effectively. “If you do 10 sessions of a visual perception task with the tRNS and then come back and do it again without it, you’ll find you perform better than the control group who hasn’t used it,” Dr. van der Groen noted.

Can this also work for the average person? It’s possible, but tRNS didn’t seem to improve the math skills of a top-level mathematician who underwent the process, according to a case study that Dr. van der Groen mentioned.

This line of work is still pretty new, though, so researchers don’t have all the answers yet. As always, we’re rooting for you, science!

 

Good news for pregnant women; bad news for fish

As soon as women find out they’re pregnant, doctors recommend they give up smoking, drinking, and eating certain types of fish. That last item may need to be reconsidered, since a recent study supports the idea that it doesn’t matter what type of fish pregnant women are eating, as long as they’re eating it.

franckreporter/Getty Images

Researchers collected data from two different studies that reviewed the mercury levels of mothers from Bristol, England, and the Seychelles, a island chain off East Africa where “fish consumption is high and prenatal mercury levels are 10 times higher than in the [United States],” they said in NeuroToxicology.

Those data showed that the mercury levels had no adverse effects on child development as long as the mother ate fish. The nutrients and vitamins in the fish – vitamin D, long-chain fatty acids, selenium, and iodine – provide protection against mercury. There’s also the already-known benefits to eyesight and intellectual abilities that have been associated with fish consumption.

This analysis goes starkly against the grain of what is commonly recommended to expectant mothers, which is to cut out fish altogether. The researchers suggested that governments should review and change those recommendations to focus on the benefits instead.

As long as women follow the researchers’ recommendation to eat “at least two portions of fish a week, one of which should be oily,” they may not have to lay off on the sushi after all.
 

We’ll show our gut worms the world

Never let it be said that mankind is not a generous species. Sure, we could maybe be kinder to our fellow human beings, maybe declare a little less war on each other, but for the past 50,000 years, we’ve been giving a free ride to millions upon millions to one of mankind’s closest companions: the whipworm.

oksmith/openclipart.org

This revelation into human kindness comes from Denmark, where researchers from Copenhagen conducted a genetic analysis of ancient preserved whipworm eggs found in old Viking and Norse settlements, some of which date back over 2,000 years. In normal conditions genetic material wouldn’t last very long, but these were Viking whipworms eggs with tiny little horned helmets, so the DNA within has remained unchanged. Or it may be the tough chitinous exterior of the eggs protecting the DNA from degrading, combined with their preservation in moist soil.

Once they had their Viking whipworm DNA, the researchers compared it with whipworm DNA from all over the world, tracing its history as it followed mankind from Africa. And it’s been a while: We brought whipworms with us during our initial migration into Asia and Europe over 50,000 years ago. When the Bering land bridge opened up and humanity moved into the Americas, the worms came as well.

This is all possible because the whipworm goes about its parasitic business quietly and cleverly. It mostly sits harmlessly in our digestive systems, producing thousands of eggs a day that get expelled through poop and picked up by another host (human or otherwise); whipworms only cause disease in those with compromised immune systems.

The researchers noted that their study, the first complete genetic analysis of the whipworm, could help combat the parasite, which to this day infects hundred of millions who don’t have access to modern medicine or sanitary conditions. Hopefully, though, the days of free rides will soon be over for the whipworm. After all, if we have to pay hundreds or thousands of dollars to visit other countries, it’s only fair that our parasites do as well.
 

 

 

From zero to vasectomy in 6.7 seconds

There’s an old saying that you’ve probably heard: When life gives you lemons, make lemonade. It’s meant to encourage optimism in the face of adversity. Then there’s the new saying we just made up: When life gives you a power outage, plug your surgical instruments into an electric pickup.

Rivian

That’s what Dr. Christopher Yang did, and now we’re making the urologist from Austin, Tex., famous by sharing his surgical/electrical adventure with all 17 of LOTME’s regular readers. That’s some serious lemonade.

Dr. Yang’s tale begins when the electricity went out at his clinic, seemingly forcing him to cancel or reschedule several surgical procedures. Not so fast. Dr. Yang happens to own a Rivian R1T, an electric pickup truck that has four power outlets. A staff member suggested plugging the surgical instruments into the truck and, surprisingly, one of the day’s patients agreed to go ahead with his vasectomy.

“We were fortunate that my normal parking spot is close enough to a patient room to run an extension cord,” Dr. Yang said on TheDrive.com. That extension cord was attached to an electrocautery device, with a handheld device available as backup, and “after we were done, I told his family. We all had a good laugh together too,” Dr. Yang told radio station WGLT in Normal, Ill.

To us, anyway, this opens up all sorts of alternative energy possibilities. Can a windmill power a liposuction? Is a gerbil running in a wheel enough to do a colonoscopy? How many potatoes do you need to keep an EHR going?
 

Learning through random acts of not-exactly noisiness

First things first. Transcranial random noise stimulation (tRNS) is not really noise in the auditory sense of the word. For some people with learning disabilities, though, it can actually be very helpful. The technology, which uses electrodes attached to the head so a weak current can pass through specific parts of the brain, may help those with learning disabilities, perhaps even those with brain injuries and visual deficits, learn, said Dr. Onno van der Groen of Edith Cowan University in Perth, Australia.

littlehenrabi/Getty Images

“When you add this type of stimulation during learning, you get better performance, faster learning and better attention afterwards as well,” he said in a statement from the university.

The researchers say that tRNS can allow the brain to form new connections and pathways, which in turn help a person learn more effectively. “If you do 10 sessions of a visual perception task with the tRNS and then come back and do it again without it, you’ll find you perform better than the control group who hasn’t used it,” Dr. van der Groen noted.

Can this also work for the average person? It’s possible, but tRNS didn’t seem to improve the math skills of a top-level mathematician who underwent the process, according to a case study that Dr. van der Groen mentioned.

This line of work is still pretty new, though, so researchers don’t have all the answers yet. As always, we’re rooting for you, science!

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Commentary: Better Migraine Outcomes Measures, September 2022

Article Type
Changed
Thu, 09/08/2022 - 15:01
Dr Berk scans the journal, so you don't have to!

 

The theme of this month's commentary is alternative outcomes measures for future migraine studies. The traditional outcomes measures, such as headache frequency measured in headache days, have long been considered gold standards when evaluating the efficacy of preventive interventions. When headache conditions are complicated by interictal pain or other symptoms, or when medication overuse adds a higher frequency or greater severity, those traditional measures are somewhat less exact and specific. Meaningful change for patients with higher frequency of attacks, near-continuous pain, or other migraine symptoms is quite different from that for those without these complications.

 

Ailani and colleagues reviewed post hoc data from the CONQUER trial, a prior study evaluating the safety and efficacy of galcanezumab vs placebo in patients who had previously not benefited from two to four categories of migraine preventive medication. This refractory population was initially noted to have 4.1 fewer headache days per month than patients taking placebo, but the authors now attempted to review these data with a focus on a different measure: total pain burden (TPB). They defined daily TPB as a single composite measure assessing the frequency, duration, and severity of migraine, calculated by multiplying the number of hours of migraine by the maximum daily migraine pain severity score. The monthly TPB was calculated by adding the daily pain burden over the entire month. The Migraine Disability Assessment questionnaire (MIDAS) and Migraine-Specific Quality of Life Questionnaire (MSQ) scores were also included to compare migraine-related disability and quality of life.

 

The patients who received galcanezumab were noted to have a significantly lower TPB, both in episodic and chronic migraine. Significantly greater reductions in monthly TPB relative to placebo were observed at each individual month as well. The change from baseline TPB was also noted to be significantly improved in the galcanezumab group compared with the placebo group. The reduction in TPB was noted even when migraine-day reductions were accounted for as part of a sensitivity analysis.

 

Preventive trials for migraine treatment focus primarily on migraine-day reduction, and for many patients with higher-frequency migraine, this measure does not adequately account for their disease-related disability. This unique way of looking at pain as part of a bigger picture is much more significant and meaningful for this patient population. Migraine frequency is still a very important outcomes measure, but it would be wise to add TBP or another measure that looks more globally at disease-related disability, especially when investigating preventive options in patients with chronic migraine.

 

When considering whether an intervention is helpful, most patients and clinicians follow the headache frequency, severity, or quality-of-life factors. As most patients will readily report, not all "headache-free days" are created equal. Although most people with migraine will experience days with absolutely no headache pain or other migraine-associated symptoms, on many days they will still have some symptoms of migraine. Lee and colleagues attempted to quantify the difference between headache-free days and crystal-clear days.

 

Most headache studies use the frequency of headache days as a primary or secondary outcome. This study collected data on both headache days and crystal-clear days, using data from a questionnaire-based large South Korean nationwide population study that evaluated headache and sleep. The study questions were validated for migraine and aura, and included: "How many days have you had a headache during the previous 30 days?" and "How many days have you had crystal-clear days without headache during the previous 30 days?" The data were then analyzed and compared with the widespread pain index (criteria for fibromyalgia) as well as sleep duration, sleep quality, depression and anxiety scales, and an allodynia checklist.

 

A little over 3000 respondents completed the surveys; 1938 had experienced headache over the past year, 170 were classified as having a diagnosis of migraine, and 50 of those were diagnosed with aura as well. Out of the patients with migraine, 97% had "unclear days." This was higher than the rate of those with non-migraine headaches (91%). Nearly all people surveyed had some crystal-clear days (99.4%).

 

The number of crystal-clear days per 30 days was significantly lower in participants with migraine than in those with non-migraine headache. Participants with migraine also had higher frequencies of cutaneous allodynia, anxiety, and depression. The weekly average sleep duration in participants with migraine did not significantly differ from that in participants with non-migraine headaches. The widespread pain index rate was much higher in those with migraine as well.

 

Most patients will definitely understand the difference between crystal-clear and unclear headache days. Many of the newer outcomes studies in migraine have started focusing on the most bothersome symptom, as headache pain is far from the only significant or disabling symptom associated with migraine. This study makes clear that further outcomes changes are necessary, and that a potentially more meaningful result in migraine studies may actually be crystal-clear days rather than simply headache-free days.

 

Although there are more acute options available for headache treatment, medication overuse headache remains a major complicating factor for most clinicians who treat headache. When educating patients, there is always a strong emphasis on guidelines for acute medication use. Many patients struggle with knowing when to use an acute treatment and when to alternate with a different treatment, and often they will withhold treatment completely due to fear of medication overuse. The new class of calcitonin gene-related peptide (CGRP) antagonist medications has shown some potential benefit as a preventive option for both medication overuse headache and migraine.

 

The prospective study by Curone and colleagues enrolled 300 patients with confirmed medication overuse headache who did not undergo withdrawal of the overused acute medication. Patients who are already taking preventive medications were excluded, as were patients with diagnoses other than chronic migraine or medication overuse. Patients were given one of the three injectable CGRP antagonist medications for prevention and were followed up at 3, 6, 9, and 12 months. The primary outcome was MIDAS score as well as monthly headache days and analgesic consumption.

 

Out of 303 patients, 242 (80%) showed both a ≥50% reduction of monthly headache days and ≥50% reduction in analgesic intake at 3-month follow-up visit. At 9 months, 198 (65%) were still responders. Monthly analgesic intake decreased ≥50% in 268 of 303 patients (88%) at 3 months and in 241 of 303 patients (79%) at the 6-month follow-up.

 

For years there has been a debate regarding whether withdrawal of an overused medication is necessary for effective treatment of medication overuse headache. Many preventive treatments are less effective when medication overuse is ongoing. The CGRP class of medications does appear to be effective even with ongoing acute medication overuse. This class of medications should definitely be considered when withdrawing an overused medication is complicated, or when a patient needs to continue to take analgesic medications for another condition.

Author and Disclosure Information

Thomas Berk, MD 

Neura Health and Thomas Jefferson University, Woodbury, NJ 

Publications
Topics
Sections
Author and Disclosure Information

Thomas Berk, MD 

Neura Health and Thomas Jefferson University, Woodbury, NJ 

Author and Disclosure Information

Thomas Berk, MD 

Neura Health and Thomas Jefferson University, Woodbury, NJ 

Dr Berk scans the journal, so you don't have to!
Dr Berk scans the journal, so you don't have to!

 

The theme of this month's commentary is alternative outcomes measures for future migraine studies. The traditional outcomes measures, such as headache frequency measured in headache days, have long been considered gold standards when evaluating the efficacy of preventive interventions. When headache conditions are complicated by interictal pain or other symptoms, or when medication overuse adds a higher frequency or greater severity, those traditional measures are somewhat less exact and specific. Meaningful change for patients with higher frequency of attacks, near-continuous pain, or other migraine symptoms is quite different from that for those without these complications.

 

Ailani and colleagues reviewed post hoc data from the CONQUER trial, a prior study evaluating the safety and efficacy of galcanezumab vs placebo in patients who had previously not benefited from two to four categories of migraine preventive medication. This refractory population was initially noted to have 4.1 fewer headache days per month than patients taking placebo, but the authors now attempted to review these data with a focus on a different measure: total pain burden (TPB). They defined daily TPB as a single composite measure assessing the frequency, duration, and severity of migraine, calculated by multiplying the number of hours of migraine by the maximum daily migraine pain severity score. The monthly TPB was calculated by adding the daily pain burden over the entire month. The Migraine Disability Assessment questionnaire (MIDAS) and Migraine-Specific Quality of Life Questionnaire (MSQ) scores were also included to compare migraine-related disability and quality of life.

 

The patients who received galcanezumab were noted to have a significantly lower TPB, both in episodic and chronic migraine. Significantly greater reductions in monthly TPB relative to placebo were observed at each individual month as well. The change from baseline TPB was also noted to be significantly improved in the galcanezumab group compared with the placebo group. The reduction in TPB was noted even when migraine-day reductions were accounted for as part of a sensitivity analysis.

 

Preventive trials for migraine treatment focus primarily on migraine-day reduction, and for many patients with higher-frequency migraine, this measure does not adequately account for their disease-related disability. This unique way of looking at pain as part of a bigger picture is much more significant and meaningful for this patient population. Migraine frequency is still a very important outcomes measure, but it would be wise to add TBP or another measure that looks more globally at disease-related disability, especially when investigating preventive options in patients with chronic migraine.

 

When considering whether an intervention is helpful, most patients and clinicians follow the headache frequency, severity, or quality-of-life factors. As most patients will readily report, not all "headache-free days" are created equal. Although most people with migraine will experience days with absolutely no headache pain or other migraine-associated symptoms, on many days they will still have some symptoms of migraine. Lee and colleagues attempted to quantify the difference between headache-free days and crystal-clear days.

 

Most headache studies use the frequency of headache days as a primary or secondary outcome. This study collected data on both headache days and crystal-clear days, using data from a questionnaire-based large South Korean nationwide population study that evaluated headache and sleep. The study questions were validated for migraine and aura, and included: "How many days have you had a headache during the previous 30 days?" and "How many days have you had crystal-clear days without headache during the previous 30 days?" The data were then analyzed and compared with the widespread pain index (criteria for fibromyalgia) as well as sleep duration, sleep quality, depression and anxiety scales, and an allodynia checklist.

 

A little over 3000 respondents completed the surveys; 1938 had experienced headache over the past year, 170 were classified as having a diagnosis of migraine, and 50 of those were diagnosed with aura as well. Out of the patients with migraine, 97% had "unclear days." This was higher than the rate of those with non-migraine headaches (91%). Nearly all people surveyed had some crystal-clear days (99.4%).

 

The number of crystal-clear days per 30 days was significantly lower in participants with migraine than in those with non-migraine headache. Participants with migraine also had higher frequencies of cutaneous allodynia, anxiety, and depression. The weekly average sleep duration in participants with migraine did not significantly differ from that in participants with non-migraine headaches. The widespread pain index rate was much higher in those with migraine as well.

 

Most patients will definitely understand the difference between crystal-clear and unclear headache days. Many of the newer outcomes studies in migraine have started focusing on the most bothersome symptom, as headache pain is far from the only significant or disabling symptom associated with migraine. This study makes clear that further outcomes changes are necessary, and that a potentially more meaningful result in migraine studies may actually be crystal-clear days rather than simply headache-free days.

 

Although there are more acute options available for headache treatment, medication overuse headache remains a major complicating factor for most clinicians who treat headache. When educating patients, there is always a strong emphasis on guidelines for acute medication use. Many patients struggle with knowing when to use an acute treatment and when to alternate with a different treatment, and often they will withhold treatment completely due to fear of medication overuse. The new class of calcitonin gene-related peptide (CGRP) antagonist medications has shown some potential benefit as a preventive option for both medication overuse headache and migraine.

 

The prospective study by Curone and colleagues enrolled 300 patients with confirmed medication overuse headache who did not undergo withdrawal of the overused acute medication. Patients who are already taking preventive medications were excluded, as were patients with diagnoses other than chronic migraine or medication overuse. Patients were given one of the three injectable CGRP antagonist medications for prevention and were followed up at 3, 6, 9, and 12 months. The primary outcome was MIDAS score as well as monthly headache days and analgesic consumption.

 

Out of 303 patients, 242 (80%) showed both a ≥50% reduction of monthly headache days and ≥50% reduction in analgesic intake at 3-month follow-up visit. At 9 months, 198 (65%) were still responders. Monthly analgesic intake decreased ≥50% in 268 of 303 patients (88%) at 3 months and in 241 of 303 patients (79%) at the 6-month follow-up.

 

For years there has been a debate regarding whether withdrawal of an overused medication is necessary for effective treatment of medication overuse headache. Many preventive treatments are less effective when medication overuse is ongoing. The CGRP class of medications does appear to be effective even with ongoing acute medication overuse. This class of medications should definitely be considered when withdrawing an overused medication is complicated, or when a patient needs to continue to take analgesic medications for another condition.

 

The theme of this month's commentary is alternative outcomes measures for future migraine studies. The traditional outcomes measures, such as headache frequency measured in headache days, have long been considered gold standards when evaluating the efficacy of preventive interventions. When headache conditions are complicated by interictal pain or other symptoms, or when medication overuse adds a higher frequency or greater severity, those traditional measures are somewhat less exact and specific. Meaningful change for patients with higher frequency of attacks, near-continuous pain, or other migraine symptoms is quite different from that for those without these complications.

 

Ailani and colleagues reviewed post hoc data from the CONQUER trial, a prior study evaluating the safety and efficacy of galcanezumab vs placebo in patients who had previously not benefited from two to four categories of migraine preventive medication. This refractory population was initially noted to have 4.1 fewer headache days per month than patients taking placebo, but the authors now attempted to review these data with a focus on a different measure: total pain burden (TPB). They defined daily TPB as a single composite measure assessing the frequency, duration, and severity of migraine, calculated by multiplying the number of hours of migraine by the maximum daily migraine pain severity score. The monthly TPB was calculated by adding the daily pain burden over the entire month. The Migraine Disability Assessment questionnaire (MIDAS) and Migraine-Specific Quality of Life Questionnaire (MSQ) scores were also included to compare migraine-related disability and quality of life.

 

The patients who received galcanezumab were noted to have a significantly lower TPB, both in episodic and chronic migraine. Significantly greater reductions in monthly TPB relative to placebo were observed at each individual month as well. The change from baseline TPB was also noted to be significantly improved in the galcanezumab group compared with the placebo group. The reduction in TPB was noted even when migraine-day reductions were accounted for as part of a sensitivity analysis.

 

Preventive trials for migraine treatment focus primarily on migraine-day reduction, and for many patients with higher-frequency migraine, this measure does not adequately account for their disease-related disability. This unique way of looking at pain as part of a bigger picture is much more significant and meaningful for this patient population. Migraine frequency is still a very important outcomes measure, but it would be wise to add TBP or another measure that looks more globally at disease-related disability, especially when investigating preventive options in patients with chronic migraine.

 

When considering whether an intervention is helpful, most patients and clinicians follow the headache frequency, severity, or quality-of-life factors. As most patients will readily report, not all "headache-free days" are created equal. Although most people with migraine will experience days with absolutely no headache pain or other migraine-associated symptoms, on many days they will still have some symptoms of migraine. Lee and colleagues attempted to quantify the difference between headache-free days and crystal-clear days.

 

Most headache studies use the frequency of headache days as a primary or secondary outcome. This study collected data on both headache days and crystal-clear days, using data from a questionnaire-based large South Korean nationwide population study that evaluated headache and sleep. The study questions were validated for migraine and aura, and included: "How many days have you had a headache during the previous 30 days?" and "How many days have you had crystal-clear days without headache during the previous 30 days?" The data were then analyzed and compared with the widespread pain index (criteria for fibromyalgia) as well as sleep duration, sleep quality, depression and anxiety scales, and an allodynia checklist.

 

A little over 3000 respondents completed the surveys; 1938 had experienced headache over the past year, 170 were classified as having a diagnosis of migraine, and 50 of those were diagnosed with aura as well. Out of the patients with migraine, 97% had "unclear days." This was higher than the rate of those with non-migraine headaches (91%). Nearly all people surveyed had some crystal-clear days (99.4%).

 

The number of crystal-clear days per 30 days was significantly lower in participants with migraine than in those with non-migraine headache. Participants with migraine also had higher frequencies of cutaneous allodynia, anxiety, and depression. The weekly average sleep duration in participants with migraine did not significantly differ from that in participants with non-migraine headaches. The widespread pain index rate was much higher in those with migraine as well.

 

Most patients will definitely understand the difference between crystal-clear and unclear headache days. Many of the newer outcomes studies in migraine have started focusing on the most bothersome symptom, as headache pain is far from the only significant or disabling symptom associated with migraine. This study makes clear that further outcomes changes are necessary, and that a potentially more meaningful result in migraine studies may actually be crystal-clear days rather than simply headache-free days.

 

Although there are more acute options available for headache treatment, medication overuse headache remains a major complicating factor for most clinicians who treat headache. When educating patients, there is always a strong emphasis on guidelines for acute medication use. Many patients struggle with knowing when to use an acute treatment and when to alternate with a different treatment, and often they will withhold treatment completely due to fear of medication overuse. The new class of calcitonin gene-related peptide (CGRP) antagonist medications has shown some potential benefit as a preventive option for both medication overuse headache and migraine.

 

The prospective study by Curone and colleagues enrolled 300 patients with confirmed medication overuse headache who did not undergo withdrawal of the overused acute medication. Patients who are already taking preventive medications were excluded, as were patients with diagnoses other than chronic migraine or medication overuse. Patients were given one of the three injectable CGRP antagonist medications for prevention and were followed up at 3, 6, 9, and 12 months. The primary outcome was MIDAS score as well as monthly headache days and analgesic consumption.

 

Out of 303 patients, 242 (80%) showed both a ≥50% reduction of monthly headache days and ≥50% reduction in analgesic intake at 3-month follow-up visit. At 9 months, 198 (65%) were still responders. Monthly analgesic intake decreased ≥50% in 268 of 303 patients (88%) at 3 months and in 241 of 303 patients (79%) at the 6-month follow-up.

 

For years there has been a debate regarding whether withdrawal of an overused medication is necessary for effective treatment of medication overuse headache. Many preventive treatments are less effective when medication overuse is ongoing. The CGRP class of medications does appear to be effective even with ongoing acute medication overuse. This class of medications should definitely be considered when withdrawing an overused medication is complicated, or when a patient needs to continue to take analgesic medications for another condition.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Article Series
Clinical Edge Journal Scan Commentary: Migraine September 2022
Gate On Date
Tue, 01/11/2022 - 20:45
Un-Gate On Date
Tue, 01/11/2022 - 20:45
Use ProPublica
CFC Schedule Remove Status
Tue, 01/11/2022 - 20:45
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Where a child eats breakfast is important

Article Type
Changed
Wed, 09/07/2022 - 17:21

We’ve been told for decades that a child who doesn’t start the day with a good breakfast is entering school at a serious disadvantage. The brain needs a good supply of energy to learn optimally. So the standard wisdom goes. Subsidized school breakfast programs have been built around this chestnut. But, is there solid evidence to support the notion that simply adding a morning meal to a child’s schedule will improve his or her school performance? It sounds like common sense, but is it just one of those old grandmother’s nuggets that doesn’t stand up under close scrutiny?

A recent study from Spain suggests that the relationship between breakfast and school performance is not merely related to the nutritional needs of a growing brain. Using data from nearly 4,000 Spanish children aged 4-14 collected in a 2017 national health survey, the investigators found “skipping breakfast and eating breakfast out of the home were linked to greater odds of psychosocial behavioral problems than eating breakfast at home.” And, we already know that, in general, children who misbehave in school don’t thrive academically.

Dr. William G. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years.
Dr. William G. Wilkoff

There were also associations between the absence or presence of certain food groups in the morning meal with behavioral problems. But the data lacked the granularity to draw any firm conclusions – although the authors felt that what they consider a healthy Spanish diet may have had a positive influence on behavior.

The findings in this study may simply be another example of the many positive influences that have been associated with family meals and have little to do with what is actually consumed. The association may not have much to do with the family gathering together at a single Norman Rockwell sitting, a reality that I suspect seldom occurs. The apparent positive influence of breakfast may be that it reflects a family’s priorities: that food is important, that sleep is important, and that school is important – so important that scheduling the morning should focus on sending the child off well prepared. The child who is allowed to stay up to an unhealthy hour is likely to be difficult to arouse in the morning for breakfast and getting off to school.

It may be that the child’s behavior problems are so disruptive and taxing for the family that even with their best efforts, the parents can’t find the time and energy to provide a breakfast in the home.

On the other hand, the study doesn’t tell us how many children aren’t offered breakfast at home because their families simply can’t afford it. Obviously, the answer depends on the socioeconomic mix of a given community. In some localities this may represent a sizable percentage of the population.

So where does this leave us? Unfortunately, as I read through the discussion at the end of this paper I felt that the authors were leaning too much toward further research based on the potential associations between behavior and specific food groups their data suggested.

For me, the take-home message from this paper is that our existing efforts to improve academic success with food offered in school should also include strategies that promote eating breakfast at home. For example, the backpack take-home food distribution programs that seem to have been effective could include breakfast-targeted items packaged in a way that encourage families to provide breakfast at home.

Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” Other than a Littman stethoscope he accepted as a first-year medical student in 1966, Dr. Wilkoff reports having nothing to disclose. Email him at [email protected].

Publications
Topics
Sections

We’ve been told for decades that a child who doesn’t start the day with a good breakfast is entering school at a serious disadvantage. The brain needs a good supply of energy to learn optimally. So the standard wisdom goes. Subsidized school breakfast programs have been built around this chestnut. But, is there solid evidence to support the notion that simply adding a morning meal to a child’s schedule will improve his or her school performance? It sounds like common sense, but is it just one of those old grandmother’s nuggets that doesn’t stand up under close scrutiny?

A recent study from Spain suggests that the relationship between breakfast and school performance is not merely related to the nutritional needs of a growing brain. Using data from nearly 4,000 Spanish children aged 4-14 collected in a 2017 national health survey, the investigators found “skipping breakfast and eating breakfast out of the home were linked to greater odds of psychosocial behavioral problems than eating breakfast at home.” And, we already know that, in general, children who misbehave in school don’t thrive academically.

Dr. William G. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years.
Dr. William G. Wilkoff

There were also associations between the absence or presence of certain food groups in the morning meal with behavioral problems. But the data lacked the granularity to draw any firm conclusions – although the authors felt that what they consider a healthy Spanish diet may have had a positive influence on behavior.

The findings in this study may simply be another example of the many positive influences that have been associated with family meals and have little to do with what is actually consumed. The association may not have much to do with the family gathering together at a single Norman Rockwell sitting, a reality that I suspect seldom occurs. The apparent positive influence of breakfast may be that it reflects a family’s priorities: that food is important, that sleep is important, and that school is important – so important that scheduling the morning should focus on sending the child off well prepared. The child who is allowed to stay up to an unhealthy hour is likely to be difficult to arouse in the morning for breakfast and getting off to school.

It may be that the child’s behavior problems are so disruptive and taxing for the family that even with their best efforts, the parents can’t find the time and energy to provide a breakfast in the home.

On the other hand, the study doesn’t tell us how many children aren’t offered breakfast at home because their families simply can’t afford it. Obviously, the answer depends on the socioeconomic mix of a given community. In some localities this may represent a sizable percentage of the population.

So where does this leave us? Unfortunately, as I read through the discussion at the end of this paper I felt that the authors were leaning too much toward further research based on the potential associations between behavior and specific food groups their data suggested.

For me, the take-home message from this paper is that our existing efforts to improve academic success with food offered in school should also include strategies that promote eating breakfast at home. For example, the backpack take-home food distribution programs that seem to have been effective could include breakfast-targeted items packaged in a way that encourage families to provide breakfast at home.

Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” Other than a Littman stethoscope he accepted as a first-year medical student in 1966, Dr. Wilkoff reports having nothing to disclose. Email him at [email protected].

We’ve been told for decades that a child who doesn’t start the day with a good breakfast is entering school at a serious disadvantage. The brain needs a good supply of energy to learn optimally. So the standard wisdom goes. Subsidized school breakfast programs have been built around this chestnut. But, is there solid evidence to support the notion that simply adding a morning meal to a child’s schedule will improve his or her school performance? It sounds like common sense, but is it just one of those old grandmother’s nuggets that doesn’t stand up under close scrutiny?

A recent study from Spain suggests that the relationship between breakfast and school performance is not merely related to the nutritional needs of a growing brain. Using data from nearly 4,000 Spanish children aged 4-14 collected in a 2017 national health survey, the investigators found “skipping breakfast and eating breakfast out of the home were linked to greater odds of psychosocial behavioral problems than eating breakfast at home.” And, we already know that, in general, children who misbehave in school don’t thrive academically.

Dr. William G. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years.
Dr. William G. Wilkoff

There were also associations between the absence or presence of certain food groups in the morning meal with behavioral problems. But the data lacked the granularity to draw any firm conclusions – although the authors felt that what they consider a healthy Spanish diet may have had a positive influence on behavior.

The findings in this study may simply be another example of the many positive influences that have been associated with family meals and have little to do with what is actually consumed. The association may not have much to do with the family gathering together at a single Norman Rockwell sitting, a reality that I suspect seldom occurs. The apparent positive influence of breakfast may be that it reflects a family’s priorities: that food is important, that sleep is important, and that school is important – so important that scheduling the morning should focus on sending the child off well prepared. The child who is allowed to stay up to an unhealthy hour is likely to be difficult to arouse in the morning for breakfast and getting off to school.

It may be that the child’s behavior problems are so disruptive and taxing for the family that even with their best efforts, the parents can’t find the time and energy to provide a breakfast in the home.

On the other hand, the study doesn’t tell us how many children aren’t offered breakfast at home because their families simply can’t afford it. Obviously, the answer depends on the socioeconomic mix of a given community. In some localities this may represent a sizable percentage of the population.

So where does this leave us? Unfortunately, as I read through the discussion at the end of this paper I felt that the authors were leaning too much toward further research based on the potential associations between behavior and specific food groups their data suggested.

For me, the take-home message from this paper is that our existing efforts to improve academic success with food offered in school should also include strategies that promote eating breakfast at home. For example, the backpack take-home food distribution programs that seem to have been effective could include breakfast-targeted items packaged in a way that encourage families to provide breakfast at home.

Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” Other than a Littman stethoscope he accepted as a first-year medical student in 1966, Dr. Wilkoff reports having nothing to disclose. Email him at [email protected].

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Pediatricians urge flu vaccine for children

Article Type
Changed
Wed, 09/07/2022 - 16:15

Attention parents: The nation’s leading pediatric medical society is urging you to make sure your children get a flu shot this fall to prevent and control the spread of the illness.

The American Academy of Pediatrics recently called on parents and caregivers to seek flu vaccines for their children as soon as they are available in the fall. The group is encouraging parents to catch up on all other vaccines for their children, too.

“As a pediatrician and a parent, I consider the flu vaccine as critical for all family members,” Kristina A. Bryant, MD, said in a statement about the academy’s recommendations. “We should not underestimate the flu, especially when other respiratory viruses like COVID-19 are circulating within our communities. Besides making your child miserable and wreaking havoc on your family’s routine, influenza can also be serious and even deadly in children.”

Only 55% of children aged 6 months to 17 years had been vaccinated against influenza as of early April – down 2% from the previous April – and coverage levels were 8.1% lower for Black children compared with non-Hispanic White children, according to the CDC. In the 2019-2020 flu season, 188 children in the United States died of the infection, equaling the high mark for deaths set in the 2017-2018 season, the agency reported.

American Academy of Pediatrics guidelines recommend children aged 6 months and older be vaccinated with the flu vaccine every year. Depending on the child’s age and health, they may receive either a shot, which has an inactive version of the flu virus, or the nasal spray, which has a weakened form of the virus. The academy has more information about the different vaccines.

Children aged 6-8 months who are getting flu vaccines for the first time should receive two doses at least 4 weeks apart. Pregnant women can get the flu vaccine any time in their pregnancy. Influenza vaccines are safe for developing fetuses, according to the academy.

The group stressed the importance of flu vaccines for high-risk and medically vulnerable children and acknowledged the need to end barriers to immunizations for all people, regardless of income or insurance coverage. In 2020, an estimated 16.1% of children in the United States were living in poverty, up from 14.4% in 2019, according to the U.S. Census Bureau.

A version of this article first appeared on WebMD.com.

Publications
Topics
Sections

Attention parents: The nation’s leading pediatric medical society is urging you to make sure your children get a flu shot this fall to prevent and control the spread of the illness.

The American Academy of Pediatrics recently called on parents and caregivers to seek flu vaccines for their children as soon as they are available in the fall. The group is encouraging parents to catch up on all other vaccines for their children, too.

“As a pediatrician and a parent, I consider the flu vaccine as critical for all family members,” Kristina A. Bryant, MD, said in a statement about the academy’s recommendations. “We should not underestimate the flu, especially when other respiratory viruses like COVID-19 are circulating within our communities. Besides making your child miserable and wreaking havoc on your family’s routine, influenza can also be serious and even deadly in children.”

Only 55% of children aged 6 months to 17 years had been vaccinated against influenza as of early April – down 2% from the previous April – and coverage levels were 8.1% lower for Black children compared with non-Hispanic White children, according to the CDC. In the 2019-2020 flu season, 188 children in the United States died of the infection, equaling the high mark for deaths set in the 2017-2018 season, the agency reported.

American Academy of Pediatrics guidelines recommend children aged 6 months and older be vaccinated with the flu vaccine every year. Depending on the child’s age and health, they may receive either a shot, which has an inactive version of the flu virus, or the nasal spray, which has a weakened form of the virus. The academy has more information about the different vaccines.

Children aged 6-8 months who are getting flu vaccines for the first time should receive two doses at least 4 weeks apart. Pregnant women can get the flu vaccine any time in their pregnancy. Influenza vaccines are safe for developing fetuses, according to the academy.

The group stressed the importance of flu vaccines for high-risk and medically vulnerable children and acknowledged the need to end barriers to immunizations for all people, regardless of income or insurance coverage. In 2020, an estimated 16.1% of children in the United States were living in poverty, up from 14.4% in 2019, according to the U.S. Census Bureau.

A version of this article first appeared on WebMD.com.

Attention parents: The nation’s leading pediatric medical society is urging you to make sure your children get a flu shot this fall to prevent and control the spread of the illness.

The American Academy of Pediatrics recently called on parents and caregivers to seek flu vaccines for their children as soon as they are available in the fall. The group is encouraging parents to catch up on all other vaccines for their children, too.

“As a pediatrician and a parent, I consider the flu vaccine as critical for all family members,” Kristina A. Bryant, MD, said in a statement about the academy’s recommendations. “We should not underestimate the flu, especially when other respiratory viruses like COVID-19 are circulating within our communities. Besides making your child miserable and wreaking havoc on your family’s routine, influenza can also be serious and even deadly in children.”

Only 55% of children aged 6 months to 17 years had been vaccinated against influenza as of early April – down 2% from the previous April – and coverage levels were 8.1% lower for Black children compared with non-Hispanic White children, according to the CDC. In the 2019-2020 flu season, 188 children in the United States died of the infection, equaling the high mark for deaths set in the 2017-2018 season, the agency reported.

American Academy of Pediatrics guidelines recommend children aged 6 months and older be vaccinated with the flu vaccine every year. Depending on the child’s age and health, they may receive either a shot, which has an inactive version of the flu virus, or the nasal spray, which has a weakened form of the virus. The academy has more information about the different vaccines.

Children aged 6-8 months who are getting flu vaccines for the first time should receive two doses at least 4 weeks apart. Pregnant women can get the flu vaccine any time in their pregnancy. Influenza vaccines are safe for developing fetuses, according to the academy.

The group stressed the importance of flu vaccines for high-risk and medically vulnerable children and acknowledged the need to end barriers to immunizations for all people, regardless of income or insurance coverage. In 2020, an estimated 16.1% of children in the United States were living in poverty, up from 14.4% in 2019, according to the U.S. Census Bureau.

A version of this article first appeared on WebMD.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Punked By the Punctum: Domestically Acquired Cutaneous Myiasis

Article Type
Changed
Wed, 09/14/2022 - 15:00
Display Headline
Punked By the Punctum: Domestically Acquired Cutaneous Myiasis

To the Editor:

Cutaneous myiasis is a skin infestation with dipterous larvae that feed on the host’s tissue and cause a wide range of manifestations depending on the location of infestation. Cutaneous myiasis, which includes furuncular, wound, and migratory types, is the most common clinical form of this condition.1 It is endemic to tropical and subtropical areas and is not common in the United States, thus it can pose a diagnostic challenge when presenting in nonendemic areas. We present the case of a woman from Michigan who acquired furuncular myiasis without travel history to a tropical or subtropical locale.

A 72-year-old woman presented to our clinic with a chief concern of a burning, pruritic, migratory skin lesion on the left arm of approximately 1 week’s duration. She had a medical history of squamous cell carcinoma, keratoacanthoma, and multiple tick bites. She reported that the lesion started on the distal aspect of the left arm as an eraser-sized, perfectly round, raised bruise with a dark pepperlike bump in the center. The lesion then spread proximally over the course of 1 week, creating 3 more identical lesions. As one lesion resolved, a new lesion appeared approximately 2 to 4 cm proximal to the preceding lesion. The patient had traveled to England, Scotland, and Ireland 2 months prior but otherwise denied leaving the state of Michigan. She reported frequent exposure to gardens, meadows, and wetlands in search of milkweed and monarch butterfly larvae that she raises in northeast Michigan. She denied any recent illness or associated systemic symptoms. Initial evaluation by a primary care physician resulted in a diagnosis of a furuncle or tick bite; she completed a 10-day course of amoxicillin and a methylprednisolone dose pack without improvement.

Physical examination revealed a 1-cm, firm, violaceous nodule with a small distinct central punctum and surrounding erythema on the proximal aspect of the left arm. Dermoscopy revealed a pulsating motion and expulsion of serosanguineous fluid from the central punctum (Figure 1). Further inspection of the patient’s left arm exposed several noninflammatory puncta distal to the primary lesion spaced at 2- to 4-cm intervals.

Dermoscopy showed pulsating motion and expulsion of serosanguineous fluid from the central punctum with surrounding erythema.
FIGURE 1. Dermoscopy showed pulsating motion and expulsion of serosanguineous fluid from the central punctum with surrounding erythema.

Gross examination of a 6-mm punch biopsy from the primary inflammatory nodule uncovered a small, motile, gray-white larval organism in the inferior portion of the specimen (Figure 2). Histopathology revealed superficial and deep eosinophil-rich inflammation, fibrosis, and hemorrhage. There was a complex wedge-shaped organism with extensive internal muscle bounded by a thin cuticle bearing rows of chitinous hooklets located at one side within the deep dermis (Figure 3). The findings were consistent with a diagnosis of cutaneous myiasis. No further treatment was required, as the organism was completely excised with the biopsy.

A 6-mm punch biopsy revealed a small, motile, gray-white larval organism in the inferior portion of the specimen.
FIGURE 2. A 6-mm punch biopsy revealed a small, motile, gray-white larval organism in the inferior portion of the specimen.

The most common causative agents of furuncular myiasis obtained from travelers returning from Mexico and Central and South America are Dermatobia hominis and Cordylobia anthropophaga. Cases of furuncular myiasis acquired in the United States without recent foreign travel are rare. Most of these cases are caused by larvae of the Cuterebra species (also known as the rabbit botfly or rodent botfly).2 In a 2003 literature review by Safdar et al3 on 56 cases of furuncular myiasis in the United States, the median age of patients was 14 years, 87% of cases occurred in August and September, and most involved exposure in rural or suburban settings; 53% of cases presented in the northeastern United States.

Histopathology revealed superficial and deep eosinophilrich inflammation, fibrosis, and hemorrhage.
FIGURE 3. Histopathology revealed superficial and deep eosinophilrich inflammation, fibrosis, and hemorrhage. A complex wedgeshaped organism with extensive internal skeletal muscle bounded by a thin cuticle bearing rows of chitinous hooklets was located in the deep dermis (H&E, original magnification ×40).

Furuncular myiasis occurs when the organism’s ova are deposited on the skin of a human host by the parent organism or a mosquito vector. The heat of the skin causes the eggs to hatch and the dipteran larvae must penetrate the skin within 20 days.1 Signs of infection typically are seen 6 to 10 days after infestation.3 The larvae then feed on human tissue and burrow deep in the dermis, forming an erythematous furunculoid nodule containing one or multiple maggots. After 5 to 10 weeks, the adult larvae drop to the ground, where they mature into adult organisms in the soil.1

The most reported symptoms of furuncular myiasis include pruritus, pain, and movement sensation, typically occurring suddenly at night.4 The most common presentation is a furunclelike lesion that exudes serosanguineous or purulent fluid,1 but there have been reports of vesicular, bullous, pustular, erosive, ecchymotic, and ulcerative lesions.5Dermatobia hominis usually presents on an exposed site, such as the scalp, face, and extremities. It may present with paroxysmal episodes of lancinating pain. Over time, the lesion usually heals without a scar, though hyperpigmentation and scarring can occur. The most reported complication is secondary bacterial infection.4 Local lymphadenopathy or systemic symptoms should raise concern for infection. Staphylococcus aureus and group B Streptococcus have been cultured from lesions.6,7

 

 

The differential diagnosis for myiasis should include furuncle, insect bite, insect prurigo, pyoderma, inflamed cyst, and tungiasis. Myiasis also can present similarly to severe soft tissue infections or cellulitis. If located on the breasts, it can be mistaken for periductal mastitis, a benign mass with microcalcification, or inflammatory carcinoma. Lastly, due to pain, erythema, pruritus, small vesicles, and crusting, it may be confused for herpes simplex virus.1

Furuncular myiasis typically is diagnosed based on clinical presentation, especially in endemic regions. In nonendemic areas, the patient’s history may reveal recent travel or predisposition to myiasis. In cases where there is uncertainty, dermoscopy may be used to identify the maggot in the lesion, or ultrasonography can be used to confirm myiasis through the detection of larval movement.8 Dermoscopy will reveal a furuncular lesion with a central opening surrounded by dilated blood vessels and a yellowish structure with black barblike spines.9 Within the dermis is a fibrous cystic sinus tract containing the dipteran larva. Laboratory studies typically are unremarkable. In chronic cases, a complete blood cell count and other laboratory tests may show systemic inflammation, peripheral eosinophilia, and elevated IgE.10 Biopsies of furuncular myiasis are not necessary for diagnosis. Histopathology reveals an ulcerated epidermis with or without hyperkeratosis and an inflammatory infiltrate composed of lymphocytes and neutrophils with eosinophils, fibroblasts, histiocytes, basophils, mast cells, plasma cells, and Langerhans cells within the dermis and subcutis.11

There are various approaches to treating furuncular myiasis, with the goal of complete removal of the larva and prevention of secondary infection. One treatment option is to apply a toxic substance to the larva, effectively killing it. Another approach is to force the larva to emerge via localized hypoxia, which can be done by occluding the punctum of the lesion for at least 24 hours. A complication of this method is suffocation of the larva without migration, leading to incomplete extraction and secondary infection.1 A third method is to surgically remove the larva, which allows for debridement of necrotic tissue surrounding the lesion if present.12 Ultrasonography also can be used therapeutically to aid in the removal of the larvae. The last method is to inject lidocaine into the base of the lesion, forcing the larva out of the punctum via fluid pressure.13 Oral treatments such as ivermectin are not recommended because they can result in the death of larvae within the lesion, leading to an inflammatory response.8

Furuncular myiasis is a form of cutaneous larvae infestation not commonly seen in individuals who do not live or travel in endemic, tropical, and subtropical regions. Diagnosis is based on clinical presentation, with imaging and laboratory studies available to supplement in unclear or atypical manifestations. Treatment involves complete removal of the larva, typically through forced evacuation via hypoxia or through surgical removal. Most cases resolve without notable scarring or other sequelae; however, in those who do have complications, the most common is secondary bacterial infection. Our patient’s absence of notable travel history and frequent environmental exposure in Michigan led us to believe the organism was from a domestic source. Our case underlines the importance of a thorough history and clinical examination of furuncular lesions including the use of dermoscopy to yield an appropriate diagnosis and treatment plan.

References
  1. Francesconi F, Lupi O. Myiasis. Clin Microbiol Rev. 2012;25:79-105. doi:10.1128/CMR.00010-11
  2. Schiff TA. Furuncular cutaneous myiasis caused by Cuterebra larva. J Am Acad Dermatol 1993;28:261-263.
  3. Safdar N, Young DK, Andes D. Autochthonous furuncular myiasis in the United States: case report and literature review. Clin Infect Dis. 2003;26:73-80.
  4. Mahal JJ, Sperling JD. Furuncular myiasis from Dermatobia hominus: a case of human botfly infestation. J Emerg Med. 2012;43:618-621.
  5. Francesconi F, Lupi O. Myiasis. In: Tyring SK, Lupi O, Hengge UR, eds. Tropical Dermatology. Elsevier; 2006:232-239.
  6. Gordon PM, Hepburn NC, Williams AE, et al. Cutaneous myiasis due to Dermatobia hominis: a report of six cases. Br J Dermatol. 1995;132:811-814.
  7. Hubler WR Jr, Rudolph AH, Dougherty EF. Dermal myiasis. Arch Dermatol. 1974;110:109-110.
  8. Quintanilla-Cedillo MR, León-Ureña H, Contreras-Ruiz J, et al. The value of Doppler ultrasound in diagnosis in 25 cases of furunculoid myiasis. Int J Dermatol. 2005;44:34-37.
  9. Bakos RM, Bakos L. Dermoscopic diagnosis of furuncular myiasis. Arch Dermatol. 2007;143:123-124.
  10. Varani S, Tassinari D, Elleri D, et al. A case of furuncular myiasis associated with systemic inflammation. Parasitol Int. 2007;56:330-333.
  11. Grogan TM, Payne CM, Spier C, et al. Cutaneous myiasis. immunohistologic and ultrastructural morphometric features of a human botfly lesion. Am J Dermatopathol. 1987;9:232-239.
  12. Krajewski A, Allen B, Hoss D, et al. Cutaneous myiasis. J Plast Reconstr Aesthet Surg. 2009;62:383-386.
  13. Lebwohl MG, Heymann WR, Berth-Jones J, et al. Myiasis: Treatment of Skin Diseases. Comprehensive Therapeutic Strategies. 2nd ed. Elsevier-Mosby; 2006.
Article PDF
Author and Disclosure Information

Drs. Globerson, Yee, and Bender are from the Department of Dermatology, Beaumont Health Systems, Farmington Hills, Michigan. Dr. Olsen is from the Pinkus Dermatopathology Laboratory, Monroe, Michigan.

The authors report no conflict of interest.

Correspondence: Jeffrey Globerson, DO, 28050 Grand River Ave, Farmington Hills, MI 48336 ([email protected]).

Issue
Cutis - 110(2)
Publications
Topics
Page Number
E37-E39
Sections
Author and Disclosure Information

Drs. Globerson, Yee, and Bender are from the Department of Dermatology, Beaumont Health Systems, Farmington Hills, Michigan. Dr. Olsen is from the Pinkus Dermatopathology Laboratory, Monroe, Michigan.

The authors report no conflict of interest.

Correspondence: Jeffrey Globerson, DO, 28050 Grand River Ave, Farmington Hills, MI 48336 ([email protected]).

Author and Disclosure Information

Drs. Globerson, Yee, and Bender are from the Department of Dermatology, Beaumont Health Systems, Farmington Hills, Michigan. Dr. Olsen is from the Pinkus Dermatopathology Laboratory, Monroe, Michigan.

The authors report no conflict of interest.

Correspondence: Jeffrey Globerson, DO, 28050 Grand River Ave, Farmington Hills, MI 48336 ([email protected]).

Article PDF
Article PDF

To the Editor:

Cutaneous myiasis is a skin infestation with dipterous larvae that feed on the host’s tissue and cause a wide range of manifestations depending on the location of infestation. Cutaneous myiasis, which includes furuncular, wound, and migratory types, is the most common clinical form of this condition.1 It is endemic to tropical and subtropical areas and is not common in the United States, thus it can pose a diagnostic challenge when presenting in nonendemic areas. We present the case of a woman from Michigan who acquired furuncular myiasis without travel history to a tropical or subtropical locale.

A 72-year-old woman presented to our clinic with a chief concern of a burning, pruritic, migratory skin lesion on the left arm of approximately 1 week’s duration. She had a medical history of squamous cell carcinoma, keratoacanthoma, and multiple tick bites. She reported that the lesion started on the distal aspect of the left arm as an eraser-sized, perfectly round, raised bruise with a dark pepperlike bump in the center. The lesion then spread proximally over the course of 1 week, creating 3 more identical lesions. As one lesion resolved, a new lesion appeared approximately 2 to 4 cm proximal to the preceding lesion. The patient had traveled to England, Scotland, and Ireland 2 months prior but otherwise denied leaving the state of Michigan. She reported frequent exposure to gardens, meadows, and wetlands in search of milkweed and monarch butterfly larvae that she raises in northeast Michigan. She denied any recent illness or associated systemic symptoms. Initial evaluation by a primary care physician resulted in a diagnosis of a furuncle or tick bite; she completed a 10-day course of amoxicillin and a methylprednisolone dose pack without improvement.

Physical examination revealed a 1-cm, firm, violaceous nodule with a small distinct central punctum and surrounding erythema on the proximal aspect of the left arm. Dermoscopy revealed a pulsating motion and expulsion of serosanguineous fluid from the central punctum (Figure 1). Further inspection of the patient’s left arm exposed several noninflammatory puncta distal to the primary lesion spaced at 2- to 4-cm intervals.

Dermoscopy showed pulsating motion and expulsion of serosanguineous fluid from the central punctum with surrounding erythema.
FIGURE 1. Dermoscopy showed pulsating motion and expulsion of serosanguineous fluid from the central punctum with surrounding erythema.

Gross examination of a 6-mm punch biopsy from the primary inflammatory nodule uncovered a small, motile, gray-white larval organism in the inferior portion of the specimen (Figure 2). Histopathology revealed superficial and deep eosinophil-rich inflammation, fibrosis, and hemorrhage. There was a complex wedge-shaped organism with extensive internal muscle bounded by a thin cuticle bearing rows of chitinous hooklets located at one side within the deep dermis (Figure 3). The findings were consistent with a diagnosis of cutaneous myiasis. No further treatment was required, as the organism was completely excised with the biopsy.

A 6-mm punch biopsy revealed a small, motile, gray-white larval organism in the inferior portion of the specimen.
FIGURE 2. A 6-mm punch biopsy revealed a small, motile, gray-white larval organism in the inferior portion of the specimen.

The most common causative agents of furuncular myiasis obtained from travelers returning from Mexico and Central and South America are Dermatobia hominis and Cordylobia anthropophaga. Cases of furuncular myiasis acquired in the United States without recent foreign travel are rare. Most of these cases are caused by larvae of the Cuterebra species (also known as the rabbit botfly or rodent botfly).2 In a 2003 literature review by Safdar et al3 on 56 cases of furuncular myiasis in the United States, the median age of patients was 14 years, 87% of cases occurred in August and September, and most involved exposure in rural or suburban settings; 53% of cases presented in the northeastern United States.

Histopathology revealed superficial and deep eosinophilrich inflammation, fibrosis, and hemorrhage.
FIGURE 3. Histopathology revealed superficial and deep eosinophilrich inflammation, fibrosis, and hemorrhage. A complex wedgeshaped organism with extensive internal skeletal muscle bounded by a thin cuticle bearing rows of chitinous hooklets was located in the deep dermis (H&E, original magnification ×40).

Furuncular myiasis occurs when the organism’s ova are deposited on the skin of a human host by the parent organism or a mosquito vector. The heat of the skin causes the eggs to hatch and the dipteran larvae must penetrate the skin within 20 days.1 Signs of infection typically are seen 6 to 10 days after infestation.3 The larvae then feed on human tissue and burrow deep in the dermis, forming an erythematous furunculoid nodule containing one or multiple maggots. After 5 to 10 weeks, the adult larvae drop to the ground, where they mature into adult organisms in the soil.1

The most reported symptoms of furuncular myiasis include pruritus, pain, and movement sensation, typically occurring suddenly at night.4 The most common presentation is a furunclelike lesion that exudes serosanguineous or purulent fluid,1 but there have been reports of vesicular, bullous, pustular, erosive, ecchymotic, and ulcerative lesions.5Dermatobia hominis usually presents on an exposed site, such as the scalp, face, and extremities. It may present with paroxysmal episodes of lancinating pain. Over time, the lesion usually heals without a scar, though hyperpigmentation and scarring can occur. The most reported complication is secondary bacterial infection.4 Local lymphadenopathy or systemic symptoms should raise concern for infection. Staphylococcus aureus and group B Streptococcus have been cultured from lesions.6,7

 

 

The differential diagnosis for myiasis should include furuncle, insect bite, insect prurigo, pyoderma, inflamed cyst, and tungiasis. Myiasis also can present similarly to severe soft tissue infections or cellulitis. If located on the breasts, it can be mistaken for periductal mastitis, a benign mass with microcalcification, or inflammatory carcinoma. Lastly, due to pain, erythema, pruritus, small vesicles, and crusting, it may be confused for herpes simplex virus.1

Furuncular myiasis typically is diagnosed based on clinical presentation, especially in endemic regions. In nonendemic areas, the patient’s history may reveal recent travel or predisposition to myiasis. In cases where there is uncertainty, dermoscopy may be used to identify the maggot in the lesion, or ultrasonography can be used to confirm myiasis through the detection of larval movement.8 Dermoscopy will reveal a furuncular lesion with a central opening surrounded by dilated blood vessels and a yellowish structure with black barblike spines.9 Within the dermis is a fibrous cystic sinus tract containing the dipteran larva. Laboratory studies typically are unremarkable. In chronic cases, a complete blood cell count and other laboratory tests may show systemic inflammation, peripheral eosinophilia, and elevated IgE.10 Biopsies of furuncular myiasis are not necessary for diagnosis. Histopathology reveals an ulcerated epidermis with or without hyperkeratosis and an inflammatory infiltrate composed of lymphocytes and neutrophils with eosinophils, fibroblasts, histiocytes, basophils, mast cells, plasma cells, and Langerhans cells within the dermis and subcutis.11

There are various approaches to treating furuncular myiasis, with the goal of complete removal of the larva and prevention of secondary infection. One treatment option is to apply a toxic substance to the larva, effectively killing it. Another approach is to force the larva to emerge via localized hypoxia, which can be done by occluding the punctum of the lesion for at least 24 hours. A complication of this method is suffocation of the larva without migration, leading to incomplete extraction and secondary infection.1 A third method is to surgically remove the larva, which allows for debridement of necrotic tissue surrounding the lesion if present.12 Ultrasonography also can be used therapeutically to aid in the removal of the larvae. The last method is to inject lidocaine into the base of the lesion, forcing the larva out of the punctum via fluid pressure.13 Oral treatments such as ivermectin are not recommended because they can result in the death of larvae within the lesion, leading to an inflammatory response.8

Furuncular myiasis is a form of cutaneous larvae infestation not commonly seen in individuals who do not live or travel in endemic, tropical, and subtropical regions. Diagnosis is based on clinical presentation, with imaging and laboratory studies available to supplement in unclear or atypical manifestations. Treatment involves complete removal of the larva, typically through forced evacuation via hypoxia or through surgical removal. Most cases resolve without notable scarring or other sequelae; however, in those who do have complications, the most common is secondary bacterial infection. Our patient’s absence of notable travel history and frequent environmental exposure in Michigan led us to believe the organism was from a domestic source. Our case underlines the importance of a thorough history and clinical examination of furuncular lesions including the use of dermoscopy to yield an appropriate diagnosis and treatment plan.

To the Editor:

Cutaneous myiasis is a skin infestation with dipterous larvae that feed on the host’s tissue and cause a wide range of manifestations depending on the location of infestation. Cutaneous myiasis, which includes furuncular, wound, and migratory types, is the most common clinical form of this condition.1 It is endemic to tropical and subtropical areas and is not common in the United States, thus it can pose a diagnostic challenge when presenting in nonendemic areas. We present the case of a woman from Michigan who acquired furuncular myiasis without travel history to a tropical or subtropical locale.

A 72-year-old woman presented to our clinic with a chief concern of a burning, pruritic, migratory skin lesion on the left arm of approximately 1 week’s duration. She had a medical history of squamous cell carcinoma, keratoacanthoma, and multiple tick bites. She reported that the lesion started on the distal aspect of the left arm as an eraser-sized, perfectly round, raised bruise with a dark pepperlike bump in the center. The lesion then spread proximally over the course of 1 week, creating 3 more identical lesions. As one lesion resolved, a new lesion appeared approximately 2 to 4 cm proximal to the preceding lesion. The patient had traveled to England, Scotland, and Ireland 2 months prior but otherwise denied leaving the state of Michigan. She reported frequent exposure to gardens, meadows, and wetlands in search of milkweed and monarch butterfly larvae that she raises in northeast Michigan. She denied any recent illness or associated systemic symptoms. Initial evaluation by a primary care physician resulted in a diagnosis of a furuncle or tick bite; she completed a 10-day course of amoxicillin and a methylprednisolone dose pack without improvement.

Physical examination revealed a 1-cm, firm, violaceous nodule with a small distinct central punctum and surrounding erythema on the proximal aspect of the left arm. Dermoscopy revealed a pulsating motion and expulsion of serosanguineous fluid from the central punctum (Figure 1). Further inspection of the patient’s left arm exposed several noninflammatory puncta distal to the primary lesion spaced at 2- to 4-cm intervals.

Dermoscopy showed pulsating motion and expulsion of serosanguineous fluid from the central punctum with surrounding erythema.
FIGURE 1. Dermoscopy showed pulsating motion and expulsion of serosanguineous fluid from the central punctum with surrounding erythema.

Gross examination of a 6-mm punch biopsy from the primary inflammatory nodule uncovered a small, motile, gray-white larval organism in the inferior portion of the specimen (Figure 2). Histopathology revealed superficial and deep eosinophil-rich inflammation, fibrosis, and hemorrhage. There was a complex wedge-shaped organism with extensive internal muscle bounded by a thin cuticle bearing rows of chitinous hooklets located at one side within the deep dermis (Figure 3). The findings were consistent with a diagnosis of cutaneous myiasis. No further treatment was required, as the organism was completely excised with the biopsy.

A 6-mm punch biopsy revealed a small, motile, gray-white larval organism in the inferior portion of the specimen.
FIGURE 2. A 6-mm punch biopsy revealed a small, motile, gray-white larval organism in the inferior portion of the specimen.

The most common causative agents of furuncular myiasis obtained from travelers returning from Mexico and Central and South America are Dermatobia hominis and Cordylobia anthropophaga. Cases of furuncular myiasis acquired in the United States without recent foreign travel are rare. Most of these cases are caused by larvae of the Cuterebra species (also known as the rabbit botfly or rodent botfly).2 In a 2003 literature review by Safdar et al3 on 56 cases of furuncular myiasis in the United States, the median age of patients was 14 years, 87% of cases occurred in August and September, and most involved exposure in rural or suburban settings; 53% of cases presented in the northeastern United States.

Histopathology revealed superficial and deep eosinophilrich inflammation, fibrosis, and hemorrhage.
FIGURE 3. Histopathology revealed superficial and deep eosinophilrich inflammation, fibrosis, and hemorrhage. A complex wedgeshaped organism with extensive internal skeletal muscle bounded by a thin cuticle bearing rows of chitinous hooklets was located in the deep dermis (H&E, original magnification ×40).

Furuncular myiasis occurs when the organism’s ova are deposited on the skin of a human host by the parent organism or a mosquito vector. The heat of the skin causes the eggs to hatch and the dipteran larvae must penetrate the skin within 20 days.1 Signs of infection typically are seen 6 to 10 days after infestation.3 The larvae then feed on human tissue and burrow deep in the dermis, forming an erythematous furunculoid nodule containing one or multiple maggots. After 5 to 10 weeks, the adult larvae drop to the ground, where they mature into adult organisms in the soil.1

The most reported symptoms of furuncular myiasis include pruritus, pain, and movement sensation, typically occurring suddenly at night.4 The most common presentation is a furunclelike lesion that exudes serosanguineous or purulent fluid,1 but there have been reports of vesicular, bullous, pustular, erosive, ecchymotic, and ulcerative lesions.5Dermatobia hominis usually presents on an exposed site, such as the scalp, face, and extremities. It may present with paroxysmal episodes of lancinating pain. Over time, the lesion usually heals without a scar, though hyperpigmentation and scarring can occur. The most reported complication is secondary bacterial infection.4 Local lymphadenopathy or systemic symptoms should raise concern for infection. Staphylococcus aureus and group B Streptococcus have been cultured from lesions.6,7

 

 

The differential diagnosis for myiasis should include furuncle, insect bite, insect prurigo, pyoderma, inflamed cyst, and tungiasis. Myiasis also can present similarly to severe soft tissue infections or cellulitis. If located on the breasts, it can be mistaken for periductal mastitis, a benign mass with microcalcification, or inflammatory carcinoma. Lastly, due to pain, erythema, pruritus, small vesicles, and crusting, it may be confused for herpes simplex virus.1

Furuncular myiasis typically is diagnosed based on clinical presentation, especially in endemic regions. In nonendemic areas, the patient’s history may reveal recent travel or predisposition to myiasis. In cases where there is uncertainty, dermoscopy may be used to identify the maggot in the lesion, or ultrasonography can be used to confirm myiasis through the detection of larval movement.8 Dermoscopy will reveal a furuncular lesion with a central opening surrounded by dilated blood vessels and a yellowish structure with black barblike spines.9 Within the dermis is a fibrous cystic sinus tract containing the dipteran larva. Laboratory studies typically are unremarkable. In chronic cases, a complete blood cell count and other laboratory tests may show systemic inflammation, peripheral eosinophilia, and elevated IgE.10 Biopsies of furuncular myiasis are not necessary for diagnosis. Histopathology reveals an ulcerated epidermis with or without hyperkeratosis and an inflammatory infiltrate composed of lymphocytes and neutrophils with eosinophils, fibroblasts, histiocytes, basophils, mast cells, plasma cells, and Langerhans cells within the dermis and subcutis.11

There are various approaches to treating furuncular myiasis, with the goal of complete removal of the larva and prevention of secondary infection. One treatment option is to apply a toxic substance to the larva, effectively killing it. Another approach is to force the larva to emerge via localized hypoxia, which can be done by occluding the punctum of the lesion for at least 24 hours. A complication of this method is suffocation of the larva without migration, leading to incomplete extraction and secondary infection.1 A third method is to surgically remove the larva, which allows for debridement of necrotic tissue surrounding the lesion if present.12 Ultrasonography also can be used therapeutically to aid in the removal of the larvae. The last method is to inject lidocaine into the base of the lesion, forcing the larva out of the punctum via fluid pressure.13 Oral treatments such as ivermectin are not recommended because they can result in the death of larvae within the lesion, leading to an inflammatory response.8

Furuncular myiasis is a form of cutaneous larvae infestation not commonly seen in individuals who do not live or travel in endemic, tropical, and subtropical regions. Diagnosis is based on clinical presentation, with imaging and laboratory studies available to supplement in unclear or atypical manifestations. Treatment involves complete removal of the larva, typically through forced evacuation via hypoxia or through surgical removal. Most cases resolve without notable scarring or other sequelae; however, in those who do have complications, the most common is secondary bacterial infection. Our patient’s absence of notable travel history and frequent environmental exposure in Michigan led us to believe the organism was from a domestic source. Our case underlines the importance of a thorough history and clinical examination of furuncular lesions including the use of dermoscopy to yield an appropriate diagnosis and treatment plan.

References
  1. Francesconi F, Lupi O. Myiasis. Clin Microbiol Rev. 2012;25:79-105. doi:10.1128/CMR.00010-11
  2. Schiff TA. Furuncular cutaneous myiasis caused by Cuterebra larva. J Am Acad Dermatol 1993;28:261-263.
  3. Safdar N, Young DK, Andes D. Autochthonous furuncular myiasis in the United States: case report and literature review. Clin Infect Dis. 2003;26:73-80.
  4. Mahal JJ, Sperling JD. Furuncular myiasis from Dermatobia hominus: a case of human botfly infestation. J Emerg Med. 2012;43:618-621.
  5. Francesconi F, Lupi O. Myiasis. In: Tyring SK, Lupi O, Hengge UR, eds. Tropical Dermatology. Elsevier; 2006:232-239.
  6. Gordon PM, Hepburn NC, Williams AE, et al. Cutaneous myiasis due to Dermatobia hominis: a report of six cases. Br J Dermatol. 1995;132:811-814.
  7. Hubler WR Jr, Rudolph AH, Dougherty EF. Dermal myiasis. Arch Dermatol. 1974;110:109-110.
  8. Quintanilla-Cedillo MR, León-Ureña H, Contreras-Ruiz J, et al. The value of Doppler ultrasound in diagnosis in 25 cases of furunculoid myiasis. Int J Dermatol. 2005;44:34-37.
  9. Bakos RM, Bakos L. Dermoscopic diagnosis of furuncular myiasis. Arch Dermatol. 2007;143:123-124.
  10. Varani S, Tassinari D, Elleri D, et al. A case of furuncular myiasis associated with systemic inflammation. Parasitol Int. 2007;56:330-333.
  11. Grogan TM, Payne CM, Spier C, et al. Cutaneous myiasis. immunohistologic and ultrastructural morphometric features of a human botfly lesion. Am J Dermatopathol. 1987;9:232-239.
  12. Krajewski A, Allen B, Hoss D, et al. Cutaneous myiasis. J Plast Reconstr Aesthet Surg. 2009;62:383-386.
  13. Lebwohl MG, Heymann WR, Berth-Jones J, et al. Myiasis: Treatment of Skin Diseases. Comprehensive Therapeutic Strategies. 2nd ed. Elsevier-Mosby; 2006.
References
  1. Francesconi F, Lupi O. Myiasis. Clin Microbiol Rev. 2012;25:79-105. doi:10.1128/CMR.00010-11
  2. Schiff TA. Furuncular cutaneous myiasis caused by Cuterebra larva. J Am Acad Dermatol 1993;28:261-263.
  3. Safdar N, Young DK, Andes D. Autochthonous furuncular myiasis in the United States: case report and literature review. Clin Infect Dis. 2003;26:73-80.
  4. Mahal JJ, Sperling JD. Furuncular myiasis from Dermatobia hominus: a case of human botfly infestation. J Emerg Med. 2012;43:618-621.
  5. Francesconi F, Lupi O. Myiasis. In: Tyring SK, Lupi O, Hengge UR, eds. Tropical Dermatology. Elsevier; 2006:232-239.
  6. Gordon PM, Hepburn NC, Williams AE, et al. Cutaneous myiasis due to Dermatobia hominis: a report of six cases. Br J Dermatol. 1995;132:811-814.
  7. Hubler WR Jr, Rudolph AH, Dougherty EF. Dermal myiasis. Arch Dermatol. 1974;110:109-110.
  8. Quintanilla-Cedillo MR, León-Ureña H, Contreras-Ruiz J, et al. The value of Doppler ultrasound in diagnosis in 25 cases of furunculoid myiasis. Int J Dermatol. 2005;44:34-37.
  9. Bakos RM, Bakos L. Dermoscopic diagnosis of furuncular myiasis. Arch Dermatol. 2007;143:123-124.
  10. Varani S, Tassinari D, Elleri D, et al. A case of furuncular myiasis associated with systemic inflammation. Parasitol Int. 2007;56:330-333.
  11. Grogan TM, Payne CM, Spier C, et al. Cutaneous myiasis. immunohistologic and ultrastructural morphometric features of a human botfly lesion. Am J Dermatopathol. 1987;9:232-239.
  12. Krajewski A, Allen B, Hoss D, et al. Cutaneous myiasis. J Plast Reconstr Aesthet Surg. 2009;62:383-386.
  13. Lebwohl MG, Heymann WR, Berth-Jones J, et al. Myiasis: Treatment of Skin Diseases. Comprehensive Therapeutic Strategies. 2nd ed. Elsevier-Mosby; 2006.
Issue
Cutis - 110(2)
Issue
Cutis - 110(2)
Page Number
E37-E39
Page Number
E37-E39
Publications
Publications
Topics
Article Type
Display Headline
Punked By the Punctum: Domestically Acquired Cutaneous Myiasis
Display Headline
Punked By the Punctum: Domestically Acquired Cutaneous Myiasis
Sections
Inside the Article

Practice Points

  • Cutaneous myiasis is a skin infestation with dipterous larvae that feed on the host’s tissue and cause a wide range of manifestations depending on the location of infestation. It consists of 3 types: furuncular, wound, and migratory forms.
  • It is uncommon in the United States and not typically seen in patients who have no history of recent travel to tropical or subtropical areas.
  • The most common cause of African furuncular myiasis acquired in the United States is larvae of the Cuterebra species (also known as the rabbit botfly or rodent botfly).
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media