Music at bedtime may aid depression-related insomnia

Article Type
Changed
Mon, 04/03/2023 - 14:12

 

Listening to music via curated playlists at bedtime is effective for depression-related insomnia, although the depression itself is unaffected, new research suggests.

The Music to Improve Sleep Quality in Adults With Depression and Insomnia (MUSTAFI) trial randomly assigned more than 110 outpatients with depression to either a music intervention or a waiting list. Sleep quality and quality of life significantly improved after listening to music for half an hour at bedtime for 4 weeks.

“This is a low-cost, safe intervention that has no side effects and may easily be implemented in psychiatry” along with existing treatments, lead researcher Helle Nystrup Lund, PhD, unit for depression, Aalborg (Denmark) University Hospital, said in an interview.

The findings were presented at the European Psychiatric Association 2023 Congress, and recently published in the Nordic Journal of Psychiatry.
 

Difficult to resolve

The researchers noted that insomnia is common in patients with depression and is “difficult to resolve.”

They noted that, while music is commonly used as a sleep aid and a growing evidence base suggests it has positive effects, there have been few investigations into the effectiveness of music for patients with depression-related insomnia.

To fill this research gap, 112 outpatients with depression and comorbid insomnia who were receiving care at a single center were randomly assigned to either an intervention group or a wait list control group.

Participants in the intervention group listened to music for a minimum of 30 minutes at bedtime for 4 weeks. The music was delivered via the MusicStar app, which is available as a free download from the Apple and Android (Google Play) app stores. The app was developed by Dr. Lund and Lars Rye Bertelsen, a PhD student and music therapist at Aalborg University Hospital.

The app is designed as a multicolored star, with each arm of the star linking to a playlist lasting between 30 minutes and 1 hour. Each color of the star indicates a different tempo of music.

Blue playlists, Dr. Lund explained, offer the quietest music, green is more lively, and red is the most dynamic. Gray playlists linked to project-related soundtracks, such as summer rain.

Dr. Lund said organizing the playlists by stimuli and color code, instead of genre, allows users to regulate their level of arousal and makes the music choice intuitive and easy.

She said that the genres of music include New Age, folk, pop, classical, and film soundtracks, “but no hard rock.”

“There’s actually a quite large selection of music available, because studies show that individual choice is important, as are personal preferences,” she said, adding that the endless choices offered by streaming services can cause confusion.

“So we made curated playlists and designed them with well-known pieces, but also with newly composed music not associated with anything,” Dr. Lund said.

Participants were assessed using the Pittsburgh Sleep Quality Index (PSQI), the Hamilton Depression Rating Scale, and two World Health Organization well-being questionnaires (WHO-5, WHOQOL-BREF), as well as actigraphy.

Results showed that, at 4 weeks, participants in the intervention group experienced significant improvements in sleep quality in comparison with control persons. The effect size for the PSQI was –2.1, and for quality of life on the WHO-5, the effect size was 8.4.

A subanalysis revealed that the length of nocturnal sleep in the intervention group increased by an average of 18 minutes during the study from a baseline of approximately 5 hours per night, said Dr. Lund.

However, there were no changes in actigraphy measurements and no significant improvements in HAMD-17 scores.

Dr. Lund said that, on the basis of these positive findings, music intervention as a sleep aid is now offered at Aalborg University Hospital to patients with depression-related insomnia.
 

Clinically meaningful?

Commenting on the findings, Gerald J. Haeffel, PhD, department of psychology, University of Notre Dame, South Bend, Ind., said that overall, the study showed there was a change in sleep-quality and quality of life scores of “about 10% in each.”

“This, on the surface, would seem to be a meaningful change,” although it is less clear whether it is “clinically meaningful.” Perhaps it is, “but it would be nice to have more information.”

It would be useful, he said, to “show the means for each group pre- to postintervention, along with standard deviations,” he added.

Dr. Haeffel added that on the basis of current results, it isn’t possible to determine whether individuals’ control over music choice is important.

“We have no idea if ‘choice’ or length of playlist had any causal role in the results. One would need to run a study with the same playlist, but in one group people have to listen to whatever song comes on versus another condition in which they get to choose a song off the same list,” he said.

He noted that his group conducted a study in which highly popular music that was chosen by individual participants was found to have a positive effect. Even so, he said, “we could not determine if it was ‘choice’ or ‘popularity’ that caused the positive effects of music.”

In addition, he said, the reason music has a positive effect on insomnia remains unclear.

“It is not because it helped with depression, and it’s not because it’s actually changing objective sleep parameters. It could be that it improves mood right before bed or helps distract people right before bed. At the same time, it could also just be a placebo effect,” said Dr. Haeffel.

In addition, he said, it’s important to note that the music intervention had no comparator, so “maybe just doing something different or getting to talk with researchers created the effect and has nothing to do with music.”

Overall, he believes that there are “not enough data” to use the sleep intervention that was employed in the current study “as primary intervention, but future work could show its usefulness as a supplement.”

Dr. Lund and Mr. Bertelsen reported ownership and sales of the MusicStar app. Dr. Haeffel reported no relevant financial relationships.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

Listening to music via curated playlists at bedtime is effective for depression-related insomnia, although the depression itself is unaffected, new research suggests.

The Music to Improve Sleep Quality in Adults With Depression and Insomnia (MUSTAFI) trial randomly assigned more than 110 outpatients with depression to either a music intervention or a waiting list. Sleep quality and quality of life significantly improved after listening to music for half an hour at bedtime for 4 weeks.

“This is a low-cost, safe intervention that has no side effects and may easily be implemented in psychiatry” along with existing treatments, lead researcher Helle Nystrup Lund, PhD, unit for depression, Aalborg (Denmark) University Hospital, said in an interview.

The findings were presented at the European Psychiatric Association 2023 Congress, and recently published in the Nordic Journal of Psychiatry.
 

Difficult to resolve

The researchers noted that insomnia is common in patients with depression and is “difficult to resolve.”

They noted that, while music is commonly used as a sleep aid and a growing evidence base suggests it has positive effects, there have been few investigations into the effectiveness of music for patients with depression-related insomnia.

To fill this research gap, 112 outpatients with depression and comorbid insomnia who were receiving care at a single center were randomly assigned to either an intervention group or a wait list control group.

Participants in the intervention group listened to music for a minimum of 30 minutes at bedtime for 4 weeks. The music was delivered via the MusicStar app, which is available as a free download from the Apple and Android (Google Play) app stores. The app was developed by Dr. Lund and Lars Rye Bertelsen, a PhD student and music therapist at Aalborg University Hospital.

The app is designed as a multicolored star, with each arm of the star linking to a playlist lasting between 30 minutes and 1 hour. Each color of the star indicates a different tempo of music.

Blue playlists, Dr. Lund explained, offer the quietest music, green is more lively, and red is the most dynamic. Gray playlists linked to project-related soundtracks, such as summer rain.

Dr. Lund said organizing the playlists by stimuli and color code, instead of genre, allows users to regulate their level of arousal and makes the music choice intuitive and easy.

She said that the genres of music include New Age, folk, pop, classical, and film soundtracks, “but no hard rock.”

“There’s actually a quite large selection of music available, because studies show that individual choice is important, as are personal preferences,” she said, adding that the endless choices offered by streaming services can cause confusion.

“So we made curated playlists and designed them with well-known pieces, but also with newly composed music not associated with anything,” Dr. Lund said.

Participants were assessed using the Pittsburgh Sleep Quality Index (PSQI), the Hamilton Depression Rating Scale, and two World Health Organization well-being questionnaires (WHO-5, WHOQOL-BREF), as well as actigraphy.

Results showed that, at 4 weeks, participants in the intervention group experienced significant improvements in sleep quality in comparison with control persons. The effect size for the PSQI was –2.1, and for quality of life on the WHO-5, the effect size was 8.4.

A subanalysis revealed that the length of nocturnal sleep in the intervention group increased by an average of 18 minutes during the study from a baseline of approximately 5 hours per night, said Dr. Lund.

However, there were no changes in actigraphy measurements and no significant improvements in HAMD-17 scores.

Dr. Lund said that, on the basis of these positive findings, music intervention as a sleep aid is now offered at Aalborg University Hospital to patients with depression-related insomnia.
 

Clinically meaningful?

Commenting on the findings, Gerald J. Haeffel, PhD, department of psychology, University of Notre Dame, South Bend, Ind., said that overall, the study showed there was a change in sleep-quality and quality of life scores of “about 10% in each.”

“This, on the surface, would seem to be a meaningful change,” although it is less clear whether it is “clinically meaningful.” Perhaps it is, “but it would be nice to have more information.”

It would be useful, he said, to “show the means for each group pre- to postintervention, along with standard deviations,” he added.

Dr. Haeffel added that on the basis of current results, it isn’t possible to determine whether individuals’ control over music choice is important.

“We have no idea if ‘choice’ or length of playlist had any causal role in the results. One would need to run a study with the same playlist, but in one group people have to listen to whatever song comes on versus another condition in which they get to choose a song off the same list,” he said.

He noted that his group conducted a study in which highly popular music that was chosen by individual participants was found to have a positive effect. Even so, he said, “we could not determine if it was ‘choice’ or ‘popularity’ that caused the positive effects of music.”

In addition, he said, the reason music has a positive effect on insomnia remains unclear.

“It is not because it helped with depression, and it’s not because it’s actually changing objective sleep parameters. It could be that it improves mood right before bed or helps distract people right before bed. At the same time, it could also just be a placebo effect,” said Dr. Haeffel.

In addition, he said, it’s important to note that the music intervention had no comparator, so “maybe just doing something different or getting to talk with researchers created the effect and has nothing to do with music.”

Overall, he believes that there are “not enough data” to use the sleep intervention that was employed in the current study “as primary intervention, but future work could show its usefulness as a supplement.”

Dr. Lund and Mr. Bertelsen reported ownership and sales of the MusicStar app. Dr. Haeffel reported no relevant financial relationships.

 

Listening to music via curated playlists at bedtime is effective for depression-related insomnia, although the depression itself is unaffected, new research suggests.

The Music to Improve Sleep Quality in Adults With Depression and Insomnia (MUSTAFI) trial randomly assigned more than 110 outpatients with depression to either a music intervention or a waiting list. Sleep quality and quality of life significantly improved after listening to music for half an hour at bedtime for 4 weeks.

“This is a low-cost, safe intervention that has no side effects and may easily be implemented in psychiatry” along with existing treatments, lead researcher Helle Nystrup Lund, PhD, unit for depression, Aalborg (Denmark) University Hospital, said in an interview.

The findings were presented at the European Psychiatric Association 2023 Congress, and recently published in the Nordic Journal of Psychiatry.
 

Difficult to resolve

The researchers noted that insomnia is common in patients with depression and is “difficult to resolve.”

They noted that, while music is commonly used as a sleep aid and a growing evidence base suggests it has positive effects, there have been few investigations into the effectiveness of music for patients with depression-related insomnia.

To fill this research gap, 112 outpatients with depression and comorbid insomnia who were receiving care at a single center were randomly assigned to either an intervention group or a wait list control group.

Participants in the intervention group listened to music for a minimum of 30 minutes at bedtime for 4 weeks. The music was delivered via the MusicStar app, which is available as a free download from the Apple and Android (Google Play) app stores. The app was developed by Dr. Lund and Lars Rye Bertelsen, a PhD student and music therapist at Aalborg University Hospital.

The app is designed as a multicolored star, with each arm of the star linking to a playlist lasting between 30 minutes and 1 hour. Each color of the star indicates a different tempo of music.

Blue playlists, Dr. Lund explained, offer the quietest music, green is more lively, and red is the most dynamic. Gray playlists linked to project-related soundtracks, such as summer rain.

Dr. Lund said organizing the playlists by stimuli and color code, instead of genre, allows users to regulate their level of arousal and makes the music choice intuitive and easy.

She said that the genres of music include New Age, folk, pop, classical, and film soundtracks, “but no hard rock.”

“There’s actually a quite large selection of music available, because studies show that individual choice is important, as are personal preferences,” she said, adding that the endless choices offered by streaming services can cause confusion.

“So we made curated playlists and designed them with well-known pieces, but also with newly composed music not associated with anything,” Dr. Lund said.

Participants were assessed using the Pittsburgh Sleep Quality Index (PSQI), the Hamilton Depression Rating Scale, and two World Health Organization well-being questionnaires (WHO-5, WHOQOL-BREF), as well as actigraphy.

Results showed that, at 4 weeks, participants in the intervention group experienced significant improvements in sleep quality in comparison with control persons. The effect size for the PSQI was –2.1, and for quality of life on the WHO-5, the effect size was 8.4.

A subanalysis revealed that the length of nocturnal sleep in the intervention group increased by an average of 18 minutes during the study from a baseline of approximately 5 hours per night, said Dr. Lund.

However, there were no changes in actigraphy measurements and no significant improvements in HAMD-17 scores.

Dr. Lund said that, on the basis of these positive findings, music intervention as a sleep aid is now offered at Aalborg University Hospital to patients with depression-related insomnia.
 

Clinically meaningful?

Commenting on the findings, Gerald J. Haeffel, PhD, department of psychology, University of Notre Dame, South Bend, Ind., said that overall, the study showed there was a change in sleep-quality and quality of life scores of “about 10% in each.”

“This, on the surface, would seem to be a meaningful change,” although it is less clear whether it is “clinically meaningful.” Perhaps it is, “but it would be nice to have more information.”

It would be useful, he said, to “show the means for each group pre- to postintervention, along with standard deviations,” he added.

Dr. Haeffel added that on the basis of current results, it isn’t possible to determine whether individuals’ control over music choice is important.

“We have no idea if ‘choice’ or length of playlist had any causal role in the results. One would need to run a study with the same playlist, but in one group people have to listen to whatever song comes on versus another condition in which they get to choose a song off the same list,” he said.

He noted that his group conducted a study in which highly popular music that was chosen by individual participants was found to have a positive effect. Even so, he said, “we could not determine if it was ‘choice’ or ‘popularity’ that caused the positive effects of music.”

In addition, he said, the reason music has a positive effect on insomnia remains unclear.

“It is not because it helped with depression, and it’s not because it’s actually changing objective sleep parameters. It could be that it improves mood right before bed or helps distract people right before bed. At the same time, it could also just be a placebo effect,” said Dr. Haeffel.

In addition, he said, it’s important to note that the music intervention had no comparator, so “maybe just doing something different or getting to talk with researchers created the effect and has nothing to do with music.”

Overall, he believes that there are “not enough data” to use the sleep intervention that was employed in the current study “as primary intervention, but future work could show its usefulness as a supplement.”

Dr. Lund and Mr. Bertelsen reported ownership and sales of the MusicStar app. Dr. Haeffel reported no relevant financial relationships.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT EPA 2023

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Cancer risk elevated after stroke in younger people

Article Type
Changed
Tue, 04/04/2023 - 17:43

 

Younger people who experience stroke or intracerebral hemorrhage have about a three- to fivefold increased risk of being diagnosed with cancer in the next few years, new research shows.

In young people, stroke might be the first manifestation of an underlying cancer, according to the investigators, led by Jamie Verhoeven, MD, PhD, with the department of neurology, Radboud University Medical Centre, Nijmegen, the Netherlands.

The new study can be viewed as a “stepping stone for future studies investigating the usefulness of screening for cancer after stroke,” the researchers say.

The study was published online in JAMA Network Open.

Currently, the diagnostic workup for young people with stroke includes searching for rare clotting disorders, although screening for cancer is not regularly performed.

Some research suggests that stroke and cancer are linked, but the literature is limited. In prior studies among people of all ages, cancer incidence after stroke has been variable – from 1% to 5% at 1 year and from 11% to 30% after 10 years.

To the team’s knowledge, only two studies have described the incidence of cancer after stroke among younger patients. One put the risk at 0.5% for people aged 18-50 years in the first year after stroke; the other described a cumulative risk of 17.3% in the 10 years after stroke for patients aged 18-55 years.

Using Dutch data, Dr. Verhoeven and colleagues identified 27,616 young stroke patients (age, 15-49 years; median age, 45 years) and 362,782 older stroke patients (median age, 76 years).

The cumulative incidence of any new cancer at 10 years was 3.7% among the younger stroke patients and 8.5% among the older stroke patients.

The incidence of a new cancer after stroke among younger patients was higher among women than men, while the opposite was true for older stroke patients.

Compared with the general population, younger stroke patients had a more than 2.5-fold greater likelihood of being diagnosed with a new cancer in the first year after ischemic stroke (standardized incidence ratio, 2.6). The risk was highest for lung cancer (SIR, 6.9), followed by hematologic cancers (SIR, 5.2).

Compared with the general population, younger stroke patients had nearly a 5.5-fold greater likelihood of being diagnosed with a new cancer in the first year after intracerebral hemorrhage (SIR, 5.4), and the risk was highest for hematologic cancers (SIR, 14.2).

In younger patients, the cumulative incidence of any cancer decreased over the years but remained significantly higher for 8 years following a stroke.

For patients aged 50 years or older, the 1-year risk for any new cancer after either ischemic stroke or intracerebral hemorrhage was 1.2 times higher, compared with the general population.

“We typically think of occult cancer as being a cause of stroke in an older population, given that the incidence of cancer increases over time [but] what this study shows is that we probably do need to consider occult cancer as an underlying cause of stroke even in a younger population,” said Laura Gioia, MD, stroke neurologist at the University of Montreal, who was not involved in the research.

Dr. Verhoeven and colleagues conclude that their finding supports the hypothesis of a causal link between cancer and stroke. Given the timing between stroke and cancer diagnosis, cancer may have been present when the stroke occurred and possibly played a role in causing it, the authors note. However, conclusions on causal mechanisms cannot be drawn from the current study.

The question of whether young stroke patients should be screened for cancer is a tough one, Dr. Gioia noted. “Cancer represents a small percentage of causes of stroke. That means you would have to screen a lot of people with a benefit that is still uncertain for the moment,” Dr. Gioia said in an interview.

“I think we need to keep cancer in mind as a cause of stroke in our young patients, and that should probably guide our history-taking with the patient and consider imaging when it’s appropriate and when we think that there could be an underlying occult cancer,” Dr. Gioia suggested.

The study was funded in part through unrestricted funding by Stryker, Medtronic, and Cerenovus. Dr. Verhoeven and Dr. Gioia have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

Younger people who experience stroke or intracerebral hemorrhage have about a three- to fivefold increased risk of being diagnosed with cancer in the next few years, new research shows.

In young people, stroke might be the first manifestation of an underlying cancer, according to the investigators, led by Jamie Verhoeven, MD, PhD, with the department of neurology, Radboud University Medical Centre, Nijmegen, the Netherlands.

The new study can be viewed as a “stepping stone for future studies investigating the usefulness of screening for cancer after stroke,” the researchers say.

The study was published online in JAMA Network Open.

Currently, the diagnostic workup for young people with stroke includes searching for rare clotting disorders, although screening for cancer is not regularly performed.

Some research suggests that stroke and cancer are linked, but the literature is limited. In prior studies among people of all ages, cancer incidence after stroke has been variable – from 1% to 5% at 1 year and from 11% to 30% after 10 years.

To the team’s knowledge, only two studies have described the incidence of cancer after stroke among younger patients. One put the risk at 0.5% for people aged 18-50 years in the first year after stroke; the other described a cumulative risk of 17.3% in the 10 years after stroke for patients aged 18-55 years.

Using Dutch data, Dr. Verhoeven and colleagues identified 27,616 young stroke patients (age, 15-49 years; median age, 45 years) and 362,782 older stroke patients (median age, 76 years).

The cumulative incidence of any new cancer at 10 years was 3.7% among the younger stroke patients and 8.5% among the older stroke patients.

The incidence of a new cancer after stroke among younger patients was higher among women than men, while the opposite was true for older stroke patients.

Compared with the general population, younger stroke patients had a more than 2.5-fold greater likelihood of being diagnosed with a new cancer in the first year after ischemic stroke (standardized incidence ratio, 2.6). The risk was highest for lung cancer (SIR, 6.9), followed by hematologic cancers (SIR, 5.2).

Compared with the general population, younger stroke patients had nearly a 5.5-fold greater likelihood of being diagnosed with a new cancer in the first year after intracerebral hemorrhage (SIR, 5.4), and the risk was highest for hematologic cancers (SIR, 14.2).

In younger patients, the cumulative incidence of any cancer decreased over the years but remained significantly higher for 8 years following a stroke.

For patients aged 50 years or older, the 1-year risk for any new cancer after either ischemic stroke or intracerebral hemorrhage was 1.2 times higher, compared with the general population.

“We typically think of occult cancer as being a cause of stroke in an older population, given that the incidence of cancer increases over time [but] what this study shows is that we probably do need to consider occult cancer as an underlying cause of stroke even in a younger population,” said Laura Gioia, MD, stroke neurologist at the University of Montreal, who was not involved in the research.

Dr. Verhoeven and colleagues conclude that their finding supports the hypothesis of a causal link between cancer and stroke. Given the timing between stroke and cancer diagnosis, cancer may have been present when the stroke occurred and possibly played a role in causing it, the authors note. However, conclusions on causal mechanisms cannot be drawn from the current study.

The question of whether young stroke patients should be screened for cancer is a tough one, Dr. Gioia noted. “Cancer represents a small percentage of causes of stroke. That means you would have to screen a lot of people with a benefit that is still uncertain for the moment,” Dr. Gioia said in an interview.

“I think we need to keep cancer in mind as a cause of stroke in our young patients, and that should probably guide our history-taking with the patient and consider imaging when it’s appropriate and when we think that there could be an underlying occult cancer,” Dr. Gioia suggested.

The study was funded in part through unrestricted funding by Stryker, Medtronic, and Cerenovus. Dr. Verhoeven and Dr. Gioia have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

 

Younger people who experience stroke or intracerebral hemorrhage have about a three- to fivefold increased risk of being diagnosed with cancer in the next few years, new research shows.

In young people, stroke might be the first manifestation of an underlying cancer, according to the investigators, led by Jamie Verhoeven, MD, PhD, with the department of neurology, Radboud University Medical Centre, Nijmegen, the Netherlands.

The new study can be viewed as a “stepping stone for future studies investigating the usefulness of screening for cancer after stroke,” the researchers say.

The study was published online in JAMA Network Open.

Currently, the diagnostic workup for young people with stroke includes searching for rare clotting disorders, although screening for cancer is not regularly performed.

Some research suggests that stroke and cancer are linked, but the literature is limited. In prior studies among people of all ages, cancer incidence after stroke has been variable – from 1% to 5% at 1 year and from 11% to 30% after 10 years.

To the team’s knowledge, only two studies have described the incidence of cancer after stroke among younger patients. One put the risk at 0.5% for people aged 18-50 years in the first year after stroke; the other described a cumulative risk of 17.3% in the 10 years after stroke for patients aged 18-55 years.

Using Dutch data, Dr. Verhoeven and colleagues identified 27,616 young stroke patients (age, 15-49 years; median age, 45 years) and 362,782 older stroke patients (median age, 76 years).

The cumulative incidence of any new cancer at 10 years was 3.7% among the younger stroke patients and 8.5% among the older stroke patients.

The incidence of a new cancer after stroke among younger patients was higher among women than men, while the opposite was true for older stroke patients.

Compared with the general population, younger stroke patients had a more than 2.5-fold greater likelihood of being diagnosed with a new cancer in the first year after ischemic stroke (standardized incidence ratio, 2.6). The risk was highest for lung cancer (SIR, 6.9), followed by hematologic cancers (SIR, 5.2).

Compared with the general population, younger stroke patients had nearly a 5.5-fold greater likelihood of being diagnosed with a new cancer in the first year after intracerebral hemorrhage (SIR, 5.4), and the risk was highest for hematologic cancers (SIR, 14.2).

In younger patients, the cumulative incidence of any cancer decreased over the years but remained significantly higher for 8 years following a stroke.

For patients aged 50 years or older, the 1-year risk for any new cancer after either ischemic stroke or intracerebral hemorrhage was 1.2 times higher, compared with the general population.

“We typically think of occult cancer as being a cause of stroke in an older population, given that the incidence of cancer increases over time [but] what this study shows is that we probably do need to consider occult cancer as an underlying cause of stroke even in a younger population,” said Laura Gioia, MD, stroke neurologist at the University of Montreal, who was not involved in the research.

Dr. Verhoeven and colleagues conclude that their finding supports the hypothesis of a causal link between cancer and stroke. Given the timing between stroke and cancer diagnosis, cancer may have been present when the stroke occurred and possibly played a role in causing it, the authors note. However, conclusions on causal mechanisms cannot be drawn from the current study.

The question of whether young stroke patients should be screened for cancer is a tough one, Dr. Gioia noted. “Cancer represents a small percentage of causes of stroke. That means you would have to screen a lot of people with a benefit that is still uncertain for the moment,” Dr. Gioia said in an interview.

“I think we need to keep cancer in mind as a cause of stroke in our young patients, and that should probably guide our history-taking with the patient and consider imaging when it’s appropriate and when we think that there could be an underlying occult cancer,” Dr. Gioia suggested.

The study was funded in part through unrestricted funding by Stryker, Medtronic, and Cerenovus. Dr. Verhoeven and Dr. Gioia have disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NETWORK OPEN

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

SGLT2 inhibitors: Real-world data show benefits outweigh risks

Article Type
Changed
Wed, 04/05/2023 - 11:37

 

A new study provides the first comprehensive safety profile of sodium-glucose cotransporter 2 (SGLT2) inhibitors in U.S. patients with chronic kidney disease (CKD) and type 2 diabetes receiving routine care and suggests that the benefits outweigh the risks.

Starting therapy with an SGLT2 inhibitor versus a glucagon-like peptide-1 (GLP-1) receptor agonist was associated with more lower limb amputations, nonvertebral fractures, and genital infections, but these risks need to be balanced against cardiovascular and renoprotective benefits, according to the researchers.

The analysis showed that there would be 2.1 more lower limb amputations, 2.5 more nonvertebral fractures, and 41 more genital infections per 1,000 patients per year among those receiving SGLT2 inhibitors versus an equal number of patients receiving GLP-1 agonists, lead author Edouard Fu, PhD, explained to this news organization in an email.

“On the other hand, we know from the evidence from randomized controlled trials that taking an SGLT2 inhibitor compared with placebo lowers the risk of developing kidney failure,” said Dr. Fu, who is a research fellow in the division of pharmacoepidemiology and pharmacoeconomics at Brigham and Women’s Hospital, Boston.

“For instance,” he continued, “in the DAPA-CKD clinical trial, dapagliflozin versus placebo led to 29 fewer events per 1,000 patients per year of the composite outcome (50% decline in estimated glomerular filtration rate [eGFR], kidney failure, cardiovascular or kidney death).”

In the CREDENCE trial, canagliflozin versus placebo led to 18 fewer events per 1,000 person-years for the composite outcome of doubling of serum creatinine, kidney failure, and cardiovascular or kidney death.

And in the EMPA-KIDNEY study, empagliflozin versus placebo led to 21 fewer events per 1,000 person-years for the composite outcome of progression of kidney disease or cardiovascular death.

“Thus, benefits would still outweigh the risks,” Dr. Fu emphasized.
 

‘Quantifies absolute rate of events among routine care patients’

“The importance of our paper,” he summarized, “is that it quantifies the absolute rate of events among routine care patients and may be used to inform shared decision-making.”

The analysis also found that the risks of diabetic ketoacidosis (DKA), hypovolemia, hypoglycemia, and severe urinary tract infection (UTI) were similar with SGLT2 inhibitors versus GLP-1 agonists, but the risk of developing acute kidney injury (AKI) was lower with an SGLT2 inhibitor.

“Our study can help inform patient-physician decision-making regarding risks and benefits before prescribing SGLT2 inhibitors in this population” of patients with CKD and diabetes treated in clinical practice, the researchers conclude, “but needs to be interpreted in light of its limitations, including residual confounding, short follow-up time, and the use of diagnosis codes to identify patients with CKD.”

The study was recently published in the Clinical Journal of the American Society of Nephrology.
 

Slow uptake, safety concerns

SGLT2 inhibitors are recommended as first-line therapy in patients with type 2 diabetes and CKD who have an eGFR equal to or greater than 20 mL/min per 1.73 m2, and thus are at high risk for cardiovascular disease and kidney disease progression, Dr. Fu and colleagues write.

However, studies report that as few as 6% of patients with CKD and type 2 diabetes are currently prescribed SGLT2 inhibitors in the United States.

This slow uptake of SGLT2 inhibitors among patients with CKD may be partly due to concerns about DKA, fractures, amputations, and urogenital infections observed in clinical trials.

However, such trials are generally underpowered to assess rare adverse events, use monitoring protocols to lower the risk of adverse events, and include a highly selected patient population, and so safety in routine clinical practice is often unclear.

To examine this, the researchers identified health insurance claims data from 96,128 individuals (from Optum, IBM MarketScan, and Medicare databases) who were 18 years or older (65 years or older for Medicare) and had type 2 diabetes and at least one inpatient or two outpatient diagnostic codes for stage 3 or 4 CKD.

Of these patients, 32,192 had a newly filled prescription for an SGLT2 inhibitor (empagliflozin, dapagliflozin, canagliflozin, or ertugliflozin) and 63,936 had a newly filled prescription for a GLP-1 agonist (liraglutide, dulaglutide, semaglutide, exenatide, albiglutide, or lixisenatide) between April 2013, when the first SGLT2 inhibitor was available in the United States, and 2021.

The researchers matched 28,847 individuals who were initiated on an SGLT2 inhibitor with an equal number who were initiated on a GLP-1 agonist, based on propensity scores, adjusting for more than 120 baseline characteristics.

Safety outcomes were based on previously identified potential safety signals.

Patients who were initiated on an SGLT2 inhibitor had 1.30-fold, 2.13-fold, and 3.08-fold higher risks of having a nonvertebral fracture, a lower limb amputation, and a genital infection, respectively, compared with patients who were initiated on a GLP-1 agonist, after a mean on-treatment time of 7.5 months,

Risks of DKA, hypovolemia, hypoglycemia, and severe UTI were similar in both groups.

Patients initiated on an SGLT2 inhibitor versus a GLP-1 agonist had a lower risk of AKI (hazard ratio, 0.93) equivalent to 6.75 fewer cases of AKI per 1,000 patients per year.

Patients had higher risks for lower limb amputation, genital infections, and nonvertebral fractures with SGLT2 inhibitors versus GLP-1 agonists across most of the prespecified subgroups by age, sex, cardiovascular disease, heart failure, and use of metformin, insulin, or sulfonylurea, but with wider confidence intervals.

Dr. Fu was supported by a Rubicon grant from the Dutch Research Council and has reported no relevant financial relationships. Disclosures for the other authors are listed with the article.

A version of this article originally appeared on Medscape.com.

Publications
Topics
Sections

 

A new study provides the first comprehensive safety profile of sodium-glucose cotransporter 2 (SGLT2) inhibitors in U.S. patients with chronic kidney disease (CKD) and type 2 diabetes receiving routine care and suggests that the benefits outweigh the risks.

Starting therapy with an SGLT2 inhibitor versus a glucagon-like peptide-1 (GLP-1) receptor agonist was associated with more lower limb amputations, nonvertebral fractures, and genital infections, but these risks need to be balanced against cardiovascular and renoprotective benefits, according to the researchers.

The analysis showed that there would be 2.1 more lower limb amputations, 2.5 more nonvertebral fractures, and 41 more genital infections per 1,000 patients per year among those receiving SGLT2 inhibitors versus an equal number of patients receiving GLP-1 agonists, lead author Edouard Fu, PhD, explained to this news organization in an email.

“On the other hand, we know from the evidence from randomized controlled trials that taking an SGLT2 inhibitor compared with placebo lowers the risk of developing kidney failure,” said Dr. Fu, who is a research fellow in the division of pharmacoepidemiology and pharmacoeconomics at Brigham and Women’s Hospital, Boston.

“For instance,” he continued, “in the DAPA-CKD clinical trial, dapagliflozin versus placebo led to 29 fewer events per 1,000 patients per year of the composite outcome (50% decline in estimated glomerular filtration rate [eGFR], kidney failure, cardiovascular or kidney death).”

In the CREDENCE trial, canagliflozin versus placebo led to 18 fewer events per 1,000 person-years for the composite outcome of doubling of serum creatinine, kidney failure, and cardiovascular or kidney death.

And in the EMPA-KIDNEY study, empagliflozin versus placebo led to 21 fewer events per 1,000 person-years for the composite outcome of progression of kidney disease or cardiovascular death.

“Thus, benefits would still outweigh the risks,” Dr. Fu emphasized.
 

‘Quantifies absolute rate of events among routine care patients’

“The importance of our paper,” he summarized, “is that it quantifies the absolute rate of events among routine care patients and may be used to inform shared decision-making.”

The analysis also found that the risks of diabetic ketoacidosis (DKA), hypovolemia, hypoglycemia, and severe urinary tract infection (UTI) were similar with SGLT2 inhibitors versus GLP-1 agonists, but the risk of developing acute kidney injury (AKI) was lower with an SGLT2 inhibitor.

“Our study can help inform patient-physician decision-making regarding risks and benefits before prescribing SGLT2 inhibitors in this population” of patients with CKD and diabetes treated in clinical practice, the researchers conclude, “but needs to be interpreted in light of its limitations, including residual confounding, short follow-up time, and the use of diagnosis codes to identify patients with CKD.”

The study was recently published in the Clinical Journal of the American Society of Nephrology.
 

Slow uptake, safety concerns

SGLT2 inhibitors are recommended as first-line therapy in patients with type 2 diabetes and CKD who have an eGFR equal to or greater than 20 mL/min per 1.73 m2, and thus are at high risk for cardiovascular disease and kidney disease progression, Dr. Fu and colleagues write.

However, studies report that as few as 6% of patients with CKD and type 2 diabetes are currently prescribed SGLT2 inhibitors in the United States.

This slow uptake of SGLT2 inhibitors among patients with CKD may be partly due to concerns about DKA, fractures, amputations, and urogenital infections observed in clinical trials.

However, such trials are generally underpowered to assess rare adverse events, use monitoring protocols to lower the risk of adverse events, and include a highly selected patient population, and so safety in routine clinical practice is often unclear.

To examine this, the researchers identified health insurance claims data from 96,128 individuals (from Optum, IBM MarketScan, and Medicare databases) who were 18 years or older (65 years or older for Medicare) and had type 2 diabetes and at least one inpatient or two outpatient diagnostic codes for stage 3 or 4 CKD.

Of these patients, 32,192 had a newly filled prescription for an SGLT2 inhibitor (empagliflozin, dapagliflozin, canagliflozin, or ertugliflozin) and 63,936 had a newly filled prescription for a GLP-1 agonist (liraglutide, dulaglutide, semaglutide, exenatide, albiglutide, or lixisenatide) between April 2013, when the first SGLT2 inhibitor was available in the United States, and 2021.

The researchers matched 28,847 individuals who were initiated on an SGLT2 inhibitor with an equal number who were initiated on a GLP-1 agonist, based on propensity scores, adjusting for more than 120 baseline characteristics.

Safety outcomes were based on previously identified potential safety signals.

Patients who were initiated on an SGLT2 inhibitor had 1.30-fold, 2.13-fold, and 3.08-fold higher risks of having a nonvertebral fracture, a lower limb amputation, and a genital infection, respectively, compared with patients who were initiated on a GLP-1 agonist, after a mean on-treatment time of 7.5 months,

Risks of DKA, hypovolemia, hypoglycemia, and severe UTI were similar in both groups.

Patients initiated on an SGLT2 inhibitor versus a GLP-1 agonist had a lower risk of AKI (hazard ratio, 0.93) equivalent to 6.75 fewer cases of AKI per 1,000 patients per year.

Patients had higher risks for lower limb amputation, genital infections, and nonvertebral fractures with SGLT2 inhibitors versus GLP-1 agonists across most of the prespecified subgroups by age, sex, cardiovascular disease, heart failure, and use of metformin, insulin, or sulfonylurea, but with wider confidence intervals.

Dr. Fu was supported by a Rubicon grant from the Dutch Research Council and has reported no relevant financial relationships. Disclosures for the other authors are listed with the article.

A version of this article originally appeared on Medscape.com.

 

A new study provides the first comprehensive safety profile of sodium-glucose cotransporter 2 (SGLT2) inhibitors in U.S. patients with chronic kidney disease (CKD) and type 2 diabetes receiving routine care and suggests that the benefits outweigh the risks.

Starting therapy with an SGLT2 inhibitor versus a glucagon-like peptide-1 (GLP-1) receptor agonist was associated with more lower limb amputations, nonvertebral fractures, and genital infections, but these risks need to be balanced against cardiovascular and renoprotective benefits, according to the researchers.

The analysis showed that there would be 2.1 more lower limb amputations, 2.5 more nonvertebral fractures, and 41 more genital infections per 1,000 patients per year among those receiving SGLT2 inhibitors versus an equal number of patients receiving GLP-1 agonists, lead author Edouard Fu, PhD, explained to this news organization in an email.

“On the other hand, we know from the evidence from randomized controlled trials that taking an SGLT2 inhibitor compared with placebo lowers the risk of developing kidney failure,” said Dr. Fu, who is a research fellow in the division of pharmacoepidemiology and pharmacoeconomics at Brigham and Women’s Hospital, Boston.

“For instance,” he continued, “in the DAPA-CKD clinical trial, dapagliflozin versus placebo led to 29 fewer events per 1,000 patients per year of the composite outcome (50% decline in estimated glomerular filtration rate [eGFR], kidney failure, cardiovascular or kidney death).”

In the CREDENCE trial, canagliflozin versus placebo led to 18 fewer events per 1,000 person-years for the composite outcome of doubling of serum creatinine, kidney failure, and cardiovascular or kidney death.

And in the EMPA-KIDNEY study, empagliflozin versus placebo led to 21 fewer events per 1,000 person-years for the composite outcome of progression of kidney disease or cardiovascular death.

“Thus, benefits would still outweigh the risks,” Dr. Fu emphasized.
 

‘Quantifies absolute rate of events among routine care patients’

“The importance of our paper,” he summarized, “is that it quantifies the absolute rate of events among routine care patients and may be used to inform shared decision-making.”

The analysis also found that the risks of diabetic ketoacidosis (DKA), hypovolemia, hypoglycemia, and severe urinary tract infection (UTI) were similar with SGLT2 inhibitors versus GLP-1 agonists, but the risk of developing acute kidney injury (AKI) was lower with an SGLT2 inhibitor.

“Our study can help inform patient-physician decision-making regarding risks and benefits before prescribing SGLT2 inhibitors in this population” of patients with CKD and diabetes treated in clinical practice, the researchers conclude, “but needs to be interpreted in light of its limitations, including residual confounding, short follow-up time, and the use of diagnosis codes to identify patients with CKD.”

The study was recently published in the Clinical Journal of the American Society of Nephrology.
 

Slow uptake, safety concerns

SGLT2 inhibitors are recommended as first-line therapy in patients with type 2 diabetes and CKD who have an eGFR equal to or greater than 20 mL/min per 1.73 m2, and thus are at high risk for cardiovascular disease and kidney disease progression, Dr. Fu and colleagues write.

However, studies report that as few as 6% of patients with CKD and type 2 diabetes are currently prescribed SGLT2 inhibitors in the United States.

This slow uptake of SGLT2 inhibitors among patients with CKD may be partly due to concerns about DKA, fractures, amputations, and urogenital infections observed in clinical trials.

However, such trials are generally underpowered to assess rare adverse events, use monitoring protocols to lower the risk of adverse events, and include a highly selected patient population, and so safety in routine clinical practice is often unclear.

To examine this, the researchers identified health insurance claims data from 96,128 individuals (from Optum, IBM MarketScan, and Medicare databases) who were 18 years or older (65 years or older for Medicare) and had type 2 diabetes and at least one inpatient or two outpatient diagnostic codes for stage 3 or 4 CKD.

Of these patients, 32,192 had a newly filled prescription for an SGLT2 inhibitor (empagliflozin, dapagliflozin, canagliflozin, or ertugliflozin) and 63,936 had a newly filled prescription for a GLP-1 agonist (liraglutide, dulaglutide, semaglutide, exenatide, albiglutide, or lixisenatide) between April 2013, when the first SGLT2 inhibitor was available in the United States, and 2021.

The researchers matched 28,847 individuals who were initiated on an SGLT2 inhibitor with an equal number who were initiated on a GLP-1 agonist, based on propensity scores, adjusting for more than 120 baseline characteristics.

Safety outcomes were based on previously identified potential safety signals.

Patients who were initiated on an SGLT2 inhibitor had 1.30-fold, 2.13-fold, and 3.08-fold higher risks of having a nonvertebral fracture, a lower limb amputation, and a genital infection, respectively, compared with patients who were initiated on a GLP-1 agonist, after a mean on-treatment time of 7.5 months,

Risks of DKA, hypovolemia, hypoglycemia, and severe UTI were similar in both groups.

Patients initiated on an SGLT2 inhibitor versus a GLP-1 agonist had a lower risk of AKI (hazard ratio, 0.93) equivalent to 6.75 fewer cases of AKI per 1,000 patients per year.

Patients had higher risks for lower limb amputation, genital infections, and nonvertebral fractures with SGLT2 inhibitors versus GLP-1 agonists across most of the prespecified subgroups by age, sex, cardiovascular disease, heart failure, and use of metformin, insulin, or sulfonylurea, but with wider confidence intervals.

Dr. Fu was supported by a Rubicon grant from the Dutch Research Council and has reported no relevant financial relationships. Disclosures for the other authors are listed with the article.

A version of this article originally appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

When practice-changing results don’t change practice

Article Type
Changed
Mon, 04/03/2023 - 14:15

 

The highly favorable results of the CheckMate 816 trial of neoadjuvant chemotherapy plus nivolumab for resectable stage IB-IIIA non–small cell lung cancer (NSCLC) were impressive enough to prompt a Food and Drug Administration approval of this combination in March 2022.

For many, this led to a marked shift in how we approached these patients. But in my conversations with many care teams, they have expressed ambivalence about using the chemoimmunotherapy regimen. Some have conveyed to me that the lack of statistically significant improvement in overall survival is a sticking point. Others have expressed uncertainty about the true benefit of neoadjuvant chemotherapy alongside nivolumab for patients with earlier-stage disease, given that 64% of patients in the trial had stage IIIA disease. The benefit of the neoadjuvant combination in patients with low or negative tumor programmed death–ligand 1 (PD-L1) expression also remains a question mark, though the trial found no significant differences in outcomes by PD-L1 subset.

But among many of my colleagues who favor adjuvant over neoadjuvant therapy, it isn’t necessarily the fine points of the data that present the real barrier: it’s the sentiment that “we just don’t favor a neoadjuvant approach at my place.”

If the worry is that a subset of patients who are eligible for up-front surgery may be derailed from the operating room if they experience significant disease progression or a complication during preoperative therapy or that surgery will more difficult after chemoimmunotherapy, those concerns are not supported by evidence. In fact, data on surgical outcomes from CheckMate 816 assessing these issues found that surgery after chemoimmunotherapy was approximately 30 minutes faster than it was after chemotherapy alone. In addition, the combination neoadjuvant chemoimmunotherapy approach was associated with less extensive surgeries, particularly for patients with stage IIIA NSCLC, and patients experienced measurably lower reports of pain and dyspnea as well.

Though postoperative systemic therapy has been our general approach for resectable NSCLC for nearly 2 decades, there are several reasons to focus on neoadjuvant therapy.

First, immunotherapy may work more effectively when the tumor antigens as well as lymph nodes and lymphatic system are present in situ at the time.

Second, patients may be eager to complete their treatment within a 3-month period of just three cycles of systemic therapy followed by surgery rather than receiving their treatment over a prolonged chapter of their lives, starting with surgery followed by four cycles of chemotherapy and 1 year of immunotherapy. 

Finally, we can’t ignore the fact that most neoadjuvant therapy is delivered exactly as intended, whereas planned adjuvant therapy is often not started or rarely completed as designed. At most, only about half of appropriate patients for adjuvant chemotherapy even start it, and far less complete a full four cycles or go on to complete prolonged adjuvant immunotherapy.

We also can’t underestimate the value of imaging and pathology findings after patients have completed neoadjuvant therapy. The pathologic complete response rate in CheckMate 816 is predictive of improved event-free survival over time.

And that isn’t just a binary variable of achieving a pathologic complete response or not. The degree of residual, viable tumor after surgery is a continuous variable associated along a spectrum with event-free survival. Our colleagues who treat breast cancer have been able to customize postoperative therapy to improve outcomes on the basis of the results achieved with neoadjuvant therapy. Multidisciplinary gastrointestinal oncology teams have revolutionized outcomes with rectal cancer by transitioning to total neoadjuvant therapy that makes it possible to deliver treatment more reliably and pursue organ-sparing approaches while achieving better survival.

Putting all of this together, I appreciate arguments against the generalizability or the maturity of the data supporting neoadjuvant chemoimmunotherapy for resectable NSCLC. However, sidestepping our most promising advances will harm our patients. Plus, what’s the point of generating practice-changing results if we don’t accept and implement them?

We owe it to our patients to follow the evolving evidence and not just stick to what we’ve always done.

Dr. West is an associate professor at City of Hope Comprehensive Cancer Center in Duarte, Calif., and vice president of network strategy at AccessHope in Los Angeles. Dr. West serves as web editor for JAMA Oncology, edits and writes several sections on lung cancer for UpToDate, and leads a wide range of continuing medical education and other educational programs.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

The highly favorable results of the CheckMate 816 trial of neoadjuvant chemotherapy plus nivolumab for resectable stage IB-IIIA non–small cell lung cancer (NSCLC) were impressive enough to prompt a Food and Drug Administration approval of this combination in March 2022.

For many, this led to a marked shift in how we approached these patients. But in my conversations with many care teams, they have expressed ambivalence about using the chemoimmunotherapy regimen. Some have conveyed to me that the lack of statistically significant improvement in overall survival is a sticking point. Others have expressed uncertainty about the true benefit of neoadjuvant chemotherapy alongside nivolumab for patients with earlier-stage disease, given that 64% of patients in the trial had stage IIIA disease. The benefit of the neoadjuvant combination in patients with low or negative tumor programmed death–ligand 1 (PD-L1) expression also remains a question mark, though the trial found no significant differences in outcomes by PD-L1 subset.

But among many of my colleagues who favor adjuvant over neoadjuvant therapy, it isn’t necessarily the fine points of the data that present the real barrier: it’s the sentiment that “we just don’t favor a neoadjuvant approach at my place.”

If the worry is that a subset of patients who are eligible for up-front surgery may be derailed from the operating room if they experience significant disease progression or a complication during preoperative therapy or that surgery will more difficult after chemoimmunotherapy, those concerns are not supported by evidence. In fact, data on surgical outcomes from CheckMate 816 assessing these issues found that surgery after chemoimmunotherapy was approximately 30 minutes faster than it was after chemotherapy alone. In addition, the combination neoadjuvant chemoimmunotherapy approach was associated with less extensive surgeries, particularly for patients with stage IIIA NSCLC, and patients experienced measurably lower reports of pain and dyspnea as well.

Though postoperative systemic therapy has been our general approach for resectable NSCLC for nearly 2 decades, there are several reasons to focus on neoadjuvant therapy.

First, immunotherapy may work more effectively when the tumor antigens as well as lymph nodes and lymphatic system are present in situ at the time.

Second, patients may be eager to complete their treatment within a 3-month period of just three cycles of systemic therapy followed by surgery rather than receiving their treatment over a prolonged chapter of their lives, starting with surgery followed by four cycles of chemotherapy and 1 year of immunotherapy. 

Finally, we can’t ignore the fact that most neoadjuvant therapy is delivered exactly as intended, whereas planned adjuvant therapy is often not started or rarely completed as designed. At most, only about half of appropriate patients for adjuvant chemotherapy even start it, and far less complete a full four cycles or go on to complete prolonged adjuvant immunotherapy.

We also can’t underestimate the value of imaging and pathology findings after patients have completed neoadjuvant therapy. The pathologic complete response rate in CheckMate 816 is predictive of improved event-free survival over time.

And that isn’t just a binary variable of achieving a pathologic complete response or not. The degree of residual, viable tumor after surgery is a continuous variable associated along a spectrum with event-free survival. Our colleagues who treat breast cancer have been able to customize postoperative therapy to improve outcomes on the basis of the results achieved with neoadjuvant therapy. Multidisciplinary gastrointestinal oncology teams have revolutionized outcomes with rectal cancer by transitioning to total neoadjuvant therapy that makes it possible to deliver treatment more reliably and pursue organ-sparing approaches while achieving better survival.

Putting all of this together, I appreciate arguments against the generalizability or the maturity of the data supporting neoadjuvant chemoimmunotherapy for resectable NSCLC. However, sidestepping our most promising advances will harm our patients. Plus, what’s the point of generating practice-changing results if we don’t accept and implement them?

We owe it to our patients to follow the evolving evidence and not just stick to what we’ve always done.

Dr. West is an associate professor at City of Hope Comprehensive Cancer Center in Duarte, Calif., and vice president of network strategy at AccessHope in Los Angeles. Dr. West serves as web editor for JAMA Oncology, edits and writes several sections on lung cancer for UpToDate, and leads a wide range of continuing medical education and other educational programs.

A version of this article first appeared on Medscape.com.

 

The highly favorable results of the CheckMate 816 trial of neoadjuvant chemotherapy plus nivolumab for resectable stage IB-IIIA non–small cell lung cancer (NSCLC) were impressive enough to prompt a Food and Drug Administration approval of this combination in March 2022.

For many, this led to a marked shift in how we approached these patients. But in my conversations with many care teams, they have expressed ambivalence about using the chemoimmunotherapy regimen. Some have conveyed to me that the lack of statistically significant improvement in overall survival is a sticking point. Others have expressed uncertainty about the true benefit of neoadjuvant chemotherapy alongside nivolumab for patients with earlier-stage disease, given that 64% of patients in the trial had stage IIIA disease. The benefit of the neoadjuvant combination in patients with low or negative tumor programmed death–ligand 1 (PD-L1) expression also remains a question mark, though the trial found no significant differences in outcomes by PD-L1 subset.

But among many of my colleagues who favor adjuvant over neoadjuvant therapy, it isn’t necessarily the fine points of the data that present the real barrier: it’s the sentiment that “we just don’t favor a neoadjuvant approach at my place.”

If the worry is that a subset of patients who are eligible for up-front surgery may be derailed from the operating room if they experience significant disease progression or a complication during preoperative therapy or that surgery will more difficult after chemoimmunotherapy, those concerns are not supported by evidence. In fact, data on surgical outcomes from CheckMate 816 assessing these issues found that surgery after chemoimmunotherapy was approximately 30 minutes faster than it was after chemotherapy alone. In addition, the combination neoadjuvant chemoimmunotherapy approach was associated with less extensive surgeries, particularly for patients with stage IIIA NSCLC, and patients experienced measurably lower reports of pain and dyspnea as well.

Though postoperative systemic therapy has been our general approach for resectable NSCLC for nearly 2 decades, there are several reasons to focus on neoadjuvant therapy.

First, immunotherapy may work more effectively when the tumor antigens as well as lymph nodes and lymphatic system are present in situ at the time.

Second, patients may be eager to complete their treatment within a 3-month period of just three cycles of systemic therapy followed by surgery rather than receiving their treatment over a prolonged chapter of their lives, starting with surgery followed by four cycles of chemotherapy and 1 year of immunotherapy. 

Finally, we can’t ignore the fact that most neoadjuvant therapy is delivered exactly as intended, whereas planned adjuvant therapy is often not started or rarely completed as designed. At most, only about half of appropriate patients for adjuvant chemotherapy even start it, and far less complete a full four cycles or go on to complete prolonged adjuvant immunotherapy.

We also can’t underestimate the value of imaging and pathology findings after patients have completed neoadjuvant therapy. The pathologic complete response rate in CheckMate 816 is predictive of improved event-free survival over time.

And that isn’t just a binary variable of achieving a pathologic complete response or not. The degree of residual, viable tumor after surgery is a continuous variable associated along a spectrum with event-free survival. Our colleagues who treat breast cancer have been able to customize postoperative therapy to improve outcomes on the basis of the results achieved with neoadjuvant therapy. Multidisciplinary gastrointestinal oncology teams have revolutionized outcomes with rectal cancer by transitioning to total neoadjuvant therapy that makes it possible to deliver treatment more reliably and pursue organ-sparing approaches while achieving better survival.

Putting all of this together, I appreciate arguments against the generalizability or the maturity of the data supporting neoadjuvant chemoimmunotherapy for resectable NSCLC. However, sidestepping our most promising advances will harm our patients. Plus, what’s the point of generating practice-changing results if we don’t accept and implement them?

We owe it to our patients to follow the evolving evidence and not just stick to what we’ve always done.

Dr. West is an associate professor at City of Hope Comprehensive Cancer Center in Duarte, Calif., and vice president of network strategy at AccessHope in Los Angeles. Dr. West serves as web editor for JAMA Oncology, edits and writes several sections on lung cancer for UpToDate, and leads a wide range of continuing medical education and other educational programs.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Heart rate, cardiac phase influence perception of time

Article Type
Changed
Fri, 04/07/2023 - 08:14

 

People’s perception of time is subjective and based not only on their emotional state but also on heartbeat and heart rate (HR), two new studies suggest.

Researchers studied young adults with an electrocardiogram (ECG), measuring electrical activity at millisecond resolution while participants listened to tones that varied in duration. Participants were asked to report whether certain tones were longer or shorter, in relation to others.

The researchers found that the momentary perception of time was not continuous but rather expanded or contracted with each heartbeat. When the heartbeat preceding a tone was shorter, participants regarded the tone as longer in duration; but when the preceding heartbeat was longer, the participants experienced the tone as shorter.

“Our findings suggest that there is a unique role that cardiac dynamics play in the momentary experience of time,” lead author Saeedah Sadeghi, MSc, a doctoral candidate in the department of psychology at Cornell University, Ithaca, N.Y., said in an interview.

The study was published online  in Psychophysiology.

In a second study, published in the journal Current Biology, a separate team of researchers asked participants to judge whether a brief event – the presentation of a tone or an image – was shorter or longer than a reference duration. ECG was used to track systole and diastole when participants were presented with these events.

The researchers found that the durations were underestimated during systole and overestimated during diastole, suggesting that time seemed to “speed up” or “slow down,” based on cardiac contraction and relaxation. When participants rated the events as more arousing, their perceived durations contracted, even during diastole.

“In our new paper, we show that our heart shapes the perceived duration of events, so time passes quicker when the heart contracts but slower when the heart relaxes,” lead author Irena Arslanova, PhD, postdoctoral researcher in cognitive neuroscience, Royal Holloway University of London, told this news organization.
 

Temporal ‘wrinkles’

“Subjective time is malleable,” observed Ms. Sadeghi and colleagues in their report. “Rather than being a uniform dimension, perceived duration has ‘wrinkles,’ with certain intervals appearing to dilate or contract relative to objective time” – a phenomenon sometimes referred to as “distortion.”

“We have known that people aren’t always consistent in how they perceive time, and objective duration doesn’t always explain subjective perception of time,” Ms. Sadeghi said.

Although the potential role of the heart in the experience of time has been hypothesized, research into the heart-time connection has been limited, with previous studies focusing primarily on estimating the average cardiac measures on longer time scales over seconds to minutes.

The current study sought to investigate “the beat-by-beat fluctuations of the heart period on the experience of brief moments in time” because, compared with longer time scales, subsecond temporal perception “has different underlying mechanisms” and a subsecond stimulus can be a “small fraction of a heartbeat.”

To home in on this small fraction, the researchers studied 45 participants (aged 18-21), who listened to 210 tones ranging in duration from 80 ms (short) to 188 ms (long). The tones were linearly spaced at 18-ms increments (80, 98, 116, 134, 152, 170, 188).

Participants were asked to categorize each tone as “short” or “long.” All tones were randomly assigned to be synchronized either with the systolic or diastolic phase of the cardiac cycle (50% each). The tones were triggered by participants’ heartbeats.

In addition, participants engaged in a heartbeat-counting activity, in which they were asked not to touch their pulse but to count their heartbeats by tuning in to their bodily sensations at intervals of 25, 35, and 45 seconds.
 

 

 

‘Classical’ response

“Participants exhibited an increased heart period after tone onset, which returned to baseline following an average canonical bell shape,” the authors reported.

The researchers performed regression analyses to determine how, on average, the heart rate before the tone was related to perceived duration or how the amount of change after the tone was related to perceived duration.

They found that when the heart rate was higher before the tone, participants tended to be more accurate in their time perception. When the heartbeat preceding a tone was shorter, participants experienced the tone as longer; conversely, when the heartbeat was longer, they experienced the duration of the identical sound as shorter.

When participants focused their attention on the sounds, their heart rate was affected such that their orienting responses actually changed their heart rate and, in turn, their temporal perception.

“The orienting response is classical,” Ms. Sadeghi said. “When you attend to something unpredictable or novel, the act of orienting attention decreases the HR.”

She explained that the heartbeats are “noise to the brain.” When people need to perceive external events, “a decrease in HR facilitates the intake of things from outside and facilitates sensory intake.”

A lower HR “makes it easier for the person to take in the tone and perceive it, so it feels as though they perceive more of the tone and the duration seems longer – similarly, when the HR decreases.”

It is unknown whether this is a causal relationship, she cautioned, “but it seems as though the decrease in HR somehow makes it easier to ‘get’ more of the tone, which then appears to have longer duration.”
 

Bidirectional relationship

“We know that experienced time can be distorted,” said Dr. Arslanova. “Time flies by when we’re busy or having fun but drags on when we’re bored or waiting for something, yet we still don’t know how the brain gives rise to such elastic experience of time.”

The brain controls the heart in response to the information the heart provides about the state of the body, she noted, “but we have begun to see more research showing that the heart–brain relationship is bidirectional.”

This means that the heart plays a role in shaping “how we process information and experience emotions.” In this analysis, Dr. Arslanova and colleagues “wanted to study whether the heart also shapes the experience of time.”

To do so, they conducted two experiments.

In the first, participants (n = 28) were presented with brief events during systole or during diastole. The events took the form of an emotionally neutral visual shape or auditory tone, shown for durations of 200 to 400 ms.

Participants were asked whether these events were of longer or shorter duration, compared with a reference duration.

The researchers found significant main effect of cardiac phase systole (F(1,27) = 8.1, P =.01), with stimuli presented at diastole regarded, on average, as 7 ms longer than those presented at systole.

They also found a significant main effect of modality (F(1,27) = 5.7, P = .02), with tones judged, on average, as 13 ms longer than visual stimuli.

“This means that time ‘sped up’ during the heart’s contraction and ‘slowed down’ during the heart’s relaxation,” Dr. Arslanova said.

The effect of cardiac phase on duration perception was independent of changes in HR, the authors noted.

In the second experiment, participants performed a similar task, but this time, it involved the images of faces containing emotional expressions. The researchers again observed a similar pattern of time appearing to speed up during systole and slow down during diastole, with stimuli present at diastole regarded as being an average 9 ms longer than those presented at systole.

These opposing effects of systole and diastole on time perception were present only for low and average arousal ratings (b = 14.4 [SE 3.2], P < .001 and b = 9.2 [2.3], P <.001, respectively). However, this effect disappeared when arousal ratings increased (b = 4.1 [3.2] P =.21).

“Interestingly, when participants rated the events as more arousing, their perceived durations contracted, even during the heart’s relaxation,” Dr. Arslanova observed. “This means that in a nonaroused state, the two cardiac phases pull the experienced duration in opposite directions – time contracts, then expands.”

The findings “also predict that increasing HR would speed up passing time, making events seem shorter, because there will be a stronger influence from the heart’s contractions,” she said.

She described the relationship between time perception and emotion as complex, noting that the findings are important because they show “that the way we experience time cannot be examined in isolation from our body,” she said.
 

 

 

Converging evidence

Martin Wiener, PhD, assistant professor, George Mason University, Fairfax, Va., said both papers “provide converging evidence on the role of the heart in our perception of time.”

Together, “the results share that our sense of time – that is, our incoming sensory perception of the present ‘moment’ – is adjusted or ‘gated’ by both our HR and cardiac phase,” said Dr. Wiener, executive director of the Timing Research Forum.

The studies “provide a link between the body and the brain, in terms of our perception, and that we cannot study one without the context of the other,” said Dr. Wiener, who was not involved with the current study.

“All of this opens up a new avenue of research, and so it is very exciting to see,” Dr. Wiener stated.

No source of funding was listed for the study by Ms. Sadeghi and coauthors. They declared no relevant financial relationships.

Dr. Arslanova and coauthors declared no competing interests. Senior author Manos Tsakiris, PhD, receives funding from the European Research Council Consolidator Grant. Dr. Wiener declared no relevant financial relationships.
 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

People’s perception of time is subjective and based not only on their emotional state but also on heartbeat and heart rate (HR), two new studies suggest.

Researchers studied young adults with an electrocardiogram (ECG), measuring electrical activity at millisecond resolution while participants listened to tones that varied in duration. Participants were asked to report whether certain tones were longer or shorter, in relation to others.

The researchers found that the momentary perception of time was not continuous but rather expanded or contracted with each heartbeat. When the heartbeat preceding a tone was shorter, participants regarded the tone as longer in duration; but when the preceding heartbeat was longer, the participants experienced the tone as shorter.

“Our findings suggest that there is a unique role that cardiac dynamics play in the momentary experience of time,” lead author Saeedah Sadeghi, MSc, a doctoral candidate in the department of psychology at Cornell University, Ithaca, N.Y., said in an interview.

The study was published online  in Psychophysiology.

In a second study, published in the journal Current Biology, a separate team of researchers asked participants to judge whether a brief event – the presentation of a tone or an image – was shorter or longer than a reference duration. ECG was used to track systole and diastole when participants were presented with these events.

The researchers found that the durations were underestimated during systole and overestimated during diastole, suggesting that time seemed to “speed up” or “slow down,” based on cardiac contraction and relaxation. When participants rated the events as more arousing, their perceived durations contracted, even during diastole.

“In our new paper, we show that our heart shapes the perceived duration of events, so time passes quicker when the heart contracts but slower when the heart relaxes,” lead author Irena Arslanova, PhD, postdoctoral researcher in cognitive neuroscience, Royal Holloway University of London, told this news organization.
 

Temporal ‘wrinkles’

“Subjective time is malleable,” observed Ms. Sadeghi and colleagues in their report. “Rather than being a uniform dimension, perceived duration has ‘wrinkles,’ with certain intervals appearing to dilate or contract relative to objective time” – a phenomenon sometimes referred to as “distortion.”

“We have known that people aren’t always consistent in how they perceive time, and objective duration doesn’t always explain subjective perception of time,” Ms. Sadeghi said.

Although the potential role of the heart in the experience of time has been hypothesized, research into the heart-time connection has been limited, with previous studies focusing primarily on estimating the average cardiac measures on longer time scales over seconds to minutes.

The current study sought to investigate “the beat-by-beat fluctuations of the heart period on the experience of brief moments in time” because, compared with longer time scales, subsecond temporal perception “has different underlying mechanisms” and a subsecond stimulus can be a “small fraction of a heartbeat.”

To home in on this small fraction, the researchers studied 45 participants (aged 18-21), who listened to 210 tones ranging in duration from 80 ms (short) to 188 ms (long). The tones were linearly spaced at 18-ms increments (80, 98, 116, 134, 152, 170, 188).

Participants were asked to categorize each tone as “short” or “long.” All tones were randomly assigned to be synchronized either with the systolic or diastolic phase of the cardiac cycle (50% each). The tones were triggered by participants’ heartbeats.

In addition, participants engaged in a heartbeat-counting activity, in which they were asked not to touch their pulse but to count their heartbeats by tuning in to their bodily sensations at intervals of 25, 35, and 45 seconds.
 

 

 

‘Classical’ response

“Participants exhibited an increased heart period after tone onset, which returned to baseline following an average canonical bell shape,” the authors reported.

The researchers performed regression analyses to determine how, on average, the heart rate before the tone was related to perceived duration or how the amount of change after the tone was related to perceived duration.

They found that when the heart rate was higher before the tone, participants tended to be more accurate in their time perception. When the heartbeat preceding a tone was shorter, participants experienced the tone as longer; conversely, when the heartbeat was longer, they experienced the duration of the identical sound as shorter.

When participants focused their attention on the sounds, their heart rate was affected such that their orienting responses actually changed their heart rate and, in turn, their temporal perception.

“The orienting response is classical,” Ms. Sadeghi said. “When you attend to something unpredictable or novel, the act of orienting attention decreases the HR.”

She explained that the heartbeats are “noise to the brain.” When people need to perceive external events, “a decrease in HR facilitates the intake of things from outside and facilitates sensory intake.”

A lower HR “makes it easier for the person to take in the tone and perceive it, so it feels as though they perceive more of the tone and the duration seems longer – similarly, when the HR decreases.”

It is unknown whether this is a causal relationship, she cautioned, “but it seems as though the decrease in HR somehow makes it easier to ‘get’ more of the tone, which then appears to have longer duration.”
 

Bidirectional relationship

“We know that experienced time can be distorted,” said Dr. Arslanova. “Time flies by when we’re busy or having fun but drags on when we’re bored or waiting for something, yet we still don’t know how the brain gives rise to such elastic experience of time.”

The brain controls the heart in response to the information the heart provides about the state of the body, she noted, “but we have begun to see more research showing that the heart–brain relationship is bidirectional.”

This means that the heart plays a role in shaping “how we process information and experience emotions.” In this analysis, Dr. Arslanova and colleagues “wanted to study whether the heart also shapes the experience of time.”

To do so, they conducted two experiments.

In the first, participants (n = 28) were presented with brief events during systole or during diastole. The events took the form of an emotionally neutral visual shape or auditory tone, shown for durations of 200 to 400 ms.

Participants were asked whether these events were of longer or shorter duration, compared with a reference duration.

The researchers found significant main effect of cardiac phase systole (F(1,27) = 8.1, P =.01), with stimuli presented at diastole regarded, on average, as 7 ms longer than those presented at systole.

They also found a significant main effect of modality (F(1,27) = 5.7, P = .02), with tones judged, on average, as 13 ms longer than visual stimuli.

“This means that time ‘sped up’ during the heart’s contraction and ‘slowed down’ during the heart’s relaxation,” Dr. Arslanova said.

The effect of cardiac phase on duration perception was independent of changes in HR, the authors noted.

In the second experiment, participants performed a similar task, but this time, it involved the images of faces containing emotional expressions. The researchers again observed a similar pattern of time appearing to speed up during systole and slow down during diastole, with stimuli present at diastole regarded as being an average 9 ms longer than those presented at systole.

These opposing effects of systole and diastole on time perception were present only for low and average arousal ratings (b = 14.4 [SE 3.2], P < .001 and b = 9.2 [2.3], P <.001, respectively). However, this effect disappeared when arousal ratings increased (b = 4.1 [3.2] P =.21).

“Interestingly, when participants rated the events as more arousing, their perceived durations contracted, even during the heart’s relaxation,” Dr. Arslanova observed. “This means that in a nonaroused state, the two cardiac phases pull the experienced duration in opposite directions – time contracts, then expands.”

The findings “also predict that increasing HR would speed up passing time, making events seem shorter, because there will be a stronger influence from the heart’s contractions,” she said.

She described the relationship between time perception and emotion as complex, noting that the findings are important because they show “that the way we experience time cannot be examined in isolation from our body,” she said.
 

 

 

Converging evidence

Martin Wiener, PhD, assistant professor, George Mason University, Fairfax, Va., said both papers “provide converging evidence on the role of the heart in our perception of time.”

Together, “the results share that our sense of time – that is, our incoming sensory perception of the present ‘moment’ – is adjusted or ‘gated’ by both our HR and cardiac phase,” said Dr. Wiener, executive director of the Timing Research Forum.

The studies “provide a link between the body and the brain, in terms of our perception, and that we cannot study one without the context of the other,” said Dr. Wiener, who was not involved with the current study.

“All of this opens up a new avenue of research, and so it is very exciting to see,” Dr. Wiener stated.

No source of funding was listed for the study by Ms. Sadeghi and coauthors. They declared no relevant financial relationships.

Dr. Arslanova and coauthors declared no competing interests. Senior author Manos Tsakiris, PhD, receives funding from the European Research Council Consolidator Grant. Dr. Wiener declared no relevant financial relationships.
 

A version of this article first appeared on Medscape.com.

 

People’s perception of time is subjective and based not only on their emotional state but also on heartbeat and heart rate (HR), two new studies suggest.

Researchers studied young adults with an electrocardiogram (ECG), measuring electrical activity at millisecond resolution while participants listened to tones that varied in duration. Participants were asked to report whether certain tones were longer or shorter, in relation to others.

The researchers found that the momentary perception of time was not continuous but rather expanded or contracted with each heartbeat. When the heartbeat preceding a tone was shorter, participants regarded the tone as longer in duration; but when the preceding heartbeat was longer, the participants experienced the tone as shorter.

“Our findings suggest that there is a unique role that cardiac dynamics play in the momentary experience of time,” lead author Saeedah Sadeghi, MSc, a doctoral candidate in the department of psychology at Cornell University, Ithaca, N.Y., said in an interview.

The study was published online  in Psychophysiology.

In a second study, published in the journal Current Biology, a separate team of researchers asked participants to judge whether a brief event – the presentation of a tone or an image – was shorter or longer than a reference duration. ECG was used to track systole and diastole when participants were presented with these events.

The researchers found that the durations were underestimated during systole and overestimated during diastole, suggesting that time seemed to “speed up” or “slow down,” based on cardiac contraction and relaxation. When participants rated the events as more arousing, their perceived durations contracted, even during diastole.

“In our new paper, we show that our heart shapes the perceived duration of events, so time passes quicker when the heart contracts but slower when the heart relaxes,” lead author Irena Arslanova, PhD, postdoctoral researcher in cognitive neuroscience, Royal Holloway University of London, told this news organization.
 

Temporal ‘wrinkles’

“Subjective time is malleable,” observed Ms. Sadeghi and colleagues in their report. “Rather than being a uniform dimension, perceived duration has ‘wrinkles,’ with certain intervals appearing to dilate or contract relative to objective time” – a phenomenon sometimes referred to as “distortion.”

“We have known that people aren’t always consistent in how they perceive time, and objective duration doesn’t always explain subjective perception of time,” Ms. Sadeghi said.

Although the potential role of the heart in the experience of time has been hypothesized, research into the heart-time connection has been limited, with previous studies focusing primarily on estimating the average cardiac measures on longer time scales over seconds to minutes.

The current study sought to investigate “the beat-by-beat fluctuations of the heart period on the experience of brief moments in time” because, compared with longer time scales, subsecond temporal perception “has different underlying mechanisms” and a subsecond stimulus can be a “small fraction of a heartbeat.”

To home in on this small fraction, the researchers studied 45 participants (aged 18-21), who listened to 210 tones ranging in duration from 80 ms (short) to 188 ms (long). The tones were linearly spaced at 18-ms increments (80, 98, 116, 134, 152, 170, 188).

Participants were asked to categorize each tone as “short” or “long.” All tones were randomly assigned to be synchronized either with the systolic or diastolic phase of the cardiac cycle (50% each). The tones were triggered by participants’ heartbeats.

In addition, participants engaged in a heartbeat-counting activity, in which they were asked not to touch their pulse but to count their heartbeats by tuning in to their bodily sensations at intervals of 25, 35, and 45 seconds.
 

 

 

‘Classical’ response

“Participants exhibited an increased heart period after tone onset, which returned to baseline following an average canonical bell shape,” the authors reported.

The researchers performed regression analyses to determine how, on average, the heart rate before the tone was related to perceived duration or how the amount of change after the tone was related to perceived duration.

They found that when the heart rate was higher before the tone, participants tended to be more accurate in their time perception. When the heartbeat preceding a tone was shorter, participants experienced the tone as longer; conversely, when the heartbeat was longer, they experienced the duration of the identical sound as shorter.

When participants focused their attention on the sounds, their heart rate was affected such that their orienting responses actually changed their heart rate and, in turn, their temporal perception.

“The orienting response is classical,” Ms. Sadeghi said. “When you attend to something unpredictable or novel, the act of orienting attention decreases the HR.”

She explained that the heartbeats are “noise to the brain.” When people need to perceive external events, “a decrease in HR facilitates the intake of things from outside and facilitates sensory intake.”

A lower HR “makes it easier for the person to take in the tone and perceive it, so it feels as though they perceive more of the tone and the duration seems longer – similarly, when the HR decreases.”

It is unknown whether this is a causal relationship, she cautioned, “but it seems as though the decrease in HR somehow makes it easier to ‘get’ more of the tone, which then appears to have longer duration.”
 

Bidirectional relationship

“We know that experienced time can be distorted,” said Dr. Arslanova. “Time flies by when we’re busy or having fun but drags on when we’re bored or waiting for something, yet we still don’t know how the brain gives rise to such elastic experience of time.”

The brain controls the heart in response to the information the heart provides about the state of the body, she noted, “but we have begun to see more research showing that the heart–brain relationship is bidirectional.”

This means that the heart plays a role in shaping “how we process information and experience emotions.” In this analysis, Dr. Arslanova and colleagues “wanted to study whether the heart also shapes the experience of time.”

To do so, they conducted two experiments.

In the first, participants (n = 28) were presented with brief events during systole or during diastole. The events took the form of an emotionally neutral visual shape or auditory tone, shown for durations of 200 to 400 ms.

Participants were asked whether these events were of longer or shorter duration, compared with a reference duration.

The researchers found significant main effect of cardiac phase systole (F(1,27) = 8.1, P =.01), with stimuli presented at diastole regarded, on average, as 7 ms longer than those presented at systole.

They also found a significant main effect of modality (F(1,27) = 5.7, P = .02), with tones judged, on average, as 13 ms longer than visual stimuli.

“This means that time ‘sped up’ during the heart’s contraction and ‘slowed down’ during the heart’s relaxation,” Dr. Arslanova said.

The effect of cardiac phase on duration perception was independent of changes in HR, the authors noted.

In the second experiment, participants performed a similar task, but this time, it involved the images of faces containing emotional expressions. The researchers again observed a similar pattern of time appearing to speed up during systole and slow down during diastole, with stimuli present at diastole regarded as being an average 9 ms longer than those presented at systole.

These opposing effects of systole and diastole on time perception were present only for low and average arousal ratings (b = 14.4 [SE 3.2], P < .001 and b = 9.2 [2.3], P <.001, respectively). However, this effect disappeared when arousal ratings increased (b = 4.1 [3.2] P =.21).

“Interestingly, when participants rated the events as more arousing, their perceived durations contracted, even during the heart’s relaxation,” Dr. Arslanova observed. “This means that in a nonaroused state, the two cardiac phases pull the experienced duration in opposite directions – time contracts, then expands.”

The findings “also predict that increasing HR would speed up passing time, making events seem shorter, because there will be a stronger influence from the heart’s contractions,” she said.

She described the relationship between time perception and emotion as complex, noting that the findings are important because they show “that the way we experience time cannot be examined in isolation from our body,” she said.
 

 

 

Converging evidence

Martin Wiener, PhD, assistant professor, George Mason University, Fairfax, Va., said both papers “provide converging evidence on the role of the heart in our perception of time.”

Together, “the results share that our sense of time – that is, our incoming sensory perception of the present ‘moment’ – is adjusted or ‘gated’ by both our HR and cardiac phase,” said Dr. Wiener, executive director of the Timing Research Forum.

The studies “provide a link between the body and the brain, in terms of our perception, and that we cannot study one without the context of the other,” said Dr. Wiener, who was not involved with the current study.

“All of this opens up a new avenue of research, and so it is very exciting to see,” Dr. Wiener stated.

No source of funding was listed for the study by Ms. Sadeghi and coauthors. They declared no relevant financial relationships.

Dr. Arslanova and coauthors declared no competing interests. Senior author Manos Tsakiris, PhD, receives funding from the European Research Council Consolidator Grant. Dr. Wiener declared no relevant financial relationships.
 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM PSYCHOPHYSIOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Nasal COVID treatment shows early promise against multiple variants

Article Type
Changed
Wed, 04/05/2023 - 11:38

An antiviral therapy in early development has the potential to prevent COVID-19 infections when given as a nasal spray as little as 4 hours before exposure. It also appears to work as a treatment if used within 4 hours after infection inside the nose, new research reveals. 

Known as TriSb92 (brand name Covidin, from drugmaker Pandemblock Oy in Finland), the viral inhibitor also appears effective against all coronavirus variants of concern, neutralizing even the Omicron variants BA.5, XBB, and BQ.1.1 in laboratory and mice studies. 

Unlike a COVID vaccine that boosts a person’s immune system as protection, the antiviral nasal spray works more directly by blocking the virus, acting as a “biological mask in the nasal cavity,” according to the biotechnology company set up to develop the treatment. 

The product targets a stable site on the spike protein of the virus that is not known to mutate. This same site is shared among many variants of the COVID virus, so it could be effective against future variants as well, researchers note.

“In animal models, by directly inactivating the virus, TriSb92 offers immediate and robust protection” against coronavirus infection and severe COVID, said Anna R. Mäkelä, PhD, lead author of the study and a senior scientist in the department of virology at the University of Helsinki. 

The study was published online in Nature Communications.
 

A potential first line of defense

Even in cases where the antiviral does not prevent coronavirus infection, the treatment could slow infection. This could happen by limiting how much virus could replicate early in the skin inside the nose and nasopharynx (the upper part of the throat), said Dr. Mäkelä, who is also CEO of Pandemblock Oy, the company set up to develop the product.

“TriSb92 could effectively tip the balance in favor of the [the person] and thereby help to reduce the risk of severe COVID-19 disease,” she said. 

The antiviral also could offer an alternative to people who cannot or do not respond to a vaccine.

“Many elderly people as well as individuals who are immunodeficient for various reasons do not respond to vaccines and are in the need of other protective measures,” said Kalle Saksela, MD, PhD, senior author of the study and a virologist at the University of Helsinki.
 

Multiple doses needed? 

TriSb92 is “one of multiple nasal spray approaches but unlikely to be as durable as effective nasal vaccines,” said Eric Topol, MD, a professor of molecular medicine and executive vice president of Scripps Research in La Jolla, Calif. Dr. Topol is also editor-in-chief of Medscape, WebMD’s sister site for medical professionals.

“The sprays generally require multiple doses per day, whereas a single dose of a nasal vaccine may protect for months,” he said.

“Both have the allure of being variant-proof,” Dr. Topol added. 
 

Thinking small

Many laboratories are shifting from treatments using monoclonal antibodies to treatments using smaller antibody fragments called “nanobodies” because they are more cost-effective and are able to last longer in storage, Dr. Mäkelä and colleagues noted. 

Several of these nanobodies have shown promise against viruses in cell culture or animal models, including as an intranasal preventive treatment for SARS-CoV-2. 

One of these smaller antibodies is being developed from llamas for example; another comes from experiments with yeast to develop synthetic nanobodies; and in a third case, researchers isolated nanobodies from llamas and from mice and showed they could neutralize the SARS-CoV-2 virus.

These nanobodies and TriSb92 target a specific part of the coronavirus spike protein called the receptor-binding domain (RBD). The RBD is where the coronavirus attaches to cells in the body. These agents essentially trick the virus by changing the structure of the outside of cells, so they look like a virus has already fused to them. This way, the virus moves on. 
 

 

 

Key findings

The researchers compared mice treated with TriSb92 before and after exposure to SARS-CoV-2. When given in advance, none of the treated mice had SARS-CoV-2 RNA in their lungs, while untreated mice in the comparison group had “abundant” levels.

Other evidence of viral infection showed similar differences between treated and untreated mice in the protective lining of cells called the epithelium inside the nose, nasal mucosa, and airways. 

Similarly, when given 2 or 4 hours after SARS-CoV-2 had already infected the epithelium, TriSb92 was linked to a complete lack of the virus’s RNA in the lungs.

It was more effective against the virus, though, when given before infection rather than after, “perhaps due to the initial establishment of the infection,” the researchers note.

The company led by Dr. Mäkelä is now working to secure funding for clinical trials of TriSb92 in humans. 

A version of this article first appeared on WebMD.com.

Publications
Topics
Sections

An antiviral therapy in early development has the potential to prevent COVID-19 infections when given as a nasal spray as little as 4 hours before exposure. It also appears to work as a treatment if used within 4 hours after infection inside the nose, new research reveals. 

Known as TriSb92 (brand name Covidin, from drugmaker Pandemblock Oy in Finland), the viral inhibitor also appears effective against all coronavirus variants of concern, neutralizing even the Omicron variants BA.5, XBB, and BQ.1.1 in laboratory and mice studies. 

Unlike a COVID vaccine that boosts a person’s immune system as protection, the antiviral nasal spray works more directly by blocking the virus, acting as a “biological mask in the nasal cavity,” according to the biotechnology company set up to develop the treatment. 

The product targets a stable site on the spike protein of the virus that is not known to mutate. This same site is shared among many variants of the COVID virus, so it could be effective against future variants as well, researchers note.

“In animal models, by directly inactivating the virus, TriSb92 offers immediate and robust protection” against coronavirus infection and severe COVID, said Anna R. Mäkelä, PhD, lead author of the study and a senior scientist in the department of virology at the University of Helsinki. 

The study was published online in Nature Communications.
 

A potential first line of defense

Even in cases where the antiviral does not prevent coronavirus infection, the treatment could slow infection. This could happen by limiting how much virus could replicate early in the skin inside the nose and nasopharynx (the upper part of the throat), said Dr. Mäkelä, who is also CEO of Pandemblock Oy, the company set up to develop the product.

“TriSb92 could effectively tip the balance in favor of the [the person] and thereby help to reduce the risk of severe COVID-19 disease,” she said. 

The antiviral also could offer an alternative to people who cannot or do not respond to a vaccine.

“Many elderly people as well as individuals who are immunodeficient for various reasons do not respond to vaccines and are in the need of other protective measures,” said Kalle Saksela, MD, PhD, senior author of the study and a virologist at the University of Helsinki.
 

Multiple doses needed? 

TriSb92 is “one of multiple nasal spray approaches but unlikely to be as durable as effective nasal vaccines,” said Eric Topol, MD, a professor of molecular medicine and executive vice president of Scripps Research in La Jolla, Calif. Dr. Topol is also editor-in-chief of Medscape, WebMD’s sister site for medical professionals.

“The sprays generally require multiple doses per day, whereas a single dose of a nasal vaccine may protect for months,” he said.

“Both have the allure of being variant-proof,” Dr. Topol added. 
 

Thinking small

Many laboratories are shifting from treatments using monoclonal antibodies to treatments using smaller antibody fragments called “nanobodies” because they are more cost-effective and are able to last longer in storage, Dr. Mäkelä and colleagues noted. 

Several of these nanobodies have shown promise against viruses in cell culture or animal models, including as an intranasal preventive treatment for SARS-CoV-2. 

One of these smaller antibodies is being developed from llamas for example; another comes from experiments with yeast to develop synthetic nanobodies; and in a third case, researchers isolated nanobodies from llamas and from mice and showed they could neutralize the SARS-CoV-2 virus.

These nanobodies and TriSb92 target a specific part of the coronavirus spike protein called the receptor-binding domain (RBD). The RBD is where the coronavirus attaches to cells in the body. These agents essentially trick the virus by changing the structure of the outside of cells, so they look like a virus has already fused to them. This way, the virus moves on. 
 

 

 

Key findings

The researchers compared mice treated with TriSb92 before and after exposure to SARS-CoV-2. When given in advance, none of the treated mice had SARS-CoV-2 RNA in their lungs, while untreated mice in the comparison group had “abundant” levels.

Other evidence of viral infection showed similar differences between treated and untreated mice in the protective lining of cells called the epithelium inside the nose, nasal mucosa, and airways. 

Similarly, when given 2 or 4 hours after SARS-CoV-2 had already infected the epithelium, TriSb92 was linked to a complete lack of the virus’s RNA in the lungs.

It was more effective against the virus, though, when given before infection rather than after, “perhaps due to the initial establishment of the infection,” the researchers note.

The company led by Dr. Mäkelä is now working to secure funding for clinical trials of TriSb92 in humans. 

A version of this article first appeared on WebMD.com.

An antiviral therapy in early development has the potential to prevent COVID-19 infections when given as a nasal spray as little as 4 hours before exposure. It also appears to work as a treatment if used within 4 hours after infection inside the nose, new research reveals. 

Known as TriSb92 (brand name Covidin, from drugmaker Pandemblock Oy in Finland), the viral inhibitor also appears effective against all coronavirus variants of concern, neutralizing even the Omicron variants BA.5, XBB, and BQ.1.1 in laboratory and mice studies. 

Unlike a COVID vaccine that boosts a person’s immune system as protection, the antiviral nasal spray works more directly by blocking the virus, acting as a “biological mask in the nasal cavity,” according to the biotechnology company set up to develop the treatment. 

The product targets a stable site on the spike protein of the virus that is not known to mutate. This same site is shared among many variants of the COVID virus, so it could be effective against future variants as well, researchers note.

“In animal models, by directly inactivating the virus, TriSb92 offers immediate and robust protection” against coronavirus infection and severe COVID, said Anna R. Mäkelä, PhD, lead author of the study and a senior scientist in the department of virology at the University of Helsinki. 

The study was published online in Nature Communications.
 

A potential first line of defense

Even in cases where the antiviral does not prevent coronavirus infection, the treatment could slow infection. This could happen by limiting how much virus could replicate early in the skin inside the nose and nasopharynx (the upper part of the throat), said Dr. Mäkelä, who is also CEO of Pandemblock Oy, the company set up to develop the product.

“TriSb92 could effectively tip the balance in favor of the [the person] and thereby help to reduce the risk of severe COVID-19 disease,” she said. 

The antiviral also could offer an alternative to people who cannot or do not respond to a vaccine.

“Many elderly people as well as individuals who are immunodeficient for various reasons do not respond to vaccines and are in the need of other protective measures,” said Kalle Saksela, MD, PhD, senior author of the study and a virologist at the University of Helsinki.
 

Multiple doses needed? 

TriSb92 is “one of multiple nasal spray approaches but unlikely to be as durable as effective nasal vaccines,” said Eric Topol, MD, a professor of molecular medicine and executive vice president of Scripps Research in La Jolla, Calif. Dr. Topol is also editor-in-chief of Medscape, WebMD’s sister site for medical professionals.

“The sprays generally require multiple doses per day, whereas a single dose of a nasal vaccine may protect for months,” he said.

“Both have the allure of being variant-proof,” Dr. Topol added. 
 

Thinking small

Many laboratories are shifting from treatments using monoclonal antibodies to treatments using smaller antibody fragments called “nanobodies” because they are more cost-effective and are able to last longer in storage, Dr. Mäkelä and colleagues noted. 

Several of these nanobodies have shown promise against viruses in cell culture or animal models, including as an intranasal preventive treatment for SARS-CoV-2. 

One of these smaller antibodies is being developed from llamas for example; another comes from experiments with yeast to develop synthetic nanobodies; and in a third case, researchers isolated nanobodies from llamas and from mice and showed they could neutralize the SARS-CoV-2 virus.

These nanobodies and TriSb92 target a specific part of the coronavirus spike protein called the receptor-binding domain (RBD). The RBD is where the coronavirus attaches to cells in the body. These agents essentially trick the virus by changing the structure of the outside of cells, so they look like a virus has already fused to them. This way, the virus moves on. 
 

 

 

Key findings

The researchers compared mice treated with TriSb92 before and after exposure to SARS-CoV-2. When given in advance, none of the treated mice had SARS-CoV-2 RNA in their lungs, while untreated mice in the comparison group had “abundant” levels.

Other evidence of viral infection showed similar differences between treated and untreated mice in the protective lining of cells called the epithelium inside the nose, nasal mucosa, and airways. 

Similarly, when given 2 or 4 hours after SARS-CoV-2 had already infected the epithelium, TriSb92 was linked to a complete lack of the virus’s RNA in the lungs.

It was more effective against the virus, though, when given before infection rather than after, “perhaps due to the initial establishment of the infection,” the researchers note.

The company led by Dr. Mäkelä is now working to secure funding for clinical trials of TriSb92 in humans. 

A version of this article first appeared on WebMD.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NATURE COMMUNICATIONS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Cluster, migraine headache strongly linked to circadian rhythm

Article Type
Changed
Mon, 04/03/2023 - 14:18

 

Cluster headache and migraine have strong ties to the circadian system at multiple levels, say new findings that could have significant treatment implications.

A meta-analysis of 16 studies showed a circadian pattern in 71% of cluster headache attacks (3,490 of 4,953), with a clear circadian peak between 9:00 p.m. and 3:00 a.m.

Migraine was also associated with a circadian pattern in 50% of cases (2,698 of 5,385) across eight studies, with a clear circadian trough between 11:00 p.m. and 7:00 a.m.

Seasonal peaks were also evident for cluster headache (spring and autumn) and migraine (April to October).

“In the short term, these findings help us explain the timing to patients – for example, it is possible that a headache at 8 a.m. is due to their internal body clock instead of their pillow, or breakfast food, or morning medications,” lead investigator Mark Burish, MD, PhD, associate professor, department of neurosurgery, at University of Texas Health Houston, told this news organization.

“In the long term, these findings do suggest that medications that target the circadian system could be effective in migraine and headache patients,” Dr. Burish added.

The study was published online in Neurology.


 

Treatment implications?

Across studies, chronotype was “highly variable” for both cluster headache and migraine, the investigators report.

Cluster headache was associated with lower melatonin and higher cortisol levels, compared with non–cluster headache controls.

On a genetic level, cluster headache was associated with two core circadian genes (CLOCK and REV-ERB–alpha), and five of the nine genes that increase the likelihood of having cluster headache are genes with a circadian pattern of expression.

Migraine headache was associated with lower urinary melatonin levels and with the core circadian genes, CK1-delta and ROR-alpha, and 110 of the 168 genes associated with migraine were clock-controlled genes.

“The data suggest that both of these headache disorders are highly circadian at multiple levels, especially cluster headache,” Dr. Burish said in a release.

“This reinforces the importance of the hypothalamus – the area of the brain that houses the primary biological clock – and its role in cluster headache and migraine. It also raises the question of the genetics of triggers such as sleep changes that are known triggers for migraine and are cues for the body’s circadian rhythm,” Dr. Burish said.

“We hope that future research will look into circadian medications as a new treatment option for migraine and cluster headache patients,” Dr. Burish told this news organization.
 

Importance of sleep regulation

The authors of an accompanying editorial note that even though the study doesn’t have immediate clinical implications, it offers a better understanding of the way chronobiologic factors may influence treatment.

“At a minimum, interventions known to regulate and improve sleep (e.g., melatonin, cognitive behavioral therapy), and which are safe and straightforward to introduce, may be useful in some individuals susceptible to circadian misalignment or sleep disorders,” write Heidi Sutherland, PhD, and Lyn Griffiths, PhD, with Queensland University of Technology, Brisbane, Australia.

“Treatment of comorbidities (e.g., insomnia) that result in sleep disturbances may also help headache management. Furthermore, chronobiological aspects of any pharmacological interventions should be considered, as some frequently used headache and migraine drugs can modulate circadian cycles and influence the expression of circadian genes (e.g., verapamil), or have sleep-related side effects,” they add.

A limitation of the study was the lack of information on factors that could influence the circadian cycle, such as medications; other disorders, such as bipolar disorder; or circadian rhythm issues, such as night-shift work.

The study was supported by grants from the Japan Society for the Promotion of Science, the National Institutes of Health, The Welch Foundation, and The Will Erwin Headache Research Foundation. Dr. Burish is an unpaid member of the medical advisory board of Clusterbusters, and a site investigator for a cluster headache clinical trial funded by Lundbeck. Dr. Sutherland has received grant funding from the U.S. Migraine Research Foundation, and received institute support from Queensland University of Technology for genetics research. Dr. Griffiths has received grant funding from the Australian NHMRC, U.S. Department of Defense, and the U.S. Migraine Research Foundation, and consultancy funding from TEVA.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

Cluster headache and migraine have strong ties to the circadian system at multiple levels, say new findings that could have significant treatment implications.

A meta-analysis of 16 studies showed a circadian pattern in 71% of cluster headache attacks (3,490 of 4,953), with a clear circadian peak between 9:00 p.m. and 3:00 a.m.

Migraine was also associated with a circadian pattern in 50% of cases (2,698 of 5,385) across eight studies, with a clear circadian trough between 11:00 p.m. and 7:00 a.m.

Seasonal peaks were also evident for cluster headache (spring and autumn) and migraine (April to October).

“In the short term, these findings help us explain the timing to patients – for example, it is possible that a headache at 8 a.m. is due to their internal body clock instead of their pillow, or breakfast food, or morning medications,” lead investigator Mark Burish, MD, PhD, associate professor, department of neurosurgery, at University of Texas Health Houston, told this news organization.

“In the long term, these findings do suggest that medications that target the circadian system could be effective in migraine and headache patients,” Dr. Burish added.

The study was published online in Neurology.


 

Treatment implications?

Across studies, chronotype was “highly variable” for both cluster headache and migraine, the investigators report.

Cluster headache was associated with lower melatonin and higher cortisol levels, compared with non–cluster headache controls.

On a genetic level, cluster headache was associated with two core circadian genes (CLOCK and REV-ERB–alpha), and five of the nine genes that increase the likelihood of having cluster headache are genes with a circadian pattern of expression.

Migraine headache was associated with lower urinary melatonin levels and with the core circadian genes, CK1-delta and ROR-alpha, and 110 of the 168 genes associated with migraine were clock-controlled genes.

“The data suggest that both of these headache disorders are highly circadian at multiple levels, especially cluster headache,” Dr. Burish said in a release.

“This reinforces the importance of the hypothalamus – the area of the brain that houses the primary biological clock – and its role in cluster headache and migraine. It also raises the question of the genetics of triggers such as sleep changes that are known triggers for migraine and are cues for the body’s circadian rhythm,” Dr. Burish said.

“We hope that future research will look into circadian medications as a new treatment option for migraine and cluster headache patients,” Dr. Burish told this news organization.
 

Importance of sleep regulation

The authors of an accompanying editorial note that even though the study doesn’t have immediate clinical implications, it offers a better understanding of the way chronobiologic factors may influence treatment.

“At a minimum, interventions known to regulate and improve sleep (e.g., melatonin, cognitive behavioral therapy), and which are safe and straightforward to introduce, may be useful in some individuals susceptible to circadian misalignment or sleep disorders,” write Heidi Sutherland, PhD, and Lyn Griffiths, PhD, with Queensland University of Technology, Brisbane, Australia.

“Treatment of comorbidities (e.g., insomnia) that result in sleep disturbances may also help headache management. Furthermore, chronobiological aspects of any pharmacological interventions should be considered, as some frequently used headache and migraine drugs can modulate circadian cycles and influence the expression of circadian genes (e.g., verapamil), or have sleep-related side effects,” they add.

A limitation of the study was the lack of information on factors that could influence the circadian cycle, such as medications; other disorders, such as bipolar disorder; or circadian rhythm issues, such as night-shift work.

The study was supported by grants from the Japan Society for the Promotion of Science, the National Institutes of Health, The Welch Foundation, and The Will Erwin Headache Research Foundation. Dr. Burish is an unpaid member of the medical advisory board of Clusterbusters, and a site investigator for a cluster headache clinical trial funded by Lundbeck. Dr. Sutherland has received grant funding from the U.S. Migraine Research Foundation, and received institute support from Queensland University of Technology for genetics research. Dr. Griffiths has received grant funding from the Australian NHMRC, U.S. Department of Defense, and the U.S. Migraine Research Foundation, and consultancy funding from TEVA.

A version of this article first appeared on Medscape.com.

 

Cluster headache and migraine have strong ties to the circadian system at multiple levels, say new findings that could have significant treatment implications.

A meta-analysis of 16 studies showed a circadian pattern in 71% of cluster headache attacks (3,490 of 4,953), with a clear circadian peak between 9:00 p.m. and 3:00 a.m.

Migraine was also associated with a circadian pattern in 50% of cases (2,698 of 5,385) across eight studies, with a clear circadian trough between 11:00 p.m. and 7:00 a.m.

Seasonal peaks were also evident for cluster headache (spring and autumn) and migraine (April to October).

“In the short term, these findings help us explain the timing to patients – for example, it is possible that a headache at 8 a.m. is due to their internal body clock instead of their pillow, or breakfast food, or morning medications,” lead investigator Mark Burish, MD, PhD, associate professor, department of neurosurgery, at University of Texas Health Houston, told this news organization.

“In the long term, these findings do suggest that medications that target the circadian system could be effective in migraine and headache patients,” Dr. Burish added.

The study was published online in Neurology.


 

Treatment implications?

Across studies, chronotype was “highly variable” for both cluster headache and migraine, the investigators report.

Cluster headache was associated with lower melatonin and higher cortisol levels, compared with non–cluster headache controls.

On a genetic level, cluster headache was associated with two core circadian genes (CLOCK and REV-ERB–alpha), and five of the nine genes that increase the likelihood of having cluster headache are genes with a circadian pattern of expression.

Migraine headache was associated with lower urinary melatonin levels and with the core circadian genes, CK1-delta and ROR-alpha, and 110 of the 168 genes associated with migraine were clock-controlled genes.

“The data suggest that both of these headache disorders are highly circadian at multiple levels, especially cluster headache,” Dr. Burish said in a release.

“This reinforces the importance of the hypothalamus – the area of the brain that houses the primary biological clock – and its role in cluster headache and migraine. It also raises the question of the genetics of triggers such as sleep changes that are known triggers for migraine and are cues for the body’s circadian rhythm,” Dr. Burish said.

“We hope that future research will look into circadian medications as a new treatment option for migraine and cluster headache patients,” Dr. Burish told this news organization.
 

Importance of sleep regulation

The authors of an accompanying editorial note that even though the study doesn’t have immediate clinical implications, it offers a better understanding of the way chronobiologic factors may influence treatment.

“At a minimum, interventions known to regulate and improve sleep (e.g., melatonin, cognitive behavioral therapy), and which are safe and straightforward to introduce, may be useful in some individuals susceptible to circadian misalignment or sleep disorders,” write Heidi Sutherland, PhD, and Lyn Griffiths, PhD, with Queensland University of Technology, Brisbane, Australia.

“Treatment of comorbidities (e.g., insomnia) that result in sleep disturbances may also help headache management. Furthermore, chronobiological aspects of any pharmacological interventions should be considered, as some frequently used headache and migraine drugs can modulate circadian cycles and influence the expression of circadian genes (e.g., verapamil), or have sleep-related side effects,” they add.

A limitation of the study was the lack of information on factors that could influence the circadian cycle, such as medications; other disorders, such as bipolar disorder; or circadian rhythm issues, such as night-shift work.

The study was supported by grants from the Japan Society for the Promotion of Science, the National Institutes of Health, The Welch Foundation, and The Will Erwin Headache Research Foundation. Dr. Burish is an unpaid member of the medical advisory board of Clusterbusters, and a site investigator for a cluster headache clinical trial funded by Lundbeck. Dr. Sutherland has received grant funding from the U.S. Migraine Research Foundation, and received institute support from Queensland University of Technology for genetics research. Dr. Griffiths has received grant funding from the Australian NHMRC, U.S. Department of Defense, and the U.S. Migraine Research Foundation, and consultancy funding from TEVA.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NEUROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Kickback Scheme Nets Prison Time for Philadelphia VAMC Service Chief

Article Type
Changed
Mon, 04/03/2023 - 12:36

A former manager at the Philadelphia Veterans Affairs Medical Center (VAMC) has been sentenced to 6 months in federal prison for his part in a bribery scheme.

Ralph Johnson was convicted of accepting $30,000 in kickbacks and bribes for steering contracts to Earron and Carlicha Starks, who ran Ekno Medical Supply and Collondale Medical Supply from 2009 to 2019. Johnson served as chief of environmental services at the medical center. He admitted to receiving cash in binders and packages mailed to his home between 2018 and 2019.

The Starkses pleaded guilty first to paying kickbacks on $7 million worth of contracts to Florida VA facilities, then participated in a sting that implicated Johnson.

The VA Office of Inspector General began investigating Johnson in 2018 after the Starkses, who were indicted for bribing staff at US Department of Veterans Affairs (VA) hospitals in Miami and West Palm Beach, Florida, said they also paid officials in VA facilities on the East Coast.

According to the Philadelphia Inquirer, the judge credited Johnson’s past military service and his “extensive cooperation” with federal authorities investigating fraud within the VA. Johnson apologized to his former employers: “Throughout these 2 and a half years [since the arrest] there’s not a day I don’t think about the wrongness that I did.”

In addition to the prison sentence, Johnson has been ordered to pay back, at $50 a month, the $440,000-plus he cost the Philadelphia VAMC in fraudulent and bloated contracts.

Johnson is at least the third Philadelphia VAMC employee indicted or sentenced for fraud since 2020.

Publications
Topics
Sections

A former manager at the Philadelphia Veterans Affairs Medical Center (VAMC) has been sentenced to 6 months in federal prison for his part in a bribery scheme.

Ralph Johnson was convicted of accepting $30,000 in kickbacks and bribes for steering contracts to Earron and Carlicha Starks, who ran Ekno Medical Supply and Collondale Medical Supply from 2009 to 2019. Johnson served as chief of environmental services at the medical center. He admitted to receiving cash in binders and packages mailed to his home between 2018 and 2019.

The Starkses pleaded guilty first to paying kickbacks on $7 million worth of contracts to Florida VA facilities, then participated in a sting that implicated Johnson.

The VA Office of Inspector General began investigating Johnson in 2018 after the Starkses, who were indicted for bribing staff at US Department of Veterans Affairs (VA) hospitals in Miami and West Palm Beach, Florida, said they also paid officials in VA facilities on the East Coast.

According to the Philadelphia Inquirer, the judge credited Johnson’s past military service and his “extensive cooperation” with federal authorities investigating fraud within the VA. Johnson apologized to his former employers: “Throughout these 2 and a half years [since the arrest] there’s not a day I don’t think about the wrongness that I did.”

In addition to the prison sentence, Johnson has been ordered to pay back, at $50 a month, the $440,000-plus he cost the Philadelphia VAMC in fraudulent and bloated contracts.

Johnson is at least the third Philadelphia VAMC employee indicted or sentenced for fraud since 2020.

A former manager at the Philadelphia Veterans Affairs Medical Center (VAMC) has been sentenced to 6 months in federal prison for his part in a bribery scheme.

Ralph Johnson was convicted of accepting $30,000 in kickbacks and bribes for steering contracts to Earron and Carlicha Starks, who ran Ekno Medical Supply and Collondale Medical Supply from 2009 to 2019. Johnson served as chief of environmental services at the medical center. He admitted to receiving cash in binders and packages mailed to his home between 2018 and 2019.

The Starkses pleaded guilty first to paying kickbacks on $7 million worth of contracts to Florida VA facilities, then participated in a sting that implicated Johnson.

The VA Office of Inspector General began investigating Johnson in 2018 after the Starkses, who were indicted for bribing staff at US Department of Veterans Affairs (VA) hospitals in Miami and West Palm Beach, Florida, said they also paid officials in VA facilities on the East Coast.

According to the Philadelphia Inquirer, the judge credited Johnson’s past military service and his “extensive cooperation” with federal authorities investigating fraud within the VA. Johnson apologized to his former employers: “Throughout these 2 and a half years [since the arrest] there’s not a day I don’t think about the wrongness that I did.”

In addition to the prison sentence, Johnson has been ordered to pay back, at $50 a month, the $440,000-plus he cost the Philadelphia VAMC in fraudulent and bloated contracts.

Johnson is at least the third Philadelphia VAMC employee indicted or sentenced for fraud since 2020.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Mon, 04/03/2023 - 12:15
Un-Gate On Date
Mon, 04/03/2023 - 12:15
Use ProPublica
CFC Schedule Remove Status
Mon, 04/03/2023 - 12:15
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Song stuck in your head? What earworms reveal about health

Article Type
Changed
Wed, 04/05/2023 - 13:57

 

If Miley Cyrus has planted “Flowers” in your head, rest assured you’re not alone.

An earworm – a bit of music you can’t shake from your brain – happens to almost everyone. 

The culprit is typically a song you’ve heard repeatedly with a strong rhythm and melody (like Miley’s No. 1 hit this year).

It pops into your head and stays there, unbidden and often unwanted. As you fish for something new on Spotify, there’s always a chance that a catchy hook holds an earworm.

“A catchy tune or melody is the part of a song most likely to get stuck in a person’s head, often a bit from the chorus,” said Elizabeth H. Margulis, PhD, a professor at Princeton (N.J.) University and director of its music cognition lab. The phenomenon, which has been studied since 1885 (way before earbuds), goes by such names as stuck song syndrome, sticky music, musical imagery repetition, intrusive musical imagery, or the semi-official term, involuntary musical imagery, or INMI.

Research confirms how common it is. A 2020 study of American college students found that 97% had experienced an earworm in the past month, similar to findings from a larger Finnish survey done more than 10 years ago.

One in five people had experienced an earworm more than once a day, the study found. The typical length was 10-30 minutes, though 8.5% said theirs lasted more than 3 hours. Levels of “distress and interference” that earworms caused was mostly “mild to moderate.” 

Some 86% said they tried to stop it – most frequently by distraction, like talking to a friend or listening to another song.

If music is important to you, your earworms are more likely to last longer and be harder to control, earlier research found. And women are thought to be more likely to have them.

“Very musical people may have more earworms because it’s easy for them to conjure up a certain tune,” says David Silbersweig, MD, chairman of the department of psychiatry and codirector of the Institute for the Neurosciences at Brigham and Women’s Hospital in Boston.

Moreover, people who lack “psychological flexibility” may find earworms more bothersome. The more they try to avoid or control intrusive thoughts (or songs), the more persistent those thoughts become. 

“This is consistent with OCD (obsessive-compulsive disorder) research on the paradoxical effect of thought suppression,” the authors of the 2020 study wrote. In fact, people who report very annoying or stressful earworms are more likely to have obsessive-compulsive symptoms.

Earworms have been linked to other medical conditions as well, and even harmless earworms can be intrusive and time-consuming. That makes them worth a closer look.

Digging for the source of earworms

Scientists trace earworms to the auditory cortex in the temporal lobe of the brain, which controls how you perceive music, as well as deep temporal lobe areas that are responsible for retrieving memories. Your amygdala and ventral striatum, parts of your brain that involve emotion, also tie into the making of an earworm.

MRI experiments found that “INMI is a common internal experience recruiting brain networks involved in perception, emotions, memory and spontaneous thoughts,” a 2015 paper in  Consciousness and Cognition  reported.

These brain networks work in tandem if you connect a song to an emotional memory – that’s when you’re more likely to experience it as an earworm. The “loop” of music you’ll hear in your head is usually a 20-second snippet.

Think of it as a “cognitive itch,” as researchers from the Netherlands put it. An earworm can be triggered by associating a song with a specific situation or emotion. Trying to suppress it just reminds you it’s there, “scratching” the itch and making it worse. “The more one tries to suppress the songs, the more their impetus increases, a mental process known as ironic process theory,” they wrote. 

“It’s also worth pointing out that earworms don’t always occur right after a song ends,” said Michael K. Scullin, PhD, an associate professor of psychology and neuroscience at Baylor University in Waco, Tex. “Sometimes they only occur many hours later, and sometimes the earworm isn’t the song you were most recently listening to.”

These processes aren’t fully understood, he said, “but they likely represent memory consolidation mechanisms; that is, the brain trying to reactivate and stabilize musical memories.” Kind of like switching “radio stations” in your head. 

 

 

When to worry

Earworms are most often harmless. “They’re part of a healthy brain,” said Dr. Silbersweig. But in rare cases, they indicate certain medical conditions. People with OCD, for example, have been shown to have earworms during times of stress. If this is the case, cognitive-behavioral therapy as well as some antidepressants may help.

Take an earworm seriously if it’s linked to other symptoms, said Elaine Jones, MD, a neurologist in Hilton Head, S.C., and a fellow of the American Academy of Neurology. Those symptoms could include “loss of consciousness or confusion, visual loss or changes, speech arrest, tremors of arms or legs,” she said.

“Most worrisome would be a seizure, but other causes could include a migraine aura. In a younger person, less than 20 years old, this kind of earworm could indicate a psychiatric condition like schizophrenia.” Drug toxicity or brain damage can also present with earworms.

Her bottom line: “If an earworm is persistent for more than 24 hours, or if it is associated with the other symptoms mentioned above, it would be important to reach out to your primary care doctor to ensure that nothing more serious is going on,” said Dr. Jones. With no other symptoms, “it is more likely to be just an earworm.”

Japanese research also indicates that an earworm that lasts for several hours in a day can be linked to depression. If a person has symptoms such as low mood, insomnia, and loss of appetite, along with earworms that last several hours a day, treatment may help. 

There’s another category called “musical hallucinations” – where the person thinks they are actually hearing music, which could be a symptom of depression, although scientists don’t know for sure. The drug vortioxetine, which may help boost serotonin in the brain, has shown some promise in reducing earworms

Some research has shown that diseases that damage the auditory pathway in the brain have a link to musical hallucinations. 

How to stop a simple earworm

Here are six easy ways to make it stop:

  • Mix up your playlist. “Listening to songs repeatedly does increase the likelihood that they’ll get stuck,” said Dr. Margulis. 
  • Take breaks from your tunes throughout the day. “Longer listening durations are more likely to lead to earworms,” Dr. Scullin said.
  • Use your feet. than the beat of your earworm. This will interrupt your memory of the tempo and can help chase away the earworm.
  • Stick with that song. “Listen to a song all the way through,” said Dr. Silbersweig. If you only listen to snippets of a song, the  can take hold. That’s the brain’s tendency to remember things that are interrupted more easily than completed things.
  • Distract yourself. Lose yourself in a book, a movie, your work, or a hobby that requires concentration. “Redirecting attention to an absorbing task can be an effective way to dislodge an earworm,” said Dr. Margulis. 
  • Chew gum.  shows that the action of doing so interferes with repetitive memories and stops your mind from “scanning” a song. Then enjoy the sound of silence!

A version of this article first appeared on WebMD.com.

Publications
Topics
Sections

 

If Miley Cyrus has planted “Flowers” in your head, rest assured you’re not alone.

An earworm – a bit of music you can’t shake from your brain – happens to almost everyone. 

The culprit is typically a song you’ve heard repeatedly with a strong rhythm and melody (like Miley’s No. 1 hit this year).

It pops into your head and stays there, unbidden and often unwanted. As you fish for something new on Spotify, there’s always a chance that a catchy hook holds an earworm.

“A catchy tune or melody is the part of a song most likely to get stuck in a person’s head, often a bit from the chorus,” said Elizabeth H. Margulis, PhD, a professor at Princeton (N.J.) University and director of its music cognition lab. The phenomenon, which has been studied since 1885 (way before earbuds), goes by such names as stuck song syndrome, sticky music, musical imagery repetition, intrusive musical imagery, or the semi-official term, involuntary musical imagery, or INMI.

Research confirms how common it is. A 2020 study of American college students found that 97% had experienced an earworm in the past month, similar to findings from a larger Finnish survey done more than 10 years ago.

One in five people had experienced an earworm more than once a day, the study found. The typical length was 10-30 minutes, though 8.5% said theirs lasted more than 3 hours. Levels of “distress and interference” that earworms caused was mostly “mild to moderate.” 

Some 86% said they tried to stop it – most frequently by distraction, like talking to a friend or listening to another song.

If music is important to you, your earworms are more likely to last longer and be harder to control, earlier research found. And women are thought to be more likely to have them.

“Very musical people may have more earworms because it’s easy for them to conjure up a certain tune,” says David Silbersweig, MD, chairman of the department of psychiatry and codirector of the Institute for the Neurosciences at Brigham and Women’s Hospital in Boston.

Moreover, people who lack “psychological flexibility” may find earworms more bothersome. The more they try to avoid or control intrusive thoughts (or songs), the more persistent those thoughts become. 

“This is consistent with OCD (obsessive-compulsive disorder) research on the paradoxical effect of thought suppression,” the authors of the 2020 study wrote. In fact, people who report very annoying or stressful earworms are more likely to have obsessive-compulsive symptoms.

Earworms have been linked to other medical conditions as well, and even harmless earworms can be intrusive and time-consuming. That makes them worth a closer look.

Digging for the source of earworms

Scientists trace earworms to the auditory cortex in the temporal lobe of the brain, which controls how you perceive music, as well as deep temporal lobe areas that are responsible for retrieving memories. Your amygdala and ventral striatum, parts of your brain that involve emotion, also tie into the making of an earworm.

MRI experiments found that “INMI is a common internal experience recruiting brain networks involved in perception, emotions, memory and spontaneous thoughts,” a 2015 paper in  Consciousness and Cognition  reported.

These brain networks work in tandem if you connect a song to an emotional memory – that’s when you’re more likely to experience it as an earworm. The “loop” of music you’ll hear in your head is usually a 20-second snippet.

Think of it as a “cognitive itch,” as researchers from the Netherlands put it. An earworm can be triggered by associating a song with a specific situation or emotion. Trying to suppress it just reminds you it’s there, “scratching” the itch and making it worse. “The more one tries to suppress the songs, the more their impetus increases, a mental process known as ironic process theory,” they wrote. 

“It’s also worth pointing out that earworms don’t always occur right after a song ends,” said Michael K. Scullin, PhD, an associate professor of psychology and neuroscience at Baylor University in Waco, Tex. “Sometimes they only occur many hours later, and sometimes the earworm isn’t the song you were most recently listening to.”

These processes aren’t fully understood, he said, “but they likely represent memory consolidation mechanisms; that is, the brain trying to reactivate and stabilize musical memories.” Kind of like switching “radio stations” in your head. 

 

 

When to worry

Earworms are most often harmless. “They’re part of a healthy brain,” said Dr. Silbersweig. But in rare cases, they indicate certain medical conditions. People with OCD, for example, have been shown to have earworms during times of stress. If this is the case, cognitive-behavioral therapy as well as some antidepressants may help.

Take an earworm seriously if it’s linked to other symptoms, said Elaine Jones, MD, a neurologist in Hilton Head, S.C., and a fellow of the American Academy of Neurology. Those symptoms could include “loss of consciousness or confusion, visual loss or changes, speech arrest, tremors of arms or legs,” she said.

“Most worrisome would be a seizure, but other causes could include a migraine aura. In a younger person, less than 20 years old, this kind of earworm could indicate a psychiatric condition like schizophrenia.” Drug toxicity or brain damage can also present with earworms.

Her bottom line: “If an earworm is persistent for more than 24 hours, or if it is associated with the other symptoms mentioned above, it would be important to reach out to your primary care doctor to ensure that nothing more serious is going on,” said Dr. Jones. With no other symptoms, “it is more likely to be just an earworm.”

Japanese research also indicates that an earworm that lasts for several hours in a day can be linked to depression. If a person has symptoms such as low mood, insomnia, and loss of appetite, along with earworms that last several hours a day, treatment may help. 

There’s another category called “musical hallucinations” – where the person thinks they are actually hearing music, which could be a symptom of depression, although scientists don’t know for sure. The drug vortioxetine, which may help boost serotonin in the brain, has shown some promise in reducing earworms

Some research has shown that diseases that damage the auditory pathway in the brain have a link to musical hallucinations. 

How to stop a simple earworm

Here are six easy ways to make it stop:

  • Mix up your playlist. “Listening to songs repeatedly does increase the likelihood that they’ll get stuck,” said Dr. Margulis. 
  • Take breaks from your tunes throughout the day. “Longer listening durations are more likely to lead to earworms,” Dr. Scullin said.
  • Use your feet. than the beat of your earworm. This will interrupt your memory of the tempo and can help chase away the earworm.
  • Stick with that song. “Listen to a song all the way through,” said Dr. Silbersweig. If you only listen to snippets of a song, the  can take hold. That’s the brain’s tendency to remember things that are interrupted more easily than completed things.
  • Distract yourself. Lose yourself in a book, a movie, your work, or a hobby that requires concentration. “Redirecting attention to an absorbing task can be an effective way to dislodge an earworm,” said Dr. Margulis. 
  • Chew gum.  shows that the action of doing so interferes with repetitive memories and stops your mind from “scanning” a song. Then enjoy the sound of silence!

A version of this article first appeared on WebMD.com.

 

If Miley Cyrus has planted “Flowers” in your head, rest assured you’re not alone.

An earworm – a bit of music you can’t shake from your brain – happens to almost everyone. 

The culprit is typically a song you’ve heard repeatedly with a strong rhythm and melody (like Miley’s No. 1 hit this year).

It pops into your head and stays there, unbidden and often unwanted. As you fish for something new on Spotify, there’s always a chance that a catchy hook holds an earworm.

“A catchy tune or melody is the part of a song most likely to get stuck in a person’s head, often a bit from the chorus,” said Elizabeth H. Margulis, PhD, a professor at Princeton (N.J.) University and director of its music cognition lab. The phenomenon, which has been studied since 1885 (way before earbuds), goes by such names as stuck song syndrome, sticky music, musical imagery repetition, intrusive musical imagery, or the semi-official term, involuntary musical imagery, or INMI.

Research confirms how common it is. A 2020 study of American college students found that 97% had experienced an earworm in the past month, similar to findings from a larger Finnish survey done more than 10 years ago.

One in five people had experienced an earworm more than once a day, the study found. The typical length was 10-30 minutes, though 8.5% said theirs lasted more than 3 hours. Levels of “distress and interference” that earworms caused was mostly “mild to moderate.” 

Some 86% said they tried to stop it – most frequently by distraction, like talking to a friend or listening to another song.

If music is important to you, your earworms are more likely to last longer and be harder to control, earlier research found. And women are thought to be more likely to have them.

“Very musical people may have more earworms because it’s easy for them to conjure up a certain tune,” says David Silbersweig, MD, chairman of the department of psychiatry and codirector of the Institute for the Neurosciences at Brigham and Women’s Hospital in Boston.

Moreover, people who lack “psychological flexibility” may find earworms more bothersome. The more they try to avoid or control intrusive thoughts (or songs), the more persistent those thoughts become. 

“This is consistent with OCD (obsessive-compulsive disorder) research on the paradoxical effect of thought suppression,” the authors of the 2020 study wrote. In fact, people who report very annoying or stressful earworms are more likely to have obsessive-compulsive symptoms.

Earworms have been linked to other medical conditions as well, and even harmless earworms can be intrusive and time-consuming. That makes them worth a closer look.

Digging for the source of earworms

Scientists trace earworms to the auditory cortex in the temporal lobe of the brain, which controls how you perceive music, as well as deep temporal lobe areas that are responsible for retrieving memories. Your amygdala and ventral striatum, parts of your brain that involve emotion, also tie into the making of an earworm.

MRI experiments found that “INMI is a common internal experience recruiting brain networks involved in perception, emotions, memory and spontaneous thoughts,” a 2015 paper in  Consciousness and Cognition  reported.

These brain networks work in tandem if you connect a song to an emotional memory – that’s when you’re more likely to experience it as an earworm. The “loop” of music you’ll hear in your head is usually a 20-second snippet.

Think of it as a “cognitive itch,” as researchers from the Netherlands put it. An earworm can be triggered by associating a song with a specific situation or emotion. Trying to suppress it just reminds you it’s there, “scratching” the itch and making it worse. “The more one tries to suppress the songs, the more their impetus increases, a mental process known as ironic process theory,” they wrote. 

“It’s also worth pointing out that earworms don’t always occur right after a song ends,” said Michael K. Scullin, PhD, an associate professor of psychology and neuroscience at Baylor University in Waco, Tex. “Sometimes they only occur many hours later, and sometimes the earworm isn’t the song you were most recently listening to.”

These processes aren’t fully understood, he said, “but they likely represent memory consolidation mechanisms; that is, the brain trying to reactivate and stabilize musical memories.” Kind of like switching “radio stations” in your head. 

 

 

When to worry

Earworms are most often harmless. “They’re part of a healthy brain,” said Dr. Silbersweig. But in rare cases, they indicate certain medical conditions. People with OCD, for example, have been shown to have earworms during times of stress. If this is the case, cognitive-behavioral therapy as well as some antidepressants may help.

Take an earworm seriously if it’s linked to other symptoms, said Elaine Jones, MD, a neurologist in Hilton Head, S.C., and a fellow of the American Academy of Neurology. Those symptoms could include “loss of consciousness or confusion, visual loss or changes, speech arrest, tremors of arms or legs,” she said.

“Most worrisome would be a seizure, but other causes could include a migraine aura. In a younger person, less than 20 years old, this kind of earworm could indicate a psychiatric condition like schizophrenia.” Drug toxicity or brain damage can also present with earworms.

Her bottom line: “If an earworm is persistent for more than 24 hours, or if it is associated with the other symptoms mentioned above, it would be important to reach out to your primary care doctor to ensure that nothing more serious is going on,” said Dr. Jones. With no other symptoms, “it is more likely to be just an earworm.”

Japanese research also indicates that an earworm that lasts for several hours in a day can be linked to depression. If a person has symptoms such as low mood, insomnia, and loss of appetite, along with earworms that last several hours a day, treatment may help. 

There’s another category called “musical hallucinations” – where the person thinks they are actually hearing music, which could be a symptom of depression, although scientists don’t know for sure. The drug vortioxetine, which may help boost serotonin in the brain, has shown some promise in reducing earworms

Some research has shown that diseases that damage the auditory pathway in the brain have a link to musical hallucinations. 

How to stop a simple earworm

Here are six easy ways to make it stop:

  • Mix up your playlist. “Listening to songs repeatedly does increase the likelihood that they’ll get stuck,” said Dr. Margulis. 
  • Take breaks from your tunes throughout the day. “Longer listening durations are more likely to lead to earworms,” Dr. Scullin said.
  • Use your feet. than the beat of your earworm. This will interrupt your memory of the tempo and can help chase away the earworm.
  • Stick with that song. “Listen to a song all the way through,” said Dr. Silbersweig. If you only listen to snippets of a song, the  can take hold. That’s the brain’s tendency to remember things that are interrupted more easily than completed things.
  • Distract yourself. Lose yourself in a book, a movie, your work, or a hobby that requires concentration. “Redirecting attention to an absorbing task can be an effective way to dislodge an earworm,” said Dr. Margulis. 
  • Chew gum.  shows that the action of doing so interferes with repetitive memories and stops your mind from “scanning” a song. Then enjoy the sound of silence!

A version of this article first appeared on WebMD.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Analysis identifies gaps in CV risk screening of patients with psoriasis

Article Type
Changed
Wed, 04/05/2023 - 11:54

 

Just 16% of psoriasis-related visits to dermatology providers in the United States involve screening for cardiovascular (CV) risk factors, with screening lowest in the region with the highest CV disease burden, according to an analysis of 10 years of national survey data.

From 2007 to 2016, national screening rates for four CV risk factors at 14.8 million psoriasis-related visits to dermatology providers were 11% (body-mass index), 7.4% (blood pressure), 2.9% (cholesterol), and 1.7% (glucose). Data from the National Ambulatory Medical Care Survey showed that at least one of the four factors was screened at 16% of dermatology visits, said William B. Song, BS, of the department of dermatology, University of Pennsylvania, Philadelphia, and associates.

The main focus of their study, however, was regional differences. “CV risk factor screening by dermatology providers for patients with psoriasis is low across all regions of the United States and lowest in the South, the region that experiences the highest CVD burden in the United States,” they wrote in a letter to the editor.

Compared with the South, the adjusted odds of any CV screening were 0.98 in the West, 1.25 in the Northeast, and 1.92 in the Midwest. Blood pressure screening was significantly higher in all three regions, compared with the South, while BMI screening was actually lower in the West (0.74), the investigators reported. Odds ratios were not available for cholesterol and glucose screening because of sample size limitations.



The regional variation in screening rates “is not explained by patient demographics or disease severity,” they noted, adding that 2.8 million visits with BP screening would have been added over the 10-year study period “if providers in the South screened patients with psoriasis for high blood pressure at the same rate as providers in the Northeast.”

Guidelines published in 2019 by the American Academy of Dermatology and the National Psoriasis Foundation – which were cowritten by Joel M. Gelfand, MD, senior author of the current study – noted that dermatologists “play an important role in evidence-based screening of CV risk factors in patients with psoriasis,” the investigators wrote. But the regional variations suggest “that some regions experience barriers to appropriate screening or challenges in adhering to guidelines for managing psoriasis and CV risk.”

While the lack of data from after 2016 is one of the study limitations, they added, “continued efforts to develop effective interventions to improve CV screening and care for people with psoriasis in all regions of the U.S. are needed to more effectively address the burden of CV disease experienced by people with psoriasis.”

The study was partly funded by the National Psoriasis Foundation. Three of the seven investigators disclosed earnings from private companies in the form of consultant fees, research support, and honoraria. Dr. Gelfand is a deputy editor for the Journal of Investigative Dermatology.

Publications
Topics
Sections

 

Just 16% of psoriasis-related visits to dermatology providers in the United States involve screening for cardiovascular (CV) risk factors, with screening lowest in the region with the highest CV disease burden, according to an analysis of 10 years of national survey data.

From 2007 to 2016, national screening rates for four CV risk factors at 14.8 million psoriasis-related visits to dermatology providers were 11% (body-mass index), 7.4% (blood pressure), 2.9% (cholesterol), and 1.7% (glucose). Data from the National Ambulatory Medical Care Survey showed that at least one of the four factors was screened at 16% of dermatology visits, said William B. Song, BS, of the department of dermatology, University of Pennsylvania, Philadelphia, and associates.

The main focus of their study, however, was regional differences. “CV risk factor screening by dermatology providers for patients with psoriasis is low across all regions of the United States and lowest in the South, the region that experiences the highest CVD burden in the United States,” they wrote in a letter to the editor.

Compared with the South, the adjusted odds of any CV screening were 0.98 in the West, 1.25 in the Northeast, and 1.92 in the Midwest. Blood pressure screening was significantly higher in all three regions, compared with the South, while BMI screening was actually lower in the West (0.74), the investigators reported. Odds ratios were not available for cholesterol and glucose screening because of sample size limitations.



The regional variation in screening rates “is not explained by patient demographics or disease severity,” they noted, adding that 2.8 million visits with BP screening would have been added over the 10-year study period “if providers in the South screened patients with psoriasis for high blood pressure at the same rate as providers in the Northeast.”

Guidelines published in 2019 by the American Academy of Dermatology and the National Psoriasis Foundation – which were cowritten by Joel M. Gelfand, MD, senior author of the current study – noted that dermatologists “play an important role in evidence-based screening of CV risk factors in patients with psoriasis,” the investigators wrote. But the regional variations suggest “that some regions experience barriers to appropriate screening or challenges in adhering to guidelines for managing psoriasis and CV risk.”

While the lack of data from after 2016 is one of the study limitations, they added, “continued efforts to develop effective interventions to improve CV screening and care for people with psoriasis in all regions of the U.S. are needed to more effectively address the burden of CV disease experienced by people with psoriasis.”

The study was partly funded by the National Psoriasis Foundation. Three of the seven investigators disclosed earnings from private companies in the form of consultant fees, research support, and honoraria. Dr. Gelfand is a deputy editor for the Journal of Investigative Dermatology.

 

Just 16% of psoriasis-related visits to dermatology providers in the United States involve screening for cardiovascular (CV) risk factors, with screening lowest in the region with the highest CV disease burden, according to an analysis of 10 years of national survey data.

From 2007 to 2016, national screening rates for four CV risk factors at 14.8 million psoriasis-related visits to dermatology providers were 11% (body-mass index), 7.4% (blood pressure), 2.9% (cholesterol), and 1.7% (glucose). Data from the National Ambulatory Medical Care Survey showed that at least one of the four factors was screened at 16% of dermatology visits, said William B. Song, BS, of the department of dermatology, University of Pennsylvania, Philadelphia, and associates.

The main focus of their study, however, was regional differences. “CV risk factor screening by dermatology providers for patients with psoriasis is low across all regions of the United States and lowest in the South, the region that experiences the highest CVD burden in the United States,” they wrote in a letter to the editor.

Compared with the South, the adjusted odds of any CV screening were 0.98 in the West, 1.25 in the Northeast, and 1.92 in the Midwest. Blood pressure screening was significantly higher in all three regions, compared with the South, while BMI screening was actually lower in the West (0.74), the investigators reported. Odds ratios were not available for cholesterol and glucose screening because of sample size limitations.



The regional variation in screening rates “is not explained by patient demographics or disease severity,” they noted, adding that 2.8 million visits with BP screening would have been added over the 10-year study period “if providers in the South screened patients with psoriasis for high blood pressure at the same rate as providers in the Northeast.”

Guidelines published in 2019 by the American Academy of Dermatology and the National Psoriasis Foundation – which were cowritten by Joel M. Gelfand, MD, senior author of the current study – noted that dermatologists “play an important role in evidence-based screening of CV risk factors in patients with psoriasis,” the investigators wrote. But the regional variations suggest “that some regions experience barriers to appropriate screening or challenges in adhering to guidelines for managing psoriasis and CV risk.”

While the lack of data from after 2016 is one of the study limitations, they added, “continued efforts to develop effective interventions to improve CV screening and care for people with psoriasis in all regions of the U.S. are needed to more effectively address the burden of CV disease experienced by people with psoriasis.”

The study was partly funded by the National Psoriasis Foundation. Three of the seven investigators disclosed earnings from private companies in the form of consultant fees, research support, and honoraria. Dr. Gelfand is a deputy editor for the Journal of Investigative Dermatology.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF INVESTIGATIVE DERMATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article